Liquidity ad

NREL’s ESIF Data Center Houses a Supercomputer, Efficiently

The petaflop supercomputer is water cooled, and provides space heat to the offices.

Energy Systems Integration NREL

An exterior view of the 182,500-square-foot Energy Systems Integration Facility at NREL in Golden, Colo.

By MOLLY RIDDELL

Since its completion in spring 2013, the Energy Systems Integration Facility (ESIF) at the National Renewable Energy Laboratory (NREL) has earned LEED Platinum certification with 56 points — well above the 52 points required for the highest LEED designation.

Already 40 percent more efficient than the baseline building-performance rating under ASHRAE/IESNA Standard 90.1-2004, the facility’s energy efficiency has achieved 46.2 percent, thanks to a 720-kilowatt (kW) solar array located on its Golden, Colo., campus.

As the nation’s first research facility dedicated to integrating renewable energy into the electric grid, the ESIF enables researchers and manufacturers to test technologies at a megawatt (MW) scale. The 182,500-square-foot (ft²) facility houses 15 laboratories, several outdoor test beds, offices for 200 researchers and a high-performance computing (HPC) data center with one of the most powerful supercomputers ever built.

HPC

The hot aisle containment racks over the HPC data center storage tape library capture waste heat that is then used to heat office spaces in the ESIF.

The ESIF petaflop-scale supercomputer performs a quadrillion calculations per second, allowing researchers to simulate how multiple renewable energy technologies interact with each other and the grid.

Compared with a typical data center’s annualized average power usage effectiveness (PUE) of 1.80, the HPC data center is achieving a PUE (the ratio of total power to run the data center to the total power drawn by the computing equipment) of 1.04 — making it more than 30 percent more efficient than a typical “green” data center.

According to Shanti Pless, LEED AP and senior research engineer at NREL, the ESIF is on track to meet or exceed its target energy use intensity of 26.7 kBTU/ft² per year — 87 percent more efficient than the average site (CBECS). In designing the data center, in a building separate from the office spaces, engineers assumed 65 watts of continuous consumption for each employee. Of the target 26.7 kBTU/ft² per year, they assigned 86 percent (16.84 kBTU) for the server and equipment plug loads, leaving 9.9 kBTU to design the pumps, lighting, fans, heating and HVAC systems, of which only 6.6 kBTU were used.

A “Systems Approach” to Efficiency

ESIF

ESIF office spaces support more than 200 research staff.

Modeling a whole-building, integrated-design approach, the ESIF incorporates long skylights and clerestory glazing to flood occupied spaces with natural light, operable windows for cooling and ventilation, solar-powered fans and recycled materials. The centerpiece is the HPC data center’s warm water liquid cooling system, used in place of mechanical chillers to keep computer components from overheating.

Because water has approximately 1,000 times the cooling capacity of air, it is more efficient to pump water than to run a fan to circulate air. Water goes to the servers at approximately 75°F (24°C) and returns at more than 100°F (38°C). This “outflow” circulates to heat laboratory and office spaces, and melts snow and ice on outside walkways.

Integrated Control Room

The integrated control room underneath the HPC data center.

Because the ESIF uses no mechanical or compressor-based cooling systems, and because the cooling liquid is supplied indirectly from cooling towers, the ambient temperature is warmer than in typical data centers.

The facility exceeded its initial energy utilization efficiency target of 0.9 by 20 percent on day one of operation and is currently at 0.7, using 30 percent of the waste heat.

Over time, the data center’s ultra-high efficiency will translate to cost savings. At a first cost of $10 million, it is expected to save about $1 million in annual operating costs compared with traditional data centers, including $800,000 in electrical energy savings and $200,000 in thermal energy savings.

Powerfully Integrated Systems

ESIF Control

The ESIF Supervisory Control and Data Acquisition System monitors and controls research facility-based processes, gathers and disseminates real-time data for collaboration and visualization.

The computers are fully integrated with a utility-scale power hardware-in-the-loop (PHIL) testing capability, connecting physical devices such as photovoltaic (PV) inverters to a simulated electric grid.

Researchers evaluate component and system performance under actual loads. The PHIL capability includes a programmable 1-MW AC grid simulator, a 1-MW PV simulator and a 1-MW load bank connected through the ESIF’s Research Electrical Distribution Bus. This power-integration circuit connects energy sources in all of the laboratories with “plug-and- play” test components, using two AC and two DC ring buses. The Supervisory Control and Data Acquisition System monitors and controls all of the ESIF systems, recording real-time, high-resolution data.

In 2013, NREL researchers and partner Advanced Energy evaluated the grid support features of AE’s 500-kW PV inverter and connected the inverter to a simulation of a simplified distribution system. They studied the inverter’s response to a simulated islanding event, the first step in evaluating the PHIL using a real-world distribution system model. The ongoing project is among the first to prove that electric utilities and manufacturers can use ESIF capabilities to test new and potentially groundbreaking changes to their circuits with no risk to the utilities or their customers.

Molly Riddell is a science and technology writer and contributing editor with SOLAR TODAY. Contact her at mriddell@solartoday.org.

Share on Facebook0Tweet about this on Twitter0Share on LinkedIn0Pin on Pinterest0Email this to someone
ASESDonate468

Have a Thought to Share?

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>