Saving Energy with Free Cooling and How Well It Works

Similar documents
Next Generation Cooling

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

INNOVATE DESIGN APPLY.

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

INNOVATE DESIGN APPLY.

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Case Study: Energy Systems Integration Facility (ESIF)

Introducing the Heat Wheel to the Data Center

Data Center Energy Savings By the numbers

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Switch to Perfection

Indirect Adiabatic and Evaporative Data Centre Cooling

Saving Water and Operating Costs at NREL s HPC Data Center

Indirect Adiabatic and Evaporative Data Center Cooling

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Scalable, Secure. Jason Rylands

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

E e n rgy g y St S ud u y y Pr P og o r g ams December 2011

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Cooling Solutions & Considerations for High Performance Computers

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Munters Data Center Cooling Solutions. Simon Young & Craig MacFadyen Munters Data Centers

Close Coupled Cooling for Datacentres

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

Optimiz Chilled January 25, 2013

Data Center Energy and Cost Saving Evaluation

Efficient Cooling Methods of Large HPC Systems

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

Close-coupled cooling

Why would you want to use outdoor air for data center cooling?

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Aquaflair. BREC/F and BCEC/F kw

Precision Cooling for Business-Critical Continuity. Liebert HPC-S. Designed for the Efficient Cooling of Small Data Centers

100% Warm-Water Cooling with CoolMUC-3

MEASURE MONITOR UNDERSTAND

Data Center Optimized Controls

Next Generation DCP Optimization using MV VSDs

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

DATA CENTER LIQUID COOLING

HIGH DENSITY RACK COOLING CHOICES

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

MSYS 4480 AC Systems Winter 2015

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Trading TCO for PUE? A Romonet White Paper: Part I

Engineered Energy Solutions An Overview. Controls, Automation and Energy Conservation

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Liebert. AFC from 500 to 1700 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Impact of Air Containment Systems

LCS Value Proposition November 2014

FACT SHEET: IBM NEW ZEALAND DATA CENTRE

Update on LRZ Leibniz Supercomputing Centre of the Bavarian Academy of Sciences and Humanities. 2 Oct 2018 Prof. Dr. Dieter Kranzlmüller

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

IT Air Conditioning Units

update: HPC, Data Center Infrastructure and Machine Learning HPC User Forum, February 28, 2017 Stuttgart

Efficiency of Data Center cooling

NERSC. National Energy Research Scientific Computing Center

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Data Centre Energy & Cost Efficiency Simulation Software. Zahl Limbuwala

CRAH Retrofit and Energy Savings Project

Algorithms, System and Data Centre Optimisation for Energy Efficient HPC

Air-Side Economizer Cycles Applied to the Rejection of Data Center Heat

Motivair MLC-SC Air-Cooled Scroll Chillers Tons

Power & Cooling Considerations for Virtualized Environments

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

ENCLOSURE HEAT DISSIPATION

Thermal management. Thermal management

M+W Group s Visionary Delta Datacenter

Cisco Lab Setpoint Increase. Energy Efficient Data Center Demonstration Project

Your Solution For Energy and Cost Savings. McQuay Packaged Chiller Plant

Energy Efficiency and WCT Innovations

Packaging of New Servers - energy efficiency aspects-

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Green Data Centers A Guideline

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

CRITICAL FACILITIES ROUNDTABLE HEAT REMOVAL IN DATA CENTERS ENGINEERED SOLUTIONS

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Building Re-Tuning (BRT) Lesson 4: SOI Air-Handling Unit Part II

Rolf Brink Founder/CEO

Liebert DCW CATALOGUE

NERSC Site Update. National Energy Research Scientific Computing Center Lawrence Berkeley National Laboratory. Richard Gerber

Welcome Association of Energy Engineers to the Microsoft Technology Center

District Cooling Marches on in Dubai The Emirates Towers Case Study

Chiller Replacement and System Strategy. Colin Berry Account Manager

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Phil Tuma, Application Development Engineer John Gross, Mechanical Engineering Director Kar-Wing Lau, CEO Charles Benge, Director Mission Critical

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Transcription:

Saving Energy with Free Cooling and How Well It Works Brent Draney HPC Facility Integration Lead Data Center Efficiency Summit 2014-1 -

NERSC is the primary computing facility for DOE Office of Science Directly support DOE science mission Award-winning science including 2 Noble prizes (2007 and 2011) 1500 journal publications per year 10-12 cover articles per year Broad user base 4500 users Geographically distributed: 47 states and 41 countries Diverse workload Many codes (600+) and algorithms Computing at scale and at high volume 17 Journal Covers in 2012-2 -

What is free cooling? Cooling without use of mechanical refrigeration Utilize environmental cooling source Lake water (CSCS) Aquifer (ivec) Open air / evaporation (NERSC) Not really free The laws of thermodynamics say that nothing is free! Power for pumps, fans, etc. Still 2-3X less power Can be utilized full or part-time - 3 -

The coldest winter I ever spent was a summer in San Francisco. - 4 -

Berkeley average temperatures The daily average low (blue) and high (red) for Oakland/Berkeley with percentile bands (inner 25% to 75%, outer 10% to 90%). Data from weatherspark.com. - 5 -

Edison (Cray XC30) First Cray XC30 Intel processors Aries interconnect Air/Near Water cooled 30 compute cabinets 23 Blower cabinets 5,200 compute nodes 333TB memory 6.4PB storage @ 140GB/s 12 login nodes, 6TB 1.6MW average, 2.0MW peak That s 67KW/rack! - 6 -

Cray XC30 cooling mechanism One blower assembly for each cabinet pair (group) Compute rack cooling coil valve adjusts flow rate to maintain outlet air temp. Inlet air preconditioning option Exhaust air can be room neutral or require residual cooling 7

XC30 Cabinet Design Compute Cabinet Blower Cabinet Compute Chassis Power Shelf Vertical Water Coil - 8 -

XC30 Blower Cabinet N+1 configurations Hot Swap Blower Assembly Low pressure and velocity of air Low noise (TN26-6 standard, 75 db/cabinet) Heat generated by blowers is extracted by radiators in compute cabinets 9

Under-row Cooling Manifold Piping can get tight need dedicated rows for power and cooling 8 pipe is the largest you can get in a single row and that should work for 16 racks Consider feeding power and cooling from opposite ends to avoid conflicts - 10 -

Parallel-Flow vs Counter-flow Heat Exchangers - 11 -

Things to consider for liquid cooling Water temperature Wet-bulb temperature Approach Dew-point Flow rate Differential pressure Processor type Altitude - 12 -

Things to consider for liquid cooling Water quality Parameter Required ph 9.5 to 10.5 Nitrite (NO2 ppm) > 600 Molybdate (MoO4 ppm) > 250 Conductivity (μs) < 5,000 Combined Sulfate and Chloride (Na2SO4 + NaCl ppm) < 500 Copper (ppm) < 0.2 Aerobic bacteria (per ml) < 1000 Anaerobic bacteria (per ml) < 10 Total Hardness (ppm) < 10 Iron (ppm) < 1 Manganese (ppm) < 0.1 Suspended solids (ppm) < 20 Turbidity (NTU) 20-13 -

Bay area climate provides the ability to generate cool water all year round Supplied water temp (hours per year) <70F 70-71F 71-72F 72-73F 73-74F 74-75F >75F One Cell 8258 261 101 64 45 19 12 Two Cell 8636 63 32 19 7 2 1 14

Power consumption with and without chillers - 15 -

Power Savings and Payback Period Analysis Annual average electrical consumption WSE (Free Cooling): 54 kw Chiller Plant: 270 kw Average Annual Load Savings of 216 kw (for WSE) 216 kw * 365 days * 24 hours/day = 1,892,000 Initial Cost of WSE (Free Cooling) Total rough estimate = $643,500 New construction of WSE system = $474,000 Demolition/Removal of packaged chiller plant = $169,500 Initial Cost of Chiller Plan Total rough estimate = $168,300 Annual savings = 1,892,000 kwh * $0.10/kWh = $189,200 Cost Premium for Free Cooling = $643,500 - $168,300 = $475,200 PG&E Rebate: $416,000 $60,000 net install costs vs. $189,200 annual power savings! 16

Lessons Learned / Observations Warm water cooling is not well understood by Mechanical Engineers Computer system needs and response is even less understood by computer vendors as well as Mechanical Engineers Carefully calculate flow rate needs at the highest water temperatures, including pressure drop in distribution pipes Must have computer Delta-T/Delta-P from the manufacturer AND in your contract! Determine if room neutral or a fixed set point is right for your machine room Stainless steel fittings need to be handled carefully to avoid leaks Temperature stratification from the bottom to the top of the rack can still occur - 17 -

Computer and cooling are a single system Feedback between Compute and Building Automation System Adjust pressure to meet compute needs and save on pumping costs Control delays may introduce temperature and pressure optimizations Compute modulates cooling by adjusting water flow through cabinets Effects may not be seen at cooling plant for several minutes due to time of travel of water through pipes Tune plant controls to dampen oscillations Optimize tradeoff between system power and energy for cooling Processor chips use less leakage current and perform more reliably (frequency scaling) at lower temperatures Determine the best temperature to reduce total power not just improve PUE. - 18 -

Edison Power Efficiency Edison (Latest) Hopper XE6 (Previous) System Cabinets 1,540 KW 2,215 KW Login Nodes and Storage 69 KW 125 KW Blowers 90 KW Incl. in system Cooling Plant 80 KW 538 KW PUE (Blowers in IT load) PUE (Blowers in mech. load) 1.047 1.23 1.106 N/A Sustained Performance (SSP) 236 TF 144 TF Performance per Watt (including cooling) 132 GF/KW 50 GF/KW - 19 -

Actual Construction Cost and Rebate - 20 -

Power Savings and Payback Period Actual Annual average electrical consumption WSE (Free Cooling): 54 kw Chiller Plant: 270 kw Average Annual Load Savings of 216 kw (for WSE) 211.2 kw 216 kw * 365 days * 24 hours/day = 1,892,000 kwh 1,689,044 kwh Initial Cost of WSE (Free Cooling) Total rough estimate = $643,500 $2,575,600!? x4! New construction of WSE system = $474,000 Demolition/Removal of packaged chiller plant = $169,500 Initial Cost of Chiller Plan Total rough estimate = $168,300 $673,618 (x4) Annual savings = 1,892,000 kwh * $0.10/kWh = $189,200 $168,904 Cost Premium for Free Cooling = $643,500 - $168,300 = $475,200 $1,901,981 PG&E Rebate: $416,000 $274,477 $1,627,505 net install costs vs. $168,904 annual power savings 9.6 Year payoff 21

NERSC will move to CRT in Spring 2015 Mixed office and data center building 300 offices on two floors 20ksf HPC floor expandable to 28ksf Mechanical space 140ksf gross Extremely energy efficient LEED gold design Free cooling for air and water Heat recovery Expandable power 12.5MW at move-in 42MW capacity to building - 22 -

Local weather and hillside design enables energy savings Air cooling 75F degree air year round without chillers Computer room exhaust heat used to warm office floors Liquid cooling 74F degree water year round without chillers 65F degree water using chillers for 560 hours/year (6%) Predicted Power Usage Effectiveness (PUE): < 1.1 PUE simulation for two system configurations 23

National Energy Research Scientific Computing Center - 24 -