Results of Study Comparing Liquid Cooling Methods

Similar documents
COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

FREEING HPC FROM THE DATACENTRE

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Switch to Perfection

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Thermal management. Thermal management

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Close Coupled Cooling for Datacentres

MSYS 4480 AC Systems Winter 2015

Total Modular Data Centre Solutions

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Rolf Brink Founder/CEO

Next Generation Cooling

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

MEASURE MONITOR UNDERSTAND

Efficiency of Data Center cooling

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Industry leading low energy evaporative fresh air cooling.

Green Data Centers A Guideline

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

DATA CENTER LIQUID COOLING

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

HPC/Advanced Cooling Track. Rolf Brink/CEO/Asperitas

Energy Efficiency and WCT Innovations

Welcome Association of Energy Engineers to the Microsoft Technology Center

Scalable, Secure. Jason Rylands

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Green Computing: Datacentres


Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

Rittal Cooling Solutions A. Tropp

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Liebert DCW CATALOGUE

Product Guide. Direct Contact Liquid Cooling Industry leading data center solutions for HPC, Cloud and Enterprise markets.

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Reducing Data Center Cooling Costs through Airflow Containment

Server room guide helps energy managers reduce server consumption

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Planning a Green Datacenter

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

LCP Hybrid Efficient performance with heat pipe technology

Data Centre Energy & Cost Efficiency Simulation Software. Zahl Limbuwala

Power & Cooling Considerations for Virtualized Environments

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

18 th National Award for Excellence in Energy Management. July 27, 2017

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

INSIDE THE PROTOTYPE BODENTYPE DATA CENTER: OPENSOURCE MONITORING OF OPENSOURCE HARDWARE

LCS Value Proposition November 2014

Algorithms, System and Data Centre Optimisation for Energy Efficient HPC

Data Center Energy Savings By the numbers

100% Warm-Water Cooling with CoolMUC-3

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

ECOBAY: Closed loop water cooled server cabinet for data centers

Overcoming the Challenges of Server Virtualisation

Data Center Trends and Challenges

Introducing the Heat Wheel to the Data Center

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

Beyond BREEAM Excellent DCs. Cobalt Data Centre Campus

Innovative Data Center Efficiency Techniques. Michael K Patterson, PhD PE Eco-technology Program Office Intel Corporation

Reducing Energy Consumption with

Rack cooling solutions for high density racks management, from 10 to 75 kw

Green IT and Green DC

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Data Center Energy and Cost Saving Evaluation

Green Computing: Datacentres

Data Center Trends: How the Customer Drives Industry Advances and Design Development

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Impact of Air Containment Systems

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

A separate CRAC Unit plant zone was created between the external façade and the technical space.

84 kw, Tier 3, Direct Expansion, 937 ft 2

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Cooling Solutions & Considerations for High Performance Computers

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

HIGH DENSITY RACK COOLING CHOICES

CLOSE CONTROL AIR CONDITIONERS DATA CENTERS DESIGN SIDE NOTES

OnRak. Up to 35kW. + EER up to Up to 88% less power input + Up to 50% saving in energy * IT Cooling.

Cannon T4 Modular Data Center (MDC)

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

Energy Saving Best Practices

Taking The Heat Off University of East Anglia Data Centre

Indirect Adiabatic and Evaporative Data Centre Cooling

Why would you want to use outdoor air for data center cooling?

Rethinking Datacenter Cooling

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

Engineered Energy Solutions An Overview. Controls, Automation and Energy Conservation

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

Liquid Cooling Package LCP Cooling Systems

Case Study: Energy Systems Integration Facility (ESIF)

Munters Data Center Cooling Solutions. Simon Young & Craig MacFadyen Munters Data Centers

Power and Cooling for Ultra-High Density Racks and Blade Servers

Transcription:

School of Mechanical Engineering FACULTY OF ENGINEERING Results of Study Comparing Liquid Cooling Methods Dr Jon Summers (J.L.Summers@Leeds.Ac.UK) Institute of ThermoFluids (itf) 54th HPC User Forum, September 15-17, 2014 Seattle, WA

Topical HPC Leeds, England Tier 2 HPC for 8 northern English Universities. Tier 3 facilities for researchers in all faculties The Tier 1 HPC facilities (Archer) are in Edinburgh, Scotland Tier 1 is a national facility Scottish Independence means Tier 1 facility should be repatriated! Government can influence!

Liquid cooling is not new.

Traditional Air cooling Source: ASHRAE Datacom volume 2

Liquid Cooling Solutions Moving towards total liquid cooling. Air ILC DLC TLC Mixes Facility and IT! ILC = Indirect Liquid Cooling 100% Air with liquid to the rack. DLC = Direct Liquid Cooling 0%<Air<100% with liquid to the rack. TLC = Total Liquid Cooling 0% Air with liquid to the rack.

Indirect Liquid Cooling to the racks. Photographs

Indirect Liquid Cooling to the rack Performance Q bdc mh U A T T ) ( ref air h Tang (2009) 0. 71 U U OnRak Compact Hot Aisle Experiments by Adam Thompson, PhD student at the Airedale test facilities in 2011.

Indirect Liquid Cooling to the rack Performance 3 rd November 2010 8 Almoli, Ali, et al. "Computational fluid dynamic investigation of liquid rack cooling in data centres." Applied energy 89.1 (2012): 150-155.

Indirect Liquid Cooling to the rack Performance High Performance Computing (HPC) installation using ILC: Installed in 2010 and 2011 System delivers 45Tflops in 6 racks, 3 with passive ILC rear doors and 3 with active ILC read doors One power measurement showed a consumption of 137kW (104kW compute, 33kW cooling) giving a ppue of 1.32 (137/104) 45TFlops/137kw = 0.33Gflops/Watt HPC team reported an annual running cost saving of 50k per annum over traditional air cooling

Direct Liquid Cooling into the server. Retrofit into decommissioned (2007) HPC nodes Setup is in a laboratory. Primary research goal is to look at cooling on demand.

Direct Liquid Cooling into the server Using StressLinux - a load profile is run for 6 minutes each time. Temperature ( o C) 41.0 40.5 40.0 39.5 39.0 38.5 38.0 31.0 30.5 30.0 29.5 29.0 28.5 28.0 Idle tests 0 5 10 15 20 25 30 35 Time (min) Air test1 Air test2 Air test3 DCLC test1 DCLC test2 DCLC test3 Temperature ( o C) 60 55 50 45 40 35 30 AIR COOLED Ambient RAM 0 CPU 0 CPU 1 HDD RAM 1 Ethernet SP Temperature ( o C) 60 55 50 45 40 35 30 LIQUID COOLED Ambient RAM 0 CPU 0 CPU 1 HDD RAM 1 Ethernet SP 25 25 20 20 15 0 5 10 15 20 25 30 35 40 Time (sec) 15 0 5 10 15 20 25 30 35 40 Time (sec)

60 pumps Direct Liquid Cooling into the server CHx liquid to liquid Cooling manifold

Direct Liquid Cooling into the server Each air cooled server is 230W power consumption. At air cooled server max load the temperature difference is around 18 o C giving a flow rate of 9 litres per second. The direct liquid cooled (DLC) server looses 4 of the 8 fans. With half the flow rate and a measured temperature difference of 8 o C the rate of heat taken up by the air for each server is 44W. For 30 servers in a rack the liquid cooled retrofitting harvests 78% of the heat into the water and the heat load in the laboratory is estimated to be 1.4kW. The ppue is difficult to calculate as building water supplies many laboratories.

Total Liquid Cooling by full immersion. Facility Deployment University of Leeds. Iceotope and 3M

Total Liquid Cooling by full immersion The Iceotope solution does not use air, and therefore no fans. No need to consider server approach temperatures and humidity. Only need to consider CPU and system temperatures. There are four heat transfer steps 2. Server water jacket to the rack water circuit Leeds deployment 4. Heat transfer by convection into laboratory 1. Microelectronics to water jacket Heat transfer is in liquids no fans are required at all. 3. Rack to secondary heating water circuit

Water in Water out Water flowrate Electrical Power in Water power out Total Liquid Cooling by full immersion: microelectronics level heat transfer Heat transfer capabilities investigated using controlled proxy mainboard. 27 o C 36 o C 1.15x10-5 m 3 s -1 500W 432W 29 o C 38 o C 1.13x10-5 m 3 s -1 500W 441W 30 o C 39 o C 1.13x10-5 m 3 s -1 500W 448W 32 o C 42 o C 1.10x10-5 m 3 s -1 500W 464W 35 o C 45 o C 1.105x10-5 m 3 s -1 500W 461W Hopton, P., & Summers, J. (2013, March). Enclosed liquid natural convection as a means of transferring heat from microelectronics to cold plates. In Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2013 29th Annual IEEE (pp. 60-64). IEEE.

Total Liquid Cooling by full immersion: rack to the lab heat transfer Using 20W central heating pump. Two 2.5kW domestic hot water radiators. Thermal images show surface temperatures of the rack and the radiators. With 2.2kW IT and 95W of pumps gives ppue = 1.05

Liquid cooling summary ILC ppue=1.32 0.33GFlops/W DLC ppue TBC TLC ppue=1.05 1.07GFlops/W 80 to 91% heat capture in liquid at 18C Up to 78% heat capture in liquid at approx 25C 89% heat capture in liquid at 45C

Liquid Cooling Performance Comparison (System Model) Server / IT level Cabinet level Total Facility Power Type Load Number ILC (RDHx) TLC (Immersion) MotherBoard 305W 672 204.96kW 204.96kW Cooling Fans 23.4W 672 15.72kW N/A Storage 0.85W 672 0.571kW 0.571kW PSU Fan 15.8W 168 2.65kW N/A PSU loss 7% loss 168 15.67kW 14.43kW Performance 223.1TFLOPS 223.1TFLOPS Total load 223.91kW 206.27kW Cabinet fan 161W 14 2.254kW N/A Pump 45W 14 N/A 0.63kW Telco 33.4W 28 0.935kW 0.935kW PDU 3.5% loss 28 7.84kW 7.84kW component 11.03kW 8.78kW Total load 234.94kW 215.05kW UPS 4% loss 2 9.40kW 8.60kW Chiller EER=3.03 Chiller on 2 77.54kW N/A EER=19.3 Chiller off 2 N/A 11.14kW Ventilation 880W 10 8.8kW N/A MFLOPS/W is based on the full facility. Systems: Supermicro X9D with dual Xeon (Sandy Bridge) E5-2670. component 95.73kW 19.74kW Total load 330.67kW 234.80kW Total PUE 1.477 1.138 MFLOPS/W 674.7 950.2 Total cooling 98.17kW 11.77kW Power saving 95.88kW Chi, Y. Q. et al (2014, March). Case study of a data centre using enclosed, immersed, direct liquid-cooled servers. In Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2014 30th Annual (pp. 164-173). IEEE.

Advantages of liquid cooling Data centre using xlc offers better space utilisation. Good for HPC facilities. Thermal inertia of DLC/TLC failure in cooling system gives > 60 min before IT overheats. For air cooled this is around 120 seconds. Increased IT reliability less thermal stress and/or corrosion of components with TLC. Reduced power consumption server requires no fans (TLC) or a reduced number of fans for ILC. xlc increases the limits of rack density (good for HPC). TLC with good insulation offers a reduction in extra data centre infrastructure. xlc offers a greater potential for waste heat re-use potential for carbon offsetting. TLC does not need any due point control no risks/run hotter.

Disadvantages of liquid cooling Requirement for extensive leek detection. RDHx (ILC) well established and used in HPC. In-line Hx (e.g. APC) are now used. Extra complexity for cooling infrastructure at the rack level and perhaps not worth the associated risks for situations with < 5kW in the cabinet (Not usually HPC). Requires a raised floor or void for facility water circuits not overhead. Liquid cooling to the rack or the row would need numerous valves or two facility water feeds to be concurrently maintainable.

Summary xlc can get a ppue of around 1.05 (The 1 is really less than 0.93 as fans in servers can consume between 7 and 25% of server power). Yields more power for compute. Lower CapEX if most of the heat is harvested in liquid. OpEx is lower than best of air cooling. Liquid cooling has many environmental advantages lower server power consumption, greater rack density, less constrained environment. Having the heat in the liquid offers greater potential for its reuse, which could be incentivised by carbon offsetting.

Any Questions Please? Email address: J.L.Summers@Leeds.Ac.UK Thanks to Colleagues and PhD students who contributed to the current research. Many thanks for the financial and in-kind support from The University of Leeds Digital Innovations Hub Technology Road mapping EU funded PEDCA Project number: 320013