NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Similar documents
LCS Value Proposition November 2014

FREEING HPC FROM THE DATACENTRE

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Cooling Solutions & Considerations for High Performance Computers

Next Generation Cooling

HIGH DENSITY RACK COOLING CHOICES

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Rack cooling solutions for high density racks management, from 10 to 75 kw

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Welcome Association of Energy Engineers to the Microsoft Technology Center

Impact of Air Containment Systems

MSYS 4480 AC Systems Winter 2015

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Introducing the Heat Wheel to the Data Center

Reducing Data Center Cooling Costs through Airflow Containment

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Liebert DCW CATALOGUE

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

IT Air Conditioning Units

Total Modular Data Centre Solutions

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Making the impossible data center a reality.

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Thermal management. Thermal management

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

INNOVATE DESIGN APPLY.

Air Containment Design Choices and Considerations

INNOVATE DESIGN APPLY.

Power & Cooling Considerations for Virtualized Environments

Recapture Capacity for Existing. and Airflow Optimization

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Google s Green Data Centers: Network POP Case Study

WHITE PAPER. Oil Immersion Cooling for Today s Data Centers. An analysis of the technology and its business

Switch to Perfection

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

84 kw, Tier 3, Direct Expansion, 937 ft 2

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Industry leading low energy evaporative fresh air cooling.

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Planning a Green Datacenter

Results of Study Comparing Liquid Cooling Methods

Reducing Energy Consumption with

Ecoflair Air Economizers

Power and Cooling for Ultra-High Density Racks and Blade Servers

ENCLOSURE HEAT DISSIPATION

Efficiency of Data Center cooling

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

Data Center Energy Savings By the numbers

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

IT COOLING AIR CONDITIONERS FOR HIGH DENSITY RACKS AND BLADE SERVERS. RACK COOLING SOLUTIONS FOR HIGH DENSITY RACK MANAGEMENT, FROM 4 TO 75 kw

A Green Approach. Thermal

Rittal Cooling Solutions A. Tropp

Thermal Design and Management of Servers

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

ASHRAE Data Center Standards: An Overview

High Density Cooling Solutions that work from Knurr Kris Holla

Your Solution For Energy and Cost Savings. McQuay Packaged Chiller Plant

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

Virtualization and consolidation

Vienna Scientific Cluster s The Immersion Supercomputer: Extreme Efficiency, Needs No Water

Close-coupled cooling

Energy Efficient Data Centers

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Product Guide. Direct Contact Liquid Cooling Industry leading data center solutions for HPC, Cloud and Enterprise markets.

How to Increase Data Center Efficiency

APC APPLICATION NOTE #74

It`s the size of science & research...

Data Center Trends: How the Customer Drives Industry Advances and Design Development

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Precision Cooling Mission Critical Air Handling Units

LCP Hybrid Efficient performance with heat pipe technology

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Server room guide helps energy managers reduce server consumption

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

HP Modular Cooling System: water cooling technology for high-density server installations

Green IT and Green DC


Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

ECOBAY: Closed loop water cooled server cabinet for data centers

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

100% Warm-Water Cooling with CoolMUC-3

Liquid Cooling Package LCP Cooling Systems

Close Coupled Cooling for Datacentres

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

Cooling System Considerations for High Density Colocation and Enterprise Data Centers

Transcription:

NREL Validates LCS Technology PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Learning Objectives As much as 40% of data center power is used to circulate air, and fans in a typical 30-megawatt data center circulate 1.4 million cubic feet of air per minute. All data centers need air conditioning and use onsite water for cooling, either in cooling towers or evaporative cooling units. Exposure to air destroys electronics. Free energy can be expensive. 2

Question Data Centers are the intersection of power, fiber and cooling. Until now Cooling has driven the decision tree. What if Cooling were not a constraint? What would your Data Center look like? Where would it be located? How much power would you need? 3

Imagine a Data Center that Reduces Capital Cost by 15% Reduces Operating Cost by 40% Reduces the Power to Cool by 98% Reduces Total Energy Use by 40% Reduces White Space by 60% Eliminates the Need for Water Supports over 50kW per Rack Improves Reliability and Extends Server Lifetime LiquidCool Solutions technology enables that data center today! 4

The Situation Major considerations for successful deployment of IT equipment in the data center: Power Space Cooling Cooling is a major cost factor Power required to cool a data center can match or exceed the power used to run the IT equipment itself Cooling also is often the limiting factor in data center capacity 5

This year, Cisco released a whitepaper* describing the typical data center air flow based on ASHRAE requirements for: Room cooling Row cooling Cabinet cooling *Cisco Unified Computing System Site Planning Guide: Data Center Power and Cooling 6

It s based on the same air driven technology Google used in its first data center in 1999, and Facebook in its first data center in 2009. Nothing has changed except the size. 7

The Issue The Problem is Air! Air is an insulator and has no thermal mass. Air wastes space, limits power density and reduces reliability. Oxidation and Corrosion Large Temperature Swings Fans Fail Air cooling also wastes energy Up to 15% of Total Power is used to Move Air Up to 20% More Power is used by Rack Fans Fans Generate Heat that must be Dissipated 8

Air Cooling Wastes Energy Air cooling wastes energy in Central Systems 20% to 60% of Data Center Power is used for Cooling 9

At the Rack 20% of IT Power actually is used to Circulate Air LiquidCool LSS220 compared to Dell PowerEdge R620 2013 LiquidCool Solutions, Inc 10

The Solution Liquid Cooling - But which one? 1. Cold Plates / Heat Pipes Requires Air Shifts heat from processors Mechanical cooling needed above 85F Maximum rack density ~ 30kW Does not save much energy 2. In-Row Cooling Requires Air Makes the room smaller Mechanical cooling above 80F Maximum rack density ~ 35kW 3. Total Immersion in Fluid Two-phase / Single-phase Eliminates Air Cooling Completely Removes 100% of the Heat in the Liquid Rack density > 75kW 11

Cold Plates Indirect cooling of servers through the use of chip-specific, fluidcooled heat exchangers mounted on each chip. Cools the chips through closed loop circulation of high pressure water through cold plates, dumping waste heat using fluid-to-fluid, fluid-to-air heat exchangers. Requires water cooling infrastructure to the rack and CRAC units & fans for cooling of all other components. 12

In-Row Cooling Moves the air cooling radiators to the back of the rack. Cools the server through closed loop circulation of water through radiators, dumping waste heat using fluid-to-fluid or fluid-to-air heat exchangers. Requires water infrastructure to the rack & fans are required on servers and back door. 13

Two-Phase Total Immersion Servers are fully immersed in lowtemp boiling halogenated fluids Cools the server through vaporization of fluid by hot electronics and re-condensing the fluid on heat exchangers Waste heat is dissipated with a liquid-toliquid heat exchanger Issues Requires Water Cooling to the Rack or Vapor Containment System Operationally Complex Reduced Reliability due to Material Incompatibilities Very Costly because Constant Coolant Replenishment is Required Environmental and Employee Safety Issues 14

Single-Phase Total Immersion Reduces Complexity and Cost Eliminates all Water in the Data Room Silent and Simple - no CRAC, no fans, no vibration, no air filters, no airborne contaminants Extremely Energy Efficient: Thermal Efficiency with a true PUE<1.02 98% lower Power-to-Cool a Data Center compared to Air 94% of Input Energy can be recaptured in the form of a 55oC Liquid Operating Expense is exceptionally low No Water Needed Other Benefits: Reduces Infrastructure Saves Space Electronics Last Longer Input Energy can be Recycled 15

Consider this: LCS single-phase data center cooling addresses four key data center issues: 1. Eliminates air cooling completely, 2. Removes 100% of the heat in the liquid, 3. Eliminates water use completely, and 4. Captures high quality heat (above 135F) for reuse. That s right! It uses no water. 16

Also consider this: It takes the same amount of air to fill the Empire State Building more than 4 times to remove as much heat as an Olympic-size Pool filled with OptiCool 17

Server of the Future Available Today Off-the-Shelf Electronics No Fans or Other Moving Components Within the Chassis Cool liquid is circulated directly to hottest components first. Remaining components are cooled by bulk flow as the dielectric liquid circulates and then exits the chassis Rack-mounted devices are easy to install/remove/maintain 76 Nodes in a 48U Rack Hot Swap Servers in Two Minutes Liquid Submerged Server - Dielectric Fluid Flow Rate: ~ 0.25 gal/min 18

Heat Dissipation The World s Smartest Water Heater Pump supplies cool dielectric liquid to multiple IT racks Reuse hot fluid or circulate to a dry cooler Incoming cool fluid can be as warm as 45 C for most applications 19

Reduces Infrastructure Not Needed Mechanical Refrigeration Cooling Towers Air Handlers Humidity Control Blade and Rack Fans Mixing Dampers High Ceilings Hot Aisle Water Lower Cost Upfront and Less Maintenance Every Day 20

Saves Space Retrofit a 2,000 sq. ft. data center for a federal government agency 21

Electronics Last Longer A Long-term Reliability Test has been in continuous operation for 45 months Over 200,000 server hours logged running 24/7 at 100% CPU load No changes to dielectric fluid properties or cooling performance have been detected Internal Components after 36 Months of Operation 22

Safe & Eco Friendly Coolant Synthetic hydrocarbon oil formulations with hindered phenol antioxident and viscosity control additives NOT mineral oil...free of wax and other impurities often found in these types of oils 1,400 times more heat capacity by volume than air Cost-effective, non-volatile, and non-toxic Available in standard and low-viscosity Mil-Std formulations High flashpoints suitable for use with electronics (160 C to 190 C) The cooling circuit is sealed and the Core Coolant never needs to be replaced. Highest Thermal Efficiency Reduces datacenter energy usage Recycled Heat Can Reduce Net Datacenter Energy Use by 90% No Water in the Chassis, Rack or White Space Less E-Waste because Chassis can be reused during upgrades 23

Reliability / Ease of Service LCS technology eliminates most causes of electronics failures Protection from airborne contaminants Cooler operating temperatures Reduced temperature fluctuation No fans or other moving parts No vibration within devices Protection from ESD effects Highly compatible, protective coolant. A Device can be Swapped in Less Than 2 minutes with No Fluid Loss 24

NREL Validation The LSS technology has the following advantages over traditional air-cooled server systems: 1. Maintains a small difference between central processing unit (CPU) and memory temperatures and inlet coolant temperature, allowing for: a. Use of higher temperature coolant b. Higher temperature heat recovery c. Ability to maintain lower electronic temperatures 2. Eliminates the parasitic cooling energy used by fans 3. Lower capital and energy costs in a building that contains a data center because the heat removed from the servers can: a. More effectively be rejected to ambient using lower cost systems b. Be re-used effectively as a heating source for the building. 25

NREL Test Setup 26

NREL Test Results Data from National Renewable Energy Laboratory tests show that 93% of the electric energy used by LCS servers can be recycled at 60 o C for hot water or building heat. Results for Liquid Submerged Server Experiments at NREL s Advanced HVAC Systems Laboratory Eric Kozubal National Renewable Energy Laboratory Pre-published report July 25, 2016 27

Product Building Blocks LCS produces three platform IT devices: Liquid Submerged Server (LSS), Submerged Cloud Server (SCS) and Submerged Graphics Server (SGS). Liquid Submerged Server (LSS) Submerged Cloud Server (SCS) Submerged Graphics Server (SGS) 28

LSS Rack Features: 30-kilowatt rack, equivalent to four aircooled racks Energy Efficient Scalable Easy to Maintain No Moving Parts No Water Electronics protected from humidity, dust, oxidation, and corrosive gases LSS Server 72-Server Rack 29

SCS Rack Features: 45kW rack, equivalent to 5 air-cooled racks Energy Efficient Scalable Easy to Maintain No Moving Parts No Water Electronics protected from humidity, dust, oxidation, and corrosive gases 48U 96-Node Rack With Fluid Expansion Reservoir 30

RT Rugged Terrain Features: Rugged all-metal enclosure with waterproof I/O connectors Electronics protected from humidity, dust, oxidation, and corrosive gases Fan-free design produces a system that is highly reliable, energy efficient and runs silently Choices for motherboard, CPU and I/O configurations are available to support a wide array of applications 31

Edge / Fog Computing Features: No A/C High Power Density Siting Flexibility High Reliability Energy Efficient Quiet Operation Easy to Maintain No Water Office, factory or outdoors 32

200 kw Prefabricated Module Features: 200-kilowatt Prefabricated Module Can be transported by cargo plane Rapid field set up Energy Efficient Cooling PUE<1.03 No water Easy to Maintain Electronics protected from humidity, dust, oxidation, and corrosive gases Overall dimensions: 12 x 12 x 20 33

750 kw Prefabricated Module Features: 750 kilowatt Prefabricated Module Replaces five air-cooled modules Energy Efficient Cooling PUE<1.03 No water Easy to Maintain No Moving Parts Electronics protected from humidity, dust, oxidation, and corrosive gases Overall dimensions: 12 x 12 x 42 34

Lowest Total Cost of Ownership Reduces Capital Cost Saves Space and Eliminates Mechanical Refrigeration, Evaporative Cooling, Rack Fans, Air Handlers, High Ceilings, and Raised Floors Reduces Operating Cost - Reduces power to cool by 98% over air cooling. More than 90% of waste heat can be reclaimed for other uses. There are no chillers, DX units or fans to maintain Conserves Water - No water is used when the ambient dry bulb temperature is less than 110 o F Increases Server Lifetime - Sealed system isolates electronics from the environment Improves Reliability - Eliminates fans and isolates electronics from environmental impurities, dampens thermal cycling and lowers operating temperatures Virtually Silent Operation - Fan noise and vibration are eliminated Supports High Density 35

LCS Makes Liquid Cooling Practical Thank you. Herb Zien, CEO LiquidCool Solutions 414-289-7171 herb.zien@liquidcoolsolutions.com 36