System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Similar documents
Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

HIGH DENSITY RACK COOLING CHOICES

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Thermal management. Thermal management

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

DATA CENTER LIQUID COOLING

FREEING HPC FROM THE DATACENTRE

ENCLOSURE HEAT DISSIPATION

Product Guide. Direct Contact Liquid Cooling Industry leading data center solutions for HPC, Cloud and Enterprise markets.

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Cooling Solutions & Considerations for High Performance Computers

HIGH EFFICIENCY Central Chiller

MSYS 4480 AC Systems Winter 2015

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Results of Study Comparing Liquid Cooling Methods

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

GP SERIES Packaged Chillers

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Rittal Cooling Solutions A. Tropp

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Switch to Perfection

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Energy Efficiency and WCT Innovations

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

HP Modular Cooling System: water cooling technology for high-density server installations

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Thermal Design and Management of Servers

CTC Low Temperature Process Chillers

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

The Energy. David L. Moss Dell Data Center Infrastructure

Next Generation Cooling

Power & Cooling Considerations for Virtualized Environments

Green Data Centers A Guideline

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Liebert DCW CATALOGUE

1.2 Intel Server Chassis SC5400BASE Summary

LCS Value Proposition November 2014

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Case Study: Energy Systems Integration Facility (ESIF)

Efficiency of Data Center cooling

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

MPC Air-Cooled Chillers MPC-FC Free-Cooling Chillers

Reducing Data Center Cooling Costs through Airflow Containment

TSUBAME-KFC : Ultra Green Supercomputing Testbed

CL20 Rear Cooler. New Generation Computer Room Cooling

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

ECOBAY: Closed loop water cooled server cabinet for data centers

Highly Efficient Power Protection for High Density Computing Applications

Google s Green Data Centers: Network POP Case Study

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

QuickSpecs. HP Modular Cooling System. Overview

Knürr CoolAdd The universal retrofit solution against overheating in server racks. Rack & Enclosure Systems

Gigabyte Technology. 3D Galaxy II LCS Cooler. Media kit GH-WIU02. Rev. 0.1

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

A Green Approach. Thermal

Data Center Energy Savings By the numbers

Power and Cooling for Ultra-High Density Racks and Blade Servers

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Cooling System Considerations for High Density Colocation and Enterprise Data Centers

Introducing the Heat Wheel to the Data Center

Motivair MLC-SC Air-Cooled Scroll Chillers Tons

Data Center Optimized Controls

Rear Door Heat Exchanger

TEMPERATURE CONTROL AND MOUNT SELECTION

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Green IT and Green DC

Cisco MCS 7845-H1 Unified CallManager Appliance

ASHRAE Data Center Standards: An Overview

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Ecotel Free Cool 5-15kW

Retro-Commissioning Report

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

MLC Air-Cooled Chillers MLC-FC Free-Cooling Chillers MLC-AD Adiabatic Chillers

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Your Solution For Energy and Cost Savings. McQuay Packaged Chiller Plant

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

One Stop Cooling Solution. Ben Tam Business Development Manager

Phil Tuma, Application Development Engineer John Gross, Mechanical Engineering Director Kar-Wing Lau, CEO Charles Benge, Director Mission Critical

Chilled Water Computer Room Air Handler (CRAH) Critical Infrastructure for the World s Leading Data Centers

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Saving Water and Operating Costs at NREL s HPC Data Center

Close-coupled cooling

Transcription:

Liquid Cooling for Data Centers System Overview The Chilldyne Cool-Flo System is a direct-to-chip liquid cooling system that delivers coolant under negative pressure. Chilldyne s technologies were designed specifically to eliminate the risks associated with liquid cooling while keeping deployment and operating costs low. The Chilldyne system mitigates risk with its patented leak-proof design. Negative Pressure Cooling Without Risk of Leaks Retains Air Cooling Utilizing standard finned heat sinks modified for liquid cooling, the Chilldyne system can retain the ability to air cool servers and can operate as a standard aircooled system to minimize down-time. Leak-Proof System Cool-Flo uses negative pressure on both supply and return so if a leak occurs anywhere air will flow into the system instead of coolant leaking out. Failure Tolerant The system will maintain cooling even with one server open to air. Leaks are a maintenance issue, they do not reduce uptime. Increased Density With liquid cooling, the power density is only limited by the electricity that can be delivered to the chip. This means shorter connections between servers and higher speed data transfer. Low Cost and Easy Installation The Chilldyne system has no hidden installation costs or delays. Plumbing is only required for the CDU while the racks and servers can be installed by data center technicians. Automatic Coolant Evacuation The Cool-Flo No-Drip/Hot Swap Connector automatically evacuates coolant from a server when it is disconnected from a system. The racks can also be drained automatically. Low Cost, High Volume The system utilizes low cost plastic tubing and simple connections, minimizing cost and allows data center technicians, not plumbers, to reconfigure racks. Reduced Setup Time The CDU automatically fills and drains the system, monitors the coolant and adds or drains coolant as needed. Air purging is automatic to reduce setup time and maintenance effort.

INNOVATIVE TECHNOLOGIES No Drip Hot Swap Connector The No-Drip/Hot Swap Connector automatically evacuates coolant from a server when it is disconnected allowing for ease of maintenance and no down time. The connector fits into a standard PCI slot, or it can be built in. Liquid and Air Cooled Heat Sinks The Chilldyne system uses standard finned heat sinks modified for liquid cooling for air cooled backup, free rear door cooling, easy deployment and server resale. We add liquid cooling capability to the server, so it can still go through the standard manufacturing and test process. GPU Cooling Capabilities The Chilldyne system has the ability to efficiently cool high power GPU Cards with direct cold plate technology. The Chilldyne system interfaces with stock liquid cooled GPU cards or a custom GPU cold plate can be designed. Pistonless Pump Technology The Chilldyne pistonless pump will last 12 years or more with annual maintenance due to the slow moving valves and low speed liquid ring vacuum pump. Liquid and Air Cooled Heat Sinks No drip hot swap connector in PCI Slot Liquid cooled data center has 75% less A/C, more room, higher density Before: Four CRAC (Computer Room Air Conditioner) units cost $4/watt to install and computing density is constrained. After: One CRAC units @ $1/watt + $1/watt for liquid cooling and then opex reduced 20-50% per year. Option to increase density.

POWER AND COST SAVINGS Liquid Cooling Savings Chilldyne s Cool-Flo System is an efficient and low cost liquid cooling system that reduces data center power consumption 3 ways: 75-100% reduction in HVAC power 75% reduction in server fan power 5-10% reduction in CPU power This example shows a legacy data center power reduction of 45% with the Cool-Flo System. Any data center can bring their Usage Efficiency (PUE) down to 1.2 or less plus additional power savings at the server. Liquid Cooling Payback Case Study SDSC Comet, 1984 Nodes, 551 kw avg 1MW peak, Efficient System Savings over 4 years ($M) $1.024=1.22 0.58+ 4(0.096) CAPEX(air+liquid) OPEX (4 yr) CAPEX 1 MW Cooling (300 tons) Air Air Handler plus installation $500,000 Chilled water supply $720,000 Total (air) $1,220,000 Liquid Cooling tower plus installation $110,000 CDUs (4 +1 spare @$60K) $300,000 Plumbing & connectors (28 racks*$1200) $60,000 Server side parts (1984*$60 each) $120,000 Server Side Plumbing Installation $50,000 Heat Sink offset. ((1984*$15*2) -$60,000 Total (liquid ) $580,000 HVAC Server OPEX (Less HVAC, Fan and CPU power) Consumption (kw) Fan Cooled CoolFlo 113.0 Other 13.6 Category Description Reduction (%) Savings ($/yr) 19.4 HVAC 13.8% $65,623 82.9% 10.9 (Rear Door HX Effect) 1.6% 39.7 7.9 Server Fan 80.0% 4.7% $22,246 436.5 413.8 CPU 3.3% $15,873 5.2% 75.4 75.4 Other Server 0.0% 551.6 497.2 Total Server 0.0% $0 24.2 Cooling tower and CDU power -1.6% -$7,430-78.2% 678 540.7 Total 20.3% $96,312 20.3%

The Chilldyne Cooling Distribution Unit (CDU) is a negative pressure system that uses liquid to cool up to 300kW of server heat. The Cooling Distribution Unit (CDU) can use cooling tower water at 15-30 o C (59-86 o F) to remove up to 300 kw of server heat (15oC Rise). Its innovative design and energy efficiency allow for effective cooling of servers in high density applications. Key Features Touchscreen Controls Data logging of key performance parameters Remote monitoring via webpage 6 cooling loops for easy hose routing 300 lpm cooling flow at.5 bar Monitors water temperature and quality, fills, drains, and tests for leaks Measures heat removed and facility water flow Automatic control of anti-corrosion fluid (N+1) Redundancy: Back up CDU stays in active idle mode with minimal power and wear COOLING DISTRIBUTION UNIT 3 2 5 4 CDU Standard Components 1. Pump Chamber The Chamber is where coolant is stored, supplied to the servers and received from the servers. The system cycles through the main and auxiliary chambers allowing for a steady flow 2. Heat Exchanger (2x) Transfers the heat created by the servers to the cooling tower or chiller. The HX are connected in series to minimize the processor temperature on hot humid days with warm facility water. 3. Liquid Ring Pump (LRP) LRP uses water as a seal to provide the required vacuum necessary to propel the coolant. The water seal does not wear out. 1 4. Microprocessor Control The temperature in the fluid reservoir is controlled to maintain the coolant temperature above the dew point in the data center 5. Water Quality Control The water quality is monitored and controlled to maintain corrosion and bacterial protection. Automatic fill, drain, air purge and leak test are included and coolant additive is stored on board.

DIRECT LIQUID COOLING Liquid and Air Cooled Heat Sinks The Chilldyne hybrid air-liquid heat sinks are designed for maximum direct-to-chip liquid cooling while retaining air-cooling capability. Chilldyne can modify existing hardware or design and develop custom hybrid heat sinks to provide a unique liquid cooling solution Leak-Proof, failure tolerant Liquid Cooling. Fins provide additional cooling to the data center while liquid cooled. Provides backup air cooling when liquid cooling is off or during test and setup. Low cost tubing and plumbing because of negative pressure technology. Easy to add to any server-fits in existing PCI slot, tall or short, or chassis can be punched for 2x6mm holes. GPU Cold Plate Dual Xeon Kit Liquid and air cold plate/heat sink and no drip hot swap connector. Standard 90mm heat sink design Chilldyne GPU Cold Plate removes heat from the GPU, RAM and voltage regulators. Chilldyne uses turbulators within drilled holes to adjust heat transfer for each customer s requirements. Also works with OEM sealed low pressure cold plates. GPU Cold Plate Custom Cold plate for 300 watt Fury X GPU, uses OEM VRM cooler

NO DRIP HOT SWAP CONNECTOR No Drip Hot Swap Connector The patented No Drip Hot Swap Connector automatically drains the server upon disconnection and automatically purges the air. Its low cost design has no valves on the server side and it fits into a standard PCI slot CHILLDYNE COMPONENTS Each server is equipped with a custom internal manifold that is connected to the CDU via a rack manifold which connects directly to the CDU s inter-rack plumbing system. Internal Server Manifold Internal Server Manifold Distributes flow to all heat sinks and GPU Cards. Individual Servers/Cards may be removed for service. Design doesn t need expensive connectors or hose clamps. Servers and Rack Manifold Rack mounted manifolds distribute coolant for all the servers. The negative pressure system allows the servers to be connected and disconnected while the system is in operation. The hoses which are not connected have automatic shut off valves. Broken fittings will leak air into the system, not water onto the servers. Minor leaks do not result in downtime. Low flow resistance connectors, manifolds and tubing provides consistent cooling flow for distant racks 30 ft (10m) or more from CDU.