Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Similar documents
How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Liebert DCW CATALOGUE

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

DATA CENTER LIQUID COOLING

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

HIGH DENSITY RACK COOLING CHOICES

MSYS 4480 AC Systems Winter 2015

Reducing Data Center Cooling Costs through Airflow Containment

Green IT and Green DC

Impact of Air Containment Systems

Rittal Cooling Solutions A. Tropp

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Thermal management. Thermal management

Virtualization and consolidation

Next Generation Cooling

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

Switch to Perfection

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Close-coupled cooling

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

84 kw, Tier 3, Direct Expansion, 937 ft 2

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

One Stop Cooling Solution. Ben Tam Business Development Manager

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Google s Green Data Centers: Network POP Case Study

HP Modular Cooling System: water cooling technology for high-density server installations

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

ENCLOSURE HEAT DISSIPATION

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Power & Cooling Considerations for Virtualized Environments

Close Coupled Cooling for Datacentres

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

IT Air Conditioning Units

APC APPLICATION NOTE #146

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

R4 Ventures LLC White Paper

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Liebert XD. Cooling Solutions For High Heat Density Applications. Precision Cooling For Business-Critical Continuity

New Techniques for Energy-Efficient Data Center Cooling

Recapture Capacity for Existing. and Airflow Optimization

Cooling Solutions & Considerations for High Performance Computers

Total Modular Data Centre Solutions

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Power and Cooling for Ultra-High Density Racks and Blade Servers

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

A Green Approach. Thermal

The Energy. David L. Moss Dell Data Center Infrastructure

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Welcome Association of Energy Engineers to the Microsoft Technology Center

Air Containment Design Choices and Considerations

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

200 kw, Tier 3, Chilled Water, 5250 ft 2

Reducing Energy Consumption with

Data Center Energy Savings By the numbers

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

1008 kw, Tier 2, Chilled Water, ft 2

Energy Efficient Data Centers

Overcoming the Challenges of Server Virtualisation

Retro-Commissioning Report

InRow Chilled Water. Up to 70kW. Close-coupled, chilled water cooling for medium to large data centers. Row-Based Cooling.

FREEING HPC FROM THE DATACENTRE

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Data Center Airflow Management Basics: Comparing Containment Systems

1.0 Executive Summary. 2.0 Available Features & Benefits

ECOBAY: Closed loop water cooled server cabinet for data centers

Industry leading low energy evaporative fresh air cooling.

Efficiency of Data Center cooling

MEASURE MONITOR UNDERSTAND

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

APC APPLICATION NOTE #74

Rethinking Datacenter Cooling

APC APPLICATION NOTE #126

LCS Value Proposition November 2014

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Energy Efficiency and WCT Innovations

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Results of Study Comparing Liquid Cooling Methods

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

Cisco Lab Setpoint Increase. Energy Efficient Data Center Demonstration Project

Critical Facilities Round Table SUN Microsystems Friday, March 25, Thursday, March 24, 2005 Wright Line LLC,

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Transcription:

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling Jeremiah Stikeleather Applications Engineer

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling While CRAH cooling has been a common data center cooling solution, OPEX for RDHx cooling can be better at minimizing today s energy consumption and operating costs. This will have increasing significance as we look to the future and interest in sustainability grows.

Agenda Industry Terminology Benefits of the Passive Rear Door Heat Exchanger The Study: Comparing 3 Cooling Designs o Traditional CRAH Units o RDHx s with a Primary Piping Manifold o RDHx s with CDU s and a Secondary Water Loop Summary of the 3 Design Alternatives

Industry Terminology CRAH Computer Room Air Handler CRAC Computer Room Air Conditioning RDHx Rear Door Heat Exchanger IRC In Row Cooler CDU Coolant Distribution Unit CAPEX Capital Expenditure OPEX Operating Expense Close Coupled Cooling Cooling that is adjacent to server racks Greenfield Building Construction of a building on greenfield land where there is no work constraints

Benefits of the Water-Cooled Passive Rear Door Heat Exchanger (RDHx) Takes up very little space Air-to-Water Heat Exchanger Close-coupled cooling solution Removes the heat at the source Very little energy required Rear of Enclosure Front of Enclosure

Benefits of the Water-Cooled Passive Rear Door Heat Exchanger (RDHx) Significant energy reduction versus a typical CRAH solution Heat exchange process occurs at rear of the rack Water thermal capacity is 3400 times greater than air Significant reduction in maintenance costs 91 F 93 F 73 F 72 F

The Study: 3 Cooling Designs Designs Compared: Design 1 Traditional 30-Ton CRAH Units Design 2 RDHx s with a primary piping manifold system Design 3 RDHx s with Coolant Distribution Units and secondary water loop Study includes: All aspects of deploying solutions in a 1 MW Data Center Supply and installation of cooling system Electrical connections, valves, piping Building monitoring integration system Leak detection, smoke/fire detection Condensate removal

The Data Center Configuration: 1 MW of IT power in a raised floor environment 5,000 sq. ft. white space Planned deployment of 177 IT enclosures Infrastructure for a space loading of 200 watts /ft 2 28 ft 2 per IT enclosure (assume 5.7 kw / rack)

The Benchmark Air Cooling System: Chilled water Computer Room Air Handlers (CRAHs). (12) 30-Ton operating units around perimeter CRAH unit air discharge temperature 68 F to 70 F Two additional CRAH units installed for redundancy Cold air discharged under an 18 inch raised floor Hot aisle-cold aisle arrangement

The Benchmark Air Cooling System: CRAH running at a reduced load of 80% (4.6 kw) Chilled water for the CRAHs (100% water, no glycol) Branch connected from a main chilled water loop running external to the white space Chiller, water supply, and related energy costs not included in any of the cooling designs

Design 1 Traditional CRAH Units Fourteen 30-Ton CRAH units 12 active, 2 standby Assume 25% CRAH performance reduction Large area with unpredictable airflow Obstructions (columns, cable runs, etc.) alter airflow Wasted air (openings in tiles that do not provide direct access for rack intake) White space consumption and the required service clearance is factored in Footprint required by CRAH system complicates future expansion of IT enclosures

Design 1 Summary Traditional CRAH Units Cost includes: Supply and installation of CRAH units Space fit-out Fire protection/suppression systems required for access and CRAH footprint De-rating published sensible cooling by 25% considered conservative due to the built-in inefficiencies of CRAH based air cooling systems Power consumption much higher due to fans, humidification, and reheat functions Increased rack power density will force a change in cooling infrastructure more CRAHs, supplemental cooling, hot aisle/cold aisle containment

Design 1 Traditional CRAH Units Cost Summary

Design 2 RDHx with manifold system Dedicated chiller 177 RDHx Chilled water distribution from prefabricated manifold system 2 CRAH units for humidification control and room cooling backup CAPEX reduced by not using additional pumps or plate-and-frame heat exchangers

Design 2 RDHx with manifold system Piping manifold alternatives: Manifold and pump tapping into bypass/mixing line (Manifold return water discharges back into the bypass) Manifold and three-way valve tapping into supply and return lines (Mixing building return water with supply water to achieve higher supply water temperatures for the RDHx) Similar to methods used in the radiant piping industry

Design 2 RDHx with manifold system Controls for the proposed system rely on supply air temperature sensors for the racks and their corresponding RDHx that is connected to the manifold, and a modulating control valve or circuit setter at the manifold return

Design 2 Summary RDHx with manifold CAPEX comparable to CRAH design (Design 1) OPEX for RDHx is minimal (About 3% of the total power consumed by CRAH units) RDHx units are passive Small RDHx whitespace footprint Reduce overall building footprint Greenfield project savings Construction savings for future expansion RDHx ROI within first year of operation Over 3x IT expansion, 5.7 kw to 18 kw / rack (RDHx nominal cooling capacity 18 kw)

Design 2 RDHx with manifold system Cost Summary

Design 3 RDHx with CDU and secondary water loop 4 Coolant Distribution Units (CDU s) CDU is floor-mounted device heat exchanger, pumps, controls, distribution manifold CDU connects to water from chiller (or cooling tower) 2 CRAH units (Humidity control and room cooling backup) No condensate (Temperature/humidity sensor regulates secondary loop temperature 2 degrees above dew point) The RDHx water loop is isolated from primary water loop CDU power consumption 3.7 kw each

Design 3 Summary RDHx with CDU CDU increases CAPEX Low OPEX cost CDU pumps use 15% of power for CRAH units Break-even point is in Year 3 Design 2 and 3 are future proof 5.7 kw racks can grow up to 18 kw

Design 3 RDHx with CDU Cost Summary

Summary At 5 kw per rack, CAPEX for RDHx and CRAH cooling approximately equal (CAPEX could be further reduced by implementing alternating RDHx s CAPEX savings up to 25% compared to populating each rack with an RDHx) OPEX is significantly reduced with RDHx designs Reduced energy consumption Reduced demand charges Reduced maintenance costs RDHx allows for future growth without new construction costs RDHx performs well with elevated water temperatures Minimizing chiller energy usage Reducing chiller OPEX

Summary OPEX savings increased using waterside economizers (Free cooling window is increased using elevated water temperatures) Hybrid system including some CRAH units with RDHx adds redundancy for greater system availability Increasing rack density to 18 kw can minimize infrastructure space (CAPEX savings 30-40%)

Study Conclusions A common misconception that liquid cooling is too expensive to deploy disproved CAPEX for liquid cooling and traditional air cooling is approximately the same at 5 kw / rack Increasing energy costs encourage data center owners and operators to consider liquid cooling Passive liquid cooling enables expansion and flexibility at a lower, incremental, capital expenditure

Future Considerations A similar study done by a 3 rd party consulting engineering firm is comparing RDHx s to IRC s 4 MW Data Center, 5000 sq. ft. IRC CAPEX $4.5M, 6 MW cooling capacity RDHx CAPEX $2.5M, 7.5 MW cooling capacity

Thank You Patrick Giangrosso General Manager Coolcentric pgiangrosso@coolcentric.com (603) 479-4806