Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Similar documents
Liebert DCW CATALOGUE

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

MSYS 4480 AC Systems Winter 2015

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Next Generation Cooling

Liebert XD. Cooling Solutions For High Heat Density Applications. Precision Cooling For Business-Critical Continuity

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Efficiency of Data Center cooling

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Data Center Energy Savings By the numbers

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Liebert XD Flexible, Energy-Saving Cooling Solutions For High Heat Density Applications. Precision Cooling For Business-Critical Continuity

The Energy. David L. Moss Dell Data Center Infrastructure

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Recapture Capacity for Existing. and Airflow Optimization

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

Retro-Commissioning Report

PEX+ High Efficiency, Low Operating Cost Precision Solution for Data Centers

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Optimizing Cooling Performance Of a Data Center

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

HIGH DENSITY RACK COOLING CHOICES

Close Coupled Cooling for Datacentres

Data Center Optimized Controls

Reducing Data Center Cooling Costs through Airflow Containment

Energy Efficient Data Centers

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

18 th National Award for Excellence in Energy Management. July 27, 2017

Virtualization and consolidation

IT Air Conditioning Units

Power & Cooling Considerations for Virtualized Environments

Impact of Air Containment Systems

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

ENCLOSURE HEAT DISSIPATION

Green IT and Green DC

MEASURE MONITOR UNDERSTAND

Chiller Replacement and System Strategy. Colin Berry Account Manager

ENERGY EFFICIENT COOLING SOLUTIONS FOR DATA CENTERS. Introducing a Vertiv tool for evaluating cooling system total cost of ownership

Overcoming the Challenges of Server Virtualisation

Power and Cooling for Ultra-High Density Racks and Blade Servers

Enabling Business Critical Continuity for Unified Communications. Andy Liu Product Manager, Asia Pacific

Close-coupled cooling

DATA CENTER LIQUID COOLING

Rack cooling solutions for high density racks management, from 10 to 75 kw

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Liebert. icom Thermal System Controls Greater Data Center Protection, Efficiency & Insight

PEX (R410A) High Efficiency, Green Solution Precision Cooling for Data Centers

Precision Cooling. Precision Cooling Advantages

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Server Room & Data Centre Energy Efficiency. Technology Paper 003

International 6SigmaDC User Conference CFD Modeling for Lab Energy Savings DCSTG Lab Temperature Setpoint Increase

DATA CENTER PRECISION COOLING: The Need For A Higher Level Of Service Expertise

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Green Data Centers A Guideline

DCIM Data Center Infrastructure Management Measurement is the first step

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Liebert PDX from 15 to 120 kw

Total Modular Data Centre Solutions

Scalable, Secure. Jason Rylands

84 kw, Tier 3, Direct Expansion, 937 ft 2

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Data Center Energy and Cost Saving Evaluation

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Rittal Cooling Solutions A. Tropp

A Green Approach. Thermal

Energy Efficiency Solutions for Your Data Center

Reducing Energy Consumption with

Switch to Perfection

Welcome Association of Energy Engineers to the Microsoft Technology Center

Thermal management. Thermal management

Powering Change in the Data Center. A White Paper from the Experts in Business-Critical Continuity TM

INNOVATE DESIGN APPLY.

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Cooling Solutions & Considerations for High Performance Computers

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

DCIM Data Center Infrastructure Management

T66 Central Utility Building (CUB) Control System Reliability, Optimization & Demand Management

Transcription:

Future of Cooling High Density Equipment Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Agenda Issues facing the Data Center Managing the explosive growth Power consumption getting the attention of Governments Developing the right cooling strategy for today and into the future Examples of how the right cooling strategy can lower the Total Cost of Ownership Myth High Density computing is more costly 2

Managing High Density Servers Rack density trend when fully populated with the newest server technology How most sites are dealing with server density 2000 28 x 2U Servers 2kW Heat Load 2002 42 x 1U Servers 6kWw Heat Load 2006 6 Blade Centers 24kW Heat Load 2009 Rabid Blades 40kW Heat Load Heading Toward 50kW 3

Progression to High Density The average server replacement cycle is 3-4 years Rack kw 1kW 2kW 5kW 10 kw 15 kw 20 kw Issues facing the IT Manager Getting Air out of the Racks Hot air mixing with the inlet of other racks Diversity of loads in the Data Center Not aware that more Fans create heat Flexibility of On Demand Cooling 4

DCUG Survey Results What are the Biggest Issues facing the IT Manager 0% 10% 20% 30% 40% 50% Heat Density (Cooling) Space Constraints/Grow th Pow er Density Adequate Monitoring Capabilities Availability (Uptime) Technology Changes / Change Energy Costs / Equipment Efficiency Other Security (Physical or Virtual) Data Center Consolidations Data Storage Hardw are Reliability Regulatory Compliance Staffing/Training Limitations 8% 7% 7% 5% 4% 3% 2% 2% 1% 1% 0% 19% 18% 22% 5

Cooling Presents an Opportunity For Energy Savings Cooling About 37%- 45% Data Center Power Draws Electricity Transformer/ UPS Air 10% Movement 12% Cooling 25% Sources of Energy Waste Fans / Blowers running on redundant units Lighting, etc. 3% Lack of air containment (cable openings, room leakage) Unnecessary cooling unit cycling on and off Lack of humidification control between units IT Equipment 50% Source: EYP Mission Critical Facilities Inc., New York Mixing of Hot and Cold air lowering the effectiveness of the cooling unit Excess Fan energy that turns into heat 6

US EPA Report Data Center Efficiency Public Law 109-431 Energy consumed by servers & data centers has doubled in last 6 years and is expected to double again in next 5 years to > 100 Billion kwh State-of-the-Art technologies and best management practices could reduce electricity use by up to 55% Recommendations include: Standardized performance measurements for data centers and equipment Incentive programs Research and development Industry collaboration and partnerships www.energystar.gov/ia/partners/prod_development/downloads/epa_da tacenter_report_congress_final1.pdf 7

Understanding The Basics Heat generated is directly related to the server power (100%) As server power increases (kw), the airflow (CFM) through the rack increases proportionally Raised floor tiles are limited in airflow (about 500-1000 CFM) Higher entering air temperatures on a cooling unit will provide more capacity and increased efficiency Higher density servers will have a greater range of temperatures leaving the rack over time (larger swings of server load) Fan horsepower to move the air is significant and all the power turns into heat (100 kw Cooling unit uses typically a 10 HP motor that generates 8.5kW of heat) 8

Having The Right High Density Cooling Strategy Delivers... Cooling for High Density racks Increase servers per rack Increase number of racks per room Energy Efficiency Lower Operating Costs More power allocated for IT/Server loads Get more out of your existing facility 9

Planning for High Density requires A Systems Approach to the Cooling System Traditional Floor-Mount through the first 100-150 w/sqft (or 4-5 kw per rack) and Supplemental Cooling above that level Computer Room / Data Center Provides high density sensible cooling at the source SUPPLEMENTAL COOLING TRADITIONAL Floor-Mount + Controls Humidity and Filtration Creates the base airflow distribution 0 5 10 15 20 25 30 35 40 kw / Rack 10

Critical Requirements Of The Base Cooling Load Equipment Cooling units with Variable cooling capacity DX Compressors such as the Digital Scroll VFDs on the fans Controls so that units can work as a team Eliminate dehumidification/humidification fighting Balance the load Optimize the cooling performance High efficiency condensers Green Refrigerant products 11

Moving the Heat Room Servers Rack Chip Heat Sink Memory Other components Network Devices Power Supplies CRAH Chiller Heat Rejection Areas for Improvement Reduced server fan power Higher temperatures over cooling coils (room temperature / less mixing of hot and cold air) Reduced resistances lower fan power 12

Cooling Solutions to Meet the Higher Density Requirements Requires moving the cooling closer to the heat source to more precisely cool the specific load and not waste energy brute force cooling the whole room. Cooling coils may be in multiple locations External to the Rack Overhead, Back, Side In the Rack Under, Side Part of the Server Requires a fluid to be delivered to the cooling coils to transport the heat out of the Data Center Chilled Water (CW) Refrigerant (pumped low pressure) 13

Energy Efficiency Benefits for Cooling Closer to the Load Traditional Cooling Only Fan Power- 8.5kW per 100 kw of Cooling Average entering air temperature of 80-84 F Cooling Unit Blower Resistances Liebert XD & Base Cooling Fan Power- 3.5 kw per 100 kw of Cooling (XD @ 2 kw per 100kW) Average entering air temperature of 96-98 F 65% less fan power Greater cooling coil effectiveness 14

Liquid Cooling Configurations with Building Chilled Water Building Chiller XDWP CW Building Chilled Water Valve Heat Exchanger CDU Tank Pump Pump 2nd Loop XDP Refrigerant 15

Open and Closed Architectures Open and Closed Architecture Systems as defined by ASHRAE The open architecture systems utilize cooling coils near the heat load either inside or outside the server rack and utilize the room air volume as a thermal storage to ride through short power outages. The closed architecture fully encloses the rack with the cooling coil inside. Other provisions are required for power loss ride through. 16

Secondary Fluid Comparisons Pumped Refrigerant System Advantages No Water in the DC or electrical hazards Micro-channel coil efficiency (+50%), lower air side pressure drop => lower operating costs Smaller piping requirements Cooling Modules can be located anywhere Scalable Capacity (2-3x to CW ) Disadvantages Small room scalability Higher Fluid Cost Chilled Water Based System Advantages Lowest Fluid Cost No limitation to room size Disadvantages Electrical Hazard Lower Operating Efficiency May require fluid treatment to prevent fouling Limited overhead cooling options 17

Flexibility - Extends Your Existing Infrastructure Investment On demand, plug-and-play flexibility to add additional capacity Cooling at the source of heat with advanced compact heat exchangers Multiple module configurations to meet any Data Center layout Works with any brand of racks Cooling fluid is a gas (no water) Self-regulating capacity 100% Sensible cooling Liebert XDO Hot Aisle Cold Aisle Configuration Hot Spots, Zones and Hot Rooms Liebert XDV Liebert XDH 18

Plug and Play Capacity on Demand 19

Liebert XD Energy Efficiency Benefits Cooling closer to the source Dramatically less fan power required to move the air Higher air temperature entering the coil results in increase performance Coil Technology Microchannel coils most efficient coil surface Sensible cooling All cooling module operate the coil at 5 degree F above the dew point in the room Does not unnecessarily dehumidify requiring additional humidification (value of 7% in efficiency) 20

The Traditional Way (120 racks @ 8kw/rack) AC AC AC AC (12) 30 ton CW Air Handlers PDU PDU PDU PDU PDU PDU 10 operational for the load 2 stand by PDU PDU PDU PDU PDU PDU AC AC AC AC Floor space 4256 sqft PDU PDU PDU PDU PDU PDU Requires a raised floor of 48 PDU PDU PDU PDU PDU PDU AC AC AC AC 21

Liebert XD Solution (120 racks @ 8kw/rack) AC AC (4) 20 ton CW Air Handlers XD XD XD PDU PDU PDU PDU PDU PDU PDU PDU PDU PDU PDU PDU 3 operational for the load 1 stand by (6) XDP with (96) XDV 5/80 operational 1/16 redundant XD PDU PDU PDU PDU PDU PDU XD Floor Space 3640 sqft XD PDU PDU PDU PDU PDU PDU AC AC 22

Total Room Load Calculations 120 Racks 8 kw per Rack Floor Mount AH Liebert XD & Floor Mount Rack Loads 960.0 960.0 Fans(BHP) 101.7 44.6 Room Latent 5.1 5.1 Excess Latent 137.3 29.8 PDU 28.8 28.8 People 1.5 1.5 Build Env 7.9 7.9 Lights 5.6 5.6 Liebert XD benefits Less Fan power 100% sensible Total (kw) 1247.9 1083.3 Smaller load to size the Chiller 23

Summary of Equipment 120 Racks 8 kw per Rack Floor Mount AH Liebert XD & Floor Mount Chillers (3) 250 ton (3) 200 ton CW Pumps (3) 25 HP (3) 20 HP Liebert XD (vs traditional method) 20% less Chiller plant Floor Mount Units (12) 30 ton (4) 20 ton Liebert XD (96) XDV Floor Space (sqft) 4256 3640 Raised Floor Height (in) 48 24 Scalable Design 8kw 20kw 15% less floor space Scalable Platform 24

Annual Energy Consumption (kw) 120 Racks 8 kw per Rack Floor Mount AH Liebert XD & Floor Mount Prec Air Units 101.7 25.4 XDV 0.0 19.2 XDP Pumps 0.0 6.3 CW Pumps 42.4 33.9 Chiller 195.1 169.4 Tower Fans 15.3 13.2 Condenser Pumps 15.3 13.2 Rehumidification 51.5 11.2 Total Kw 421.3 291.9 Operating Costs (@$.08/kw) $ 295,231 $ 204,571 Delta $ (90,660) -31% Cooling closer to the source is more efficient. 25

Capital Costs Total Impact 120 Racks 8 kw per Rack Floor Mount AH Liebert XD & Floor Mount Chiller $ 187,500 $ 150,000 Cooling Units $ 135,360 $ 258,015 Installation $ 322,860 $ 452,265 Floor Space $ - $ (123,200) Total (E,I,S) $ 645,720 $ 737,080 Delta $ 91,360 Operating Savings $ (90,660) Payback (yr) 1.0 Results Traditional 4256 sqft Liebert XD 3640 sqft (-616 ft²) Example at $200 / ft² Industry range $250-1000 / ft² 26

kw Chiller Capacity per kw of Sensible Heat Load Liebert XD Technology - Fluid Cooling at the Source Drives Down Operating Costs 1.40 1.20 1.00 0.80 0.60 0.40 0.20 0.00 Chiller Capacity Latent Load Fan Load Sensible Load Traditional CW CRAC CW Enclosed Rack Refrigerant Modules 20% less capacity of the support equipment Chiller (s) Cooling Tower / Condensers Chiller water circulating pumps Emergency Generators Electrical Switch Gear kw power to cool 1 kw of sensible heat 0.450 0.400 0.350 0.300 0.250 0.200 0.150 0.100 0.050 0.000 Annual Power Consumption 30% Lower Traditional CW CRAC CW Enclosed Rack Refrigerant Modules Fan Pump (CDU) Pump (CW) Chiller 27

Liebert XD Full Range of Opportunities XDO20 XDC or XDP Base Infrastructure (160 kw) Future pumping units of larger capacity XDV10 XDH20/32 Standard Cooling Modules 10-35kw +++ Embedded Cooling (microchannel intercoolers) Tested 35-60kw Developing up to 100kw Embedded & Chip Cooling (microchannel intercoolers and Cooligy chip cooling) Tested 50 kw (100% redundant) Capable over 100kw 28

Additional System Opportunities for Improved Cooling Efficiencies kw for cooling per kw of server heat load 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0.65 0.42 Data Center Best Practices 1990 2000 2005 Optimal 0.36 Enclosed Rack 2006/2007 Energy Savings driven by reduction in Fan Power (cooling system and server) plus Heat Transfer efficiency 0.31 XD Modules (10 35 kw) 2004/2006 30% Egenera (10 20 kw) 2006/2007 Embedded Cooling (35 60 kw) 2010 0.2 Component Cooling (>50 kw) Future 45% Traditional Cooling Liebert XD Opportunities 29

Cooling Process Throughout The Range Of Rack Loads Cooling Process 30 Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Availability Flexibility TCO Chip/Component Cooling Embedded Rack - Refrigerant Rack Mounted - Refrigerant Rack Mounted - CW Traditional CRAH >0-5 >5-10 >10-15 >15-20 >20-25 >25-30 >30-35 >35-50 >50 Rack Load (kw)

Improving The Total Cost Of Ownership with the Liebert XD Cooling Systems How Cooling closer to the source of the heat makes the heat exchangers more efficient (higher entering air temperatures) Lower total Fan HP Sensible cooling eliminates wasted energy dehumidify unnecessarily and then having to re-humidify Less Chiller or DX infrastructure required Overhead cooling modules require no additional floor space Cooling solutions that meets the requirements to fill racks of high density servers Infrastructure for modules today and for future server / rack designs Results in More cooling capacity for energy consumed Less Power (energy consumed) Less Power (energy consumed) Less Power and Capital Equipment Less Floor Space consumed Less Floor Space consumed Extends your capital life 31

Lower Power For Cooling Provides More Power For IT Equipment 12% 26% Data Center Power Draws Electricity Transformer/ UPS Air 10% Movement 12% Cooling 25% Lighting, etc. 3% 59% IT Equipment 50% For the same building power, you can allocate more power to the IT Equipment (18%) 32

Take Aways Cooling solutions for higher density will need to move closer to the load and will require a reliable fluid delivery means Cost effective cooling solutions exist that can be employed today that meet future needs Allow racks to be fully populated Cooling at the source of the heat load will actually lower your incremental energy consumption Less power for the cooling system provides more power for the IT equipment Results in more available Floor Space and Growth capability for the IT equipment 33

Thank You Contact: Steve.Madara@EmersonNetworkPower.com 34