Power and Cooling for Ultra-High Density Racks and Blade Servers

Similar documents
Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

APC APPLICATION NOTE #74

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Reducing Energy Consumption with

Cooling Strategies for Ultra-High Density Racks and Blade Servers

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

ENCLOSURE HEAT DISSIPATION

Reducing Data Center Cooling Costs through Airflow Containment

A Green Approach. Thermal

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Thermal management. Thermal management

Rittal Cooling Solutions A. Tropp

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

APC APPLICATION NOTE #146

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

APC by Schneider Electric Solutions for Cisco Unified Computing System (UCS) Blade Applications

Optimizing Cooling Performance Of a Data Center

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

APC InfraStruXure Solution Overview

Ten Cooling Solutions to Support High- Density Server Deployment

Liebert DCW CATALOGUE

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Impact of Air Containment Systems

Great Lakes Product Solutions for Cisco Catalyst and Catalyst Switches: ESSAB14D Baffle for Cisco Catalyst 9513 Switch

Recapture Capacity for Existing. and Airflow Optimization

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Green IT and Green DC

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Airflow Management s Role in Data Center Cooling Capacity

Efficiency of Data Center cooling

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Close Coupled Cooling for Datacentres

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Power Monitoring in the Data Centre

CFD Modeling of an Existing Raised-Floor Data Center

APC APPLICATION NOTE #114

84 kw, Tier 3, Direct Expansion, 937 ft 2

MSYS 4480 AC Systems Winter 2015

Running Your Data Center Under Ideal Temperature Conditions

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models

InRow Chilled Water. Up to 70kW. Close-coupled, chilled water cooling for medium to large data centers. Row-Based Cooling.

Virtualization and consolidation

Rethinking Datacenter Cooling

New Techniques for Energy-Efficient Data Center Cooling

One Stop Cooling Solution. Ben Tam Business Development Manager

Switch to Perfection

MECHANICAL AND THERMAL CONSIDERATIONS FOR RACK DEPLOYMENT OF THE DELL POWEREDGE M-SERIES. One Dell Way Round Rock, Texas

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Temperature monitoring and CFD Analysis of Data Centre

Data Center Design: Power, Cooling & Management

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Data Center Energy and Cost Saving Evaluation

Energy Efficient Data Centers

Improving Rack Cooling Performance Using Blanking Panels

ECOBAY: Closed loop water cooled server cabinet for data centers

Next Generation Cooling

Server Room & Data Centre Energy Efficiency. Technology Paper 003

Liquid Cooling Package LCP Cooling Systems

Cabinet Level Containment. IsoFlo Cabinet. Isolated Airfl ow Cabinet for Raised Floor Data Centers

HP Modular Cooling System: water cooling technology for high-density server installations

COOLING SOLUTIONS. SmartRack Close-Coupled Cooling for IT Environments

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Air Containment Design Choices and Considerations

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

How to Increase Data Center Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Green Data Centers A Guideline

Data Center Design: Power, Cooling & Management

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

The Energy. David L. Moss Dell Data Center Infrastructure

Reverse Flow Cabinet. ContraFlo Cabinet. Reverse Flow Cabinet for Raised Floor Data Centers

Metered Rack PDU Rack PDU, Metered, Zero U, 10A, 230V, (16) C13

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

Application of TileFlow to Improve Cooling in a Data Center

Google s Green Data Centers: Network POP Case Study

FREEING HPC FROM THE DATACENTRE

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Close-coupled cooling

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Total Modular Data Centre Solutions

Transcription:

Power and Cooling for Ultra-High Density Racks and Blade Servers APC White Paper #46 Richard L. Sawyer Director of Data Center Technology American Power Conversion Company 2005 APC corporation What has changed? 20 th Century vs. 21 st Century Data Center is fixed in time and place Data Center is a protected repository Data Center information is processed in batches Data Center is a low power investment (< 50 w/sq.ft.) Data Centers are virtual and mobile, and dynamic Important data can be anywhere, anytime Information is processed in real time Data Centers demand high power densities (>100 w/sq.ft.)

20 th Century solutions that don t work 2005 APC corporation Welcome to 21 st Century Computing! What do we learn from Blade Server Technology?

The fastest computers (Nov 2004) The ability to configure is the key! Basic IBM blade server attributes Nameplate power consumption: 5400 watts per frame Measured power consumption: 4050 watts per frame, 100% CPU utilization Standard power supply: 2000 watts each, 4 per chassis Voltage requirement: 200-240 VAC, 50-60HZ, single phase Input power conductor configuration; IEC C20 Power supplies are hot-swappable. 24 power supplies in fully configured rack, each with its own power cord Power per 42U rack (6 frames): 24.3 Kilowatts

Power Supply 3 Power Supply 1 Power Supply 4 Power Supply 2 Recommended power solutions Number of chassis per rack 1 2-3 4-6 Solution (2) Rack PDU, Metered, 20amp, ZeroU, 5.7kW, 3 phase, 208vAC (4) Rack PDU, Metered 20amp, ZeroU, 5.7kW, 3 phase, 208vAC (4) Rack PDU, Metered 50amp, ZeroU, 12.5kW, 3 phase, 208vAC APC Part Numbers (2) AP7864 (4) AP7864 (4) AP7868 For 2N, plug power supplies 1 and 3 into power supply A, and power supplies 2 and 4 into feed B!

B 2 A 1 4 3 It s hot and getting hotter! 1 kilowatt of power consumption by a server requires 160 cubic feet of air per minute. The air must be capable of producing a 20 degree F. temperature drop across the server. Most servers can operate with inlet air of 85-90 degrees F. ASHRAE STANDARDS: Thermal guidelines for Data Processing Environments TC 9.9 Allowable temperature range: 59-90 degrees F. Recommended range: 68-77 degrees F. Humidity range: 20-80%RH Recommended range: 40-55%RH A 24 KW Blade Server rack will require 3840 Cubic Feet of Air per Minute!

Conventional Cooling Won t Work A conventional data center layout with one vented tile per rack simply cannot cool racks over approximately 6 kw per rack over a sustained area Limits are limits No matter how well you manage a raised floor, or how well you use current technologies, the upper limit for raised floor cooling is about 3-4 KW per rack. This equates to an average density of 100-140 watts/sq. ft. Beyond this limit, you have more space allocated to cooling than to the IT servers.

Floor Tile Cooling Ability 12 10 Typical Capability Perf tile With Effort Grate tile Extreme Impractical Rack Power (kw) that can be 8 6 4 cooled by one tile with this airflow 2 Blade Servers Standard IT Equipment 0 0 100 200 300 400 500 600 700 800 900 1000 [47.2] [94.4] [141.6] [188.8] [236.0] [283.2] [330.4] [377.6] [424.8] [471.9] Tile Airflow (cfm) [L/s] UPS 30 kw PDU CRAC 40 kw PDU UPS 30 kw Blade Servers 24 KW Load 2N UPS and cooling required for 2 load

Increased Density and Total Floor Area 50 4.65 45 4.18 40 35 Total area of IT equipment + infrastructure 3.72 3.25 Area ft 2 / kw 30 25 20 Area occupied by power and cooling infrastructure 2.79 2.32 1.86 m 2 / kw 15 10 5 Area occupied by IT equipment 1.39 0.93 0.46 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 Power Density kw per Rack 0 The Cooling Challenge 2 POWER 24kW 2 COOLING 30-amp circuits 208 / 230 V (Assume dual-corded blade chassis)

The Cooling Challenge 24kW 3840 cfm 3840 cfm CHALLENGE # 1: 2 Supply 3840 cfm of Cool Air to the Rack

Typical Raised-Floor Airflow One 300 cfm vented tile per rack 300 cfm 300 cfm 300 cfm 300 cfm Perforated Tiles? 2 rack would require 13 perforated tiles Aisle width would need to be substantially increased Spacing between racks would need to be substantially increased Perforated tiles cannot cool an 2 rack in a typical data center

Floor Tile Cooling Ability 12 10 Typical Capability 300-500 cfm Perf tile With Effort Requires careful raised floor design, careful CRAC placement, and control of under-floor obstacles (pipes/wiring) Grate tile Extreme Impractical Rack Power (kw) that can be 8 6 4 cooled by one tile with this airflow 2 Blade Servers Standard IT Equipment 0 0 100 200 300 400 500 600 700 800 900 1000 [47.2] [94.4] [141.6] [188.8] [236.0] [283.2] [330.4] [377.6] [424.8] [471.9] Increased Floor Depth? Max % Tile Airflow Variation Max % Tile Airflow Variation 350% 300% 250% 200% 200% 150% 150% 100% 50% With grate-type tiles, airflow in some cases reverses! 25% Open Tiles (perforated) 56% Open Tiles (grate) Variation is large even for very deep plenum 0% 1 ft 22ft 33ft 4 ft 55ft 66ft 0.30 m 0.61 m 0.91 m 1.22 m 1.52 m 1.83 m [0.30] [0.61] [0.91] [1.22] [1.52] Floor Plenum Depth Floor Plenum Depth (feet) [meters]

Grate-Type Tiles? Grate-type tiles dramatically alter under-floor pressure gradients, making cooling non-uniform and unpredictable Grate-type tiles in one area impact airflow in neighboring areas Large airflow variations when using grate-type tiles mean some locations will NOT receive enough cooling Even if an extreme cooling design could solve these large airflow variations, it would still take 3-4 grate-type tiles to cool one 18kW rack Optimizing raised floor environments Blanking Panels in open U spaces in racks Seal cable panel cut out spaces Limit the number of perforated tiles Employ hot aisle/cold aisle layout Use no perforated tiles in hot aisles Monitor under floor static pressure at multiple points Use VFD s to control CRAC fan speeds to stabilize static pressure Remove dead data and power cables- dictate routes for new cables Check airflow balance on a regular basis Use CFM modeling to engineer the air flow Balance critical loads-spread out high density loads Minimize latent cooling raise chilled water supply temperature Raise computer room temperature Avoid air mixing between supply and exhaust air Eliminate turning vanes

Rack Air Distribution Unit Airflow Diagram Conditioned Room Air Is: Pulled in from underneath raised floor Delivered from bottom to top of rack by dual fans Drawn in by the IT equipment Provides Cooler Air to the Rack Provides better cooling for IT equipment reducing heat related failures Extends the life of equipment in the rack Rack Air Distribution Unit ARU AIR FLOW

CHALLENGE # 2: 24kW Remove 3840 cfm of Hot Air From the Rack 3 Ways to Remove Heat: Through the room Through a duct Through ceiling plenum Rack Air Removal Unit Airflow Diagram Fans pull in rack equipment exhaust air Cable impedance is overcome by high powered fans Ducted exhaust system (optional) delivers hot air to plenum Eliminates hot air from mixing with room air Proper airflow through the enclosure is ensured Cool inlet air moves freely to equipment in the rack Rack Air Removal Unit

ADU Air Flow CHALLENGE # 3: 18kW 3 kw X Keep Hot Exhaust Air Away From Equipment Intake 3 kw 3 kw 3 kw 3 kw 3 kw

Blanking Panels Strategy #4 Snap-in blanking panel Blanking Panels Blanking panels block internal recirculation BEFORE AFTER 90ºF 32ºC 80ºF 27ºC 95ºF 35ºC Rack front 79ºF 32ºC 73ºF 32ºC 73ºF 32ºC Rack front Blanking panels 83ºF 28ºC 72ºF 22ºC 70ºF 21ºC 73ºF 32ºC 72ºF 32ºC 70ºF 32ºC SIDE VIEW SIDE VIEW

Air Flow with no blanking panels Air Flow with blanking panels

Cooling the data center Ultimate Strategy Ultimate Strategy Dedicated High-Density Area Supports maximum-density racks Doesn t require spreading out of high-density equipment Optimal floor space utilization New technologies can deliver predictable, highly efficient cooling

Dedicated High-Density Area: Considerations Requires prior knowledge of number of high-density racks Need to plan high-density area in advance and reserve space for it Requires ability to segregate high-density equipment Airflow patterns

Dedicated High-Density Area: Power / Cooling Technology Ambient-temperature air is returned to room Hot aisle Integral rack power distribution system (UPS is optional) Door access to hot aisle and rear of IT equipment Integral rack air conditioner All exhaust air is captured within chamber and neutralized to ambient temperature Equipment racks take in ambient air from front Can operate on hard floor or raised floor Hot aisle containment capture the heat Roof for Hot Aisle - Converts Hot Aisle to a plenum to minimize air mixing Chamber Doors InfraStruXure High Density Dedicated Air Unit Back to front air flow Access to hot aisle, locks for security

ISX High Density Air Flow Pattern Questions? Thank you! 2005 APC corporation