Energy Efficient Data Centers

Similar documents
Green IT and Green DC

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Recapture Capacity for Existing. and Airflow Optimization

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Optimizing Cooling Performance Of a Data Center

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

CFD Modeling of an Existing Raised-Floor Data Center

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Data Center Trends and Power Management

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Improving Data Center Efficiency

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Power and Cooling for Ultra-High Density Racks and Blade Servers

A Green Approach. Thermal

Next Generation Cooling

ENCLOSURE HEAT DISSIPATION

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

MSYS 4480 AC Systems Winter 2015

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Rittal Cooling Solutions A. Tropp

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Reducing Data Center Cooling Costs through Airflow Containment

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

The Energy. David L. Moss Dell Data Center Infrastructure

Reducing Energy Consumption with

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Application of TileFlow to Improve Cooling in a Data Center

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Airflow Management s Role in Data Center Cooling Capacity

Packaging of New Servers - energy efficiency aspects-

IBM eserver Total Storage, On Demand

Liebert DCW CATALOGUE

Close Coupled Cooling for Datacentres

Impact of Air Containment Systems

DATA CENTER LIQUID COOLING

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

LCP Hybrid Efficient performance with heat pipe technology

Rear Door Heat Exchanger

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Innovative Data Center Energy Efficiency Solutions

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Improving Data Center Efficiency

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Cabinet Level Containment. IsoFlo Cabinet. Isolated Airfl ow Cabinet for Raised Floor Data Centers

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

APC APPLICATION NOTE #74

Green Data Centers A Guideline

Thermal management. Thermal management

Measurement and Management Technologies (MMT) for more energy efficient Data Center and Telecommunication Facilities

Optimizing data centers for high-density computing

18 th National Award for Excellence in Energy Management. July 27, 2017

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Power & Cooling Considerations for Virtualized Environments

Switch to Perfection

Efficiency of Data Center cooling

IT Air Conditioning Units

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Power Monitoring in the Data Centre

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Liquid Cooling Package LCP Cooling Systems

Measurement and Management Technologies (MMT)

MEASURE MONITOR UNDERSTAND

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

HIGH DENSITY RACK COOLING CHOICES

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Charles Nolan. Do more with less with your data centre A customer s perspective. Director IT Infrastructure, UNSW. Brought to you by:

Pramod Mandagere Prof. David Du Sandeep Uttamchandani (IBM Almaden)

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Improving Data Center Reliability and Efficiency by Recovering Cooling Capacity with KoldLok Raised Floor Grommets BUSINESS CASE ANALYSIS

High Density Cooling Solutions that work from Knurr Kris Holla

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Green IT. Finding Practical Efficiencies. Infrastructure Planning and Facilities. Green IT: Finding Practical Efficiencies

Data Center Infrastructure Management ( DCiM )

HP Modular Cooling System: water cooling technology for high-density server installations

Green IT: reduce energy costs and improve service levels with virtualization Bob Good, Manager - Systems Engineers August 2009 MISA Fall Conference

1.866.TRY.GLCC WeRackYourWorld.com

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Transcription:

IBM Systems and Technology Group Lab Services Energy Efficient Data Centers Steven Ahladas IBM Corp ahladas@us.ibm.com Keeping our clients in the race.

All trends going the same way. IT Servers were an asset, tend to be a commodity Energy was a commodity for servers, tend to be a cost But.. Data Centers energy cost now shows Appear as a critical limitation to server extension in the data center Heat increase can be a growth killer by stopping introduction of new servers (DataCenter rooms limitations) Energy economics around IT servers must be rebalanced with new energyconscious criteria to stop cost increase 1) Stop the trend 2) Reduce power consumption Spending (US$B) $120 $100 $80 $60 $40 $20 $0 New Server Spending Power and Cooling 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 IDC Presentation, The Impact of Power and Cooling on Data Center Infrastructure, Doc #201722, May 2006 Installed Base (M Units) 50 45 40 35 30 25 20 15 10 5 0 2 Keeping our clients in the race.

Energy/Performance Metrics Server performance increasing much faster than server power Server/Storage Performance/Energy Trends 80 High End z xseries 1U p550 High End Storage 70 Annual Change(%) 60 50 40 30 20 10 0 Perf(capacity for storage) Power Perf/Space Perf/Watt Source: Roger Schmidt 3 Keeping our clients in the race.

Comparison of typical server utilization rates Used Wasted Mainframe UNIX x86 4 Keeping our clients in the race.

Energy Management Policy Example Server Consolidation Conserves Energy APP 1 APP 2 APP 3 APP 4 8 APP 7 APP APP 8 APP 1 APP 2 APP 3 APP 4 APP 5 APP 6 APP 7 System 1 System 2 System 8 Advanced Virtualization 10% busy 1KW 10% busy 1KW 10% busy 1KW 70% busy 4KW Total Power 8KW Total Power 4KW Server consolidation exploiting virtualization is the most effective tool in reducing energy costs 5 Keeping our clients in the race.

Cool Blue: Virtualization on IBM Systems and Storage This shift has eliminated one floor of servers, cut power and cooling costs by 80 percent, and reduced the administrative staff (and costs) by 10 administrators. No other platform has clearly demonstrated the vertical scaling (scale-up) savings as the mainframe. Virtualization was first introduced by IBM in the 1960s to allow the partitioning of large mainframe environments. System z virtualization remains the gold standard today. IBM SAN Volume Controller (SVC). for storage virtualization increases storage utilization rates by 30% while reducing the growth rate of storage capacity by 20%. VMware, Microsoft Virtual Server and Xen offerings are available for IBM System x and IBM BladeCenter Dell PowerEdge 1950 Dell PowerEdge 1950 virtualized Dell PowerEdge 1955 Dell PowerEdge 1955 virtualized IBM System z BC S07 Small IBM System z BC S07 Expanded Annual Power Cost per Logical Server 0 $100 $200 $300 $400 $500 $600 $700 $800 $900 $1,000 $920.70 $353.14 $550.06 $207.57 $127.23 $73.06 IBM System z EC S18 $32.80 6 Keeping our clients in the race. Source: Avoiding the $25 Million Server Data Center Power, Cooling, and Growth report by Sine Nomine Associates, 4/6/07

Cool Blue: Storage virtualization Pool and share I/O Resources for: Improved utilization Reduced energy needs Combined utilization 50% IBM SAN Volume Controller EMC IBM HP Hitachi IBM Combined utilization 90% 7 Keeping our clients in the race.

Cool Blue: Deploy efficient IBM System Storage 10 year TCO example using blended tape and disk best practices Deploy more power efficient storage Utilize storage more efficiently 250TB of storage 25% growth over 10 years 8 Keeping our clients in the race.

Reduce Consumption Through Provisioning De-provision Standby Servers Active Servers Standby Servers Provisioned 875 Servers De-provisioned 125 Servers Power Savings In Standby Mode 9 Keeping our clients in the race.

Energy Efficient Data Centers Data Center Power Usage Effectiveness 12 100.00% Number of Data Centers 10 8 6 4 2 80.00% 60.00% 40.00% 20.00% 0 1.53 1.81 2.10 2.38 More Building / IT Eqt Power 0.00% 10 Keeping our clients in the race.

DATA CENTER ENERGY EFFICIENCIES Total Power = 580 kw DATA CENTER A UPS Losses 6% DATA CENTER B Computer Loads 63% Total Power = 1700 kw Computer Loads 38% HVAC 54% UPS Losses 13% Lighting 2% Lighting 1% HVAC - Chilled Water Plant 14% HVAC - Air Movement 9% Data Center A: 54 % HVAC ; Data Center B: 23 % HVAC Data Center A could possibly save ~ 180 kw Those that did many things well still had room to improve From LBNL DC energy study, W. Tschudi 11 Keeping our clients in the race.

TYPICAL RAISED FLOOR DATA CENTERS GOAL: provide specified inlet temperatures to every IT Rack as efficiently as possible 12 Keeping our clients in the race.

3D Temperature distributions of IBM supercomputer 54.47 o C cold aisle hot aisle z=2.5 z=3.5 z=4.5 z=5.5 z=6.5 z=8.5 z=0.5 z=7.5 z=1.5 feet Hot spot at long aisle 33.74 o C z 13.01 o C y x Measured 3D temperature maps of a datacenter 13 Keeping our clients in the race.

Verticle Temperature Gradients 45 inlet air temperature T inlet [ o C] 40 35 30 25 20 15 upper specs. 1 2 3 4 5 6 7 8 9 10 11 12 0 1 2 3 4 5 6 7 8 9 height [feet] 14 Keeping our clients in the race.

COOLING EFFICIENCY #1: INTERMIXING between hot and cold air increases locally Inlet Air Temperatures ( excessive low CRAC discharge temperatures; affects Cooling Production Efficiency) #2: RECIRCULATION causes small differences in Return and CRAC Discharge Temperatures ( CRAC units are not fully utilized; affects Cooling Delivery Efficiency) 15 Keeping our clients in the race.

PERFORATED TILES 56% open, no damper 56% open, with damper - in many places perf. tiles do not support for the required cooling capacity - get high throughput tiles (50 % open) - remove dampers to enhance air flow (300 cfm ~ 2 kw) Tate GrateAire 56% Tate GrateAire 56% w/ 100% Open Damper Static Pressure, in H2O 0.12 0.10 0.08 0.06 0.04 0.02 0.00 0 500 1000 1500 2000 2500 3000 Airflow, CFM 16 Keeping our clients in the race.

UNDER-FLOOR BLOCKAGE Examples for blockage cables water pipes - avoid any blockage in the critical path to perforated tiles - put cabling etc. under hot aisles - possibly deploy under-floor boundaries - limit underfloor airflow to the boundaries of the zone to be cooled - escaping air results in lower underfloor air pressure and room airflow - use under-floor barriers to limit the airflow to the needed places 17 Keeping our clients in the race.

LEAKS AND CABLE CUTOUTS - block cable cut-outs using the methods shown below - add perforated tiles & use higher throughput tiles - try to have good sealing between the tiles Brushes Bags Mats Pillows 18 Keeping our clients in the race.

LEAKAGE in Sample Raised Floors # CRACs total flow/cooling capacity 1 # perf. tiles total perf. tile flow/cooling capacity 1 total ceiling tile flow/cooling capacity 1 Not-targeted flow/cooling 2 Total HW / Room Power Margin #1 11 67500 cfm 450 kw 118 29500 cfm 200 kw 1900 cfm 12 kw 38000 cfm 250 kw 300/360 kw 87/104 tons 90 kw 26 tons 130 tons 58 tons 3 tons 72 tons (57 %) (25 %) #2 7 42700 cfm 285 kw 43 11000 cfm 73 kw 2000 cfm 13 kw 31700 cfm 211 kw 110/140 kw 32/41 tons 145 kw 42 tons 82 tons 21 tons 4 tons 61 tons (74 %) (103 %) #3 17 84000 cfm 560 kw 24 6000 cfm 40 kw NA 78000 cfm 520 kw 175/230 kw 51/68 tons 330 kw 96 tons 162 tons 12 tons 151 tons (93 %) (143 %) #4 35 194 kcfm 1294 kw 185 46.5 kcfm NA 137.5 kcfm 310 kw 918 kw 585/730 kw 170/212 tons 564 kw 164 tons 375 tons 90 tons 266 tons (71 %) (77%) 1.) @ 60F (16 o C) CRAC discharge temperature 2.) Cable cutouts, other losses H.Hamman 2/2007 19 Keeping our clients in the race.

Normal Operation of Server Understanding Where Costs Hide is Critical Rear Door Heat exchanger can remove over 50% of a rack s s heat output Front Cold Air flow Back Hot No new fans or electricity needed. Attaches to back of rack (adds 5 ) No rearrangement of datacenter Cost effective; 1KW cooling = $286 Perf tile Cable Opening Tile floor Underfloor Chilled Air Subfloor Improved Operation of Server Rear Door Heat Xchanger The Cool Blue Heat exchanger adds cooling capacity at ~1/4 the cost of traditional methods Air flow Front Cold Back Hot IBM Enterprise Rack Rear Door Heat Exchange water lines 20 Keeping our clients in the race. Perf tile Cable Opening Underfloor Chilled Air Tile floor Subfloor

Motivation for Liquid Cooling - Increase in heat removal performance: Superior thermal properties of liquids compared to air - Design flexibility: Sensible heat transport to locations with available space - Centralized secondary heat exchanger - Efficient water-water heat exchanger Thermal conductivity [W/(m*K)] Air 0.0245 1.27 H 2 O 0.6 4176 Volumetric heat capacity [kj/(m 3 *K)] Disadvantage: Increased complexity Secondary Transport Primary Long distance transport possible 21 Keeping our clients in the race. Limited heat transport due to fin efficiency

Data Center Power Distribution Systems & Efficiency Requires this much power at the facility level Power Conversion & Distribution Losses One Watt Of Chip Power Feeder Transformer UPS System Raised Floor Transformer Distribution (all combined) Server Bulk PS VRM Load - Std Raised Floor 480V Servers 11.2KV(AC) to 480VAC 11.2KV(AC) to 480VAC 5% 480VAC to 400VDC to 480VAC 480VAC to 400VDC to 480VAC 480VAC to 208VAC to - 208VAC 400VDC to 12VDC NA - 480VAC to 400VDC to 12VDC 12VDC to 1VDC Std Watts 1.79 1.75 1.58 1.52 1.47 1.25 1 12VDC to 1VDC 480V Watts 1.70 1.67 1.50 1.50 1.47 1.25 1 1VDC to heat 1VDC to heat 400V DC Servers 400V DC Watts 11.2KV(AC) to 480VAC 12% 480VAC to 400VDC NA - 400VDC to 12VDC 12VDC to 1VDC 1.57 1.54 1.46 1.46 1.40 1.25 1 1VDC to heat Capital/Cost of Acquisition Savings 22 Keeping our clients in the race.

Thank You! Questions? Steve Ahladas, Data Center Services: ahladas@us.ibm.com 23 Keeping our clients in the race.