Welcome Association of Energy Engineers to the Microsoft Technology Center
Today s Agenda 10:00 10:30 What s new in data center energy efficiency planning and implementation? 10:30 11:00 ASHRAE data center economizer update new economizer requirements and technologies. Break? 11:00 11:30 Where is the low hanging fruit for data center efficiency improvements? 11:30 Noon DCIM (Data Center Infrastructure Management), the HOT topic in the data center industry right now for optimal operational and energy efficiency Noon 1:00 Lunch and MTC data center tours
What s new in data center energy efficiency planning and implementation?
Key Market Dynamics Cloud Computing In 2012 cloud was estimated to be 25% of all new IT spend 15 25% adoption in 2011 25 45% adoption by 2013 EXTERNAL / PUBLIC: Pay per use hosting of virtual servers at an external service provider INTERNAL / PRIVATE: Your data center, where your IT department offers virtual servers as a service Co Location Growth Providing capacity for expansions that have been delayed SMB outsourcing Source: IDC Research 4
Rack Power Densities Are Increasing 14.0 12.0 What is the average power density (in kw) per rack in your data center? 12.0 10.0 8.0 6.0 4.0 2.0 6.0 6.1 7.3 7.4 0.0 2006 2007 2008 2009 est. 2012 63% of respondents have racks over 12 kw today Data Center Users Group Survey 5
Density: Next Build or Expansion Best Practice: Build / retrofit with higher densities in mind Data Center Users Group Survey 6
Increased Server Operating Temp. Dell PowerEdge R320 Standard Environment Warranty Operating Period Low Temp High Temp Low RH High RH Standard Operating Temp 10 C (50 F) 35 C (95 F) 10% 80% <10% Annual Operating hours (900 hours) <1% Annual Operating Hours (90 hours) 5 C (41 F) 40 C (104 F) 5% 85% 5 C (23 F) 45 C (113 F) 5% 90% Normal Operating Temp. 15 C (60 F) 26 C (78 F) 40% 65% How Server Heat Challenges are Addressed 27 C (80.6 F) Free Air Hours Design Use worst case server thermal design System Management Power Capping & Real time monitoring 27 C (80.6 F) Free Air Hours 7
ASHRAE Thermal Guidelines 2011 2011 Class Applications IT Equipment Environmental Control A1 Data Center Enterprise servers, storage products Tightly controlled A2 Data Center Volume servers, storage products, personal computers, workstations A3 Data Center Volume servers, storage products, personal computers, workstations A4 Data Center Volume servers, storage products, personal computers, workstations B C Office, home, transportable environment, etc. Point of sale, industrial, factory, etc. Personal computers, workstations, laptops, printers Point of sale equipment, ruggedized controllers, PDAs Some control Some control Some control Minimal control 2008 Class 1 2 N/A N/A 3 No control 4 8
ASHRAE Thermal Guidelines 2011 Equipment Operating Environmental Specifications Classes DB Temp (F) Hum Range, non condensing Max. DP (F) Recommended (Applies to all A classes) A1 to A4 64.4 to 80.6 41.9F DP to 60% RH & 59F DP Allowable A1 59 to 89.6 20 to 80% RH 62.6 A2 50 to 95 20 to 80% RH 69.8 A3 41 to 104 10.4F DP & 8 to 85% RH 75.2 A4 41 to 113 10.4F DP & 8 to 90% RH 75.2 B 41 to 95 8% to 80% RH 82.4 C 41 to 104 8% to 80% RH 82.4 9
ASHRAE Thermal Guidelines 2011 A1-A4 Recommended A2 A3 A4 A1 A3 & A4 A2
Standard Rating Conditions Application Classes Return Air DB Temp (F) Return Air DP Temp (F) Class 1 75.0 52.0 Class 2 85.0 52.0 Class 3 95.0 52.0 Class 4 105.0 52.0 Air cooled 95F, 80F, 65F, 40F Ambient Water cooled 83F, 70F, 55F, 35F EWT Glycol cooled 104F, 85F, 65F, 35F EGT Chilled Water 50F ECWT Efficiency Ratings NSenCOP insencop 11
Customer Trend Alternate cooling methods ASHRAE lower dew point limit discussion Using lower Dew point and higher cold aisle temperatures to get more hours of air and water economizer operation Raising cold aisle temperatures To gain cooling capacity and efficiency Many oblivious to the impact to the server fan power ramp Efficiency, Efficiency, Efficiency No cooling or One Off Experiments Yahoo, Microsoft, Facebook Facebook Prineville, Oregon Lockport, New York Yahoo Computing Coop, which emphasizes free cooling and air flow management in metal prefabricated structures 212 hours per year they cannot use free cooling 12
The Noise in the Market Place Good if you have another data center and the means and the customer willing to move the work Not a Tier 4 facility 13
Customer Trend Modularity 14
Specific Modular Data Center Assemblies Integrated Modular Chiller Plant Modular Buildings AC Power Enclosures AC Power Skids
Modular at the row level
ASHRAE data center economizer update new economizer requirements and technologies.
ASHRAE 90.1 Standard Minimum Efficiency Standard for non Residential Buildings The 90.1 Standard 2010 eliminates the data center exclusion Mandatory Requirements Establishes minimum efficiency requirements for CRAC and CRAH» Not covered Row based and Supplemental Coolers Efficiency Rating per Standard 127 2007 SCOP vs EER vs COP Method to prevent units from fighting (Heating / cooling / humidifying / dehumidifying) Strongly favors a third party certification program (AHRI) Prescriptive Requirements Economizers Air Water Total Cost Calculator Timing Standard June 2010 Adoption varies by state 19
Exceptions to the Economizer Prerequisite Data Centers with a design load < 250 tons Or in a building with a chiller plant the design load must be < 50 tons Exemption to all economizers if a water supply for a cooling tower is not reliable Requirement that fluid economizers provide 100% cooling at 50 F changed to 100% cooling at 40 F Systems for computer rooms where a minimum of 75% of the design load serves: 1) those spaces classified as an essential facility (per IBC 2006) 2) those spaces having a mechanical cooling design of Tier IV as defined by ANSI/TIA 942 3) Those spaces classified under NFPA 70 Article 708 Critical Operations Power Systems (COPS) 4) Those spaces where core clearing and settlement services are performed such that their failure to settle pending financial transactions could present systemic risk as described in The Interagency Paper on Sound Practices to Strengthen the Resilience of the US Financial System, April 7, 2003 The standard does not state what type of economizer fluid economizers are 20 fully acceptable
Types of Economizers Chilled Water Systems Fluid Economizers Parallel Chiller Tower Series Chiller Tower Series Air Cooled DX Refrigerant Cooling Systems Glycol System Drycooler Cooling Tower Refrigerant only system Pumped Refrigerant Air Economizers Direct Indirect Evaporative 21
CRAH with Air Economizer System Normal Operation Economizer Operation 22
Air Economizers: Real World Only 1 out of 4 economizers works properly Failed components, Improper controls, Lack of maintenance Failed economizers can increase annual energy consumption significantly more than they save New England Power Services Co. Study only 44% of economizers evaluated were operational after 2 years New construction one study suggests 50% new installations never work properly Source: Energy Design Resources Design Brief on Economizers (2 nd Edition, August 2011) 23
CRAH units with air cooled chiller 45ºF 45? 55? Why Not 55 to 65? Outdoor Chiller 55ºF 24
CRAH units with air cooled chiller and drycooler for free coolingor pre cooling 45ºF 30? What about 50 or 55? 45ºF Outdoor Chiller 55ºF 30ºF 25
CRAH units with water cooled chiller pump pump 85ºF Summer time 45ºF Chiller Evaporative Cooling Tower 55ºF 95ºF 26
CRAH units with water cooled chiller Water side Economizer (low ambient) pump pump 42ºF Winter time 42ºF Evaporative Cooling Tower 45ºF Chiller 55ºF 27
Air Cooled Pumped Refrigerant System Overview Indoor Unit Air Cooled Condenser Pump Module
Air Cooled Operation Summer operation Outdoor Ambient 95 F KW @ 70% Load 24.1 SCOP = 3.6 ppue = 1.28 Check Valve Condenser Solenoid Valve Refrigerant Pumps OFF Check Valve Electronic expansion valve Evaporator Circuit 2 Circuit 1 Compressors ON Check Valve 29
Partial Economization Fall / Evening operation Outdoor Ambient 65 F KW @ 70% Load 15.1 SCOP = 5.8 ppue = 1.17 Check Valve ON Condenser OFF Solenoid Valve Refrigerant Pump Check Valve ON Electronic expansion valve Evaporator Circuit 2 Circuit 1 Compressor OFF Check Valve
Full Economization Winter operation Outdoor Ambient 45 F KW @ 70% Load 7.9 SCOP = 11.2 ppue = 1.09 Check Valve ON Condenser ON Solenoid Valve Refrigerant Pump Check Valve OFF Electronic expansion valve Evaporator Circuit 2 Circuit 1 Compressor OFF Check Valve
Economizer Hours Dependent on Temperature Bin Data Shown for Chicago, IL Dry Bulb Bin Data Temp bins below 5 5 9 10 14 15 19 20 24 25 29 30 34 35 39 40 44 45 49 50 54 55 59 60 64 65 69 70 74 75 79 80 84 85 89 90 94 above 95 Dry bulb hrs 130 112 152 226 317 511 744 783 622 597 586 611 689 729 755 578 361 179 63 13 WB @ DB bin 5.9 10.5 15.3 19.9 24.7 29.5 33.9 38.1 42.3 46.6 51.1 55.7 60.1 64.2 67.1 69.5 72.1 74.7 Ave Dew point 2 2 7 12 17 22 27 24 29 34 39 44 49 54 59 55 60 57 62 62 Hours of Free Economization dependent on Technology & Aisle Temp Cold Aisle 70 F Hot Aisle 95 F Fluid Economizers Parallel Chiller Tower 54% Series Chiller Tower 54+45% Series Air Cooled 48+38% Air Economizers Direct 45% Indirect 78% + 21% Glycol System Drycooler 42+38% Cooling Tower 48+52% Pumped Refrigerant only system Dry Condenser 41+45% Evap Condenser 56+44%
The Issues Must be Understood Fluid Economizers Water Usage complexity of valve system and controls Freezing weather Transient Change over Capital Cost DX Glycol Systems Hours of Free cooling extra coil air pressure drop Pumped Refrigerant New Technology Air Economizers Humidity Control Contamination Freezing coils Transient Change over Cost of Indirect Systems 33
Where is the low hanging fruit for data center efficiency improvements?
Simple Improvement Hot Aisle Cold Aisle Reduces mixing of the hot and cold air streams
Simple Improvement Ceiling Plenum Return Keeps hot air from migrating back to the cold aisle. Increases the temperature back to the CRACs, increasing both capacity and efficiency.
Simple Improvement Aisle Containment Ensures that cold air stays in the cold aisle until it is taken in by the server. Prevents hot air from enetring the cold aisle.
Other Simple Improvements Blanking Panels Seal floor penetrations Create density zones when planning the data center layout.
Simple Improvement More Efficient IT Equipment! 1 Watt saved at the server component level results in cumulative savings of about 2.84 Watts in total consumption
DCIM (Data Center Infrastructure Management), the HOT topic in the data center industry right now for optimal operational and energy efficiency
Emerging Trend: Holistic Data Center Infrastructure Management (DCIM) Collaboration of IT & Facilities resources Aligning virtual computing to physical infrastructure IT goes where there is infrastructure capacity Optimal alignment of computing demands to infrastructure supply Data Center Room Cooling System IT Devices Power System Infrastructure Management Distributed Infrastructure 41
Fluctuating Loads Create Challenges & Opportunities Even in a virtualized environment, data center utilization remains below 25% for large parts of a day 100 Server Utilization in Virtualized Environment The data center is dynamic Variable Peaks and valleys 0 0 12 24h Best Practice: Take advantage of this variable utilization with a variable infrastructure Source: McKinsey & Company Revolutionizing Data Center Efficiency 42
Managing Infrastructure Measuring PUE PUE = Total Energy Entering Data Center Total IT Equipment Usage 43
What should I monitor? Automatic Transfer Switch Fire Pump Controller Extreme Density Precision Cooling Precision Cooling Power Distribution Units Surge Suppression Uninterruptible Power Systems & Batteries Integrated Racks RACK Performance Management Rack Cooling UPS Control Room KVM Cable Management Power Distribution Unit Infrastructure Management & Monitoring Systems Board Level Power Supplies Server Power Supplies Rack Monitoring 44
Data Center Infrastructure Management Maturity MONITOR AND ACCESS DATA CAPTURE AND PLANNING ANALYZE, DIAGNOSE RECOMMEND AND AUTOMATE How are my assets operating? Am I getting realtime notification of alarms and alerts? How do I get my server back up and running? Can I populate my planning tools with actual performance data? What and where are assets in the data center? How are they interconnected? Do we have space, cooling and power to meet future needs? How can I commission decommission more efficiently? How do I extend the life of the data center? How do I reduce mean time to repair (MTTR)? How do I synch infrastructure with virtualization automation? How are we doing against SLAs? How do I anticipate potential failures and automatically shift compute and physical load to eliminate downtime? How can I optimize efficiency across my data center? Early Warning (Reactive) Improved Planning Reduced MTTR and Effort Customers need to evolve through levels of maturity in DCIM Availability at Optimal Performance (Proactive) 45
The Final Word: Apply These Best Practices For Optimal Performance 1. Maximize the return air temperature at the cooling units to improve capacity and efficiency (through containment, control, or both) 2. Match cooling capacity and airflow with IT Loads 3. Utilize cooling designs that reduce energy consumption 4. Select a power system to optimize your availability and efficiency needs 5. Design for flexibility using scalable architecture that minimizes footprints 6. Enable data center infrastructure management and monitoring to improve capacity, efficiency and availability 7. Utilize local design and service expertise to extend equipment life, reduce costs and address your data center s unique challenges White Paper: Seven Best Practices for Increasing Efficiency, Availability and Capacity: The Enterprise Data Center Design Guide 46
For more resources, please visit us at www.dvlnet.com