Recapture Capacity for Existing. and Airflow Optimization

Similar documents
Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Reducing Energy Consumption with

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

A Green Approach. Thermal

Optimizing Cooling Performance Of a Data Center

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Reducing Data Center Cooling Costs through Airflow Containment

Impact of Air Containment Systems

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Effectiveness and Implementation of Modular Containment in Existing Data Centers

MSYS 4480 AC Systems Winter 2015

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Great Lakes Product Solutions for Cisco Catalyst and Catalyst Switches: ESSAB14D Baffle for Cisco Catalyst 9513 Switch

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Application of TileFlow to Improve Cooling in a Data Center

Power and Cooling for Ultra-High Density Racks and Blade Servers

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

MEASURE MONITOR UNDERSTAND

Air Containment Design Choices and Considerations

Energy Efficient Data Centers

Ten Cooling Solutions to Support High- Density Server Deployment

ENCLOSURE HEAT DISSIPATION

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Close Coupled Cooling for Datacentres

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Google s Green Data Centers: Network POP Case Study

Rittal Cooling Solutions A. Tropp

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Thermal management. Thermal management

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

APC APPLICATION NOTE #146

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Innovative Data Center Energy Efficiency Solutions

1.866.TRY.GLCC WeRackYourWorld.com

Overcoming the Challenges of Server Virtualisation

Switch to Perfection

Welcome Association of Energy Engineers to the Microsoft Technology Center

Data Center Energy Savings By the numbers

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Rethinking Datacenter Cooling

Data Center Airflow Management Basics: Comparing Containment Systems

New Techniques for Energy-Efficient Data Center Cooling

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

CFD Modeling of an Existing Raised-Floor Data Center

Virtualization and consolidation

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Data Center Energy and Cost Saving Evaluation

DCIM Data Center Infrastructure Management Measurement is the first step

DCIM Data Center Infrastructure Management

Efficiency of Data Center cooling

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

How to Increase Data Center Efficiency

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

FREEING HPC FROM THE DATACENTRE

The Energy. David L. Moss Dell Data Center Infrastructure

Optimiz Chilled January 25, 2013

The Four 4R s. of Data Center Airflow Management. About Upsite Technologies

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Temperature monitoring and CFD Analysis of Data Centre

Green IT and Green DC

Running Your Data Center Under Ideal Temperature Conditions

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

Design and Installation Challenges: Aisle Containment Systems

Using ASHRAE s Books to Save Energy. Presented by Don Beaty President DLB Associates ASHRAE Technical Committee TC 9.9

Evolution of Facebook s data center design. Veerendra Mulay, PhD Strategic Engineering and Development

INNOVATE DESIGN APPLY.

Improving Data Center Efficiency

Data Center Optimization Services. Maximize Your Data Center

INNOVATE DESIGN APPLY.

Cisco Lab Setpoint Increase. Energy Efficient Data Center Demonstration Project

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Next Generation Cooling

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

GLOBAL TECHNOLOGY BRIEFING THERMAL EFFICIENCY BEST PRACTICES

Power Monitoring in the Data Centre

Data Center Temperature Rise During a Cooling System Outage

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Airflow Management s Role in Data Center Cooling Capacity

WIRELESS ENVIRONMENTAL MONITORS

Pramod Mandagere Prof. David Du Sandeep Uttamchandani (IBM Almaden)

APC APPLICATION NOTE #74

Server Room & Data Centre Energy Efficiency. Technology Paper 003

Review on Performance Metrics for Energy Efficiency in Data Center: The Role of Thermal Management

Transcription:

Recapture Capacity for Existing Data Centers Through Cooling and Airflow Optimization

Introduction Challenges and Trends Agenda Cooling & Airflow Optimization Takeaways

Learning Objectives Recognize opportunities that can optimize an existing facility s operational performance and PUE. Identify fundamental improvements that can be implemented within a legacy facility s equipment limitations. d d d d h i di Better understand advanced techniques and improvements that can provide greater insights into a facility s operational characteristics.

Introduction Many existing data centers have inefficiencies that can be reduced through cooling and airflow optimization. The result of this optimization is not only a data center that operates in a MORE EFFICIENT manner, but one that may RECAPTURE previously unusable or lost capacity. We will review the challenges present in typical existing data centers and how they can be addressed in a manner that can improve operational performance.

How is Capacity Defined? Data center CAPACITY is the total amount of IT equipment (in terms of power) that the facility can FULLY support. It is dependent on several factors, including: Physical Space Network Bandwidth Available Power Cooling Capacity

Data Center Challenges What are the industry drivers & challenges? Compaction and increasing power density of IT equipment Virtualization and consolidation higher & steadier loads Increased networking requirements More energy efficient operation / lower PUE How can we meet these challenges in an existing data center? What were the design power densities? Have the ITE deployments been optimal? Is the available space, power, cooling, and network capacity all of equal concern?

Increasing Power Density of ITE 2X Density

Virtualization and Consolidation In legacy data centers, servers would be DEDICATED to a specific service. Their level of utilization would be proportional to the real- time demand on the hosted service. Design of today s IT equipment combined with RADICAL changes in software are DOUBLING or even TRIPLING the IT equipment demand / utilization. Very, very possible to experience demands of 50% to 75%. Demand Demand Previous Experience Virtualization ti / Technology

Power Usage Effectiveness Power Usage Effectiveness (PUE) is defined as the measurement of how efficiently a data center operates. PUE Rating 10 1.0 Ideal 1.2 Very Efficient 1.5 Efficient 2.0 Average 2.5 Inefficient Alternatively, l consider PUE as: 30 3.0 Very Inefficient i

Recent Greenfield Data Center PUE s PUEs continue to trend downward for Greenfield datacenters - a low PUE for a Greenfield is easier to achieve than in a retrofit situation. E PU 2 1.8 16 1.6 1.4 1.2 1 0.8 0.6 0.4 0.2 0 1.80 1.07 1.09 1.14 1.22 1.08 Facebook: Google: Google: Microsoft: Yahoo: Avg. Data Center Oregon Belgium Fleet Chicago New York (Uptime Institute)

Design vs. actual ITE deployment Power (KW) 4 3 2 1 Design Distribution Homogeneous Load 0 Actual Deployment Mixed Load

Inefficiencies Reduce Available Capacity Due to a lack of available cooling ONLY 10% of the data center capacity remains this will result in stranded power capacity. 100% Capa acity Used 80% 60% 40% 20% 0% Current Used Capacity Predicted Max. Cap. Based On Cooling Constraints Ideal Capacity

Overcooling, Yet Still Hot Spots? Due to the complexity of a data center environment, a space can be overcooled, yet hot spots still form. ASHRAE Temp ( F) T > 89.6 80.6 < T < 89.6 64.4 < T < 80.6 59 < T < 64.4 T<59 Hot Spots Still Present Hot Spots 4 CRAC Units 7 CRAC Units

Overcooling, Yet Still Hot Spots? Total IT Load: 906 kw Total Cooling Capacity: 1160 kw (28% Excess Cooling) Racks range from 5.2 to 10 kw ASHRAE Temp ( F) Hot Spots T > 89.6 80.6 < T < 89.6 64.4 < T < 80.6 59 < T < 64.4 T < 59

Cooling & Airflow Optimization

Maximize Available Cooling Available cooling is often a limiting factor in data center capacity. How can we maximize available cooling? Optimize Airflow Measure current performance & rebalance Eliminate mal-distributed air & separate hot / cold airstreams Increase Efficiency & Operational Performance Change control setpoints Increase operational temperatures & delta Ts Maintain i Improved dcooling Continued monitoring of data center conditions Use CFD to simulate cooling requirements prior to future IT equipment deployments

Air Measurement Points ASHRAE TC 9.9 recommended measurement points at the IT equipment inlet for installation verification and troubleshooting. 17

Airflow Optimization Various techniques are available to ensure cooling is getting to where it should be and at the proper conditions: Blanking Panels Hot / Cold Aisle Containment Floor Tile Balancing Increasing Underfloor Static Pressure CRAC Fan Speed Optimization Underfloor Baffles Plenum and Floor Sealing An invaluable tool in the optimization process is CFD MODELING. It allows you to test potential cooling solutions virtually, before committing to a final design.

Mal-Distributed Air Mal-distributed air is not reaching its intended destination, both wasting energy and limiting the amount of cooling available to the IT equipment. Mal-distribution of air is caused by: Leakage through ducts / raised flooring (often ranges from 5% to 15% of the total cooling airflow) Unsealed raised floor cutouts such as cable openings Consider using brush seals, grommets, etc. Incorrectly located air outlets / perforated floor tiles Air outlets should ONLY be in cold aisles near active IT equipment

Blanking Panels Blanking panels are an affordable solution for maximizing supply air effectiveness. Panels Missing Recirculated Air Without Blanking Panels 50% of Racks Above ASHRAE Recommended Without Side Panels With Blanking Panels & Side Fillers 0% of Racks Above ASHRAE Recommended

Blanking Panels Blanking panels are an affordable solution for maximizing supply air effectiveness. Side SdePanels esmissing Recirculated Air Without Blanking Panels 50% of Racks Above ASHRAE Recommended Without Side Panels With Blanking Panels & Side Fillers 0% of Racks Above ASHRAE Recommended

Floor Tile Balancing The tile layout must be designed to match the IT equipment in both location and amount of airflow. ITE Thermal Report (condensed) Description Heat Release Airflow (CFM) (Watts) Typical Max Minimum 420 26 40 Full 600 30 45 Typical 450 26 40 Adjust floor damper settings to minimize BYPASS AIR (air that does not enter the server is WASTED CAPACITY) Bypass Air

Increasing Underfloor Static Pressure The UNDERFLOOR STATIC PRESSURE is important in maintaining proper airflow throughout the data center. Typical readings range from 0.01 to approximately 0.20 in. W.G. Advantages of higher static pressure: More resiliency in case of lost CRAC unit Higher flow rates through grilles More consistent flow rates How to overcome low static pressure: Increase fan speed Reduce leakage Adjust damper positions

Increasing Underfloor Static Pressure Example 1: Dampers Set to 50% Open Static Pressure: 0.005005 to 0.025025 in. W.G. Min Airflow: 46 CFM Max Airflow: 990 CFM Standard Deviation: 240 CFM

Increasing Underfloor Static Pressure Example 2: Dampers Set to 20% Open Static Pressure: 0.035035 to 0.055055 in. W.G. Min Airflow: 418 CFM Max Airflow: 700 CFM Standard Deviation: 61 CFM

Containment Panels Extend Up to Ceiling End Cap End Cap Only Full Hot Aisle Partial Hot Aisle Ceiling Panel End Cap End Cap Only Full Cold Aisle Partial Cold Aisle

Containment Example Data Center Overview Supply Air Temperature = 55 F 10,000 ft 2 ; Slab to Deck Height: 25 ft 138 ITE cabinets totaling 1080 kw (range from 5.2 to 16 kw) 24,000 CFM of Cooling Supply Air

Containment Example Full Hot Aisle Containment Supply Air Temperature = 74 F Scenario Name Supply Air Rack Max Inlet Containment Temp ( F) Failures Temp ( F) Base Model (No Containment) 55 None 4 95.9 Full Hot Aisle Containment 74 Full 0 82.5

Increase Efficiency & Performance ASHRAE TC9.9 Green Tips contains good PRACTICAL advice for improving efficiency and performance in existing datacenters. Some considerations: Change control setpoints & increase deadbands Increase operational temperatures Reduce fan speeds More efficient humidification

ASHRAE TC 9.9 Thermal Envelope 2008 Criteria 2008 Low Temp 64.4 F High Temp 80.6 F Low Moisture 41.9 F DP High Moisture 60% RH & 59 F DP Legacy Data Centers 64.4 F 80.6 F We are no longer confined to such a small set of operating conditions 30

Precision Cooling? 2008 ASHRAE Thermal Guidelines Recommended Envelope Many Operating Points (290) Dry % Relative Humidity Bulb 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 64.4 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =18 66 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =19 67 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =20 68 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 69 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 70 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 71 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 72 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 73 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 74 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =21 75 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =19 76 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =18 77 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 =16 78 1 1 1 1 1 1 1 1 1 1 1 1 1 =13 79 1 1 1 1 1 1 1 1 1 1 1 =11 80.6 1 1 1 1 1 1 1 1 1 =9 =13 =14 =15 =16 =16 =16 =16 =16 =16 =15 =15 =14 =14 =13 =13 =13 =12 =12 =11 =10 =10 =290

ASHRAE TC 9.9 Thermal Envelope 2011 ALLOWABLE The allowable envelope is where the IT manufacturers test their equipment in order to verify that the equipment will function within those environmental boundaries. 32

Reduce Fan Speeds Fan affinity laws tell us that fan power (P) is proportional to the cube of the fan speed (N), while airflow (Q) is directly proportional. This can translate into SIGNIFICANT energy savings for a fairly small reduction in fan speed. Fan Speed / Airflow (%) Fan Power (HP) Power Savings (%) 100 10.00 0 90 7.29 27.1 80 5.12 48.8 70 343 3.43 65.7 Consider at total of 5 CRACs arranged at N+1 running at 100% What if they run at 80% speed?

Advantages of Raised Temperatures Depending on the configuration of the cooling system, raising the supply air temperature in a data center provides an opportunity for significant ADVANTAGES: Increased chiller capacity Increased chiller energy efficiency Ability to take advantage of more economizer hours In ANY data center, improved separation of fh hot & cold airstreams and airflow management can INCREASE the ΔT (return temp supply temp). Increased CRAC / CRAH capacity Potentially INCREASED capacity of piping distribution systems Increased chiller efficiency

Available Economizer Hours Air temperatures for Toronto (air mixing to maintain 59 F min. temp)

Takeaways Performance can be improved and CAPACITY can be REGAINED through optimizing airflow and cooling. For a given Total Facility load, more IT power capacity can be made available by implementing a more efficient cooling system. Modify control RANGES to ELIMINATE fighting. ASHRAE guidelines show elevated temperature ranges are possible without sacrificing reliability. CFD analysis is an invaluable tool in evaluating current & FUTURE cooling requirements. Continued Measurement & monitoring is vital for efficient i toperation.