Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Similar documents
Recapture Capacity for Existing. and Airflow Optimization

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Data Center Energy Savings By the numbers

Reducing Data Center Cooling Costs through Airflow Containment

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Introducing the Heat Wheel to the Data Center

MSYS 4480 AC Systems Winter 2015

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Impact of Air Containment Systems

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

18 th National Award for Excellence in Energy Management. July 27, 2017

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

DCIM Data Center Infrastructure Management Measurement is the first step

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

A Green Approach. Thermal

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Reducing Energy Consumption with

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Data Center Energy and Cost Saving Evaluation

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Virtualization and consolidation

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Optimiz Chilled January 25, 2013

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Efficiency of Data Center cooling

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Welcome Association of Energy Engineers to the Microsoft Technology Center

Cisco Lab Setpoint Increase. Energy Efficient Data Center Demonstration Project

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Cooling Solutions & Considerations for High Performance Computers

The Energy. David L. Moss Dell Data Center Infrastructure

Data Center Trends and Challenges

Planning a Green Datacenter

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Power and Cooling for Ultra-High Density Racks and Blade Servers

Switch to Perfection

Rittal Cooling Solutions A. Tropp

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Air Containment Design Choices and Considerations

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Optimizing Cooling Performance Of a Data Center

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Google s Green Data Centers: Network POP Case Study

Next Generation Cooling

Case study: Energy efficiency improvements in a legacy data centre. Robert Tozer Director Opera>onal Intelligence

The End of Redundancy. Alan Wood Sun Microsystems May 8, 2009

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

IT Air Conditioning Units

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Retro-Commissioning Report

Green IT Project: The Business Case

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

MEASURE MONITOR UNDERSTAND

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Power & Cooling Considerations for Virtualized Environments

Scalable, Secure. Jason Rylands

International 6SigmaDC User Conference CFD Modeling for Lab Energy Savings DCSTG Lab Temperature Setpoint Increase

Data Center Infrastructure Management ( DCiM )

Data Center Design: Power, Cooling & Management

Thermal management. Thermal management

Chiller Replacement and System Strategy. Colin Berry Account Manager

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

Environmental Data Center Management and Monitoring

Rethinking Datacenter Cooling

ANCIS I N C O R P O R A T E D

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

What s New for Data Centers in Title Jeff Stein, PE Taylor Engineering

Data Centre Air Segregation Efficiency (ASE)

ASHRAE Data Center Standards: An Overview

DCIM Data Center Infrastructure Management

Implications of Current Thermal Guidelines

Trends in Data Center Energy Efficiency

HIGH DENSITY RACK COOLING CHOICES

Green IT and Green DC

How to Increase Data Center Efficiency

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Experimental Study of Airflow Designs for Data Centers

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

ENCLOSURE HEAT DISSIPATION

Transcription:

Data Center Temperature Design Tutorial Ian Seaton Chatsworth Products

What are we going to do today? Explore a process for determining what data center operating temperature will deliver the best total cost of ownership, while meeting uptime and reliability objectives, for a wide range of boundary variables

What are we going to do today? Background of ASHRAE environmental guidelines for data centers Difference between comfort cooling and data center cooling Minimum best practices Considerations for a design without cooling economization Considerations for a design with cooling economization Considerations for a design without mechanical cooling How to weigh impact of design elements on computer equipment reliability and availability BREAK

What are we going to do today? Quiz Design element assessment practice exercise Special considerations for a building retrofit project Special considerations for an efficiency upgrade project Summary/review

Learning Objectives You will understand how to integrate ASHRAE TCO guidelines into a building design project. You will be able to articulate the differences between comfort cooling and data center cooling. You will be able to apply the different data center class temperature thresholds to the process of evaluating relative merits of competing design concepts. You will be able to de-couple inherent power usage effectiveness (PUE) errors from total building energy use estimates. You will be able to calculate the impact on computer equipment reliability of specific building design strategies.

Appendix C Detailed Flowchart for the Use and Application of the ASHRAE Data Center Classes ANSI/BICSI 002-2014 Data Center Design and Implementation Best Practices Appendix G Design for Energy Efficiency

2015 Update Thermal Guidelines for Data Processing Equipment, ASHRAE, 2012

Dry Bulb Temperatures Server Classes Recommended Allowable A1 64.4 to 80.6 59 to 89.6 A2 64.4 to 80.6 50 to 95 A3 64.4 to 80.6 41 to 104 A4 64.4 to 80.6 41 to 113

Let s Get Started Desire to Reduce TCO and Explore Higher Temperatures My data center project is? New Construction Major Retrofit Existing, looking for Efficiency Gains

NOTES FOR APPENDIX C FLOW CHARTS 1. To use the highest inlet temperatures, ensure excellent airflow segregation is in place to avoid recirculation with warmer IT outlet flows. 2. Higher temperature IT loses its primary benefit if mixed with standard IT equipment, a common cooling system must meet the most demanding IT equipment requirements. 3. Ensure no chiller option choice meets availability requirements of the IT workload needs; investigate site extreme temperature and humidity conditions and economizer uptime risk. 4. Ensure the operational and safety aspects of high hot-aisle temperatures are understood; temperature of 140⁰F to 160⁰F may be expected. 5. Ensure higher airflow for IT equipment above the recommended range is understood. Data center airflow may have to increase dramatically.

New Construction Include best practices: airflow management, containment, VFD, control cooling at IT inlet Will you have an economizer? No Yes

Economizer? No Note 1 Operate data center within recommended environmental range at IT inlet. Investigate warmer chilled water temperature. Procure Class A1 or A2 ITE going forward

NOTES FOR APPENDIX C FLOW CHARTS 1. To use the highest inlet temperatures, ensure excellent airflow segregation is in place to avoid recirculation with warmer IT outlet flows. 2. Higher temperature IT loses its primary benefit if mixed with standard IT equipment, a common cooling system must meet the most demanding IT equipment requirements. 3. Ensure no chiller option choice meets availability requirements of the IT workload needs; investigate site extreme temperature and humidity conditions and economizer uptime risk. 4. Ensure the operational and safety aspects of high hot-aisle temperatures are understood; temperature of 140⁰F to 160⁰F may be expected. 5. Ensure higher airflow for IT equipment above the recommended range is understood. Data center airflow may have to increase dramatically.

Maximize Airflow Segregation Maximize Airflow Segregation When: T i max = TSLA Then: T i min - 10⁰F = LWT

Maximize Airflow Segregation

Economizer? No Note 1 Operate data center within recommended environmental range at IT inlet. Investigate warmer chilled water temperature. Procure Class A1 or A2 ITE going forward?

Where to procure Class A1 Servers? Where to procure Class A1 servers?

Economizer? No Note 1 Operate data center within recommended environmental range at IT inlet. Investigate warmer chilled water temperature. Assess accelerating technology refresh

Chiller Efficiency at Higher Set Points Why T i min is so important Chart from ASHRAE Data Center Design and Operation Book#6: Best Practices for Datacom Facility Energy Efficiency

With economizer Will you have an economizer? No Yes 1. Do detailed TCO analysis considering hours of economization at various Ti (including added cost of A3/A4 servers and increased energy use at higher Ti 2. Evaluate acceptable reliability of higher Ti 3. Assess impact of Class A3/A4 servers 4. Define optimum Tmax

Yes, with Economizer

TCO Worksheet, with economizer Sample Scenario Label Total 500kW IT load with 100 cabinets @ average 5kW each, in 5 rows of 20 cabinets each. Seven 30 ton CRACs @ N+2 redundancy, on slab floor with chimney cabinets

TCO Analysis Steps 1.Obtain relevant outdoor temperature bin data 2.Determine (estimate) total sensible cooling load 3.Obtain economizer operating cost estimate 4.Determine cost of expediting technology refresh from legacy Class A2 servers and cost between A3 and A4 servers

Outdoor Ambient Operating Conditions http://www.ncdc.noaa.gov/oa/climate/c limatedata.html http://weatherbank.com http://www.thegreengrid.org/en/global/ Content/Tools/NAmericanFreeCooling Tool

Water-Side Economization Parallel Economizer On or Off (may limit free cooling hours) Simple pump control Dead band increases approach temperature Should be controlled by engineer Easier retrofit Better ROI for larger data centers Long, cold winter Series Economizer Partial economization adds free cooling Complex pump control Approach figured on return temperature Can be controlled by automation More complex retrofit Better ROI for smaller data centers Temperate climate

Affinity Law Fan Energy Economies 10 CRAH units with 12 Hp fan motors 797,160 kw/h @ 100% rpm 581,130 kw/h @ 90% rpm 408,146 kw/h @ 80% rpm 273,426 kw/h @ 70% rpm

Affinity Law Fan Energy Economies 2 MW data center case study example, N+2 CRAH redundancy $290,765 to operate 44 CRAHs @ 94% fan rpm $42,075 to operate 44 CRAHs @ 49% fan rpm $141,417 to operate 24 CRAHs @ 90% rpm $500,000 capital expense avoidance Year One $1,390,765 44 CRAHs and no containment $1,142,075 44 CRAHs and containment $741,417 24 CRAHs and containment

Affinity Law Fan Energy Economies 2 MW data center case study example, N+2 CRAH redundancy $290,765 to operate 44 CRAHs @ 94% fan rpm $42,075 to operate 44 CRAHs @ 49% fan rpm $141,417 to operate 24 CRAHs @ 90% rpm $500,000 capital expense avoidance Year Five $2,553,823 44 CRAHs and no containment $1,307,743 44 CRAHs and containment $1,307,086 24 CRAHs and containment

Affinity Law Fan Energy Economies 2 MW data center case study example, N+2 CRAH redundancy $290,765 to operate 44 CRAHs @ 94% fan rpm $42,075 to operate 44 CRAHs @ 49% fan rpm $141,417 to operate 24 CRAHs @ 90% rpm $500,000 capital expense avoidance Year Ten $4,087,645 44 CRAHs and no containment $1,518,116 44 CRAHs and containment $2,014,172 24 CRAHs and containment

Energy Use in Data Centers IT Energy (UPS Load) Cooling equipment Power Conversion Lighting

Cooling Requirements in Data Centers IT Load Electrical room Battery room Power substation CRAC/CRAH fan motors Power Conversion Losses Lighting Make-up air Pump motors Bypass air Building and room skin losses

Energy Use in Data Centers

Determine Sensible Cooling Load

Forecast the Life of the Data Center Source: Datacom Equipment Power Trends and Cooling Applications, 2 nd edition, ASHRAE, 2012, Table 4.1 Volume Server Power Trends

Apply Power Trends to Plan How many technology refreshes is the data center being planned to see? How much expected application growth? Plot refresh growth from point on the trend curves where you are today. Determine how much of the forecasted load to include in initial capacity.

Using CFD to fine-tune TCO Analysis 64% airflow volume < 1.5% over-supply Higher supply air temperature

Using CFD to fine-tune TCO Analysis All server inlet temperatures are within recommended environmental envelope

Server Airflow versus Temperature 1.4 = 40% increase in airflow 40% increase in airflow = 274% increase in fan energy (1.4 3 = 2.74) Example 700 watt server with 70 watts for fan energy @ 68 Fan energy increased to 191.8 watts @ 86 Thermal Guidelines for Data Processing Environments, Fourth Edition, ASHRAE Total server energy increased from 700 to 891.8 watts @ 86

Server Airflow versus Temperature Example Implications 500 servers = 840,000 kw hours additional @ 86 4 CRACs @ 0 hours = 1.5 million kw hours avoidance 4 CRACs @ ½ year = 750,000 kw hours avoidance Realistic scenario 10% of year running CRACs when ambient exceeds 86 50% of year running no CRACs 68-86 40% of year running no CRACs below 68

Server Airflow versus Temperature Convection heat transfer from a surface to an airflow is governed by the equation q = ha(t w T f ) Where q = the heat transfer rate h = the convection heat transfer coefficient A = the surface area T w = the temperature of the surface T f = the temperature of the fluid Sensible cooling CFM = 3.1W ΔT Where CFM = cubic feet per minute airflow w = watts (sensible heat load) ΔT = temperature rise in ⁰F

Server Airflow versus Temperature Hewlett-Packard research project reported at 2011 Uptime Symposium Server Inlet Total Energy Temperature Cost _ PUE 68⁰F $510 1.27 77⁰F $501 1.20 86⁰F $565 1.19 95⁰F $605 1.18

Thermal Guidelines for Data Processing Environments, Fourth Edition, ASHRAE Server Airflow versus Temperature

Server Airflow versus Temperature Hypothetical Illustrative Example One Megawatt data center 1250 800 watt servers A3 servers @ 95 F = 8% fan energy penalty (or 806.4 watts) Build new data center in Seattle

Server Airflow versus Temperature Seattle Illustrative Example with A3 Servers 3 hours a year at or above 96 3 hours a year right at 95 3 hours a year at 90-95 24 hours a year at 85-90 102 hours a year at 80-85 220 hours a year at 75-80 442 hours a year at 69-75 7864 hours at 68 F or lower

Server Airflow versus Temperature 3 hours on chiller = 0 penalty 3 hours @ 8% penalty @ $.10 per kw/h = $0.24 penalty 3 hours @ 5% penalty @ $.10 per kw/h = $0.15 penalty 24 hours @ 3.5% penalty @ $.10 per kw/h = $0.85 penalty 102 hours @ 2.5% penalty @ $.10 per kw/h = $2.57 penalty 220 hours @ 1% penalty @ $.10 per kw/h = $2.22 penalty 442 hours @.5% penalty @ $.10 per kw/h = $2.27 penalty 7864 hours @ baseline @ $.10 per kw/h = $0 penalty Total fan energy penalty for Class A3 servers in 1MW Seattle data center = $8.25

Server Airflow versus Temperature WITH ALLOWABLE MAXIMUM SERVER INLET OF 95 Total energy savings for 8757 hours without a chiller = $195,290 2013: 200% up to 700% 5 year ROI 33 to 83 week payback 2015 and following: 2.4 million % 1 year ROI 22 minute payback

Server Airflow versus Temperature San Antonio Illustrative Example with A3 Servers 9 hours a year at or above 96 4 hours a year right at 95 161 hours a year at 90-95 693 hours a year at 85-90 915 hours a year at 80-85 1657 hours a year at 75-80 1692 hours a year at 69-75 3629 hours at 68 F or lower

Server Airflow versus Temperature 9 hours on chiller = 0 penalty 4 hours @ 8% penalty @ $.10 per kw/h = $3.20 penalty 161 hours @ 5% penalty @ $.10 per kw/h = $80.50 penalty 693 hours @ 3.5% penalty @ $.10 per kw/h = $242.55 penalty 915 hours @ 2.5% penalty @ $.10 per kw/h = $228.75 penalty 1657 hours @ 1% penalty @ $.10 per kw/h = $165.70 penalty 1692 hours @.5% penalty @ $.10 per kw/h = $84.60 penalty 3629 hours @ baseline @ $.10 per kw/h = $0 penalty Total fan energy penalty for Class A3 servers in 1MW San Antonio data center = $805.30

Server Airflow versus Temperature WITH ALLOWABLE MAXIMUM SERVER INLET OF 95 Total energy savings for 8751 hours without a chiller = $195,156 2013: 200% up to 650% 5 year ROI 34 to 84 week payback 2015 and following: 24,134% 1 year ROI 1.5 day payback

Cooling Cost Roll-up, example 12kW to operate 5 cooling units @ 64% capacity 41.8kW to operate chiller @ 65 F H 2 O leaving temperature 0.06 kw for fresh air (N+1) 0.56kW for electrical room (N+1) 0.06kW for battery room (N+1) 15kW for substation (N+1, but not simultaneously running redundancy 608,119 kw hours per year $58,800 cooling budget at $0.10 per kw hour (.23 ANNUALIZED MECHANICAL LOAD COMPONENT)

TCO Calculation

TCO Calculation

TCO Calculation

The Big Worry: How to know the effect of temperature on ICT equipment reliability/life? How to know how long it is OK to operate in allowable temperature range? The X Factor 1. Baseline = 24/7 operation @ 68 F server inlet temperature 2. OEM historical data on user failure reports 3. Performance variations above, at and below baseline 68 4. Premise is that the data center temperature follows mother nature

Evaluate acceptable reliability of higher Inlet Temperatures T i Failure Rate x-factor (⁰F) Lower Bound Average Bound Upper Bound 59 0.72 0.72 0.72 63.5 0.80 0.87 0.95 68.0 0.88 1.00 1.14 72.5 0.96 1.13 1.31 77.0 1.04 1.24 1.43 81.5 1.12 1.34 1.54 86.0 1.19 1.42 1.63 90.5 1.27 1.48 1.69 95.0 1.35 1.55 1.74 99.5 1.43 1.61 1.78 104.0 1.51 1.66 1.81 108.5 1.59 1.71 1.83 113.0 1.67 1.76 1.84 Thermal Guidelines for Data Processing Environments, Fourth Edition, ASHRAE TC9.9

Implications of X - Factor If 1000 servers produced 30 failures during warranty period (3% failure rate) running all year at 68⁰F, then those servers would produce 44 failures running all year @ 90⁰F inlet temperature BUT 438 hours below 59⁰F 876 hours at 59⁰F 2190 hours @ 63.5F 3066 hours @ 68⁰F 1314 hours @ 72.5⁰ 438 hours @ 77⁰F 438 hours @ 81.5⁰F Results in 29 failures, or one fewer than running all year @ 68⁰F

Implications of X - Factor Thermal Guidelines for Data Processing Environments, Fourth Edition, ASHRAE TC9.9

Evaluate acceptable reliability of higher Inlet Temperatures Time at temperature weighted failure rate calculations for IT equipment in Chicago Inlet Temperature The X factor % of Hours 59⁰F T 68⁰F.865 68⁰F T 77⁰F 1.13 77⁰F T 86⁰F 1.335 86⁰F T 95⁰F 1.482 67.6% 17.2% 10.6% 4.6% 2011Thermal Guidelines for Data Processing Environments Expanded Data Center Classes and Usage Guidance, White Paper, ASHRAE TC9.9 Net x-factor = 0.99

TCO Preliminary Results Analysis shows positive returns? No Yes No Will you have a chiller? Yes Operate data center within recommended environmental range at IT inlet. Investigate warmer chilled water temperature. Procure Class A1 or A2 ITE Think Technology Refresh

Economizer with Chiller Notes 4,5 Yes Set economizer to provide free cooling from 59⁰F to lowest TCO Tmax, select ITE for Ti as: A1: 90⁰F A2: 95⁰F A3: 104⁰F A4: 113⁰F Ensure all data center components capable of meeting ASHRAE Class

NOTES FOR APPENDIX C FLOW CHARTS 1. To use the highest inlet temperatures, ensure excellent airflow segregation is in place to avoid recirculation with warmer IT outlet flows. 2. Higher temperature IT loses its primary benefit if mixed with standard IT equipment, a common cooling system must meet the most demanding IT equipment requirements. 3. Ensure no chiller option choice meets availability requirements of the IT workload needs; investigate site extreme temperature and humidity conditions and economizer uptime risk. 4. Ensure the operational and safety aspects of high hot-aisle temperatures are understood; temperature of 140⁰F to 160⁰F may be expected. 5. Ensure higher airflow for IT equipment above the recommended range is understood. Data center airflow may have to increase dramatically.

CRAC/CRAH Selection Cooling capacity requirement (growth + redundancy) Efficiency (EER/SEER or COP) CRAH-1w/EC EER = 58 CRAH-2wEC EER = 133 CRAH-3 centrifugal EER = 43 Close-Coupled EER = 38 CRAC-1 EER = 14 CRAC-2 EER = 11 kw per ton Chiller New Chiller Existing Reciprocating.78 to. 85.90 to 1.2 or higher Screw.62 to.75.75 to.85 or higher Centrifugal High.50 to.62 NA Centrifugal Moderate.63 to.70.70 to.80 or higher CRAC (DX) 1.0 to 2.0 1.5 to 2.5 or higher Free-cooling opportunities

Economizer with Chiller Notes 4,5 Yes Set economizer to provide free cooling from 59⁰F to lowest TCO Tmax, select ITE for Ti as: A1: 90⁰F A2: 95⁰F A3: 104⁰F A4: 113⁰F Ensure all data center components capable of meeting ASHRAE Class

Annual Hours of Operation with Mechanical Cooling and Class A2 Servers Recommended inlet temperatures Class A2 allowable inlet temperatures Economizer with Data Center Containment HA/CA Water-Side Wheel Air-Side IEC Chicago 70% 14.6% 14.6% 12.7% 10.2% 70% 0% 0.2% 0% 0% Seattle 92.5% 4% 2.9% 2.4% 0% 92.5% 0% 0.1% 0% 0% San Antonio 98% 58.4% 36.5% 33.4% 42.2% 98% 0.9% 1% 0.1% 18.6% Denver 71.9% 6.8% 13% 12% 22.2% 71.9% 0% 2% 0.7% 0%

Annual Hours of Operation with Mechanical Cooling and A3 Servers Recommended inlet temperatures Class A3 allowable inlet temperatures Economizer with Data Center Containment HA/CA Water-Side Wheel Air-Side IEC Chicago 70% 14.6% 14.6% 12.7% 10.2% 70% 0% 0% 0% 0% Seattle 92.5% 4% 2.9% 2.4% 0% 92.5% 0% 0% 0% 0% San Antonio 98% 58.4% 36.5% 33.4% 42.2% 98% 0.6% 0% 0% 0.6% Denver 71.9% 6.8% 13% 12% 22.2% 71.9% 0% 0% 0% 0%

Economizer without Chiller Notes 3,4,5 No Set economizer to provide free cooling from 59⁰F to maximum annual outdoor temperature, select ITE for Ti as: A1: 90⁰F A2: 95⁰F A3: 104⁰F A4: 113⁰F Ensure all data center components capable of meeting ASHRAE Class

NOTES FOR APPENDIX C FLOW CHARTS 1. To use the highest inlet temperatures, ensure excellent airflow segregation is in place to avoid recirculation with warmer IT outlet flows. 2. Higher temperature IT loses its primary benefit if mixed with standard IT equipment, a common cooling system must meet the most demanding IT equipment requirements. 3. Ensure no chiller option choice meets availability requirements of the IT workload needs; investigate site extreme temperature and humidity conditions and economizer uptime risk. 4. Ensure the operational and safety aspects of high hot-aisle temperatures are understood; temperature of 140⁰F to 160⁰F may be expected. 5. Ensure higher airflow for IT equipment above the recommended range is understood. Data center airflow may have to increase dramatically.

Economizer without Chiller Notes 3,4,5 No Set economizer to provide free cooling from 59⁰F to maximum annual outdoor temperature, select ITE for Ti as: A1: 90⁰F A2: 95⁰F A3: 104⁰F A4: 113⁰F Ensure all data center components capable of meeting ASHRAE Class

Final Note No single design concept is best for all situations Well executed airflow management is the foundation for everything Newer ICT equipment opens more design options Free cooling does not necessarily mean exposure to outside elements ICT procurement/specification decisions affect mechanical design and performance Mechanical design decisions affect ICT deployment Consider EVERYTHING while you are still in design phase

10 MINUTE BREAK

QUICK REVIEW

What do you remember from before the break? What are the differences between comfort cooling and data center cooling?

What do you remember from before the break? What are the ASHRAE recommended temperature thresholds?

What do you remember from before the break? What are the ASHRAE allowable temperature thresholds for class A3 servers?

What do you remember from before the break? Where can you buy Class A1 servers today?

What do you remember from before the break? How can your PUE go down and your operating costs actually increase for doing the same amount of work?

What do you remember from before the break? How do you determine if your PUE anomaly is actually a serious issue?

What do you remember from before the break? What might make a Q factor of 1.3 acceptable?

Let s apply that 1. Open file: BICSI Practice Exercise w-omaha data.xls from the provided USB device 2. Go to the Input tab 3. Enter data only in yellow shaded cells 4. N.B.: Only ONE approach temperature (not both wet and dry) 5. Use AFM factor (airflow management) on card provided 6. You can pick maximum server inlet temperature, class of servers, and type of economization

Why We re not Examining Class A1 Servers Today Class A1 Sever Chart shows fan airflow Each point is an airflow that needs to be cubed to get energy 95 F is 1.78 3 or 564% 40 watts X 564% = 225.6 watts 400 watts 40 watts + 225.6 watts = 585.6 watts 2500 servers = 1464KW total server energy = 464kW server fan energy penalty

Why We re Examining Class A2 and A3 Servers Today Class A2 and A3 charts show actual server energy Each point is a total server energy variation 95 F = 1.125 or 112.5% 400 watts X 112.5% = 450 watts 2500 servers X 450 watts = 1125kW total server energy = 125kW server fan penalty

Let s apply that 1. Open file: Omaha BICSI Practice Exercise.xls from the provided USB device 2. Go to the Input tab 3. Enter data only in yellow shaded cells 4. N.B.: Only ONE approach temperature (not both wet and dry) 5. Use AFM factor (airflow management) on card provided 6. You can pick maximum server inlet temperature, class of servers, and type of economization 7. Enter server failure experience 8. Enter actual server inlet temperature experience 9. Short report on findings

Let s apply that Points of reference: 2.0 PUE = 8,760,000 kw hours for cooling energy 1.5 PUE = 3,504,000 kw hours for cooling energy 1.25 PUE = 1,314,000 kw hours for cooling energy WRITE THAT DOWN!!!

Let s apply that GO TO WORK WHAT DID YOU DISCOVER?

The Rest of the Story Desire to Reduce TCO and Explore Higher Temperatures My data center project is? New Construction Major Retrofit Existing, looking for Efficiency Gains

Major Retrofit Include best practices: airflow management, containment, VFD, control cooling at IT inlet Yes Do you have or are you adding an economizer? Is the new space thermally and airflow segregated from existing IT? Yes No No

With Economizer and Segregated New Space Yes 1. Do detailed TCO analysis considering hours of economization at various Ti (including added cost of A3/A4 servers and increased energy use at higher Ti 2. Evaluate acceptable reliability of higher Ti 3. Assess impact of Class A3/A4 servers 4. Define optimum Tmax No Analysis shows positive returns? Yes Will you have a chiller? Yes No

With Economizer and Segregated New Space Segregated high density space with cold aisle containment and partitioned return air space

Desire to Reduce TCO and Explore Higher Temperatures My data center project is? New Construction Major Retrofit Existing, looking for Efficiency Gains

Existing Data Center, Looking for Efficiency Gains Include best practices: airflow management, containment, VFD, control cooling at IT inlet Implement best practices for airflow management, hot/cold aisle, containment, blanking panels, correct airflows, VFDs (see BICSI 002-2011, ASHRAE Datacom book series and EU CoC best practices No Do you have good airflow mgt? Rack Cooling Index & Return Temp. Index? Yes

Rack Index Over Temperature Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI), Magnus Herrlin, ASHRAE Transactions, Volume 111, Part 2 (DE-05-11-2)

Rack Index Under Temperature Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI), Magnus Herrlin, ASHRAE Transactions, Volume 111, Part 2 (DE-05-11-2)

Rack Index Under Temperature Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI), Magnus Herrlin, ASHRAE Transactions, Volume 111, Part 2 (DE-05-11-2) Ian s suggestion for an improvement target: Set your minimum recommended temperature within 3⁰F of your maximum recommended temperature and then set your minimum allowable temperature 1-2⁰F below your minimum recommended temperature and set 100% as your project target

Existing Data Center, Looking for Efficiency Gains With Effective Airflow Management Yes Correct control strategy No Do you measure/ control cooling @ IT inlet temperature? Yes Do you have an economizer? No Yes

NOTES FOR APPENDIX C FLOW CHARTS 1. To use the highest inlet temperatures, ensure excellent airflow segregation is in place to avoid recirculation with warmer IT outlet flows. 2. Higher temperature IT loses its primary benefit if mixed with standard IT equipment, a common cooling system must meet the most demanding IT equipment requirements. 3. Ensure no chiller option choice meets availability requirements of the IT workload needs; investigate site extreme temperature and humidity conditions and economizer uptime risk. 4. Ensure the operational and safety aspects of high hot-aisle temperatures are understood; temperature of 140⁰F to 160⁰F may be expected. 5. Ensure higher airflow for IT equipment above the recommended range is understood. Data center airflow may have to increase dramatically.

Final Note No single design concept is best for all situations Well executed airflow management is the foundation for everything Newer ICT equipment opens more design options Free cooling does not necessarily mean exposure to outside elements ICT procurement/specification decisions affect mechanical design and performance Mechanical design decisions affect ICT deployment Consider EVERYTHING while you are still in design phase

QUESTIONS? Ian Seaton Chatsworth Products, Inc. iseaton@chatsworth.com (512) 809-5767