Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Similar documents
Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Air-Side Economizer Cycles Applied to the Rejection of Data Center Heat

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

Data Center Energy Savings By the numbers

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

MSYS 4480 AC Systems Winter 2015

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Introducing the Heat Wheel to the Data Center

Data Center Heat Rejection

Impact of Air Containment Systems

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Welcome Association of Energy Engineers to the Microsoft Technology Center

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

Reducing Data Center Cooling Costs through Airflow Containment

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

IT Air Conditioning Units

Ecoflair Air Economizers

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Data Center Carbon Emission Effectiveness

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Efficiency of Data Center cooling

Implications of Current Thermal Guidelines

HIGH DENSITY RACK COOLING CHOICES

Optimiz Chilled January 25, 2013

Rittal Cooling Solutions A. Tropp

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

DATA CENTER LIQUID COOLING

Data Center Energy and Cost Saving Evaluation

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Total Modular Data Centre Solutions

Liebert. AFC from 500 to 1700 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Rack cooling solutions for high density racks management, from 10 to 75 kw

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Retro-Commissioning Report

Recapture Capacity for Existing. and Airflow Optimization

Precision Cooling Mission Critical Air Handling Units

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

HOURLY ANALYSIS PROGRAM 5.10

MEASURE MONITOR UNDERSTAND

Silicon Valley Power Data Center Program Rebate Application

Second Stakeholder Meeting for Nonresidential HVAC Proposals Based on ASHRAE Revisions to Waterside Economizer

Data Center Temperature Rise During a Cooling System Outage

Next Generation DCP Optimization using MV VSDs

Saving Water and Operating Costs at NREL s HPC Data Center

What s New for Data Centers in Title Jeff Stein, PE Taylor Engineering

A BSRIA Guide. Free Cooling Systems BG 8/2004 FREE COOLING SYSTEMS BSRIA By Tom De Saulles. Supported by

Indirect Adiabatic and Evaporative Data Centre Cooling

18 th National Award for Excellence in Energy Management. July 27, 2017

Indirect Adiabatic and Evaporative Data Center Cooling

IT environmental range and data centre cooling analysis

Data Center Trends and Challenges

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Liebert. EFC from 100 to 350 kw. The Highly Efficient Indirect Evaporative Freecooling Unit

High Efficient Data Center Cooling

Evaluation and Modeling of Data Center Energy Efficiency Measures for an Existing Office Building

MOLLIER DIAGRAM CAN BE IMPROVED!

Aquaflair. BREC/F and BCEC/F kw

Excellent location for Data Center Campus in Hanko

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Case study: Energy efficiency improvements in a legacy data centre. Robert Tozer Director Opera>onal Intelligence

CLIMATE. CUSTOMIZED. CyberHandler2 from 30KW to 520 KW. The indirect evaporative free-cooling Air Handling Unit

UC DAVIS THERMAL ENERGY STORAGE (TES) TANK OPTIMIZATION INVESTIGATION MATTHEW KALLERUD, DANNY NIP, MIANFENG ZHANG TTP289A JUNE 2012

Power & Cooling Considerations for Virtualized Environments

Reducing Energy Consumption with

IEC DATA CENTRE INDIRECT EVAPORATIVE COOLING SYSTEM

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

Liebert. AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability CERTIFIED HVAC-HYGIENE.

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Liebert DSE Precision Cooling System Industry-best efficiency. Increased capacity. Ensured availability. EFFICIENCY. CAPACITY. AVAILABILITY.

Next Generation Cooling

CASE STUDIES. Brevard Community College: Florida Power & Light. Cape Canaveral Air Force Station: NORESCO

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Switch to Perfection

Water cooled chiller plant (all-variable)

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Power and Cooling for Ultra-High Density Racks and Blade Servers

Munters Data Center Cooling Solutions. Simon Young & Craig MacFadyen Munters Data Centers

Trane Engineering News Chiller Plant Case Study: Variable-Primary Flow Series Chillers with Free Cooling

CLOSE CONTROL AIR CONDITIONERS DATA CENTERS DESIGN SIDE NOTES

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Liebert AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Thermal management. Thermal management

Green Data Centers A Guideline

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Using ASHRAE s Books to Save Energy. Presented by Don Beaty President DLB Associates ASHRAE Technical Committee TC 9.9

Liebert AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Transcription:

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP This case study will evaluate the feasibility of data center free cooling in the Middle East using indirect evaporative cooling (IEC). General Assumptions: 1MW Data Center IT LOAD Cooling demand at 100% IT LOAD The data center is pressurized, with no infiltration of outside air into the white space The analysis focuses on the white space only Energy consumption of cooling requirements of other spaces are not part of this case study Data center cabinets are in hot and cold aisle configuration with containment Evaporative Cooling Evaporative cooling uses water as a medium to cool the air through its evaporation. The adiabatic evaporation of water provides the cooling effect which requires less energy than the typical vapor compression system used for air conditioning. There are two types of Evaporative Cooling - Direct and Indirect. According to ASHRAE HVAC Systems and Equipment Handbook, With direct evaporative cooling, water evaporates directly into the airstream, reducing the air s dry-bulb temperature and raising its humidity. Direct evaporative equipment cools air by direct contact with the water, either by an extended wetted-surface material (e.g., packaged air coolers) or with a series of sprays (e.g., an air washer). On the other hand, according to ASHRAE: With indirect evaporative cooling, secondary air removes heat from primary air using a heat exchanger. The water does not make any contact with the primary air, in this case, the data center air. In one indirect method, water is evaporatively cooled by a cooling tower and circulated through a heat exchanger. Supply air to the space passes over the other side of the heat exchanger. In another common method, one side of an air-to-air heat exchanger is wetted and removes heat from the conditioned supply airstream on the dry side. See Figure 1. Figure 1. This illustration shows a typical packaged IEC unit for data centers 1

Operation Modes Dry Operations: When the ambient (secondary/scavenger) air dry-bulb temperature is cold enough to cool the primary (data center supply) air, the water sprayer can be switched off to save water. The primary air (completely isolated from secondary air - no mixing) is recirculated by the primary fans and passes through the cooling process in the heat exchanger. Concurrently, the secondary air absorbs the heat from the heat exchange and rejects it to the ambient air. Wet Operations: When the ambient (secondary/scavenger) dry-bulb temperature is not cold enough for dry operations, but the wet-bulb temperature is low enough for IEC without the support of the mechanical cooling, the evaporative cooler switches to wet operation. In this process, the primary air (completely isolated from secondary air - no mixing) is recirculated by the primary fans and passes through the cooling process in the heat exchanger. While on the secondary side of the heat exchanger, the outside surface is wetted by the water sprayer to evaporatively cool the secondary air. Concurrently, the secondary air absorbs the heat from the heat exchange and rejects it to the ambient air. Wet Operation supported by Mechanical Cooling: When the ambient wet-bulb temperature is not low enough to achieve the required primary (data center supply) air temperature set point, the mechanical cooling system (direct expansion, in this case) will be operated to work in tandem with the wet operations of the evaporative cooler to achieve the required primary air temperature set point. Free Cooling Hours For the purpose of this case study, we will start in the Kingdom of Saudi Arabia (KSA), particularly in the city of Riyadh. This is a good starting point because KSA has the biggest market share in the Middle East and data center industry there is starting to mature. Figure 2. Riyadh s Weather Bin Data on a Psychrometric Chart 2

The first step in evaluating the free cooling potential is to understand the weather conditions of the city and how it coincides with the ASHRAE TC 9.9 Thermal Guidelines. A psychrometric software was used to plot International Weather for Energy Calculation (IWEC) Bin Weather Data of Riyadh, KSA. Overlaid on that data is the ASHRAE TC 9.9 Thermal Guidelines Class A1 recommended and allowable limits. See Figure 2. Figure 2. Shows the weather bin data plotted on the psychrometric chart. This refers to the hourly weather data sorted into discrete groups or bins. Each bin contains the number of average hours of occurrence during a year of a particular range of weather conditions, aka the weather conditions in Riyadh, KSA. The red solid lines from the center are the ASHRAE TC 9.9 class A1 recommended and allowable limits, respectively. The solid vertical black line is the constant dry-bulb temperature line of 48 F [9 C] and the diagonal black line is the constant wet-bulb temperature line of 61 F [16 C]. The colors in the chart represent the frequency of hours within the specified dry bulb and wet bulb temperature bins in the year, with red representing higher frequency and blue representing lower frequency (see legend). The area to the left of the constant dry-bulb temperature line, 48 F [9 C], shows the free cooling potential hours of a dry operation of IEC. Between the constant dry-bulb line, 48 F [9 C], and the constant wet-bulb line, 61 F [16 C], are the free cooling potential hours of wet operation of IEC. The hours above the constant wet-bulb line of 61 F [16 C] of the psychrometric chart depicts the hours of mechanical cooling required. During these hours, evaporative cooling continues while the mechanical system supplements cooling to deliver the design supply air temperature. The constant dry-bulb and wet-bulb temperatures shown in the psychrometric chart were calculated using the equation and following assumptions in order to estimate the free cooling hours of the indirect evaporative cooler: Where: WBDE = wet-bulb depression efficiency, % t 1 = dry-bulb temperature of entering primary air (data center return air temperature), F [ C] t 2 = dry-bulb temperature of leaving primary air (data center supply air temperature), F [ C] t s = wet-bulb temperature of entering secondary air (outside ambient air temperature), F [ C] According to 2008 ASHRAE HVAC Systems and Equipment Handbook Chapter 40, an evaporative heat exchanger is capable of a 60% to 80% approach of ambient (data center return air) dry bulb temperature to the secondary (outside ambient air) airflow entering wet bulb temperature. For the purpose of this case study, t 1 = 100 F [38 C], t 2 = 77 F [25 C], WBDE = 60% and the ambient wet-bulb temperature of each time of the year were considered to further validate the results of Figures 2 & 3. In addition, it was assumed that the efficiency of the heat exchanger will be less efficient during dry operations giving an efficiency of 45%, as provided by the manufacturer. The ambient dry-bulb temperature will have to be calculated for dry operations instead of the wet-bulb temperature during wet operations. Figure 3 shows the annual hours of different operating modes in percentage, while Figure 4 shows a stacked area chart that represents the hours of operational modes per month. Based on these, a substantial reduction in power consumption was realized - approximately 82% of the year requires no mechanical cooling. 3

Figure 3. Indirect Evaporative Cooler Annual Operation mode contributions Figure 4. Indirect Evaporative Cooler Monthly Operation modes 4

Free Cooling Hours Checking Calculation of the Primary (Data Center Supply) Air Temperature In Figure 5, the temperature limits for both recommended and allowable supply air temperature limits of ASHRAE TC 9.9 Class A1 is shown in red dashed and solid lines respectively. The gold straight line is the target supply air temperature; the blue spiked line depicts the hourly supply air temperatures produced by the IEC unit. These were calculated using the same equation and assumptions that were used in calculating the constant dry-bulb and wet-build temperatures in Figure 2 and the hourly dry and wet-bulb temperatures of Riyadh, in lieu of the target supply air temperature considerations in Figure 2. Figure 5. Hourly Supply Air Temperature According to Thermal Guidelines for Data Processing Environment 3 rd Edition, ASHRAE Datacom Series 1, IT manufacturers recommend that data center operators maintain their environment within the recommended envelope for extended periods of time. Exceeding the recommended limits for short periods of time should not be a problem, but running near the allowable limits for months could result in increased reliability issues. The results in Figure 5 show fluctuations of supply air temperatures as a result of the varying capacity available through the heat exchanger. Additionally, the calculation has only considered the air volume of the primary and secondary air to be constant. In actuality, the secondary air volume is modulated to achieve the target supply air temperature in the primary air flow. More so, the energy impact of modulating the secondary fan is not articulated in this case study since the focus is to validate and estimate the free cooling hours in Riyadh, KSA. 5

Figure 6 shows the total hours of the calculated supply air temperatures throughout the year as a percentage within a range. The target supply air temperature, t 2 = 77 F [25 C], is achievable 82% of the year without mechanical cooling. Roughly 18% would exceed the desired supply air temperature which would then require mechanical cooling in order to meet the target supply air temperature. Same as Figure 5, the results in this figure assumed the primary and secondary air volume ratio is 1, so the energy savings of modulating the secondary fan is not covered in this case study. Figure 6. Annual supply air temperature contribution. Operating Cost Impact The basic operational cost of maintaining a 1MW data center environment using a traditional chilled water system and an IEC unit were compared and analyzed. For the comparison, the annual power consumption of a chilled water system composed of an air-cooled chiller, chilled water pump and computer room air handling units were compared with the IEC system. The data center is assumed to be fully loaded and static. The redundant units and ancillary equipment supporting the data center were neglected and only the active components supporting the cooling of the white space were accounted for. The operation of the mechanical cooling unit of the indirect evaporative coolers were assumed to operate when the wet-bulb temperature exceeded 61 F [16 C]. For both systems being compared, a dedicated outdoor air system (DOAS) was assumed to provide humidity control and room pressurization when supplying the code minimum of outside air. Therefore, the energy associated with the DOAS of each cancels each other out and need not be considered in this study. 6

Assumptions and calculation parameter: Indirect Evaporative Cooler No. of Indirect Evaporative Coolers 5 No. of Primary Fans 8 No. of Secondary Fans 8 No. of Water Pumps 1 Primary (Indoor) Fan Power Consumption, 13.2 kw Secondary (Outdoor/Scavenger) Fan Power 13.2 Consumption, kw Water Pump Power Consumption. kw 6 34 Ton Integral Dx System Power 18 Consumption, kw Water Consumption, m 3 /h 0.6 Chilled Water System No. of Air Cooled Chiller 1 No. of chilled water pumps 1 No. of CRAH units 5 287 Tons Air Cooled Chiller Power 407 Consumption, kw Chilled Water Pumps Power Consumption, kw 18 60 Tons CRAH unit Power Consumption, kw 13.2 Tariffs Power, $/kwh 0.069 Water, $/m 3 1.6 Figure 7. Annual power and water consumption cost comparison 7

Conclusion This case study has analyzed the available free cooling hours in Riyadh, Kingdom of Saudi Arabia using its bin weather data. Results: Increasing the target supply air temperature limit will increase the number of hours of free cooling, allowing mechanical cooling to be reduced or eliminated. Selecting a higher supply air temperature, i.e. 77 F [25 C], will improve the economics of the economizer without even considering the option of using the "allowable range." The supply air temperature results and the water consumption of the IEC may vary depending on the efficiencies of the heat exchangers as selected by the IEC manufacturer. However, it is expected that the performance of IEC units, as provided by several manufacturers, would be similar in performance to each other, and all would show significant improvements and cost savings when compared to the traditional air-cooled or water-cooled chilled water systems. Other benefits of using IEC: CRAH units and chillers may not be required. Piping, both in distance and size, is reduced, especially for IEC with integral DX system. Less equipment installation since IEC comes as a package or module. Approximately 50% less in annual cost of power and water consumption. Note that the results are based on a constant air volume of the primary and secondary fans. More savings could be realized if the modulation of secondary fans are accounted for. No mixing of outside and inside air. This eliminates the risk of introducing particulate and gaseous contaminants, which very likely would damage the IT equipment. Since the unit is in economizer mode at all times, when mechanical cooling is being used, the mechanical cooling only needs to supplement - it does not need to be sized for 100% of the load. Therefore, the size of the mechanical system is smaller (i.e. <285 tons), and the electrical infrastructure to support it is reduced in size, too. This helps reduce the cost of the installation. This case study has revealed that free cooling in Riyadh is feasible. Although not included in this report, the bin weather data of the other Middle East countries and cities have also revealed that free cooling may be feasible. Although the results seem favorable, data center free cooling is still at its infancy in the region. Other considerations that need to be examined include: Water storage. How the system will operate during dust/sand storms. Although no known cases of Legionella were recorded, water treatment may be required. IEC may not be practical for locations with higher water tariffs and some regulation restrictions. Careful cost analysis is a must. Regular maintenance of heat exchangers to mitigate fouling. After sales support and spare parts. Equipment is larger and may often need to be placed outdoors. In the long run, free cooling strategies such as IEC should be considered because they make good economic sense, even in the relative heat of the Middle East. References 1. 2008 ASHRAE Handbook HVAC Systems and Equipment. 2. Thermal Guidelines for Data Processing Environment 3 rd Edition, ASHRAE Datacom Series 1. 3. Green Tips for Data Center, ASHRAE Datacom Series 10. 4. Emerson EFC and Munters Oasis Indirect Evaporative Cooler. 8