Optimizing Cooling Performance Of a Data Center

Similar documents
CFD Modeling of an Existing Raised-Floor Data Center

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Recapture Capacity for Existing. and Airflow Optimization

Application of TileFlow to Improve Cooling in a Data Center

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Power and Cooling for Ultra-High Density Racks and Blade Servers

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Efficiency of Data Center cooling

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Thermal management. Thermal management

International 6SigmaDC User Conference CFD Modeling for Lab Energy Savings DCSTG Lab Temperature Setpoint Increase

Energy Efficient Data Centers

Reducing Energy Consumption with

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Impact of Air Containment Systems

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

ENCLOSURE HEAT DISSIPATION

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Reducing Data Center Cooling Costs through Airflow Containment

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

The Energy. David L. Moss Dell Data Center Infrastructure

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

MSYS 4480 AC Systems Winter 2015

Energy Efficiency Solutions for Your Data Center

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Improving Data Center Efficiency

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models

New Techniques for Energy-Efficient Data Center Cooling

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Energy Efficient Data Center Design. Can Ozcan Ozen Engineering Emre Türköz Ozen Engineering

Running Your Data Center Under Ideal Temperature Conditions

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

18 th National Award for Excellence in Energy Management. July 27, 2017

Airflow Management s Role in Data Center Cooling Capacity

NUMERICAL MODELLING METHOD FOR THE DESIGN OF DATA CENTRE COOLING

PERFORMANCE OF TEMPERATURE CONTROLLED PERIMETER AND ROW-BASED COOLING SYSTEMS IN OPEN AND CONTAINMENT ENVIRONMENT

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Ten Cooling Solutions to Support High- Density Server Deployment

Innovative Data Center Energy Efficiency Solutions

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Estimating Data Center Thermal Correlation Indices from Historical Data

Case Study on Cold Air Distribution of a Data Centre

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Close Coupled Cooling for Datacentres

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Temperature monitoring and CFD Analysis of Data Centre

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Rethinking Datacenter Cooling

Liebert DCW CATALOGUE

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

A Green Approach. Thermal

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

APC APPLICATION NOTE #74

Cisco Lab Setpoint Increase. Energy Efficient Data Center Demonstration Project

Google s Green Data Centers: Network POP Case Study

1.0 Executive Summary. 2.0 Available Features & Benefits

Overcoming the Challenges of Server Virtualisation

Data Center Energy and Cost Saving Evaluation

Data Center Temperature Rise During a Cooling System Outage

Measurement and Management Technologies (MMT) for more energy efficient Data Center and Telecommunication Facilities

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

APC APPLICATION NOTE #126

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

Green IT and Green DC

Retro-Commissioning Report

IT Air Conditioning Units

ECOBAY: Closed loop water cooled server cabinet for data centers

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Data Center Optimization Services. Maximize Your Data Center

Numerical Investigation of Cooling of Electronic Servers Racks at Different Locations and Spacing from the Data Center Cooling Unit

MEASURE MONITOR UNDERSTAND

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

APC APPLICATION NOTE #146

Data Center Energy Savings By the numbers

Next Generation Cooling

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

PERFORMANCE DATA SCD. SCD 12 in. x 12 in. Face Size. SCD 20 in. x 20 in. Face Size. Square Cone Diffuser

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

LCP Hybrid Efficient performance with heat pipe technology

DATA CENTER PRECISION COOLING: The Need For A Higher Level Of Service Expertise

Switch to Perfection

Transcription:

ASHRAE www.ashrae.org. Used with permission from ASHRAE Journal. This article may not be copied nor distributed in either paper or digital form without ASHRAE s permission. For more information about ASHRAE, visit www.ashrae.org. Optimizing Cooling Performance Of a Data Center Using CFD Simulation and Measurements BY AMIR RADMEHR, PH.D.; JOHN FITZPATRICK; KAILASH KARKI, PH.D., MEMBER ASHRAE In this article, we present a case study that combines computational fluid dynamics (CFD) modeling and measurements to evaluate the cooling performance of a raisedfloor data center. To improve the cooling efficiency, we propose enhancements such as equipping the blowers of computer room air-handling (CRAH) units with variable frequency drive (VFD) electric motors, adjusting the speed of the blowers to maintain a certain pressure below the raised floor, and increasing the temperature settings of the CRAH units. These enhancements were evaluated and fine-tuned using CFD modeling. After their implementation, the temperatures of the racks and energy consumption of the data center were monitored for several months. This data showed that the inlet temperatures of the racks stayed below the ASHRAErecommended maximum value and the energy consumption of the data center was reduced by 58%. The cost of the enhancement will be recovered by the saving in operating cost over 1.5 years. A large number of data centers are routinely overcooled, resulting in unnecessary increase in the energy consumption and operating cost. The reasons for overcooling include concerns, mostly unfounded, about the reliability of computer equipment, inability of the cooling infrastructure to respond to the changes in the data center, and lack of proper tools to get guidance for changes required to improve the cooling efficiency and to predict the effect of these changes. Several developments in the recent years have eliminated much of the rationale for overcooling. These developments include: A better understanding of the effect of cooling-air temperature on the performance of servers Amir Radmehr, Ph.D., is a member of the technical staff at Innovative Research, Inc. John Fitzpatrick is the director of enterprise data centers at University of Rochester. Kailash Karki, Ph.D., is a member of the technical staff at Innovative Research, Inc. 22

FIGURE 1 The CFD model of the data center. (a) Three dimensional view, (b) plan view. PDUs CRAH Units CRAH 8 CRAH 7 Row 31 Row 19 CRAH 6 CRAH 5 Racks D18 to Q18 A Racks Perforated Tiles Racks AB9 to AT9 Row 7 B CRAH 4 CRAH 3 CRAH 2 CRAH 1 Availability of control systems on cooling devices Adoption of CFD modeling for predicting airflow and temperature distribution in data centers In this study, we took advantage of these developments to improve the cooling efficiency of a data center. We used CFD to identify the cooling issues in the data center and to evaluate various enhancements. CFD modeling has been used widely in other industries since the early 1970s. It became popular for data center applications in early 2000. 1-3 Now, it has become a standard practice in both designing new data centers and resolving cooling problems and inefficiencies in existing facilities. We have used CFD simulations to propose changes in the data center and study the effect of these changes on cooling. For this simulation-based strategy to be successful, the CFD model must be validated. For this validation, we used measurements for the current (as-is) conditions in this data center. In an operating data center, there are uncertainties in the descriptions of certain inputs needed in the model. These measurements were also used to verify and fine-tune such input parameters. The Data Center The data center is a raised-floor space, with floor area of approximately 750 m 2 (8,000 ft 2 ), located in Rochester, N.Y. At the time of the study, the data center housed 175 server racks positioned in the hot aisle-cold aisle arrangement. The total IT heat load in the data center was 320 kw (1,088 kbtu/h). The space was being cooled by eight down-flow, chilled-water CRAH units working at 100% fan speed. The data center does not have a drop ceiling; therefore, the hot air returns to the CRAH units through the room. However, extension ducts are installed at the return side of the CRAH units to pull in hot air from regions closer to the ceiling, preventing this air from reaching the racks. Perforated tiles with 25% open area equipped with dampers were used to deliver the airflow to the racks. For perforated tiles in front of racks with little or no heat load, the dampers were closed. Under each rack, there was a cutout on the floor for cables. Blanking panels were used to close the open spaces between the servers inside the racks. There are 12 power distribution units (PDUs) in the data center. Underneath them, there were large openings for cables. The CFD Model The CFD model was created using the commercially available software package 4 that has been designed specifically for data centers and has an extensive library of objects, such as perforated tiles, CRAH units, and server racks, needed to build layouts. The findings of this article are general and independent of the choice of the modeling tool. The details of the methodology are given in Radmehr et al. 5.6 Figure 1 shows the layout of the CRAH units, racks, perforated tiles, and other objects in (a) three-dimensional and (b) plan views. Measurements and Validation of the CFD Results Measurements were performed for two purposes: 1. To obtain reliable inputs for the CFD analysis. These measurements included the power consumption JULY 2018 ashrae.org ASHRAE JOURNAL 23

of the racks and the airflow rates and supply temperatures of the CRAH units. 2. To validate the values calculated by the CFD model. These measurements included the flow rates of perforated tiles, the inlet temperatures of racks at various heights, and the average return temperatures of CRAH units. Details of the measurement process and techniques are given by Radmehr, et al. 5 Figure 2 shows the comparison of the measured and calculated flow rates of perforated tiles. The flow rates of the perforated tiles with closed dampers are significantly lower than the flow rates of the tiles with open dampers. Figures 3 and 4 show the comparison of the measured and calculated inlet temperatures for two sets of racks. The temperatures were measured at the height of 0.9 m (3 ft) for one set and at 1.8 m (6 ft) for the other. Figure 5 shows the comparison of the measured and calculated return temperatures of CRAH units. The results for the flow rates of the tiles and racks temperatures are similar for other rows. It can be seen that the agreement between the measured and calculated values is good, indicating that the CFD model is an accurate representation of the data center and the results are reliable. Cooling Assessment The nominal cooling capacity of each CRAH unit at 100% fan speed and at 24 C (75 F) return temperature is reported by the manufacturer to be 91 kw (309 kbtu/h). At the time of the study, all eight CRAH units were working at 100% speed, which makes the nominal cooling capacity 728 kw (2,575 kbtu/h). The total IT heat load measured at PDU units was only 320 kw (1,088 kbtu/h), which indicated that CRAH units were providing partial cooling. The cooling produced by a CRAH unit can be calculated by applying the energy equation across the unit: CoolingProduced by a CRAH unit ( ) = ρvc T T p return supply Flow Rate (m 3 /s) 0.000 E G I K M O U W Y AA AC AE AG AJ AK AM AO AQ AS Perforated Tile Label FIGURE 3 Comparison of the calculated and measured temperatures of the racks at 0.9 m for a typical row. Rack Rate Temperature ( C) Rack Temperature ( C) Return Temperature ( C) FIGURE 2 Comparison of the calculated and measured flow rates of the tiles for a typical row. 0.200 0.150 0.100 0.050 25.0 20.0 15.0 10.0 5.0 0.0 AB09 AC09 AD09 AE09 AF09 AG09 AH09 AI09 AJ09 Rack Name FIGURE 4 Comparison of the calculated and measured temperatures of the racks at 1.8 m for a typical row. 25.0 20.0 15.0 10.0 5.0 0.0 D18 E18 F18 G18 H18 I18 K18 L18 M18 Rack Name FIGURE 5 Comparison of the calculated and measured CRAH units return temperatures. 30.0 20.0 10.0 0.0 1 2 3 4 5 6 7 8 CRAH Number (1) This equation involves the air density ρ, volumetric airflow rate V, specific heat c p, and return and supply temperatures. (The return and supply temperatures were 24

CRAH TABLE 1 Cooling produced by CRAH units. FLOW RATE m 3 /s (cfm) RETURN TEMPERATURE C ( F) SUPPLY TEMPERATURE C ( F) COOLING kw (kbtu/h) 1 5.53 (11,712) 25.6 (78) 16.1 (61) 63.4 (215.6) 2 5.44 (11,520) 27.2 (81) 15.0 (59) 80.7 (274.4) 3 5.30 (11,232) 25.0 (77) 22.2 (72) 17.9 (60.9) 4 5.37 (11,376) 25.0 (77) 20.0 (68) 32.6 (110.8) 5 5.22 (11,072) 24.4 (76) 24.4 (76) 0.0 (0.0) 6 5.47 (11,600) 25.6 (78) 16.6 (65.5) 46.2 (157.1) 7 5.58 (11,824) 26.1 (79) 17.5 (63.5) 58.4 (198.6) 8 5.51 (11,680) 24.4 (76) 21.7 (71) 18.6 (63.2) TABLE 2 The split of airflow among various openings on the floor. OPENING FLOW RATE m 3 /s (cfm) Perforated Tiles 22.18 (47,000) Large Openings Under PDUs 9.91 (21,000) Cable Openings Under Racks 8.02 (17,000) Distributed Leakage Through the Floor 3.30 (7,000) FIGURE 6 The inlet temperatures of racks. Total 43.42 (92,000) Total Cooling 317.8 (1080.5) 18.0 22.3 26.5 30.8 35.0 measured at several points across the return and supply faces and their average values are used in Equation 1). The results, presented in Table 1, show that the total cooling produced by the CRAH units was 317.8 kw (1,081 kbtu/h) which matched the total IT heat load. This is an independent validation of the IT heat load. Note that the cooling produced by each CRAH unit was less than its nominal cooling of 91 kw (309 kbtu/h). Moreover, two CRAH units, Units 3 and 8, produced less than 25% of their nominal cooling, and one, Unit 5, produced no cooling. This result showed that there was a significant opportunity for improving the cooling performance and reducing the energy consumption of the data center. To provide recommendations for improvements, it was essential to know how the supplied cooling air was distributed. The CFD model provided the split of airflow among various openings on the floor. The results are shown in Table 2. More than 40% of the cooling air leaked through the large openings under PDUs and cable openings under racks. Nearly 8%, 3.3 m 3 /s (7,000 cfm), leaked through the gaps between the tiles. This is called distributed leakage and cannot be avoided. Readers interested in the distributed leakage topic are encouraged to read Radmehr, et al., and Karki, et al. 5,7 Only slightly more than 50% of the airflow was discharged through perforated tiles. Figure 6 shows the inlet temperatures of racks predicted by CFD simulation. Some of the racks received air at temperatures above 27 C (80.6 F), which is the upper limit of the recommended value by ASHRAE. 8 This was because the cooling air was not distributed according to the demand of the racks. Figure 7 shows the temperature distribution in the room at 2.13 m (7 ft), just above the FIGURE 7 Temperature distribution at 2.13 m (7 ft). Cooling Air Returning to CRAH Units 18.0 22.3 26.5 30.8 35.0 racks. It can be seen that a large quantity of the cooling air at low temperature returned to the CRAH units without participating in the cooling of the racks. Figure 8 highlights the areas where the hot air penetrated into the cold aisles through open spaces at the end of the cold aisles and between the racks. JULY 2018 ashrae.org ASHRAE JOURNAL 25

Recommendations The results produced by the CFD simulation for as-is conditions provided the insights to assess the cooling performance of the data center and propose modifications, which can be further evaluated and fine-tuned with CFD simulations. The proposed modifications and their objectives are discussed below. Balancing the Supply and Demand of Airflow for Each Cold Aisle To provide the needed airflow at the proper temperature to the racks, the supplied airflow from the perforated tiles to each cold aisle should be equal or slightly higher than the airflow demand of the racks facing the cold aisle. Moreover the supplied airflow should not leave the cold aisle without entering the servers. The CFD simulation provides the airflow supplied to each cold aisle which can be compared to the airflow demand of the racks. In order to balance the airflow for each cold aisle and make sure the supplied airflow enters the servers before leaving the cold aisle, the following modifications were recommended: 1. Closing cable cutouts under racks and large openings under PDUs to minimize the airflow leakage; 2. Replacing perforated tiles with a closed damper with solid tiles to eliminate the airflow leakage through them; 3. Rearranging perforated tiles to improve the balance between the supply and demand in each cold aisle; and 4. Placing vertical partitions to close the openings at the end of the cold aisles and the large gaps between racks. This is done to prevent supplied airflow leaving the cold aisle and to prevent penetration of hot air into the cold aisle. Addition of VFD to CRAH Units and Pressure Sensors Under the Floor The CFD simulations indicated that the blowers of the CRAH units can be operated at 75% of the rated speed and yet they provide the required airflow. A reduction in the blower speed will lower the energy consumption of the CRAH units. Although the blower speed can be adjusted manually, it is desirable that the speed responds to the changes in the data center. In this project, the blowers were equipped with variable frequency drive (VFD) and pressure sensors were installed below the raised floor. The blower speed was adjusted to FIGURE 8 Hot air penetration into cold aisles; the temperature distribution is at 1 m (3.3 ft). 18.0 22.3 26.5 30.8 35.0 maintain a certain average pressure in the underfloor plenum. An increase in the heat load is expected to be accompanied by an increase in the open area on the raised floor, either by using more open tiles or by adding new tiles. For the current blower speed, this increase in the open area will cause a reduction in the under-floor pressure. The control system will tend to maintain the pressure at the specified level and, therefore, will increase the blower speed. Thus, this control system properly responds to changes in the heat load. Since it is the static pressure under the floor that drives the airflow through the perforated tiles, the pressure sensors need to measure the static pressure and not the total pressure, which is the combination of the static and dynamic pressure. Therefore, the placement of the sensors is important. They should be placed away from the CRAH units and in areas with low velocities. Pressure sensors equipped with a perforated shield that reduces the effect of the dynamic pressure are preferred. CFD simulations were used to decide the setpoint for the underfloor average pressure, the number of pressure sensors, and their optimum locations. The required average pressure under the floor depends on the airflow demand of the equipment and the resistance characteristics of the perforated tiles. Attention needs to be paid to ensure adequate cooling for all racks with 26

the recommended average pressure under the floor. In this case, the required average pressure under the floor was calculated to be 12 Pa (0.048 inches of water). This value was chosen as the setpoint for the average sensor pressure. Since the underfloor pressure in regions away from the CRAH units is nearly uniform, we decided to use only four sensors and place them in areas where the pressure is close to the set point pressure. As discussed above, the sensors should be placed in the regions where the velocities are low. To decide the locations, we took guidance from the airflow pattern. These are shown in Figure 9. Increasing the Temperature Settings of the CRAH Units The CFD model shows that after implementing the above-mentioned modifications, there will be adequate cooling air for each rack and the variation of the inlet temperature across the face of the racks will be small. As a result, the supply temperature of the CRAH units can be increased to reduce the energy consumption of the chiller plant. FIGURE 9 The location of the sensors and the pressure/velocity fields under the floor. 1 UF Pressure (Pa) at z = 815 mm 23.2 3.3 3.3 9.8 16.3 53.8 2 3 Cooling Assessment After Modifications The CFD simulations were used to fine-tune the proposed modifications. The results shown in Table 3 and Figures 10 and 11 are generated after implementing 4 Advertisement formerly in this space. JULY 2018 ashrae.org ASHRAE JOURNAL 27

TABLE 3 The split of airflow among various openings on the floor after modifications. OPENING FLOW RATE m 3 /s (cfm) Perforated Tiles 25.48 (54,000) Large Openings Under PDUs 1.65 (3,500) Cable Openings Under Racks 1.89 (4,000) Distributed Leakage Through the Floor 3.54 (7,500) Total 32.56 (69,000) FIGURE 10 Prevention of hot air penetration into cold aisles by partitions; the temperature distribution is at 1 m (3.3 ft). 18.0 22.3 26.5 30.8 35.0 all the modifications under the Recommendations section. The CFD results indicated that the CRAH units blower speed was reduced to 75%, based on the feedback from the underfloor pressure sensors, to maintain the desired underfloor pressure of 12 Pa (0.048 inches of water). Table 3 shows the split of airflow rates among various openings on the raised floor for the modified layout. This table shows the benefit of closing the cable openings under the racks and PDUs, replacing perforated tiles with closed damper with solid tiles, rearranging perforated tiles, and using CRAC units with VFD system and pressure sensors. The airflow leakage through openings under PDUs and racks is reduced significantly while the airflow rates through perforated tiles are increased. Figure 10 shows the effectiveness of partitions in preventing the hot air penetration into the cold aisles. The CFD simulations also indicated that the thermostat setting of the CRAH units can be increased by 4.4 C (8 F) and still inlet temperatures for all racks could be maintained below the ASHRAE-recommended maximum temperature of 27 C (80.6 F). Figure 11 shows the inlet temperatures of the racks after implementing this modification along with the ones stated in the previous paragraphs. That the inlet temperatures for all racks are within the acceptable range is a significant achievement considering the total airflow supplied by the CRAH units is reduced by 25% and the return temperatures of the CRAH units are higher. These modifications inside the data center also presented opportunities for enhancements in the chiller system, such as increasing the temperature of the chilled water and reducing the speed of the pumps, which led to further reduction in energy consumption. After implementing these changes in the data center and chiller system, the energy consumption of the FIGURE 11 The inlet temperatures of racks after modifications. 18.0 22.3 26.5 30.8 35.0 cooling system was monitored over a few months. It was reduced by 58%, from 132 kw (449 kbtu/h) to 55 kw (187 kbtu/h), which resulted in $86,000 reduction in annual cooling cost. The total cost of the modifications was $135,000, which will be recovered in 1.5 years. Conclusion We were able to improve the cooling performance of the data center significantly by getting guidance from CFD simulations and field measurements. The measurements were done to validate the simulation results and 28

improve the accuracy of inputs to the CFD model. The CFD simulations were particularly instrumental in the following areas: Identifying the cause of the cooling problems; Showing how the cooling air splits among the openings on the floor; Evaluating various strategies, such as VFD on CRAH units and use of partitions; Finding the optimum locations for pressure sensors; and Fine-tuning recommended changes, such as reducing the flow rates of CRAH units and increasing their temperature settings. Each data center is unique and has its own cooling challenges. The remedies and enhancements proposed in this study may not be entirely relevant to other data centers. However, the approach based on CFD simulations and field measurements can be used in any data center to ensure efficient cooling of computer equipment and to reduce the energy consumption and operating cost. References 1. Kang, S., et al. 2000. A methodology for the design of perforated tiles in raised floor data centers using computational flow analysis. ITHERM, Proceedings, (1):215 224. 2. Schmidt, R. R., et al. 2001. Measurement and predictions of the flow distribution through perforated tiles in raised-floor data centers, Paper No. IPACK2001-15728, Proceedings of InterPack 01, The Pacific Rim/ASME International Electronic Packaging Technical Conference and Exhibition. 3. Karki, K.C., A. Radmehr, S.V. Patankar. 2003. Use of computational fluid dynamics for calculating flow rates through perforated tiles in raised-floor data centers, Int. J. of HVAC&R Research, (9):153 166. 4. Innovative Research, Inc. 2018. TileFlow Version 6.0. 5. Radmehr, A., R.R. Schmidt, K.C. Karki, S.V. Patankar. 2005. Distributed leakage flow in raised-floor data centers, IPACK2005-73273. ASME InterPack 05. 6. Radmehr, A., B. Noll, J. Fitzpatrick, K.C. Karki. 2013. CFD modeling of an existing raised-floor data center, 29th IEEE SEMI- THERM Symposium. 7. Karki, K. C., A. Radmehr, S.V. Patankar. 2007. Prediction of distributed air leakage in raised-floor data centers ASHRAE Transactions, 113(1):219 226. 8. ASHRAE. 2015. Thermal Guidelines for Data Processing Environments. 4 th Edition. Atlanta: ASHRAE. Advertisement formerly in this space. Advertisement formerly in this space. 30