Application of TileFlow to Improve Cooling in a Data Center

Similar documents
Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

CFD Modeling of an Existing Raised-Floor Data Center

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Optimizing Cooling Performance Of a Data Center

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

ENCLOSURE HEAT DISSIPATION

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Recapture Capacity for Existing. and Airflow Optimization

Improving Data Center Efficiency

New Techniques for Energy-Efficient Data Center Cooling

A Green Approach. Thermal

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Rethinking Datacenter Cooling

Temperature monitoring and CFD Analysis of Data Centre

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Liebert DCW CATALOGUE

Reducing Energy Consumption with

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Ten Cooling Solutions to Support High- Density Server Deployment

Innovative Data Center Energy Efficiency Solutions

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Power and Cooling for Ultra-High Density Racks and Blade Servers

CRITICAL FACILITIES ROUNDTABLE HEAT REMOVAL IN DATA CENTERS ENGINEERED SOLUTIONS

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Energy Efficient Data Centers

Google s Green Data Centers: Network POP Case Study

84 kw, Tier 3, Direct Expansion, 937 ft 2

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

APC APPLICATION NOTE #74

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Close Coupled Cooling for Datacentres

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

1.866.TRY.GLCC WeRackYourWorld.com

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

Reducing Data Center Cooling Costs through Airflow Containment

Total Modular Data Centre Solutions

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models

WIRELESS ENVIRONMENTAL MONITORS

Optimizing data centers for high-density computing

Improving Rack Cooling Performance Using Blanking Panels

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

1.0 Executive Summary. 2.0 Available Features & Benefits

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

1.0 Executive Summary

Great Lakes Product Solutions for Cisco Catalyst and Catalyst Switches: ESSAB14D Baffle for Cisco Catalyst 9513 Switch

Design and Installation Challenges: Aisle Containment Systems

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

DATA CENTER PRECISION COOLING: The Need For A Higher Level Of Service Expertise

Overcoming the Challenges of Server Virtualisation

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Thermal management. Thermal management

Next Generation Cooling

Data Center Energy and Cost Saving Evaluation

Underfloor v. Overhead Cabling

Energy Efficient Data Center Design. Can Ozcan Ozen Engineering Emre Türköz Ozen Engineering

Improving Data Center Efficiency

The Energy. David L. Moss Dell Data Center Infrastructure

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Airflow Management s Role in Data Center Cooling Capacity

Power & Cooling Considerations for Virtualized Environments

Impact of Air Containment Systems

IBM Integrated Server Farm (ISF)

APC APPLICATION NOTE #114

Data Center Airflow Management Basics: Comparing Containment Systems

Environmental Data Center Management and Monitoring

Case Study on Cold Air Distribution of a Data Centre

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

A Scalable, Reconfigurable, and Efficient High Density Data Center Power Distribution Architecture

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Partner Introduction

Running Your Data Center Under Ideal Temperature Conditions

Reverse Flow Cabinet. ContraFlo Cabinet. Reverse Flow Cabinet for Raised Floor Data Centers

Server Thermal Considerations to enable High Temperature Ambient Data Center Operations

Data Center Optimization Services. Maximize Your Data Center

INNOVATE DESIGN APPLY.

Data Center Temperature Rise During a Cooling System Outage

THE ROLE OF INFRARED TESTING AT DATA CENTERS

INNOVATE DESIGN APPLY.

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

Transcription:

Case Study Application of TileFlow to Improve Cooling in a Data Center 3025 Harbor Lane North, Suite 300 Plymouth, MN 55447 USA Copyright 2004 Innovative Research, Inc.

2 Introduction A large insurance company was experiencing cooling problems in its data center. Although the total amount of cooling air supplied by the computer room airconditioning (CRAC) units exceeded the amount required to cool the total heat load, hot spots developed in many racks. These racks had heat loads up to 3 kw. The company also wanted to add ten new 6 kw racks. The facilities manager approached Innovative Research, Inc. (IRI), a company specializing in airflow modeling in data centers, to get guidance for improving the cooling of the existing racks and for placing the new racks. IRI used its airflow modeling software package TileFlow for constructing a computer model of the data center. This model was used to calculate the airflow rates for the existing layout, to investigate available options for improving the cooling, and to determine appropriate locations for the new racks. TileFlow TileFlow uses the technique of computational fluid dynamics to calculate the velocity and pressure distributions under the raised floor and the airflow rates through various openings on the raised floor. A TileFlow model requires as input the geometry of the plenum and the details of the CRAC units, perforated tiles and other openings like the cable cutouts, and the under-floor obstructions like pipes and cable bundles. TileFlow has been extensively validated using airflow measurements in real-life as well as laboratory-scale data centers; the predicted airflow rates are generally within 10% of the measured values. Results for the Current Floor Layout IRI personnel surveyed the data center to collect the input required to construct a TileFlow model. The perforated tiles have open area of 25%. Each rack has one perforated tile in front of it. A large number of panels have cable openings with open areas ranging from 3% to 43% of the panel area. The under-floor plenum contains many pipes and cable bundles. The current layout of CRAC units, perforated tiles, and cable openings is shown in Fig. 1. A sample of underfloor obstructions is shown in Fig. 2. Figure 1. Current layout of the data center floor.

3 A color map of the calculated airflow rates through all openings on the raised floor is shown in Fig. 3. The flow rates vary from 30 cfm to 970 cfm. The largest flow rates are for the cable openings with 43% open area. For perforated tiles, the flow rates range from 50 to 450 cfm. The detailed results show that although the CRAC units supply 107,100 cfm of cooling air, the total airflow from the perforated tiles is only 47,700 cfm. Of the rest, 2,410 cfm leaks through seams between the panels and 56,990 cfm escapes through the cable openings. Thus, currently only 44% of the cooling air is being used for direct cooling of the computer equipment. Figure 2. Under-floor obstructions. Figure 3. Airflow rates for the current floor layout. The maximum acceptable heat load in a rack depends on the cooling capacity of the cooling air it receives. The cooling capacities of airflow through perforated tiles in two rows are shown in Fig. 4. The cooling capacities range from 1 to 3 kw, with most of them around 2 kw. However, many racks have heat loads larger than 3 kw. These racks do not receive sufficient cooling air and are likely to develop hot spots. Temperature measurements in the data center verified

4 that TileFlow analysis correctly identified the racks for which the exhaust temperatures are high. Table 1. Effect of sealing the cable openings Before After Airflow through perforated tiles (cfm) 47,700 79,100 Airflow through cable openings (cfm) 56,990 23,900 Average static pressure under the raised floor (in wg) 0.019 0.053 By sealing the cable openings, the total airflow through the perforated tiles increases by nearly 66%. The revised cooling capacities for two rows of perforated tiles are shown in Fig. 5. Now the cooling capacities throughout the data center are 3 kw or higher. Thus, by sealing the cable openings, all racks receive adequate amount of cooling air and hot spots are eliminated. Figure 4. Cooling capacities of airflow at rows 7 and 20. (See Fig. 3 for the locations of these rows.) How to Improve Cooling? The TileFlow model for the current layout revealed that nearly 53% of the cooling air escapes through the cable openings without providing direct cooling to the equipment in the racks. The first and most economical step is to seal the large cable openings so that the airflow rates through the perforated tiles increase. To establish the benefit of sealing the cable openings, the TileFlow model was modified and the open areas of all cable openings were reduced to 3%. The effect of sealing the cable openings on the airflow rates and the static pressure under the raised floor is highlighted in Table 1. Figure 5. Cooling capacities after sealing the cable openings.

5 Placement of New 6 kw Racks To cool a heat load of 6 kw, we need approximately 900 cfm of cooling air, assuming an acceptable temperature rise of 20 F across the servers. For the present layout (with sealed cable openings), the average flow rate through a 25% open tile is around 500 cfm. Therefore, we need to provide two 25% open tiles in front of each 6 kw rack. Thus, for ten such racks, we need 20 additional perforated tiles. For ease of airflow management, all new racks should be located in one area where the under-floor pressure is high. One possible region is the mid section of the left half of the data center, shown in Fig. 6. that is, all new racks receive adequate amount of cooling air. The cooling capacities for rows 7 and 20 are shown in Fig. 8. The cooling capacities at these and other locations are still around 3 kw. Thus, the addition of new racks does not compromise the cooling of any of the existing racks. Figure 7. Cooling capacity of airflow received by 6 kw racks. Figure 6. Suggested location for 6 kw racks. Another TileFlow run was made to calculate airflow rates and cooling capacities for the suggested layout. The total cooling capacities of the cooling air received by the new racks are shown in Fig. 7. (Note that each rack receives cooling air from two tiles in front of it.) The cooling capacities at all locations exceed 6 kw, Figure 8. Cooling capacity of airflow at rows 7 and 20.

6 Summary In this data center, although the total amount of cooling air supplied by the CRAC units exceeds the cooling air needed to cool the total heat load in the data center, many racks were not cooled properly. A TileFlow model of the data center revealed that nearly 53% of the cooling air escaped through the cable openings, without providing direct cooling to the computer equipment. This model also showed that a majority of racks with heat loads larger than 2 kw did not receive adequate cooling air and, thus, developed hot spots. The most economical way to improve the cooling is to seal the large cable openings so that more cooling air is delivered through the perforated tiles. A TileFlow model showed that this approach would produce a 66% increase in the airflow through perforated tiles and would provide sufficient cooling airflow for all racks in the data center. The TileFlow model also indicated that two 25% open tiles would be required to cool a 6 kw rack. The current layout was modified to include 20 new perforated tiles for ten 6 kw racks. TileFlow analysis was used to check the amount of cooling air for the new racks and the impact of the addition of new perforated tiles on the flow rates for existing racks. This analysis shows that the new racks receive sufficient cooling air without compromising the cooling of existing racks. Benefits of Using TileFlow This case study illustrates the use of TileFlow as a diagnostic and design tool. In the first phase of this project, TileFlow was used to identify the cause of the cooling problem in the existing layout. In the second phase, it was used to evaluate a possible solution to the cooling problem. In the final phase, it was used to modify the layout such that the new racks as well as existing racks are properly cooled. About Innovative Research, Inc. Innovative Research, Inc. (IRI) based in Plymouth, MN, provides computer software and consulting services in the field of fluid flow, heat transfer, turbulence, combustion, and related processes. The company s COMPACT programs for computational fluid dynamics are widely used in industry and in universities. For the solution of complex flow networks, the company has developed MacroFlow, which has extensive applications in many different industries. In 2001, IRI introduced TileFlow to calculate airflow distribution in raised-floor data centers. TileFlow is now accepted as the modeling tool of choice by the data center community. It is being used by equipment manufacturers, data-center designers and architects, and facilities engineers. In addition to developing engineering software packages, we also offer consulting services using our commercial/customized software packages. In the recent years, we have done several projects involving data centers. In these projects, we have provided recommendations for designing new data centers, upgrading existing facilities, placing high-density racks, and improving the overall airflow distribution. Scientific Contribution of Innovative Research, Inc. We have made several important contributions to the data center community. These include Our identification that the airflow distribution is the key to proper cooling of a data center, Our computational model to predict the underfloor flow and thus calculate the flow rates through the perforated tiles, and Our software product TileFlow that incorporates the model. By using TileFlow to design and reconfigure data centers, you will be able to assure proper cooling of your computers and save on capital and operating costs of the cooling equipment.