Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Similar documents
SmartAisle Intelligent, Integrated Infrastructure Using Row-based Building Blocks. Solutions For Business-Critical Continuity

Powering Change in the Data Center. A White Paper from the Experts in Business-Critical Continuity TM

Liebert : The Right Cooling For IT Environments

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

SmartRow Intelligent, Integrated Infrastructure in a Self-Contained Line-Up. Solutions For Business-Critical Continuity

Liebert. icom Thermal System Controls Greater Data Center Protection, Efficiency & Insight

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Smart Solutions Intelligent, Integrated Infrastructure for the Data Center

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

Liebert DCW CATALOGUE

Energy Efficiency Solutions for Your Data Center

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

PEX+ High Efficiency, Low Operating Cost Precision Solution for Data Centers

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Two-Stage Power Distribution: An Essential Strategy to Future Proof the Data Center

PEX (R410A) High Efficiency, Green Solution Precision Cooling for Data Centers

Reducing Data Center Cooling Costs through Airflow Containment

SmartMod Intelligent, Integrated Infrastructure for the Data Center

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

SMARTAISLE Intelligent, Integrated Infrastructure for the Data Center

Liebert DSE Precision Cooling System Industry-best efficiency. Increased capacity. Ensured availability. EFFICIENCY. CAPACITY. AVAILABILITY.

MSYS 4480 AC Systems Winter 2015

Power & Cooling Considerations for Virtualized Environments

Recapture Capacity for Existing. and Airflow Optimization

Liebert PDX. The Cooling Solution for Small and Medium Data Centers. Digital Scroll. Part Load. Competent Partner Efficiency

2016 COOLING THE EDGE SURVEY. Conducted by Vertiv Publication Date: August 30, 2016

Five Best Practices for Increasing Efficiency, Availability and Capacity: The Enterprise Data Center Design Guide

Rack & Enclosure Systems

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

ENCLOSURE HEAT DISSIPATION

Next Generation Cooling

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Data Center Optimized Controls

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Running Your Data Center Under Ideal Temperature Conditions

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Google s Green Data Centers: Network POP Case Study

IT Air Conditioning Units

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Impact of Air Containment Systems

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Liebert. CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Liebert APM UPS The Industry s Highest Energy Efficiency In A Row-Based, On-line UPS. AC Power For Business-Critical Continuity

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Thermal management. Thermal management

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Liebert PDX from 15 to 120 kw

Overcoming the Challenges of Server Virtualisation

SmartCabinet Intelligent, Integrated Containment for IT Infrastructure

Liebert. DCL up to 35 kw. Modular Rack Cooling

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

DATA CENTER PRECISION COOLING: The Need For A Higher Level Of Service Expertise

84 kw, Tier 3, Direct Expansion, 937 ft 2

PLAYBOOK. How Do You Plan to Grow? Evaluating Your Critical Infrastructure Can Help Uncover the Right Strategy

Power and Cooling Design Guidelines for Network Access Rooms. A White Paper from the Experts in Business-Critical Continuity TM

Industry leading low energy evaporative fresh air cooling.

Green IT and Green DC

SMART SOLUTIONS TM INTELLIGENT, INTEGRATED INFRASTRUCTURE FOR THE DATA CENTER

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

UPS Operating Modes: A Global Standard. A White Paper from the Experts in Business-Critical Continuity

Liebert. PDX from 15 to 165 kw. Direct Expansion Solution for Small and Medium Data Centers

Knürr CoolAdd The universal retrofit solution against overheating in server racks. Rack & Enclosure Systems

Liebert. PDX and PCW Compact Perimeter Cooling Solutions

APC APPLICATION NOTE #146

Tel: AC Power for Business-Critical Continuity TM

Liebert. DCL up to 35 kw. Modular Rack Cooling

A Green Approach. Thermal

The Four 4R s. of Data Center Airflow Management. About Upsite Technologies

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Power and Cooling for Ultra-High Density Racks and Blade Servers

Liebert XD. Cooling Solutions For High Heat Density Applications. Precision Cooling For Business-Critical Continuity

Close-coupled cooling

Rethinking Datacenter Cooling

SMARTMOD Intelligent, Integrated Infrastructure for the Data Center

Efficiency of Data Center cooling

Liebert APM The Compact Row-Based UPS With Flexpower Technology TM 30kW - 300kW. AC Power For Business-Critical Continuity

Liebert. HPC-S Free Cooling Chiller Designed for the Efficient Cooling of Small Data Centers

Precision Cooling. Precision Cooling Advantages

Reducing Energy Consumption with

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Application of TileFlow to Improve Cooling in a Data Center

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Precision Cooling for Business-Critical Continuity. Liebert CRV. Efficient Cooling For IT Equipment NEW. Extended Range Available From 10 To 50 kw

Data Center Energy and Cost Saving Evaluation

Transcription:

A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Executive Summary Energy efficiency is a top and growing concern for data center professionals today. Emerson Network Power s Data Center Users Group survey respondents ranked energy efficiency their second highest concern in spring 2009. In fall 2005, it was not even among the top five. Whether driven by a mandate to save costs, reduce carbon footprint or both, efforts to reduce data center energy consumption are compromised by the continuing adoption of technologies that draw more power and generate more heat. While it is possible to take incremental steps to improve cooling energy efficiency by degrees, Emerson Network Power recommends implementing a comprehensive efficiency management platform that combines containment technology with intelligent environmental controls. This solution can deliver greater than 30 percent energy savings as well as 25 percent improvement in capacity. With an ability to cool more than 30kW per rack, it also addresses issues of heat density now and for the future. In sum, this efficiency management platform provides the highest energy efficiency currently available, supports IT growth trends, and enhances availability. Introduction Data centers consume between 8 and 35 percent of total energy used by nonmanufacturing, information-intensive companies, which makes the data center the single largest and most concentrated contributor to enterprise carbon footprint, according to the Uptime Institute. 1 The Uptime Institute also has reported that the four-year cost of a server s electricity is typically the same as the cost for the server itself and that, when the costs for additional power and air conditioning infrastructure are considered, the minimum capital expenditure for a $1,500 server is more than five times its cost. 2 These statistics underscore the win-win nature of improving energy efficiency: it is good for data center management seeking cost reductions and good for the planet. Yet achieving high levels of infrastructure energy efficiency is a considerable challenge, particularly when aiming to improve the efficiency of the cooling system, which is the second largest consumer of energy in the data center (the servers are the largest). Average heat loads in larger data centers are currently at 8kW per rack and are projected to grow significantly over the next few years as technologies that require more power and generate more heat continue to be widely adopted. Another trend that is expected to continue is wide diversity in heat loads from rack to rack, resulting in hot spots and zones throughout the data center. Imbalanced heat loads across the row challenge the ability to optimize data center cooling. Typical measures taken to dissipate the heat, such as increasing airflow, result in efficiency losses and fail to provide the proper environmental conditions, thereby threatening costly downtime. Fortunately, technologies exist that can dramatically and safely improve cooling energy efficiency. Emerson Network Power recommends a comprehensive efficiency management platform that combines cold aisle containment technology with intelligent environmental controls to deliver greater than 30 percent energy savings and 25 percent capacity increase while enhancing availability. This solution is capable of cooling 10kW to more than 30kW per rack, depending on whether 1

A comprehensive efficiency management platform that combines cold aisle containment with intelligent environmental controls can deliver greater than 30 percent energy savings and 25 percent capacity increase while enhancing availability. This solution is capable of cooling 10kW to more than 30kW per rack. internal or external cooling is used with the cold aisle containment. The basic requirement for implementing this efficient solution is that the servers must be arranged in a hot aisle/cold aisle configuration. It can be used in raised-floor data centers or data centers constructed on a slab. It is easy to retrofit in conventional cooling environments, described below, and addresses efficiency deficits inherent to the conventional cooling method. Conventional Cooling Efficiency Deficits The conventional cooling method circulates cold air from computer room air conditioning (CRAC) units via a plenum under a raised floor. The CRAC units are located outside the rack rows around the perimeter of the data center. Server racks arranged in a hot aisle/cold aisle configuration improve the efficiency of raisedfloor data centers by raising the temperature of the air returning to the CRAC unit, which allows it to operate more efficiently. However, this practice is not failsafe from an efficiency standpoint. Cool bypass air escapes through openings in the raised floor and hot air re-circulates through and around racks because there is no physical barrier to this air movement. The resulting mixed air temperatures can in some cases be unacceptably high for the servers at the top of the racks and at the end of the aisles. When users discover hot spots, they typically respond by lowering the supply air temperature, which results in overcooling the room. Users also turn on standby units, increasing the volume of air to the cold aisle. Efficiency losses result from the higher amount of air being distributed at temperatures that are too low. The most energy efficient and reliable solution is to provide focused cool air at the right temperature and volume directly to the IT equipment, and maintain those parameters even when conditions in the data center change. This requires a cold aisle containment system and dynamic adaptive control of room airflow, temperature and humidity. However, in order to achieve the significant energy efficiency gains this efficiency management platform will deliver, cooling best practices must first be implemented. Foundation for Energy Efficiency: Cooling Best Practices Implementing cooling best practices will provide baseline improvements in cooling efficiency that save money and improve data center performance. These include: Sealing moisture out of the data center, Optimizing airflow (including arranging server racks in a hot aisle/cold aisle configuration), Raising air supply temperatures while still remaining within the ASHRAE recommendations of 64.4 degrees F (18 degrees C) to 80.6 degrees F (27 degrees C), 3 Following proper operating guidelines, and Implementing system monitoring to ensure preventive maintenance. (Additional information about cooling best practices can be found in the white paper Focused Cooling Using Cold Aisle Containment, available at www.liebert.com.) Aisle Containment Improves Traditional Cooling Efficiency Beyond implementing these baseline measures, data center energy efficiency can be further improved by using a physical barrier to prevent hot and cold air from mixing. Aisle containment, as this solution is called, is possible once the server racks have been arranged in the hot aisle/ cold aisle configuration. 2

While aisle containment was a relatively unknown practice in 2008, 30 percent of respondents to a survey conducted in mid- 2009 by SearchDataCenter.com reported they had implemented an aisle-containment system. 4 The rapid adoption is not surprising given that aisle containment substantially improves cooling efficiency at a low initial cost and is easy to implement. Aisle containment increases the cooling capacity and energy efficiency of the cooling unit by ensuring that the return air temperature to the CRAC unit is high. The high return air temperature increases the cooling capacity available for cooling the sensible heat generated by the electronic equipment rather than using the energy for re-humidifying. In addition, the increased capacity, together with the separation of hot and cold air, makes it possible to cool higher heat loads per rack. Achieving Higher Efficiency and Capacity Improvements Using Cold Aisle Containment Data center professionals have the option of containing the cold aisle or the hot aisle; however, cold aisle containment (CAC) is the solution preferred by Emerson Network Power, as well as respondents to the SearchDataCenter.com survey previously mentioned. When utilized with focusedcooling technologies, CAC substantially improves data center cooling efficiency. CAC better addresses the task of separating hot and cold air while supplying cold air to the servers. It also is easily retrofitted into existing raised-floor data centers or installed on a slab. It can be implemented to allow the CRAC airflow to be controlled based on the server needs. Cold aisle containment also provides a more comfortable environment for data center personnel, and the low initial cost of cold aisle containment makes it an economical solution as well. CAC delivers substantial capacity improvements. By preventing the mixing of hot and cold air, CAC effectively captures wasted cooling effort and gives it back in the form of increased capacity. On average, CAC in a room with high-heat density racks cooled by traditional raised-floor cooling will realize an increase in supply air temperature of 10 degrees F 5. This enables CAC to manage the same heat load using fewer CRAC units. CAC can be accomplished using external cooling or internal cooling. With the external cooling method, the cooling unit is located outside the containment (typically a raisedfloor system with perimeter-located CRAC units). CAC with external cooling can typically cool 10 to 15kW heat load per rack. However, maximum capacity depends on site-specific factors such as raised floor height, rack layout and others. Using the internal cooling method, the cooling unit is located inside the containment (typically above or between the racks). CAC with internal cooling can cool more than 30kW heat load per rack. CAC with internal cooling can be deployed in a traditional raised-floor environment or on a slab. (For more information about cold aisle containment, including a detailed comparison of hot aisle and cold aisle containment, see the white paper Focused Cooling Using Cold Aisle Containment, available at www.liebert.com.) Achieving Maximum Efficiency through Intelligent Control To build on the efficiency improvements offered by cold aisle containment, Emerson Network Power introduced SmartAisle, a complete system that provides high availability along with the highest possible energy efficiency. These goals cannot be reached with CAC alone. To achieve the greatest reduction in energy consumption, The most energy efficient and reliable solution is to provide focused cool air at the right temperature and volume directly to the IT equipment, and maintain those parameters even when conditions in the data center change. 3

To achieve the greatest reduction in energy consumption, intelligent control capable of dynamically optimizing the operating parameters of the precision cooling environment is required. intelligent control capable of dynamically optimizing the operating parameters of the precision cooling environment is required. This intelligence is provided by the Liebert icom control system. Intelligent environmental controls enable a CAC structure to become a SmartAisle system with the potential to reduce energy consumption by more than 30 percent. Much of the energy reduction results from dynamic fan control of precision cooling units equipped with electrically commutated (EC) plug fans or variable frequency drive fans. Only with this dynamic fan control can the amount of efficiency gains possible with a physical containment system be maximized. (For a comparison of the efficiency of EC plug fans vs. variable frequency drive fans, see the technical note Using EC Plug Fans to Improve Energy Efficiency of Chilled Water Cooling Systems in Large Data Centers, available at www.liebert.com.) Remote sensors in both the hot and cold aisles allow the intelligent controls to ensure proper temperature and humidity to the inlet of the servers. This maximizes the availability of the servers and other IT equipment, the principle reason data center cooling is used. Then, for maximum efficiency gains, the controls also provide continuous dynamic adaptation to server requirements, demanding less fan power per kilowatt of cooling. Adaptive Airflow Management Providing the proper volume of cool air is important to maintaining the conditioned environment for the servers and other sensitive electronics. Dynamic control of fan speed helps maintain an even airflow distribution. As shown in Figure 1, discharge temperature control maintains the cooling capacity of the cooling unit. A dedicated temperature control loop repositions the cooling capacity based on the programmed temperature set-point. Fan speed control maintains the CFM required Server Rack Fan Speed Control Sensor Server Rack 75 80 F Variable Fan Speed Air circ. cooling unit 72 75 F 70 72 F Temp Control Sensor Figure 1. Even distribution of airflow is achieved with dynamic fan speed control. 4

to deliver the cool air to the inlet of the server rack. A dedicated fan speed control loop repositions the fan based on the programmed fan speed set-point. Two separate control loops allow for flexibility of sensor placement. The controls also can be fine-tuned to respond to rapid or slow changes in load and CFM requirements. When limited to fixed speed fans, the only possible energy-saving strategy is to place one of the units in standby. One of the drawbacks of this method is that the under-floor air distribution changes when the units rotate or when the standby unit is brought online. With variable fan speed, the fans can maintain the same speed as they increase and decrease capacity. This added control enables the CRAC units to easily maintain the same air distribution. Figure 2 shows the additional efficiency that can be achieved with dynamic fan speed control. In a typical raised-floor data center, cold aisle containment without integrated Liebert icom controls reduces energy consumption by 21 percent over no containment when Liebert DS precision cooling is used, and 15 percent when Liebert CW precision cooling is used. By adding intelligent control, an additional efficiency gain of 15 percent is realized. This translates into a 33 percent or a 28 percent reduction in energy consumption, depending on whether the precision cooling is a Liebert DS or Liebert CW system. While the ability to control fan speed provides the greatest energy reduction, intelligent control provides additional efficiency and availability benefits through humidity control, zone management, teamwork operations and redundancy. 75 F 85 F 92 F 89 F 94 F 97 F 54 F Precision Cooling Precision Cooling 62 F 62 F Precision Cooling 54 F 62 F 62 F Conventional Cooling Approach With CAC With CAC and Intelligent Control Compressor 69.7% 50.9% 50.4% Condenser 9.3% 9.3% 9.3% Evaporator Fan 21.0% 18.5% 7.2% Total 100% 78.7% 66.9% Savings 21% 33% Figure 2. Dynamic control provides an additional 15 percent increase in total system efficiency over cold aisle containment alone. 5

Humidity Control Dew point control is critical when considering the total energy savings of the precision cooling system. If the control does not compensate for temperature changes from set-point, a relative humidity control will add moisture to the air as the temperature increases from set-point. It will then need to remove the moisture as the cooling unit drives the temperature back to set-point, resulting in efficiency losses. Zone Management Using intelligent control, data center zones can be created and managed to allow a select group of units to work together. Zone management enables precision cooling in rooms with diverse heat loads, dividing walls and localized high density cooling. Teamwork Operation As discussed previously, heat loads in the data center have become more diverse across the server rows, and this trend is expected to continue. Without proper coordination between CRAC units, the units may operate in different modes of temperature and humidity control. For example, a unit on the north side of the room may sense low relative humidity conditions and add humidity. At the same time, a unit on the south side of the room senses high relative humidity and removes moisture from the air. The actual moisture in the air is equal, but because the measurement is a relative measurement, the higher the temperature, the lower the relative humidity. Advanced control systems can be deployed across all the precision cooling units in a room to enable the units to communicate and coordinate their operation. This prevents fighting mode and greatly increases efficiency of the cooling system. Redundancy Achieving better energy efficiency should not require compromising on availability. Implementing a comprehensive efficiency management platform ensures that required levels of availability will be maintained. The Liebert icom control system features built-in lead/lag functions to enhance system reliability by ensuring continuous operation. If an alarm occurs in a primary unit, the control system automatically activates a standby unit to maintain cooling system performance. If the room temperature or humidity level cannot be regulated by the active units, Liebert icom will automatically activate standby units to regain control of the space. While unlikely, if the master unit fails, each unit will default to its internal controls, preventing downtime. Conclusion The growing requirement to improve IT infrastructure energy efficiency is challenged by increasing heat density and diverse heat loads across rows of servers. The most energy efficient and reliable solution is to provide focused cool air at the right temperature and volume directly to the IT equipment, and maintain those parameters even when conditions in the data center change. This requires a comprehensive efficiency management platform that combines cold aisle containment technology with intelligent environmental controls. This solution can deliver greater than 30 percent energy savings as well as a 25 percent improvement in capacity. It also addresses issues of heat density now and for the future, providing substantial efficiency gains without compromising requirements for availability. 6

References 1. Uptime Institute, 2009. Welcome to Green IT 2.0, the Second Wave - Now What Are You Going to Do About It. www.uptimeinstitute.org. 2. Uptime Institute, 2009. Brill, Ken, Proceedings of EmTech@MIT 2009 Conference, reported in www.arstechnic.com, October 1, 2009. 3. 2008 ASHRAE Environmental Guidelines for Datacom Equipment - Expanding the Recommended Environmental Envelope. 4. SearchDataCenter.com, 2009. Data Center Purchasing Survey. 5. www.liebert.com, 2009. Focused Cooling Using Cold Aisle Containment, page 6. Emerson Network Power 1050 Dearborn Drive P.O. Box 29186 Columbus, Ohio 43229 800.877.9222 (U.S. & Canada Only) 614.888.0246 (Outside U.S.) Fax: 614.841.6022 EmersonNetworkPower.com Liebert.com While every precaution has been taken to ensure accuracy and completeness in this literature, Liebert Corporation assumes no responsibility, and disclaims all liability for damages resulting from use of this information or for any errors or omissions. Specifications subject to change without notice. 2009 Liebert Corporation. All rights reserved throughout the world. Trademarks or registered trademarks are property of their respective owners. Liebert and the Liebert logo are registered trademarks of the Liebert Corporation. Business-Critical Continuity, Emerson Network Power and the Emerson Network Power logo are trademarks and service marks of Emerson Electric Co. 2009 Emerson Electric Co. Emerson Network Power. The global leader in enabling Business-Critical Continuity. AC Power Embedded Computing Outside Plant Connectivity Embedded Power Power Switching & Controls DC Power Monitoring Precision Cooling EmersonNetworkPower.com Racks & Integrated Cabinets Services Surge Protection