Running Your Data Center Under Ideal Temperature Conditions

Similar documents
RF Code Delivers Millions of Dollars in Annual Power & Cooling Savings for CenturyLink

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Google s Green Data Centers: Network POP Case Study

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Reducing Data Center Cooling Costs through Airflow Containment

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Reducing Energy Consumption with

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Environmental Data Center Management and Monitoring

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Power and Cooling for Ultra-High Density Racks and Blade Servers

Why would you want to use outdoor air for data center cooling?

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

IT Consulting and Implementation Services

Creating an Energy-Efficient, Automated Monitoring System for Temperature and Humidity at a World-Class Data Center

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

CyrusOne Case Study About CyrusOne

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Liebert : The Right Cooling For IT Environments

Optimizing Cooling Performance Of a Data Center

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Recapture Capacity for Existing. and Airflow Optimization

Impact of Air Containment Systems

Three Pillars of Effective Disaster Recovery

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Green IT and Green DC

Data Center Energy and Cost Saving Evaluation

EXECUTIVE BRIEF Optimizing for Innovation: How Hybrid IT Outsourcing Shifts IT Focus to Innovation. At Stake

Next Generation Cooling

MSYS 4480 AC Systems Winter 2015

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

ENCLOSURE HEAT DISSIPATION

Airflow Management s Role in Data Center Cooling Capacity

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

A Green Approach. Thermal

APC APPLICATION NOTE #146

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

PRODUCT OVERVIEW. Storage and Backup. Flexible Scalable Storage Solutions for. Product Overview. Storage and Backup

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Managed Hosting Services

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Server Room & Data Centre Energy Efficiency. Technology Paper 003

Five Best Practices for Increasing Efficiency, Availability and Capacity: The Enterprise Data Center Design Guide

Design and Installation Challenges: Aisle Containment Systems

SmartCabinet Intelligent, Integrated Containment for IT Infrastructure

Data Center Airflow Management Basics: Comparing Containment Systems

Six Important Considerations When Choosing a Colocation Provider

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Rethinking Datacenter Cooling

Precision Cooling Mission Critical Air Handling Units

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Thermal management. Thermal management

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

COOLING SOLUTIONS. SmartRack Close-Coupled Cooling for IT Environments

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Improving Rack Cooling Performance Using Blanking Panels

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

2016 COOLING THE EDGE SURVEY. Conducted by Vertiv Publication Date: August 30, 2016

Mitigating Branch Office Risks with SD-WAN

Colocation Service Level Agreement

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

DATA CENTER PRECISION COOLING: The Need For A Higher Level Of Service Expertise

5 Best Practices for Transitioning from Legacy Voice to VoIP and UC&C

Energy Efficiency Solutions for Your Data Center

SmartRow Intelligent, Integrated Infrastructure in a Self-Contained Line-Up. Solutions For Business-Critical Continuity

Industry leading low energy evaporative fresh air cooling.

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

Liquid Cooling Package LCP Cooling Systems

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

ASHRAE Data Center Standards: An Overview

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Cabinet Level Containment. IsoFlo Cabinet. Isolated Airfl ow Cabinet for Raised Floor Data Centers

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

Air Containment Design Choices and Considerations

MEASURE MONITOR UNDERSTAND

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

Building Re-Tuning (BRT) Lesson 4: SOI Air-Handling Unit Part II

Data Center Energy Savings By the numbers

The Four 4R s. of Data Center Airflow Management. About Upsite Technologies

FACT SHEET: IBM NEW ZEALAND DATA CENTRE

Containment Accessories Floor Mounting Template

HIGH DENSITY RACK COOLING CHOICES

Liebert DCW CATALOGUE

Power & Cooling Considerations for Virtualized Environments

Introducing the Heat Wheel to the Data Center

Total Modular Data Centre Solutions

Data Centers. Powerful Solutions That Put You in Control. Energy Management Solutions. Where your energy dollars are used. 1

Temperature monitoring and CFD Analysis of Data Centre

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

New Techniques for Energy-Efficient Data Center Cooling

INSTALLATION INSTRUCTIONS ECONOMIZER WITH EXHAUST

Evaluation and Modeling of Data Center Energy Efficiency Measures for an Existing Office Building

Application of TileFlow to Improve Cooling in a Data Center

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Transcription:

TECHNICAL BRIEF Mastering Data Center Environmental Controls to Help Drive Your Bottom Line Running Your Data Center Under Ideal Temperature Conditions Challenge Many data centers waste energy using inefficient temperature and humidity controls within their computer rooms. At Stake Proper temperature and humidity control can extend the life of hardware and deliver significant cost savings to data centers. Solution Data centers can closely follow temperature guidelines recommended by ASHRAE and use an air containment strategy to become more energy efficient and save money. Data centers are non-stop energy hogs. They require vast amounts of power to run 365 days a year. The primary source of consumption is server hardware stacked in racks in computer rooms. The other major energy gluttons are refrigeration systems and fans used for cooling and air circulation. Responsible for almost a third of a data center s energy consumption, these systems fall under the larger umbrella of climate control. The following recommendations offer simple, practical steps you can take to lower cooling costs while maintaining your hardware. 1

Raise the Thermostat Level Did you know that increasing room temperature by one degree Celsius can save on average between two to four percent of a data center s energy bill? Updated thermal guidelines set by the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) for data processing environments have recommended raising room temperatures higher than 26.6 degrees Celsius (80 degrees Fahrenheit). Most of our control strategies at CenturyLink revolve around ASHRAE TC 9.9 Thermal Guidelines for Data Centers, said Doug Florek, Senior Lead Engineer at CenturyLink. We utilize the Class A1 recommended ranges for control of our cooling systems as well as humidity control, however most service level agreements (SLAs) are written to the allowable temperature and humidity. ASHRAE guidelines reinforce the idea that servers can perform effectively at higher temperatures. The problem is getting data center managers to buy into this concept. Equipment manufacturers all do their own independent testing to find the mean time of failure for their equipment, said Florek. Those numbers are much higher than that of the actual recommended control range. Nevertheless, if you re a data center manager and your job depends on server availability, the last thing you want to do is take chances. That s why many data centers stay in the low to mid 70 degrees range. It s better than the frozen tundra temperatures of years gone by, but it would be optimal for data centers managers to program their computer rooms to match current recommendations. 2

Air Containment One of the most effective ways to improve cooling efficiency If you re not using a roof over a row or enclosing a space, in the data center is to employ a containment strategy. With you can use vinyl strip or hard-walls to maintain the integrity containment, you separate cold airflow from the hot exhaust between rows. The vinyl material is easy to modify to air that comes from hardware. This simple separation creates accommodate various mounting conditions. a consistent temperature at the supply intake and prevents hot exhaust air from passing back through the front of the equipment. Also, you should seal off all cable penetrations within the floor and containment systems and install blanking plates in In a cold aisle, cooling equipment blows cold air (usually through unoccupied areas within and around racks. If there s an area a raised floor) into the front of computer racks situated to face where someone moved out, such as in a colocation facility, pull each other. The hot air from the servers gets discharged to the hot those perforated tiles out of the floor completely. This is much aisle and is returned to the cooling equipment (commonly through more efficient than using a damper or dialing them down. Only return grilles in the ceiling). Orienting rows in such a configuration cold aisles should contain supply registers or perforated floor utilizes energy better. Our best practice for containment is getting tiles for cold air supply; all supply air should be removed from the hottest air possible back to the cooling equipment, said areas outside of the cold aisle. Florek. But that might not be possible if your computer room air conditioning equipment is situated around the perimeter. A hot/cold aisle rack/cabinet with containment. A hot/cold aisle rack/cabinet without containment. Sensors A common challenge with containment is maintaining temperature temperature. Eighty percent of our sites have wireless sensors consistency within rows. Older rooms use sensors that hang from on every third rack on the inlet, said Florek. It lets us better the ceiling several feet above the equipment, and the readings fail control our systems and have more consistent temperature to accurately reflect the temperature of the rack below. coming into each rack. More and more companies, such as CenturyLink, deploy For full coverage, the sensors should be equidistant from one wireless sensors that can be placed directly on a rack and another in areas where the equipment is actively drawing air provide an infinitely more precise reading of the server inlet air (avoid placing sensors on inactive sections of a cabinet). Florek 3

also recommends placing each sensor slightly closer to (four to five feet above) the floor because racks tend to be loaded from the bottom and such positioning keeps them closer to the supply side of the air flow and the higher off the floor, the more likely it is you ll have high inlet temperature issues. temperatures accordingly. Such automation means energy is consumed only when needed. Traditionally, data center temperatures are based on the air returning to cooling units. This isn t very practical. The returning air is often warmer, which leads to wasted energy trying to cool down spaces that don t need it. By placing the sensor on the supply side, a manager gets a reading of the air flow in front of a piece of equipment, which is the temperature that needs to be controlled most. The idea used to be to make a data center as cold as possible, said Florek. We ve learned over the course of the last ten years that having a computer room too cold can be just as harmful to the life of equipment as too hot a temperature. Similar to the Internet of Things, wireless sensors collect and monitor information. They are automated to identify data center temperature fluctuations outside of the desired range and send messages to the cooling systems to increase or lower An example of sensor placement in hot/cold rows. Humidity Humidity concerns the moisture in the air. You need to control the humidity in your computer room in order to prevent issues such as the potential buildup of electrostatic discharges or fluctuations in temperature that can have a negative impact on the life of your hardware. Measuring humidity in data centers really means talking about relative humidity, which is directly impacted by the temperature in the air. Percent Relative Humidity (%RH) is a humidity measurement that varies with the temperature entering the rack. By raising supply air temperatures, you can reduce the dehumidification effect and use less energy. ASHRAE recommends a humidity level of 60%RH and an allowable range of between 20-80%RH. One of the biggest challenges with humidity is taking the uncontrollable outside air and converting it into controllable interior air flow. Humidifiers in data centers offer high capacity, low cost evaporative cooling. The best way to choose a humidifier is to pick a cooling system reflective of the climate of the data center building s location. Depending on the region, humidifiers may be needed minimally or not at all. In most climatic regions, humidifiers are not a critical component of a data center. If you utilize them, you do not need to install them in each of the air conditioning units. A more efficient practice would be to place one humidifier in your central air unit. A lot of data centers also use outdoor air economizers. Economizers use controllable dampers to increase or decrease the amount of air drawn into the data center, depending on the temperature outside. These tools come with high limit switches that are meant to determine if the outside air is appropriate for cooling. If not, the switch disables the dampers. You would think that cooler outdoor temperatures would signal the ideal scenario for using the economizer. But this is not always the case. In some cases the outdoor air might be too cold, and lack the required humidity to maintain ideal operating conditions calling on humidification systems and wasting both energy and water. Conversely there may be times when the return air is hotter than the outside air, but the outside air would require more cooling because of the moisture content. Humid air puts a dehumidifcation burden on the cooling coils making them work harder and waste energy. 4

Conclusion There is no one-size-fits-all method for achieving data center energy efficiency. Data center managers can, however, utilize temperature and humidity controls that deliver consistent readings and save energy by utilizing higher thresholds. Best of all, you can achieve these savings without buying new equipment or building new structures. Investigate your options and make the move to improve the climate within your data center structure and lower your energy consumption. Follow ASHRAE guidelines for temperature control. Be fanatical about containment. Keep the cold supply air separate from the hot return air to increase cooling efficiency and assure proper operating temperatures for the IT kit. Place more sensors closer to the equipment to monitor server inlet temperatures. Manage humidity to provide a stable and efficient operating environment in the data center. Research whether an outdoor air economizer makes sense in your situation. Master your environmental management to maintain IT equipment and improve energy efficiency. About CenturyLink CenturyLink, Inc. is the third largest telecommunications company in the United States. Headquartered in Monroe, LA, CenturyLink is an S&P 500 company and is included among the Fortune 500 list of America s largest corporations. CenturyLink Business delivers innovative private and public networking and managed services for global businesses on virtual, dedicated and colocation platforms. It is a global leader in data and voice networks, cloud infrastructure and hosted IT solutions for enterprise business customers. For more information visit www.centurylink.com/enterprise. Global Headquarters Monroe, LA (800) 728-8471 EMEA Headquarters United Kingdom +44 (0)118 322 6000 Asia Pacific Headquarters Singapore +65 6591 8824 Canada Headquarters Toronto, ON 1-877-387-3764 2015 CenturyLink. All Rights Reserved. The CenturyLink mark, pathways logo and certain CenturyLink product names are the property of CenturyLink. All other marks are the property of their respective owners. Services not available everywhere. Business customers only. CenturyLink may change or cancel services or substitute similar services at its sole discretion without notice. 644050815 - CM150513