Trading TCO for PUE? A Romonet White Paper: Part I

Similar documents
Data Center Energy Savings By the numbers

Maximising Energy Efficiency and Validating Decisions with Romonet s Analytics Platform. About Global Switch. Global Switch Sydney East

MEASURE MONITOR UNDERSTAND

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

Why would you want to use outdoor air for data center cooling?

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Interested in conducting your own webinar?

MSYS 4480 AC Systems Winter 2015

Introducing the Heat Wheel to the Data Center

Management Update: Storage Management TCO Considerations

Indirect Adiabatic and Evaporative Data Centre Cooling

Indirect Adiabatic and Evaporative Data Center Cooling

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Air-Side Economizer Cycles Applied to the Rejection of Data Center Heat

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

IT environmental range and data centre cooling analysis

Data Center Energy and Cost Saving Evaluation

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Data Centre Optimisation Services

84 kw, Tier 3, Direct Expansion, 937 ft 2

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Green IT and Green DC

Server Consolidation for Oracle E-Business Suite with IBM Power Systems Servers:

Data Centers. Powerful Solutions That Put You in Control. Energy Management Solutions. Where your energy dollars are used. 1

HP FlexDC A new approach to industrialized IT. Business white paper

Software VIRTUAL BRIEFCASE REPORTING ENERGY DASHBOARDS ALARM/NOTIFICATION POINT VIEW CONSUMPTION VIEW

Google s Green Data Centers: Network POP Case Study

18 th National Award for Excellence in Energy Management. July 27, 2017

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

Munters Data Center Cooling Solutions. Simon Young & Craig MacFadyen Munters Data Centers

Welcome Association of Energy Engineers to the Microsoft Technology Center

DCIM Data Center Infrastructure Management Measurement is the first step

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Shared infrastructure: scale-out advantages and effects on TCO

Return On Investment (ROI) for IP Communications

Retro-Commissioning of Data Centers

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

CASE STUDIES ENERGY EFFICIENT DATA CENTRES

IBM Storwize V7000 TCO White Paper:

OpEx Drivers in the Enterprise

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Symantec Data Center Transformation

Nominee: Prefabricated, Pre Assembled Data Centre Modules by Schneider Electric

Virtualization and consolidation

Server room guide helps energy managers reduce server consumption

COST EFFICIENCY VS ENERGY EFFICIENCY. Anna Lepak Universität Hamburg Seminar: Energy-Efficient Programming Wintersemester 2014/2015

9.6 MW, Traditional IT and OCP-compatible, Indirect Air Economizer, m 2

Data Center Carbon Emission Effectiveness

Liebert. icom Thermal System Controls Greater Data Center Protection, Efficiency & Insight

Liebert. EFC from 100 to 350 kw. The Highly Efficient Indirect Evaporative Freecooling Unit

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Shared Infrastructure: Scale-Out Advantages and Effects on TCO

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

High Efficient Data Center Cooling

1008 kw, Tier 2, Chilled Water, ft 2

Prefabricated Data Center Solutions: Coming of Age

DC Energy, Efficiency Challenges

Saving Energy with Free Cooling and How Well It Works

Aquaflair. BREC/F and BCEC/F kw

DATA CENTER LIQUID COOLING

How High-Density Colocation and Enhanced Visibility Improve Business Performance

Committed to Energy Efficiency

Running Your Data Center Under Ideal Temperature Conditions

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Retro-Commissioning Report

Ecoflair Air Economizers

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Reducing Data Center Cooling Costs through Airflow Containment

IEC DATA CENTRE INDIRECT EVAPORATIVE COOLING SYSTEM

Green IT Project: The Business Case

How green is green? The environmental benefits of using Diesel Rotary UPS systems to support critical loads in the datacenter. Summary.

Data Centre Solutions Expertly Engineered APC Management Software

Your Data Center is Everywhere. Unified Computing System Data Center Campaign Overview Marketing Cheat Sheet

Moving From Reactive to Proactive Storage Management with an On-demand Cloud Solution

A separate CRAC Unit plant zone was created between the external façade and the technical space.

Liebert. AFC from 500 to 1700 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

DCIM Solutions to improve planning and reduce operational cost across IT & Facilities

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Green Computing: Realities and Myths

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

DATA CENTER COLOCATION BUILD VS. BUY

TECHNICAL OVERVIEW ACCELERATED COMPUTING AND THE DEMOCRATIZATION OF SUPERCOMPUTING

Precision Cooling Mission Critical Air Handling Units

Data Center Lifecycle and Energy Efficiency

CIO Forum Maximize the value of IT in today s economy

Data Center RFP Template

E e n rgy g y St S ud u y y Pr P og o r g ams December 2011

200 kw, Tier 3, Chilled Water, 5250 ft 2

Factsheet. Power Generation Service Energy efficiency in power generation and water

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Total Cost of Ownership: Benefits of the OpenText Cloud

Transcription:

Trading TCO for PUE? A Romonet White Paper: Part I

Executive Overview Most data center vendors claim to improve Power Usage Effectiveness (PUE). The problem with this is that focusing on reducing PUE makes little financial or environmental sense for many enterprises and multi-tenant data center (MTDC) operators. Most organizations lack accurate operational and financial data to effectively analyze, model and predict how their facilities should, and actually do, behave or compare with others. We demonstrate this with one data center design that, when modeled using Romonet s platform, actually showed no financial or environmental benefits, despite a reduction in PUE. The initial $500,000 investment would actually deliver an overall lifetime loss of $400,000. It is critical to understand the accuracy of any TCO claim, terms such as total cost of ownership (TCO) and return on investment (ROI) mean nothing when analyses are based on weak proxy indicators, e.g. free cooling hours or carefully selected edge cases. Many published analyses contain claims which are based on inaccurate or irrelevant estimations that yield lower than claimed or no overall savings in the real world. Modeling your data center should not be based on guesswork, as Romonet explains in this paper. The Example Data Center So, how was the aforementioned loss of $400,000 identified? The data center modeled was: A new build 1.2MW IT load facility according to Tier III standards Direct outside air economizers and two modern IT equipment environmental ranges Good quality components, reasonable balance of capital cost and total cost of ownership The what if scenario was: should we add adiabatic 1 cooling capability to the Air Handling Units (AHU)? In theory, this enables the data center to operate without mechanical cooling for more of the year. The instinctive response to the proposed change is: yes, free cooling means better energy efficiency, especially in an arid climate such as New Mexico, as long as water costs are controlled. The issue is that the adiabatic components are expensive to purchase and maintain. These are the challenges data center owners meet on a daily basis. Should we use noncontained air flow or contained air flow (Appendix 1)? What is free cooling really worth to us? Is the climate too hot or too dry? It is common to assess the performance of cooling initiatives by comparing how many free cooling hours per year are achievable. There are many rules of thumb about x% savings for each degree change in temperature or mistaking free cooling hours as a proxy for cost savings. The table and graph on the following page show how much of the year our site would spend in full free cooling, full mechanical cooling or in partial free cooling with mechanical top-up. 1 Adiabatic evaporative cooling and humidification

Condition Normal 75 F Adiabatic 75 F Normal 58 F Adiabatic 58 F Full free cooling (no compressor) 30% 96% 21% 69% Partial free cooling (part compressor) 14% 2% 20% 15% Full Mechanical Cooling 56% 2% 59% 16% Calculations, Not Assumptions With these results in mind, surely the adiabatic option is the correct choice due to its significant increase in free cooling hours compared to the standard outside air system? The answer lies in accurate predictive analysis. To evaluate the potential cost savings from the cooling strategies, the following conditions were used: A base power cost of $0.058 / kwh IT kw load 1,000 kw average Cooling overhead fudge factor 25% overhead to approximate the cost of cooling each kw of IT load The expected operational cost savings were: Normal 75 F Adiabatic 75 F Normal 58 F Adiabatic 58 F Mech. Cooling Hours 6,090 hours 380 hours 6,929 hours 2,747 hours Mech. Cooling kwh 1,522,500 95,000 1,732,250 686,750 Mech. Cooling $ $88,305 $5,510 $100,471 $39,832 Predicted Annual Saving $82,795 $60,639 Predicted 10 Year Saving $827,950 $606,390

Simple mathematics aside, despite being derived using a reasonable formula, these estimates are completely incorrect. This is partly because of factors such as climate interactions and varying mechanical compressor efficiency. More importantly, it is because they are based on an irrelevant and deeply misleading metric. PUE Understanding Why This leads us to more detailed analyses. The charts below show full hourly simulations of both normal (red) and adiabatic (blue) designs operating at both air supply temperatures. 2 The second chart shows at 58 F, even with adiabatic cooling, the data center relies heavily on mechanical refrigeration. In the summer, the air is too hot or humid. The non-adiabatic option is fairly constant in its use of mechanical cooling, both during winter and summer. Water: The Precious Resource Predicted water consumption is as critical to many operators now as energy profiling. The acquisition, logistics, filtration, storage and pricing of water is crucial to controlling operational costs in the data center. To meet the minimum IT humidity and temperature targets the humidifier sections of the adiabatic systems must use water, whenever outside air is used this humidity must be added constantly unlike a traditional, recirculating air system. Next, the water consumption cost calculations 3 were modeled against a reasonable sample price for the region ($22 per 1,000 gallons). The first chart shows that with adiabatic cooling, the PUE is almost flat for the whole year, meaning almost no mechanical cooling. Without the adiabatic option, there is more mechanical cooling, but predominantly in the winter when outside air is too dry to be supplied to the data hall, not in the summer where you might normally expect. The results (Appendix 2) found the 75 F adiabatic option cost $17,805 and the 58 F option $9,170. When this is applied to an accurate prediction methodology, the real financial outcomes become much clearer. 2 These simulations take into account cooling load, varying DX compressor performance; note - for readability, the PUE axis starts at 1.0, not 0. 3 Water consumption calculated by determining water requirement to meet change in air moisture. 50% overhead added to account for water lost from flushing and other processes (conservative estimation).

When The Savings Don t Mount Up Given the low water costs, you would expect there is still hope for savings from the adiabatic investment. So how did we arrive at the conclusion that the operational investment would actually cost the data center $400,000? As shown below, the small reduction in operating energy costs cannot offset the initial capital investments. Despite deliberately selecting a climate to favor the engineering decision, there was still no financial argument to support the additional capital expenditure. First we evaluated both adiabatic options over a 10-year period with a 7% discount rate and flat water and power costs for the period. In fact, if price inflation for power and water were included, the losses would have been even larger. Adiabatic 75 F Adiabatic 58 F Capital Cost 10 Year PV Saving $500,000 -$415,000 $500,000 -$421,000 Capital cost differences are next. The additional cost to fit the AHUs with adiabatic sections is $500,000. A further $10,000 per annum is added for maintenance and treatment. The IT power draw is modeled as rising for the first four years from 250 kw to 1MW, before remaining at 1MW for the remaining years. The average utility energy cost is $0.058 per kwh. Whether building, expanding, consolidating, acquiring or managing a data center, next time you have a financial or operational decision to make, do not rely on speculation to deliver your business outcomes. Instead ensure you model with accuracy and the correct data.

Appendix Appendix 1 ASHRAE Class A1 Ranges and Control Boundaries Parameter Non-Contained Air Flow Contained Air Flow Max IT Inlet Temperature 21 C / 70 F 24 C / 75 F Supply Air Temperature 14.5 C / 58 F 24 C / 75 F Minimum Humidity 20% RH at 21 C 20% RH at 24 C Maximum Humidity 17 C dew point (~75% RH) 17 C dew point (~65% RH) Appendix 2 Water Consumption Calculation Methodology Adiabatic 75 F Adiabatic 58 F Humidification Hours 4,657 3,376 Annual Liters (H2O) 1,551,000 696,000 Adiabatic Cooling Hours 1,226 2,122 Annual Liters (H2O) 493,000 357,000 Total Annual H2O 2,045,000 1,053,000 Annual H2O (Incl. Losses) 3,067,000 1,580,000 Annual Water Cost $17,805 $9,170 About Romonet Established in 2006, Romonet provides the data center industry s only end-to-end, cloud-based management platform built on a native Big Data architecture. This combination of modeling, simulation, financial and infrastructure performance services provides customers with the capability to accurately provision, predict, model and control their owner-operated, leased and public cloud estates, associated natural resources and their immediate and long-term capacity. Romonet s platform is the simplest commercially viable method to enable the CFO and CIO to accurately understand the most strategic asset in a company s portfolio. This value extends through to the engineering teams responsible for maintaining a data center s availability, quality of service and performance management. www.romonet.com info@romonet.com +1 415 578 0049 +44 20 8256 0250