Increasing Data Center Efficiency through Metering and Monitoring Power Usage

Similar documents
MSYS 4480 AC Systems Winter 2015

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Google s Green Data Centers: Network POP Case Study

1.0 Executive Summary. 2.0 Available Features & Benefits

Green IT and Green DC

November/December 2011

Liebert DCW CATALOGUE

1.0 Executive Summary

84 kw, Tier 3, Direct Expansion, 937 ft 2

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

MEASURE MONITOR UNDERSTAND

Facilities Design for High density Data Centers

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

CA ecometer. Overview. Benefits. agility made possible. Improve data center uptime and availability through better energy management

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Green Computing: Datacentres

DATA CENTER COLOCATION BUILD VS. BUY

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

Building Management Solutions. for Data Centers and Mission Critical Facilities

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

DCIM Data Center Infrastructure Management Measurement is the first step

DCIM Data Center Infrastructure Management

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Green Computing: Datacentres

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Powering Change in the Data Center. A White Paper from the Experts in Business-Critical Continuity TM

CRITICAL INFRASTRUCTURE ASSESSMENTS

APPLYING DATA CENTER INFRASTRUCTURE MANAGEMENT IN COLOCATION DATA CENTERS

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Efficiency of Data Center cooling

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Intel vpro Technology: Proven Value in Four Use Cases

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Virtualization and consolidation

Data Center Infrastructure Management (DCIM) By Jesse Zhuo

RF Code Delivers Millions of Dollars in Annual Power & Cooling Savings for CenturyLink

The Evolving Data Center

Modius OpenData v3.3. Product Overview of Modius OpenData for Unified Data Center Monitoring & Analysis

18 th National Award for Excellence in Energy Management. July 27, 2017

Green Computing and Sustainability

Server room guide helps energy managers reduce server consumption

The Energy. David L. Moss Dell Data Center Infrastructure

Calculating Total Power Requirements for Data Centers

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Rack & Enclosure Systems

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

ENCLOSURE HEAT DISSIPATION

Data Center Trends: How the Customer Drives Industry Advances and Design Development

STATE DATA CENTER DRAFT PROPOSAL. STATE OF OREGON DATA CENTER 28 October 2016

Switch to Perfection

Optimize Your Data Center. Gary Hill VP, Business Development DVL Group, Inc.

Engineered Energy Solutions An Overview. Controls, Automation and Energy Conservation

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

HPE Energy Efficiency Certification Service

Reducing Data Center Cooling Costs through Airflow Containment

PLAYBOOK. How Do You Plan to Grow? Evaluating Your Critical Infrastructure Can Help Uncover the Right Strategy

Green Data Centers A Guideline

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

5.2 MW, Pod-based build, Chilled Water, ft 2

1008 kw, Tier 2, Chilled Water, ft 2

Newave s approach for protecting mission critical application

Silicon Valley Power Data Center Program Rebate Application

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

Welcome Association of Energy Engineers to the Microsoft Technology Center

Optimizing Cooling Performance Of a Data Center

Ft Worth Data Center Power Metering for Real Time PUE Calculation. Scope of Work

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Data Centre Energy & Cost Efficiency Simulation Software. Zahl Limbuwala

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

Technology for Constructing Environmentally Friendly Data Centers and Fujitsu s Approach

DCIM Solutions to improve planning and reduce operational cost across IT & Facilities

Data Centers. Powerful Solutions That Put You in Control. Energy Management Solutions. Where your energy dollars are used. 1

Extending Energy Efficiency. From Silicon To The Platform. And Beyond Raj Hazra. Director, Systems Technology Lab

Data Center Lifecycle and Energy Efficiency

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Energy Action Plan 2015

Cisco EnergyWise Optimize and Cost Saving. Traditional IT Power Management

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

EP s approach to Power & Energy efficiency in DCs & paradigms

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Data Center Infrastructure Management ( DCiM )

SMARTAISLE Intelligent, Integrated Infrastructure for the Data Center

Eaton Open Data Centre Solutions. Kari Koli

Liebert. icom Thermal System Controls Greater Data Center Protection, Efficiency & Insight

HP Power Calculator Utility: a tool for estimating power requirements for HP ProLiant rack-mounted systems

Checklist for building an edge data center

White Paper: Data Center Capacity Management Provides New Business Opportunities & Efficiency at State-of-the-Art One Wilshire June 8, 2017

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Energy Performance Contracting

SmartMod Intelligent, Integrated Infrastructure for the Data Center

Planning a Green Datacenter

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Data Center Efficiency and Retrofit Best Practices

Case Study Leicester City Council

Transcription:

White Paper Intel Information Technology Computer Manufacturing Data Center Efficiency Increasing Data Center Efficiency through Metering and Monitoring Power Usage To increase data center energy efficiency at one of our older data centers in India, Intel IT s Data Center Services group collaborated on a technology study with the IT compute organization and Intel Facilities Management to develop a comprehensive approach to metering power usage. We developed methods for identifying measurable efficiency improvements and placed instrumentation to continuously track power usage effectiveness (PUE), the key metric of data center energy efficiency. Using PUE metrics allowed us to make decisions that increased efficiency, helped achieve optimum data center facility utilization, and provided data we can share with other Intel facilities around the world to proliferate energy savings. Anand Vanchi, Sujith Kannan, and Ravi Giri, Intel Corporation June 2009

IT@Intel White Paper Increasing Data Center Efficiency through Metering and Monitoring Power Usage Executive Summary To increase energy efficiency at a five-year-old data center in India, Intel IT s Data Center Services group, the IT compute organization, and Intel Facilities Management collaborated to develop a consistent way to measure data center energy efficiency including return on investment (ROI), IT performance, and facility utilization. Our integrated approach for measuring total IT power utilization involved metering power usage at the granularity of a row of servers. We developed a clear, consistent way to measure data center efficiency including ROI, IT performance, and facility utilization. Energy is the highest operating cost in our data centers, and Intel IT faces the challenge of supporting ever increasing electrical energy consumption for design engineering and enterprise computing. To address these concerns at our data center in India, which was built on older design principles, we developed methods to identify efficiency improvements and implemented instrumentation to continuously track power usage effectiveness (PUE) and data center efficiency (DCE). These metrics delivered actionable inputs that helped improve energy efficiency, saving resources at this and other Intel facilities in India. By defining the diminishing returns of a narrow focus on facilities management and on improving the PUE of the data center, the metrics also reinforced the need to focus on overall IT performance efficiency to achieve greater long-term ROI for data center efficiency efforts. Close collaboration between Data Center Services, the IT compute organization, and Intel Facilities Management throughout the project life cycle led to greater savings than originally anticipated, as we could measure each variable with a potential impact on electrical energy consumption. This integrated approach enabled us to chart a future course of action for the data center s energy efficiency roadmap. Improving the PUE index of this data center from 1.99 in 2007 to 1.81 in 2008 provided performance and financial benefits including: Annual energy cost savings of USD 77,000 for a data center with an average load of 9,672 kilowatt-hours (kwh) per day 10 percent improvement in overall operational efficiency Detailed return on investment (ROI) justification for retrofitting the data center to further improve rack power density by 2.8x, from 5 kilowatts (kw) per rack to 14 kw per rack Based on the success of this initiative, the project team has become a key contributor in the Intel IT Global Data Center Manageability work group. This work group addresses metering, monitoring, and efficiency requirements at Intel IT data centers around the world, helping to proliferate a collaborative strategy for an enterprise-wide single pane of glass view into data center manageability. An integrated IT-Facilities point of view will help us manage and correlate capacity and utilization. 2 www.intel.com/it

Increasing Data Center Efficiency through Metering and Monitoring Power Usage IT@Intel White Paper Contents Executive Summary... 2 Business Challenge... 4 Solution... 5 Building a Collaborative Team... 5 Defining the Project... 5 Addressing Business Value Considerations... 5 Establishing Baseline Measurements and Improvement Goals... 5 Implementing the Solution... 8 Results... 9 Next Steps... 10 Conclusion... 11 Authors...12 Acknowledgements... 12 Acronyms and Terms... 12 www.intel.com/it 3

IT@Intel White Paper Increasing Data Center Efficiency through Metering and Monitoring Power Usage Business Challenge Energy consumption is a critical concern for IT organizations worldwide as the cost of operating data centers increases due to the growing use of computing devices and rising energy costs. Compounding these factors, data centers that were considered state of the art just five years ago are now lagging behind in energy-efficient technologies. Intel IT has taken significant steps to reduce its environmental impact, especially in data centers, where energy is a top operating cost. As part of these efforts, Intel IT s Data Center Services group developed a collaborative strategy with the IT compute organization and Intel Facilities Management to meter and monitor data center power usage at a five-year-old data center in India. While Intel has a number of state-of-theart data centers with aggressive power usage effectiveness (PUE) numbers, we also needed to look at cost-effective ways to update our older data centers which, as for most companies, represent the highest percentage of data centers currently in operation. The challenges we faced at this older data center were multifaceted: India, like other emerging economies, was witnessing a widening gap between supply and demand for power resources. Available systems to measure data center efficiency were intensely manual, error prone, and limited to a single point in time. Available energy consumption data was residing in multiple, disparate systems across the IT and facilities organizations. All of these factors made it difficult to measure electrical energy consumption in the data center and take specific actions to reduce consumption. Furthermore, some of the infrastructure systems, such as the chiller plant, were shared between the data center and other facility needs, hampering targeted measurements. Computing demands from Intel product groups were getting more complex. India s Data Center Efficiency Initiative As India plans to be the world s first market for trading in energy savings, the Government of India has formed a national policy group to focus on data center energy efficiency and to develop a National Design Code for Energy Conservation in Data Centers and a Best Practice Manual for Indian Data Centers in consultation with industry. As part of our commitment to IT sustainability, Intel IT is an active participant in this group. 4 www.intel.com/it

Increasing Data Center Efficiency through Metering and Monitoring Power Usage IT@Intel White Paper Solution Through a collaborative process, we improved energy efficiency at our five-year-old India data center by metering and monitoring electrical energy consumption at a very granular level, achieved through implementing instrumentation that enabled continuous tracking of PUE, the key data center efficiency metric. Metering facility utilization for power, cooling, and losses at key points in power distribution provided practical information for implementing efficiency improvements. Building a Collaborative Team It was critical for stakeholders Data Center Services, IT computing, and Facilities Management to collaborate throughout the project life cycle so that we could measure each variable that had a potential impact on energy consumption and have a holistic view of total facility operating costs. Defining the Project The primary goal of this project was to target potential cost saving opportunities. We focused on identifying current operational costs, setting baseline measurements, implementing energy efficiency processes based on the information we gathered, and moving toward our defined goal. Once we implemented our project, we would continue to measure and make improvements. Addressing Business Value Considerations Having consistent methods for measuring data center efficiency (DCE) that encompassed important business factors like return on investment (ROI), IT performance, and facility utilization was critical in helping us to address challenges such as: Rightsizing facility infrastructure and considering appropriate load diversity. Increasing the efficiency of the uninterruptible power supply (UPS) system. Reducing power consumption for data center cooling. Designing the optimum power and cooling distribution architecture within the data center. Eliminating redundancies to align facilities with business needs. Determining plans for legacy servers, which yield lower performance per watt of power consumed compared to current servers based on multi-core processors. Dealing with geographic factors, such as humidity, temperature, utility reliability, and so on. Once we completed all of our analysis and planning, we were ready to start metering energy consumption. Establishing Baseline Measurements and Improvement Goals Our older India data center is a 5500-square-foot facility built on older design principles such as a 24-inch raised floor, a 10-foot-high false ceiling that is not used as a return air plenum, and no hot aisle containments. It has a power density of 110 watts per square foot (WPSF), a 2(N+1) UPS power redundancy configuration, and ductless chilled-water-based precision air conditioning (PAC) units in an N+1 cooling redundancy configuration. Determining a Process for Data Collection Before we could begin metering, we needed to address a number of challenges in our older data center: Isolating data center power and cooling loads from the rest of the building facilities. www.intel.com/it 5

IT@Intel White Paper Increasing Data Center Efficiency through Metering and Monitoring Power Usage Metering facility utilization power, cooling, and losses at the right points in the power distribution cycle so that we collected the most useful information for efficiency improvements. Identifying the optimum levels of granularity for energy metering. To resolve these issues, we determined that we needed to implement energy meters at the row level in our older data center. This enabled us to distinguish between electrical energy consumption for IT equipment and for the rest of the building facilities, to meter and monitor total facility utilization at a very granular level to establish baseline metrics, and to continually track PUE, the key data center efficiency metric. We also measured and analyzed all electrical energy consumption of the data center operation to gather accurate efficiency measurements. This meant identifying the optimum, comprehensive level of granularity for power and cooling metering by: Measuring existing energy efficiency levels. Identifying current operational power costs and setting the baseline. Targeting potential cost saving opportunities and setting the goals. Implementing the efficiency improvement process using the existing knowledge base. Continuously measuring and working on improvements in data center operational cost structures. Key Metric: Power Usage Effectiveness We used calculations advocated by Green Grid*, a global consortium dedicated to advancing energy efficiency in data centers and business computing ecosystems: Power usage effectiveness (PUE) = Total facility power / Total IT equipment power As seen in Figure 1, the reciprocal, data center efficiency, is defined as: Data center efficiency (DCE) = 1/PUE or Total IT equipment power / Total facility power Total facility power Total facility power is defined as the power measured at the utility meter that is dedicated solely to the data center. Data center total facility power includes everything that supports the IT equipment load, such as: Power delivery components including UPSs, switch gears, generators, power distribution units (PDUs), batteries, and distribution losses external to the IT equipment Cooling system components such as chillers, computer room air conditioning (CRAC) units, direct expansion (DX) air handler units, pumps, cooling towers, and automation Compute, network, and storage nodes Other miscellaneous component loads, such as data center lighting, the fire protection system, and so on Building Load Power Grid Total Facility Power Power Switch Gear, Uninterruptible Power Supply, Battery Backup, and so on Cooling Chillers, Computer Room Air Conditioning Units (CRACs), and so on IT Equipment Power IT Load Demand from Servers, Storage, Telco Equipment, and so on Power Usage Effectiveness (PUE) = Total Facility Power/Total IT Equipment Power Data Center Efficiency (DCE) = 1/PUE = Total IT Equipment Power/Total Facility Power Figure 1. Identifying power metering points to capture IT utilization data. 6 www.intel.com/it

Increasing Data Center Efficiency through Metering and Monitoring Power Usage IT@Intel White Paper Power measurement in mixed-use buildings buildings that house data centers as one of several functions requires metering to distinguish power used for the data center. Figure 2 shows our UPS power metering layout to measure total facility power in our older data center. IT equipment power IT equipment power is defined as the effective power used by the equipment that manages, processes, stores, or routes data within the raised floor space, including: The load associated with all of the IT equipment such as compute, storage, and network equipment Supplemental equipment such as keyboard, video, mouse (KVM) switches; monitors; and workstations and laptops used to monitor or otherwise control the data center Typically, IT equipment power is monitored at the rack level using metered PDUs; however, we found a more effective approach by continuous monitoring at the row level in the electrical distribution box using these energy meters. Uninterruptible Power Supply (UPS) Power Metering Layout Source 1 UPS Layout Source 2 UPS Layout UPS Input Panel 400 KVA 400 KVA 300 KVA 300 KVA 300 KVA 300 KVA Wrap Panel Main Switchboard 3 Main Switchboard 4 Main Switchboard 1 Main Switchboard 2 To Server Rows A through C Spare To Server Rows D through F Spare Network Room Extension Panel To Server Rows A through E Spare To Server Rows F through K Breaker KVA kilovolt-ampere Individual Network Equipment Spare Figure 2. Uninterruptible power supply (UPS) power metering measured the total facility power of the data center. www.intel.com/it 7

IT@Intel White Paper Increasing Data Center Efficiency through Metering and Monitoring Power Usage Table 1. Location of Data Center Power and Cooling Meters Power metering Cooling metering Input side: uninterruptible power supply (UPS) input panel Output side: row-level power distribution panel Precision air conditioning (PAC) units: at distribution panel supplying power to the PAC units Multi-use chiller plant: used flow meters and temperature sensors to isolate cooling consumption of the data center alone Implementing the Solution To monitor PUE in our older data center, we installed energy meters for measuring IT power loads. Because the chilled water plant was common between the data center, labs, and office space, we installed flow meters and temperature sensors along with regular energy meters to measure cooling loads. All of the meters were hooked up to the building management system (BMS) for more effective continuous availability of measurements. Table 1 describes the locations of the meters. Figure 3 shows the locations of energy meters to measure cooling. Collecting Measurements We measured the following items for potential energy efficiency savings: UPS and distribution losses IT load electrical energy utilization Total electrical energy utilization by PAC units in the data center Electrical energy utilization for running make-up air units for the data center Electrical energy utilization for cooling the UPS room Electrical energy utilization at the chiller plant, with respect to data center usage Lighting load measurements Granular measurements levels provided us with optimal metrics for tracking and monitoring electrical energy utilization. We rolled Air Conditioning Panel Main Circuit Breaker Main Switchboard 2 Air Conditioning Main Switchboard 1 Air Conditioning To Air-cooled Chiller To All Air Conditioning Units To All Air Conditioning Units in the Uninterruptible Power Supply (UPS) Room Figure 3. Cooling power metering isolated data center power from the rest of the building facilities. 8 www.intel.com/it

Increasing Data Center Efficiency through Metering and Monitoring Power Usage IT@Intel White Paper Power Cost Savings Yielded by Improvements in Power Usage Effectiveness (PUE) Efficiency and Projected Potential Savings IT Average Load per Day (Kilowatt-hours) (A) Facilities Average Load per Day (Kilowatt-hours) (B) Total Average Data Center Load per Day (A+B) Power Usage Effectiveness (PUE = [A+B]/A) Data Center Efficiency (DCE = 1/PUE) Total Power Cost (USD) Annualized 2007 Annual Load 2007 Load Normalized to 2008 IT Load Level 2008 Annual Load After Efficiency Improvements Potential Impact of Retrofits PUE= 1.68 10,164 9,672 9,672 9,672 10,103 9,575 7,818 6,577 20,267 19,247 17,490 16,249 1.99 1.99 1.81 1.68 50% 50% 55% 60% USD 887,693 USD 843,023 USD 766,066 USD 711,706 Actual savings achieved in year 2008 over year 2007: USD 76,597 Potential savings for year 2009 over year 2007 after planned data center retrofit: USD 131,317 Facility Power Consumption Units 20,000 16,000 12,000 8,000 4,000 0 Total Average Data Center Load per Day 19,247 2007 Load Normalized to 2008 IT Load Level 17,490 2008 Annual Load after Efficiency Improvements 16,249 Potential Impact with Retrofits Figure 4. Actual and projected savings from improving power usage effectiveness (PUE) in an older data center. out measurement activities at various times during the year, based on input received from continuous metering of the facility consumption in the data center. Analyzing Electrical Energy Utilization We analyzed all electrical energy utilization of the data center operation to generate reports that would allow us to make modifications to the older data center. Reports included continuous PUE measurements, impact on PUE due to variation in IT load, and impact on PUE due to variation in facility utilization corresponding to variation in load. Modifying the Data Center Based our measurements and analysis, we took a number of subsequent actions: Paralleling the UPS to increase the utilization levels, thereby increasing efficiency and reducing distribution losses. Increasing the PAC temperature set point from 19 to 23 degrees Celsius. Implementing humidity controls as needed; we found these controls were only required for three months of the year. Managing airflow inside the data center. Managing load spread across the data center floor. Using LED lighting for emergency lighting inside the data center. Managing standard lighting to reduce consumption. Each of these actions required in-depth planning and coordinated execution. As part of our ongoing work to improve energy efficiency in our older data centers, we plan to share our experiences and best-known methods in future publications.. Results This facility efficiency program emphasized the importance of PUE as a metric and demonstrated how we could consistently measure and manage PUE in data centers to improve energy efficiency. We achieved the following results, summarized in Figures 4 and 5: Annualized operational power cost savings of USD 77,000 and more than 10 percent improvement in overall operational data center efficiency in 2008 by improving PUE from 1.99 to 1.81. Identified the potential for further savings by implementing further retrofits: incremental savings of 20 percent by improving PUE to 1.68 and total annualized power cost savings of USD 131,000. www.intel.com/it 9

IT@Intel White Paper Increasing Data Center Efficiency through Metering and Monitoring Power Usage Percent of Total Facility Load 100% 80 60 40 20 0 0.79% 42.04% 7.02% 50.15% 2007 IT Infrastructure Optimization 0.46% 36.50% 7.74% 55.30% 2008 Lighting Load Cooling Load Uninterruptible Power Supply (UPS) Losses IT Load Figure 5. Total facilities load reduction. Despite an increase in the overall IT load, there has been a reduction in total facilities load. Established the relationship between power density optimization and improved efficiency. Contributed to the ROI justification for retrofitting the data center to a higher density and capacity, refreshing four-year-old servers based on single-core processors with high-performance, highdensity servers based on multi-core processors. Proliferated knowledge we gained globally within the organization, including our contributions on data and architecture. We learned the importance of collaborating to implement change, as well as to save resources and improve ROI through the development of a partnership between stakeholders in data center operations, the IT computing organization, and facilities management. Because the team worked together throughout the project life cycle, this integrated approach resulted in greater savings than originally anticipated, as we could consistently measure each variable that potentially impacted energy consumption. The results of this project also stimulated the creation of a global work group focused on data center management and provided the initial ROI values and architectural approach for our worldwide effort. This work group will investigate, develop, and evaluate design options for integrating data center IT and facility asset management systems for performance optimization and enterprise-wide monitoring and management. ROI Facilities Optimization Data Center Efficiency (DCE) = 1/PUE Next Steps Through our study, we determined that the initial investment to improve the PUE from 1.99 to 1.81 was approximately USD 7,000. However, we also learned that efficiency gains from facilities optimization are finite: The reduction in facilities utilization cannot go on forever. While that doesn t mean the quest for higher DCE necessarily ends, we determined that further potential improvement in PUE to 1.68 would necessitate a much higher investment in retrofitting, estimated at several hundred thousand U.S. dollars. Figure 6. Diminishing return on investment (ROI) for further improvements in power usage effectiveness (PUE). For the same PUE, IT infrastructure optimization (less power for the same compute output) provides additional savings and is a natural next step in the efficiency improvement process. Based on analysis of these findings, we could infer the key point in the diminishing return in ROI for every further step in improvement of PUE, as shown in the conceptual representation in Figure 6. We also learned that efficiency gains from facilities optimization are finite for any data center based on the geographical location, power and cooling distribution architecture, and other location-specific environmental factors. We can potentially achieve further reductions in energy consumption and reach the next level in DCE through further team collaboration on IT infrastructure optimization, server refresh, and other efforts. 10 www.intel.com/it

Increasing Data Center Efficiency through Metering and Monitoring Power Usage IT@Intel White Paper Conclusion At this data center, we demonstrated a 10 percent improvement in overall operational efficiency and achieved annual cost savings of USD 77,000, with potential for 2.8x improvement in rack power density. These results led to improved data center facility utilization and justified the ROI of transitioning to a higher density data center. This increase in rack power density allows us to take advantage of current generation high-performance multi-core server infrastructure enabling higher compute capacity per watt of power consumed. By taking an integrated, holistic approach and by analyzing all the factors associated with data center electrical energy usage and performance, we implemented changes that led to greater savings while developing metrics for repeatability and achieving optimum facility utilization levels at Intel data centers around the world. This study is the first step in establishing concise operating costs per unit of IT performance and in evolving the total cost of ownership (TCO) model to deliver effective data center services in the most efficient way from a Facilities Management perspective and an IT perspective. This combined IT-Facilities perspective that guided our work also led Intel to develop an enterprisewide view of data center manageability a single pane of glass. This integrated view of utilization will help manage IT capacity and cost. Our focus in the future will be to correlate IT performance and facility utilization and the benefits we can derive long term. www.intel.com/it 11

For additional content on Intel IT s best practices on this topic, go to www.intel.com/it Authors Anand Vanchi is an enterprise architect with Intel IT. Sujith Kannan is the maintenance and operations manager for Intel India. Ravi Giri is a staff engineer with Intel IT. Acknowledgements Ravi Madras is a data center operations manager with Intel IT. Rajkumar Kambar is a data center operations manager with Intel IT. Ashok Radhakrishnan is the IT Sustainability India program manager. Acronyms and Terms BMS CRAC unit DX DCE facility utilization IT equipment power KVA KVM PAC unit PDU PUE ROI TCO total facility power UPS building management system computer room air conditioning unit direct expansion data center efficiency: 1/PUE or total IT equipment power / total facility power total data center electrical energy consumption: power, cooling, and losses the effective power directly used by the equipment that manages, processes, stores, or routes data within the raised floor space, such as server, storage, and network equipment kilovolt-ampere keyboard, video, or mouse precision air conditioning unit power distribution unit power usage effectiveness: a metric advocated by Green Grid that is equal to total facility power / total IT equipment power return on investment total cost of ownership total facility power consumption, which includes all the power used to support the data center uninterruptible power supply This paper is for informational purposes only. THIS DOCUMENT IS PROVIDED AS IS WITH NO WARRANTIES WHATSOEVER, INCLUDING ANY WARRANTY OF MERCHANTABILITY, NONINFRINGEMENT, FITNESS FOR ANY PARTICULAR PURPOSE, OR ANY WARRANTY OTHERWISE ARISING OUT OF ANY PROPOSAL, SPECIFICATION OR SAMPLE. Intel disclaims all liability, including liability for infringement of any proprietary rights, relating to use of information in this specification. No license, express or implied, by estoppel or otherwise, to any intellectual property rights is granted herein. Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and other countries. *Other names and brands may be claimed as the property of others. Copyright 2009 Intel Corporation. All rights reserved. Printed in USA 0609/KAR/KC/PDF Please Recycle 322018-001US