Planning a Green Datacenter

Similar documents
Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

MSYS 4480 AC Systems Winter 2015

Thermal management. Thermal management

Server room guide helps energy managers reduce server consumption

Introducing the Heat Wheel to the Data Center

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Switch to Perfection

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Industry leading low energy evaporative fresh air cooling.

Efficiency of Data Center cooling

Data Center Energy Savings By the numbers

Green Data Centers A Guideline

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

MEASURE MONITOR UNDERSTAND

Why would you want to use outdoor air for data center cooling?

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Reducing Data Center Cooling Costs through Airflow Containment

Scalable, Secure. Jason Rylands

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Data Center Energy and Cost Saving Evaluation

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

Google s Green Data Centers: Network POP Case Study

High Efficient Data Center Cooling

A Green Approach. Thermal

Case Study Leicester City Council

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Green IT and Green DC

EP s approach to Power & Energy efficiency in DCs & paradigms

Reducing Energy Consumption with

The End of Redundancy. Alan Wood Sun Microsystems May 8, 2009

How to Increase Data Center Efficiency

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Total Modular Data Centre Solutions

18 th National Award for Excellence in Energy Management. July 27, 2017

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Green Computing: Datacentres

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

Next Generation Cooling

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Rittal Cooling Solutions A. Tropp

Energy Saving Best Practices

Case study on Green Data Center. Reliance Jio Infocomm Limited

ENCLOSURE HEAT DISSIPATION

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

CASE STUDIES ENERGY EFFICIENT DATA CENTRES

A separate CRAC Unit plant zone was created between the external façade and the technical space.

DCIM Solutions to improve planning and reduce operational cost across IT & Facilities

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

IT Air Conditioning Units

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Virtualization and consolidation

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Data Center Carbon Emission Effectiveness

Green Computing: Datacentres

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Maximising Energy Efficiency and Validating Decisions with Romonet s Analytics Platform. About Global Switch. Global Switch Sydney East

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Recapture Capacity for Existing. and Airflow Optimization

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

Power and Cooling for Ultra-High Density Racks and Blade Servers

How to improve efficiency in Data Processing Centres (DPCs) The importance of knowing the PUE. center DPC

SmartRow Intelligent, Integrated Infrastructure in a Self-Contained Line-Up. Solutions For Business-Critical Continuity

Temperature monitoring and CFD Analysis of Data Centre

CHP based cooling for Mannheim

Carbon emission trading schemes drive the use of data centre energy efficiency

EUROPEAN DATA CENTRE STANDARDS

DATA CENTRE CODES AND STANDARDS

Indirect Outside Air-cooling Unit

Green Computing and Sustainability

Close Coupled Cooling for Datacentres

The Shared Data Centre Project

Rolf Brink Founder/CEO

DC Energy, Efficiency Challenges

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Data Center Trends: How the Customer Drives Industry Advances and Design Development

White paper adaptive IT infrastructures

Technology for Constructing Environmentally Friendly Data Centers and Fujitsu s Approach

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Data Center Infrastructure Management ( DCiM )

FREEING HPC FROM THE DATACENTRE

Impact of Air Containment Systems

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Making the impossible data center a reality.

DATACENTER CAMPUS FRANKFURT. Plan for Every Second

Transcription:

Planning a Green Datacenter Dr. Dominique Singy Contribution to: Green IT Crash Course Organisator: Green IT Special Interest Group of the Swiss Informatics Society ETH-Zurich / February 13,

Green Data Center (DC)? Why? Energy demand for ICT? 2 ICT consumes a lot of energy and hence is responsible for 2% of the global carbon dioxide (CO 2 ) emissions. This figure is equivalent to the output from the global aviation industry! (Gartner, 2008) Round 50% of total energy use @ data centers is consumed by the auxiliary infrastructure i.e. air conditioning, cooling and UPS. Electricity price is rising -> energy costs could strongly charge IT budget in near future Green Datacenter focuses primarily on enhancing energy efficiency. Energy efficiency enables reduction of operating costs and sustainable ICT.

Corporate Responsibility Strategy @ Swisscom -> energy efficient datacenter 3 By 2015, we intend to increase energy efficiency by 20% compared to 2010: -> deployment of energy savings @ network and data centers Swisscom owns and operates 12 Datacenters (2 main and 10 mixed sites ) Distribution between Hosting/Housing/Cloud for external customers and internal IT

Datacenter metrics: PUE and DC(i)E according to The Green Grid 4 PUE (Power Usage Effectiveness) Total Facility Power = --------------------------- IT Equipment Power The closer the PUE value is to 1.0, the more energy efficient the infrastructure is. An ideal PUE value of 1.0 would imply 100% efficiency. According to US EPA study in 2009, average PUE of a sample of 121 DC by round 1.9. Other metrics also available, but PUE is currently the most widely used metric for data center infrastructure efficiency Measurement point of IT load usually set at UPS output Recommended to measure energy for the PUE, thus including time variation of power needs

Datacenter: temperature specifications 5 Mostly specifications are set according to ASHRAE thermal ETSI EN 300 019-1-3, class 3.1 guidelines with restrictive 50 recommended temperature area 45 (up to 27 C). 40 Since 2009 the telecommunication standard ETSI EN 300 019-1-3 (class 30 3.1) applies to data centres too. 90% of the time According to this standard the 20 temperature may vary between 5 and 40 C under normal conditions 10 and even up to 45 C under 5 abnormal conditions 0 Exceptional climatic limit In 2011 ASHRAE has defined a new class A4 with upper temperature -5-10 limit of 45 C which complies with 0 5 10 30 the ETSI class 3.1. Relative air humidity (%) Air temperature ( C) Normal climatic limit 20 40 50 60 70 80 85 90 100

Energy efficiency targets for new Datacenter by Swisscom 6 2008 2011 Cooling < 20% @ load rate >25% UPS power loss < 8% Server > 72% Energy target for cooling <20% based on recommendation from the Swiss Federal Office for Energy The energy breakdown in the above chart corresponds to a PUE < 1.39 and a reciprocal DCiE > 72% Cooling < 10% UPS power loss < 9% Server > 81% These targets correspond to a PUE < 1.23 and a reciprocal DCiE > 81%

How green issues are included in the planning of new Datacenters at Swisscom? 7 PUE target Cooling / heat re-use Energy provision Uninterrupted power supply (UPS)

Swisscom DC built in 2007 in Zollikofen (Berne) -> Energy efficiency targets and measures 8 Targets Chilled water Cooling design in server rooms Energy efficiency targets set by the planning of this data centre : Energy use for the cooling < 20% Energy efficiency of UPS above 90% (power loss of 8%). DCiE value above 72% or a reciprocal PUE value below 1.39 Implemented measures to reach these targets: Chilled water temperature supplied at 16 C which enables extended use of freecooling throughout year (mixed operating chillers/freecooling ). Turbo chillers with high COP (coefficient of performance) even at reduced load Room temperature set at 25 C. Cold/warm aisle topology in the server rooms, thus preventing air mixing between cold and warm air. Use of air re-circulation units with variable air volume rate, thus enabling better matching of air volume rate to the effective needs Power and IP cables placed in well dedicated cable routes inside the raised floor, thus preventing unwanted obstacles for air flow. Heat recovery UPS Electricity provision Heat recovery currently used for internal purposes and also available for potential external recipients (district heating). Selection of static UPS with energy efficiency above 90%. 100% renewable energy

Swisscom DC built in 2007 in Zollikofen (Berne) -> monitored PUE 9 PUE 2.0 1.9 1.8 1.7 1.6 1.5 1.4 1.3 1.2 1.1 1.0 PUE of Swisscom data centre in Zollikofen (BE) 01.01.2011 Average PUE of legacy data centres according to a US EPA study in 2009 Monitored PUE over 12 months 01.01.2012 01.07.2011 Planning PUE of 1.4 01.07.2012 01.07. 01.01. 01.07.2014 01.01.2014 Yearly averaged PUE of 1.29 at End 2012 Reduction of the electricity needs for cooling in the DC of Zollikofen by 40% compared with conventional datacenter. This allows us today to save the equivalent amount of electricity that would be required by around 1,400 households

New Swisscom state-of-the-art data centre in Berne-Wankdorf 10 Building of one of the most modern and efficient data center in Europe at the Berne-Wankdorf business park. The new data centre will begin operating in 2014. The data centre will initially be equipped with three modules with an effective output of 600 kw each. There is scope for subsequent expansion to up to seven modules in line with requirements The new building will provide sufficient space for IT outsourcing and managed housing services. In a first stage the data centre will have a server area of 2,300 m², expandable to a maximum of 4,000 m² depending on requirements.

New Swisscom state-of-the-art data centre in Berne- Wankdorf 11 PUE target Cooling PUE value will reach the top rating of 1.2 thanks to extremely efficient power usage. Environmental issues also included Free cooling and additional water evaporation by higher outside temperature (no chillers) For the first time, evaporation cooling using rain water => massive reduction of the amount of fresh water consumed or even not necessary. Heat recovery UPS Heat re-use and provision to the city of Berne s district heating network. Instead of static UPS use of dynamic UPS thus eliminating the needs of batteries. In the event of network failure, a permanently operating rotational mass will drive the generator until the diesel units take over this task.

Examples of heat re-use at Swisscom 12 Swisscom centre in Zürich-Binz: In addition to internal heat re-use, provision of heat to district heating network of the residential areas of Tiergarten and Talwiesen in Zürich In the future additional provision of heat year-round to the new Anergy network with heat storage in the ground (project of Familienheim-Genossenschaft Zürich) for heating of round 2 000 households On the midterm total heat re-use from our centre could reach up to 20 GWh/a, which corresponds to a saving of round 2 millions litres of heating oil per year. Swisscom centre in Zürich-Herdern: Provision of heat to the district heating of the city of Zürich

New concept developed by Swisscom: -> Fresh-air-cooling @ Datacenter 13 Concept of fresh air cooling throughout the entire year at data centers (even for IT loads of several kw/m 2 ) Concept To extend fresh-air-cooling in Data Centers, following concept is proposed: - Arrange the racks in the rooms with hot and cold aisles - Contain hot aisles and connecting them to the return air plenum - Avoid hot and cold air mixing thus keeping room temperature at appropriate level - Supply outside air through openings in the outer facade and use exhaust fans to remove hot air - Possible heat re-use through heat exchanger mounted on exhaust side

Pilot Fresh-air-cooling @ Datacenter at Swisscom 14 Pilot carried out from Nov. 2011 to Oct. 2012 (around one year) for demonstrating the feasibility of the concept Fresh-air-cooling system Warm air removal: Tested IT platform: 6 HP blade systems (BL 7000) with 16 servers (type BL460c) each (total 96 servers) Front to rear air flow through the servers One SAN storage module also installed Arrangement: 2 rows with 3 racks each Total IT power @ 100% utilization: ca. 27 kw Platform powered by high energy efficient UPS (max. load: 80 kva, 96.5% efficiency at ca. 28 kw load) Specific load: ca. 1 kw/m 2 floor area @ 100% utilization rate Steady operation of servers @ 100% utilisation rate We acknowledge the support of HP and GE Energy Warm aisle containment -> no air mixing Two exhaust fans each providing 50% of required cooling power in redundant parallel operation Temperature-controlled air flow volume T Cold air inlet: Air inlet through openings in the façade and air ducts

Results of the pilot Fresh-air-cooling @ Datacenter at Swisscom 15 Pilot operated over ca. one year (from November, 2011 to October, 2012) Pilot is successful. The servers were working properly, even at the hottest summer days with outside temperature above 30 C All measured temperature and relative humidity values within the climatogramm of ETSI EN 300 019-1-3, class 3.1 Temperature ( C) / IT load (kw) 50 45 40 35 30 25 20 15 10 5 0 25.07.2012 18:00 26.07.2012 00:00 26.07.2012 06:00 Pilot fresh-air-cooling @ DC (July 26-27, 2012) Upper temperature limit for exceptional operating conditions Upper temperature limit for normal operating conditions 26.07.2012 12:00 26.07.2012 18:00 27.07.2012 00:00 Datum; Zeit (GMT) 27.07.2012 06:00 27.07.2012 12:00 27.07.2012 18:00 28.07.2012 00:00 Outside temperature ( C) Room temperature ( C) IT Load (kw) PUE value Advantages of fresh-air-cooling: 1.50 1.45 1.40 1.35 1.30 1.25 1.20 1.15 1.10 1.05 1.00 PUE value Monitored PUE value varies between 1.05 and 1.08: yearly averaged PUE-value of 1.06 Energy use for cooling cut by a factor 10 Low cost -> CAPEX by a factor 3 to 4 lower No refrigerants and no water needs Higher modularity and reliability (PUE = Power Usage Effectiveness = Total Facility Power / IT Power)

Conclusion / Recommendations / Outlook 16 Green Datacenter enables: Sustainable development of ICT Reduction of OPEX Promotion of image Setting of energy efficiency targets (e.g. in terms of PUE) and challenging best practices by planning of new datacenters: Select best solution based on TCO approach including energy costs as well as additional green issues Possible support from promotion program PUEDA (initialized by Swiss Federal Office of Energy) For best practices refer to e.g. EU CoC for energy efficient datacenter Better use of allowable temperature range for IT equipment according to ETSI class 3.1, thus enabling lowering CAPEX and OPEX for cooling. Promotion of high energy efficient powering and cooling solutions: Steps towards a first implementation of fresh-air-cooling year-round at Swisscom are on going Promotion of server virtualization and Green IT-Outsourcing

Contact information 17 Swisscom Ltd Dominique Singy Corporate Responsibility Ostermundigenstrasse 93 CH-3050 Berne Thank you for your attention Phone: +41 79 244 03 89 Mail: dominique.singy@swisscom.com www.swisscom.ch