Taking The Heat Off University of East Anglia Data Centre

Similar documents
Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Industry leading low energy evaporative fresh air cooling.

Thermal management. Thermal management

Rittal Cooling Solutions A. Tropp

Case Study Leicester City Council

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Switch to Perfection

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

CASE STUDIES ENERGY EFFICIENT DATA CENTRES

Green Data Centers A Guideline

Optimize Your Data Center. Gary Hill VP, Business Development DVL Group, Inc.

A Green Approach. Thermal

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

A separate CRAC Unit plant zone was created between the external façade and the technical space.

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

FREEING HPC FROM THE DATACENTRE

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Total Modular Data Centre Solutions

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

ENCLOSURE HEAT DISSIPATION

Case Study The University of Warwick

Planning a Green Datacenter

LCP Hybrid Efficient performance with heat pipe technology

Reducing Data Center Cooling Costs through Airflow Containment

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Design and Installation Challenges: Aisle Containment Systems

Introducing the Heat Wheel to the Data Center

Power and Cooling for Ultra-High Density Racks and Blade Servers

CRITICAL FACILITIES ROUNDTABLE HEAT REMOVAL IN DATA CENTERS ENGINEERED SOLUTIONS

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

Effectiveness and Implementation of Modular Containment in Existing Data Centers

84 kw, Tier 3, Direct Expansion, 937 ft 2

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

APC APPLICATION NOTE #146

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

MSYS 4480 AC Systems Winter 2015

Liquid Cooling Package LCP Cooling Systems

Recapture Capacity for Existing. and Airflow Optimization

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

APC InfraStruXure Solution Overview

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Data Center Trends: How the Customer Drives Industry Advances and Design Development

World Class. Globally Certified. High Availability.

SmartCabinet Intelligent, Integrated Containment for IT Infrastructure

Data Center Optimized Controls

Running Your Data Center Under Ideal Temperature Conditions

SMART VENTILATION MANAGEMENT: CHANGING THE WAY WE THINK ABOUT VENTILATION

CASE STUDIES ENERGY EFFICIENT COOLING FOR THE PUBLIC SECTOR

Efficiency of Data Center cooling

1.866.TRY.GLCC WeRackYourWorld.com

COOLING SOLUTIONS. SmartRack Close-Coupled Cooling for IT Environments

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

The Energy. David L. Moss Dell Data Center Infrastructure

White paper adaptive IT infrastructures

Data Centre Infrastructure

» BEST PRACTICES FOR AIRFLOW MANAGEMENT AND COOLING OPTIMISATION IN THE DATA CENTRE

Virtualization and consolidation

DATA CENTRE & COLOCATION

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

The Shared Data Centre Project

Overcoming the Challenges of Server Virtualisation

IT infrastructure Next level for data center. Rittal The System Yavor Panayotov Data center Day

9.6 MW, Traditional IT and OCP-compatible, Indirect Air Economizer, m 2

Cooling Solutions & Considerations for High Performance Computers

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Site Preparation for ClearCube Installation. Rev. F

Liquid Cooling Package LCP Cooling Systems

Critical Facilities Round Table SUN Microsystems Friday, March 25, Thursday, March 24, 2005 Wright Line LLC,

Cooling Case Studies

IT Air Conditioning Units

Close Coupled Cooling for Datacentres

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

How to Increase Data Center Efficiency

DC Energy, Efficiency Challenges

Next Generation Cooling

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

The TeliaSonera Green Room Concept for high and mid density ICT equipment

Los Anseles W Department of 5* Water & Power

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Google s Green Data Centers: Network POP Case Study

Modular & Mobile Data Centre Solutions. Data Centre Solutions Expertly Engineered

Green IT and Green DC

INSIDE THE PROTOTYPE BODENTYPE DATA CENTER: OPENSOURCE MONITORING OF OPENSOURCE HARDWARE

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

Transcription:

Taking The Heat Off University of East Anglia Data Centre University of East Anglia delivers data centre to match corporate green credentials The University of East Anglia (UEA) is one of the leading green technology institutions in the UK. Students and staff are encouraged to think about what they do and its environmental impact. As part of this, the UEA runs its own Green Impact Awards every year to encourage innovation in the QUICK FACTS Client UEA Location Norwich Research Park, Norwich Improvements to reach energy efficiency PUE 1.19 when the data centre is fully loaded field of environmental impact within the university and beyond. It should come as no surprise then, that Challenge met. Paul Carter Future-Tech Project Manager when UEA looked at its existing data centre, it realised that there was much to be done in order to bring it in line with corporate goals. To ensure that they got the best value for greening their data centre refurbishment, UEA approach Salix, specialists in energy efficiency loans. The result was a partnership between UEA, Salix, Future-Tech and Dataracks to deliver energy efficiency, full pay-back within 6 years and a facility that meets very stringent targets.

Existing facility As James Willman, Head of Pre-Sales Consultancy with Future-Tech explained, One of the major challenges for this project was the existing facility. The UEA data centre is located in a Grade II listed building. This meant that any solution that relied on making holes in walls, moving walls or changing the construction of floor and ceilings was not acceptable. Instead, the solution had to work within the existing space in order to deliver improvements. Another restriction was time. As with any university, UEA has research programmes that operate all year round which meant that any large works would have to be planned and executed when the majority of the student body was on leave. Like most projects, the trouble came in threes. The third was that the data centre is located very close to student accommodation and teaching facilities. This meant that any solution would have to be not just energy efficient but would also have to meet extremely tight noise emission standards as well. Data centre requirements As a major research hub, UEA runs a lot of computer equipment. While it is not running a mega data centre, it is representative of a lot of mid-sized company facilities. The existing data centre consumed around 60kW with a fairly typical PUE of 2.08. The existing cooling system was capable of supporting a load of 138kW in an N+1 configuration. Looking to the future, UEA wants to increase the capacity of the data centre in order to take on new research projects and support staff and students. This means that the cooling load will need to increase by over 60% to 220kW. At the same time, the efficiency of the cooling has to support a dramatic lowering of the PUE to as low as 1.19 when the data centre is fully loaded. For most corporate data centre new-builds, where a PUE of around 1.4 is more common, a 1.19 PUE would be seen as an extremely optimistic target! What is important here is that the funding for this project was tied to the ability of the Future-Tech and its subcontractors to achieve such a low PUE. This is an approach that enterprise customers should seriously consider.

Equipment updates Virtually none of the existing infrastructure escaped the refresh. IT equipment was left to UEA to make its own decisions upon. Some things, such as the existing racks, would not change. This meant that there would be a need to create innovative solutions when installing aisle containment. Power A new monitoring system was deployed that makes it very easy for operations to see exactly where the power is being used and to quickly identify any potential leakage or power loss. Key temperatures Perhaps the most important part of this project was the change to how cooling is delivered. While there were savings from changing the electrical system, cabling and 60% of the year UEA is able to rely on Free Air Cooling. distribution board, cooling would need to manage a greater amount of heat, be more flexible and use far less power. The target temperatures were set at 22C input temperature with a maximum hot air return of 32C. There were key reasons for these temperatures. Given the historic external temperature around the university, it would ensure that for 60 per cent of the year, the university would be able to rely on free air cooling. For the rest of the year, there would be a need to use some degree of mechanical cooling but the amount would depend on the temperature outside the data centre and the loading on the IT equipment. Aisle containment The first major change was aisle containment. After a lot of consideration, the decision was made to go with hot aisle containment which would be provided by Dataracks. With hot aisle containment, the room would be cooled with flooded cold air. However, creating the aisle containment was more complex than expected. The racks in use are Rittal and there was no off the shelf Rittal containment solution that would work. A further complication was introduced by the IT team s requirements. They wanted the ability to remove an entire rack when they wanted to work on it and the aisle containment had to facilitate this. This meant that the aisle containment could not be secured to the physical racks and it had to be flexible enough to allow additional elements to be added while racks were off being worked upon. Dataracks had to very quickly create a custom solution.

In many data centres, the Manhattan Skyline of different sized racks is a challenge that Dataracks is used to contending with. According to their managing director Jeremy Hartley, having all the legacy racks in the university s data centre all of one height made a welcome change. The next issue was that all of the Ethernet switches are located in the rear of the racks. Hartley, explained: With the hot aisle now running at 32C this would lead to possible overheating and reduced reliability of the switches. This is a frequent problem with aisle containment and one that we at Dataracks are used to solving. Fortunately, all the switches in use are cooled form side to side. So in double-quick time we designed and manufactured bespoke air-ducts that feed the cold air from the front of the rack to the input side of the switch. Another air-duct then takes the hot exhaust away from the switch and vents it into the hot aisle. As well as providing highly effective switch cooling, this also prevents the risk of hot air being vented into the centre of the rack and causing hot spots that would be impossible Dataracks' Bespoke Air Duct is cooling the switches side to side. to effectively cool. This problem is not unique to UEA but is something that many data centres struggle to resolve easily. Cooling The Future Tech cooling solution uses a combination of direct fresh air delivery supplemented by the use of DX (direct expansion) coolers. As the temperature rises, the number of coolers on-line is increased under software control. This means that power requirements will be kept to a minimum while ensuring that cooling is kept within the target temperatures. To keep the mixed and full DX mode as efficient as possible, Future Tech have deployed EC (electronically commutated) fans which are far more efficient at part-speed than conventional fans. The fans have their speed adjusted under software control to ensure that only the required amount of air is delivered thus avoiding over cooling and the risk of condensation.

The ability to drop into mixed mode and full DX mode immediately is also important. It means that should there be a sudden surge in loading on the servers, UEA will be able to counter any risk of thermal runaway because the systems are already interlinked. Other free air vendors require separate management systems for the free and forced air systems. A secondary backup option means that should there be any failure of the extract fans, the DX system will automatically drop into a full re-circulation stand-alone mode. Challenge met! Retrofitting data centre facilities in listed buildings where noise, power, cooling and energy efficiency targets are tough is a challenge that many would avoid. It can often be easier to look at a new facility, move to cloud or use a drop-in data centre. For the University of East Anglia retrofitting and meeting their very strict corporate environment targets was critical. With Salix providing the funding and ensuring that the targets were contractually binding, it meant that Future-Tech as the lead contractor and DataRacks as the containment designer and manufacturer had to be very innovative and deliver in a short space of time. Confirming that the containment solution had met all of its design objectives and specifically the 1.19 PUE and temperature requirements, Paul Carter, Future-Tech Project Manager, said: Dataracks were very flexible with both the detailed design and delivery of their aisle containment solution. From initial survey to installation took only three weeks. Given the time pressures on the project this was very helpful and exceeded my expectations. The aisle containment systems is built to a high standard and installed very well. I look forward to working with them again in the future.