A separate CRAC Unit plant zone was created between the external façade and the technical space.

Similar documents
Thermal management. Thermal management

FACT SHEET: IBM NEW ZEALAND DATA CENTRE

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Industry leading low energy evaporative fresh air cooling.

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Efficiency of Data Center cooling

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

ENCLOSURE HEAT DISSIPATION

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

LCP Hybrid Efficient performance with heat pipe technology

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

IT Air Conditioning Units

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Total Modular Data Centre Solutions

MSYS 4480 AC Systems Winter 2015

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Ecotel Free Cool 5-15kW

84 kw, Tier 3, Direct Expansion, 937 ft 2

Overcoming the Challenges of Server Virtualisation

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Switch to Perfection

The Shared Data Centre Project

DATA CENTRE & COLOCATION

RiMatrix S Make easy.

Cannon T4 Modular Data Center (MDC)

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

MEASURE MONITOR UNDERSTAND

Datacentre Newbridge Data sheet

Recapture Capacity for Existing. and Airflow Optimization

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

Planning a Green Datacenter

Data Center Energy and Cost Saving Evaluation

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Datacentre Milton Keynes Hall C Data sheet

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Virtualization and consolidation

Data Centre Energy & Cost Efficiency Simulation Software. Zahl Limbuwala

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

CASE STUDIES ENERGY EFFICIENT DATA CENTRES

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

IXcellerate Moscow One Datacentre - Phase 1 & 2 Overview

Taking The Heat Off University of East Anglia Data Centre

COLOCATION SERVICE DESCRIPTION

Power and Cooling for Ultra-High Density Racks and Blade Servers

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Retro-Commissioning Report

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Close Coupled Cooling for Datacentres

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Google s Green Data Centers: Network POP Case Study

Rittal LCP T3+ High-availability IT cooling with no compromise

18 th National Award for Excellence in Energy Management. July 27, 2017

CRITICAL FACILITIES ROUNDTABLE HEAT REMOVAL IN DATA CENTERS ENGINEERED SOLUTIONS

Datacentre Milton Keynes Data sheet

Data Center Energy Savings By the numbers

Data Center Optimized Controls

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Case Study Leicester City Council

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Rethinking Datacenter Cooling

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Over IP Group. Data Centre Infrastructure Management (DCIM)

Server Room & Data Centre Energy Efficiency. Technology Paper 003

APC APPLICATION NOTE #146

DATA CENTER AIR HANDLING COOLING SOLUTIONS

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Datacentre Reading East 2 Data sheet

Chilled Water Computer Room Air Handler (CRAH) Critical Infrastructure for the World s Leading Data Centers

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

A Green Approach. Thermal

Running Your Data Center Under Ideal Temperature Conditions

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Precision Cooling Mission Critical Air Handling Units

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Liebert PDX. The Cooling Solution for Small and Medium Data Centers. Digital Scroll. Part Load. Competent Partner Efficiency

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Case Study The University of Warwick

Air-Side Economizer Cycles Applied to the Rejection of Data Center Heat

Indirect Adiabatic and Evaporative Data Centre Cooling

Indirect Adiabatic and Evaporative Data Center Cooling

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Liebert PDX from 15 to 120 kw

Who are Aqua Cooling?

The noise of cloud computing

Design and Installation Challenges: Aisle Containment Systems

Optimizing Cooling Performance Of a Data Center

Liebert PCW. Cool the Cloud

Beyond BREEAM Excellent DCs. Cobalt Data Centre Campus

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Liquid Cooling Package LCP Cooling Systems

Transcription:

Nominee: FES Co Ltd Nomination title: HP Cold Cube The Doxford Facility is a highly secure Data Centre. The challenge was to take advantage of the previous investment in highly resilient infrastructure and to utilize this within existing space to deliver the extension in the optimum time. The targeted space had a light weight slab that dictated an enabling works package to achieve the required weight loading for a data centre. This included delivering and installing pre manufactured sections of steel assembled utilising over 4,800 bolted fixings. These steel sections installed by hand over the live operational critical power UPS whilst it remained in operation. The existing window elevation was replaced with continuous secure weather Louvres to HP Secure Standard. A separate CRAC Unit plant zone was created between the external façade and the technical space. This unique arrangement provided the following: - Access for maintenance personnel without the need to enter the Technical Space. - Creation of a mixing chamber for fresh air, return air and humidification. - The separation of noise producing components from the operational space. The Technical space was designed to house 78 data racks. These were arranged in three Cold Cubes each containing 26 racks. The space included areas for free standing non rack able hardware.

Each Cold Cube had a maximum design load of 120 kw with a normal individual maximum load of 4.5kW per rack. The data hall had a maximum design load of 360kW. The IT Equipment power utilisation had a design maximum of 80 % reflecting a reflecting a normal maximum design load for the new data hall of 288 kw. From an energy efficiency perspective the design provided 300 kw of free cooling up to an external air temperature of 19 degrees C. This aligned with the normal maximum design load. The Systems were delivered to a Tier 3 resilience standard. The delivery was based around five down flow Cooling Units (CRAC Units) each rated at 100kW mounted within the plant zone. Three of the Units were configured in a free cooling mode with dampered connections to external supply air. These Free Cooling Units were further configured with dampered connections to the plant zone which its self will be a hot isle return plenum. The free cooling dampered supply and hot isle return air were linked within a mixing chamber above the three dedicated free cooling DX CRAC Units. The Plant Zone Plenum additionally was provided with four extract fans each connected to dampered connections to the external louvre. Supply air was supplied from the base of the CRAC Unit through the floor void into the Technical Space passing into the base of the Cold Cube. The Supply air temperature was conditioned to 23 degrees C at the base of the Cold Cube. The Air was entrained through the Server Equipment and discharged outside the Cold Cube at approximately 38 degrees C. Five Number return air egg crate grilles aligning with the respective CRAC Units located within the Technical Space at high level to allow return air to pass through into the CRAC Unit plant zone hot aisle plenum. Ultrasonic humidifiers were located adjacent to the return air grille from the data hall at high level above the intake to the CRAC Unit supply air. One humidifier per CRAC Unit. The respective water supply was de-mineralized and treated. This unique arrangement formed a mixing chamber for the conditioned air away from the data hall.

To eradicate the risk of corrosion the specification included the provision of Saline filters located within the supply air connection from outside air into the CRAC Unit Free Cooling mixing box. The Cooling Coils were further treated to take account the potentially salt laden coastal air. Each CRAC Unit was provided with its own integral control system. This such that each CRAC Unit operated independently. The individual CRAC Unit controls further controlled the respective dampers aligned with that unit. A Central Supervisor and Control System allowed the five CRAC Units to operate as a group. CRAC Supply fans were controlled via pressure sensors installed at the base of the CRAC Unit. As the server fans pick up cooling demand the pressure within the cube falls. The pressure sensors ensured air pressure was maintained. The three cold cube pressure sensors sent an average pressure signal to the five respective CRAC Units. The Extract Air Fans each had variable speed drives so that the Technical Space pressure was maintained through its operational range. Pressure differential switches at the Technical Space ingress egress doors further allowed the system to automatically self adjust pressure should the doors be held open. Air Temperature Sensors within the space reported to the ACIS Central Supervisory System. This provided an active full graphically representation of conditions within the space. The impact the solution had for the customer: The free cooling has a forecast energy saving of 83,828 and 160,985 Kg CO2 per annum. The unique mixing chamber allows the security of the data centre to be free of mechanical service teams and restricts noisy machinery to the plant zone. The design delivered a highly secure data centre utilising existing infrastructure and space to bring the product to market within 6 months. The major differentiators between your product/solution and those of your primary competitors:

- Saline Filtration and Coated Refrigerant coils to ensure design life maximised - Customised damper box to allow 100 % air blending through operational range - Variable Speed drive control of CRAC fans - Dedicated mixing chamber - Piezo electric humidification - Demineralised water - LED lighting - External blades to prevent mixing of abutting supply and extract air - 100 % Load IST Testing - A Central Supervisor CRAC Unit control and management system providing a simple and intuitive interface that facilitates the full range of monitoring, reporting and diagnostic tools relating to the technical space and associated plant space. - Pressurised control of Cold Cube - Individual rack final circuit monitoring - Elevated Rack temperature control Why nominee should win - Delivery of a Cold Cube High Intensity Rack Data Centre with a PUE of 1.07 with a financial / carbon saving of 83,828 and 160,985 Kg CO2 per annum. - Unique design of a plant room mixing chamber that additionally segregates maintenance zones from operational space whilst limiting noise impact to users. - The creation of external blades to ensure separation of abutting sections of supply and extract air preventing short circuiting and maximizing fresh air.

- Construction of a new Data Hall above an existing operational Data Centre. - The use of innovation including piezo electric humidification