IEC DATA CENTRE INDIRECT EVAPORATIVE COOLING SYSTEM

Similar documents
Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

IT COOLING EVAPORATIVE COOLING SYSTEM. 2 STAGE INDIRECT ADIABATIC COOLING SYSTEM FOR LARGE DATA CENTERS FROM 80 TO 320 kw

Total Modular Data Centre Solutions

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

FREECOOLING, EVAPORATIVE AND ADIABATIC COOLING TECHNOLOGIES IN DATA CENTER. Applications in Diverse Climates within Europe, Middle East and Africa

Liebert. EFC from 100 to 350 kw. The Highly Efficient Indirect Evaporative Freecooling Unit

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Rack cooling solutions for high density racks management, from 10 to 75 kw

Ecoflair Air Economizers

Indirect Adiabatic and Evaporative Data Center Cooling

Liebert. AFC from 500 to 1700 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

CLIMATE. CUSTOMIZED. CyberHandler2 from 30KW to 520 KW. The indirect evaporative free-cooling Air Handling Unit

Industry leading low energy evaporative fresh air cooling.

Indirect Adiabatic and Evaporative Data Centre Cooling

Air-Side Economizer Cycles Applied to the Rejection of Data Center Heat

Precision Cooling Mission Critical Air Handling Units

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Munters Data Center Cooling Solutions. Simon Young & Craig MacFadyen Munters Data Centers

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

CASE STUDIES ENERGY EFFICIENT DATA CENTRES

Ecotel Free Cool 5-15kW

Data Center Energy Savings By the numbers

MSYS 4480 AC Systems Winter 2015

Efficiency of Data Center cooling

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Thermal management. Thermal management

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Case Study The University of Warwick

MEASURE MONITOR UNDERSTAND

Liebert. AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability CERTIFIED HVAC-HYGIENE.

Close-coupled cooling

IT COOLING AIR CONDITIONERS FOR HIGH DENSITY RACKS AND BLADE SERVERS. RACK COOLING SOLUTIONS FOR HIGH DENSITY RACK MANAGEMENT, FROM 4 TO 75 kw

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

Aquaflair. Uniflair TRAC, TRAF and TRAH. Air-cooled chillers, free-cooling chillers and heat pumps with scroll compressors.

Server room guide helps energy managers reduce server consumption

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Introducing the Heat Wheel to the Data Center

DATA CENTER AIR HANDLING COOLING SOLUTIONS

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Reducing Data Center Cooling Costs through Airflow Containment

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

A separate CRAC Unit plant zone was created between the external façade and the technical space.

Liebert. PCW from 25 to 220 kw. Chilled Water Units Optimizing Data Center Efficiency

Aquaflair. BREC/F and BCEC/F kw

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

APPLIED SYSTEM AWS FAMILY FOR LARGE APPLICATIONS PRODUCT LEAFLET. Capacity from 600 to 2,010kWr. Engineered for flexibility and performance

Scalable, Secure. Jason Rylands

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Data Center Energy and Cost Saving Evaluation

Liebert AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Liebert AFC from 500 to 1450 kw. The Adiabatic Freecooling Solution with Top-Tier Availability

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

HYFRA Sigma. Efficient, reliable cooling water recooling systems. Customized. Cooling. Solutions.

Google s Green Data Centers: Network POP Case Study

Maximising Energy Efficiency and Validating Decisions with Romonet s Analytics Platform. About Global Switch. Global Switch Sydney East

Green Data Centers A Guideline

Beyond BREEAM Excellent DCs. Cobalt Data Centre Campus

PEX+ High Efficiency, Low Operating Cost Precision Solution for Data Centers

A Green Approach. Thermal

i-accurate_gb_ High precision air-conditioners for data centres cooling, with new inverter technology and R-410A refrigerant, from 6 to 150 kw

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Impact of Air Containment Systems

New Design Strategies for Energy Saving. Intelligent free-cooling

Trading TCO for PUE? A Romonet White Paper: Part I

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Retro-Commissioning Report

Planning a Green Datacenter

Data Center Carbon Emission Effectiveness

Liebert. PDX from 15 to 165 kw. Direct Expansion Solution for Small and Medium Data Centers

Why would you want to use outdoor air for data center cooling?

INNOVATE DESIGN APPLY.

Performance Certification : From Energy Efficiency To Indoor Air Quality. By: Eric Foucherot

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

The Future. Today. YZ MAGNETIC BEARING CENTRIFUGAL CHILLER

Chilled Water Computer Room Air Handler (CRAH) Critical Infrastructure for the World s Leading Data Centers

INNOVATE DESIGN APPLY.

AireFlow kW DX AIR. Data Centre Cooling Product of the Year. Indirect Adiabatic AHU

Liebert PDX from 15 to 120 kw

Precision Cooling. Precision Cooling Advantages

Reducing Energy Consumption with

HIGH EFFICIENCY Central Chiller

Session 6: Data Center Energy Management Strategies

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

How to Increase Data Center Efficiency

Air Handling Units. With Intelligence

Virtualization and consolidation

Data Centers. Powerful Solutions That Put You in Control. Energy Management Solutions. Where your energy dollars are used. 1

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Liebert DCW CATALOGUE

ASHRAE IRELAND HVAC FOR DATA CENTRES. Michael Geraghty Building Services Engineer (DipEng) Company owner and managing director

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

RiMatrix S Make easy.

Liebert PDX from 15 to 120 kw

Transcription:

DATA CENTRE INDIRECT EVAPORATIVE COOLING SYSTEM

THE GLOBAL DATA CENTRE MARKET The data centre air conditioning market is expected to grow with a CAGR (compound annual growth rate) of about 12% in the next five years. The market value will grow from the actual $ 4,91billion to $ 8,07 billion in 2018, doubling in size. GLOBAL DATA CENTRE IP TRAFFIC WILL TRIPLE WITHIN 5 YEARS The global data centre IP traffic has set a constant growth path, with a CAGR of 25% during the period 2012-2017. In particular, the growth will be driven by three main factors: The need for bigger online storage 8 resources; 7 7,7 The new possibility to analize 6 6,4 a bigger amount of data (a phenomenon called Big data 5 5,2 analysis, applied in the analysis of 4 4,2 complex systems such as weather 3 3,3 forecasts or social behavior 2 2,6 prediction); 1 The growing demand for cloud 0 applications. 2012 2013 2014 2015 2016 2017 Cloud data centre traffic will grow with Year a CAGR of 35% from 2012 to 2017, at a faster rate than the traditional IP traffic, estabilishing a 4,5-fold growth during this period. Global cloud traffic crossed the zettabyte threshold in 2012, and by 2017 over two-thirds of all data centre traffic will be based in the cloud. Cloud traffic will represent 69% of total data centre traffic by 2017. Significant promoters of cloud traffic growth are the rapid adoption of and migration to cloud architectures, along with the ability cloud data centres offer in handling significantly higher traffic loads. Zettabytes / Year ENERGY EFFICIENCY & PUE LEVELS: A MARKET SURVEY PUE (Power Usage Effectiveness) is a measure of how efficiently a computer data centre uses energy. Specifically, it measures how much energy is used by the computing equipment (in contrast to cooling and other overheads). It is defined as the ratio of the total amount of energy used by a data centre facility to the energy delivered to the computing equipement. An optimum PUE would be 1,0. A 2013 independent market survey defined that 41% of data centre CIOs reported their PUE was above or equal to 2,0 while the average PUE was 2,53. Only 1% of those interviewed reported their PUE was lower than 1,4. Current market average PUE = 2,53 PUE achieved by 1,40 only 1% of market 2

DATA CENTRE COOLING REQUIREMENTS Data centre cooling systems represent a significant portion of a facility s capital expenditure and use a substantial amount of energy. ASHRAE (American Society of Heating, Refrigerating, and Air-conditioning Engineers) publishes specific guidelines for temperature and humidity control within data centres. The 3rd Edition of the Thermal Guidelines for Data Processing Envirnoments defines a reccomended operating window and four allowable ranges, designeted A1 though A4. The new allowable ranges (A3 and A4) are intended to remove obstacles to new data centre cooling strategies such as free cooling systems. Free cooling takes advantage of a facility s local climate by using outside air to cool IT equipment directly, without the use of mechanical refrigeration (chillers or air conditioners) whenever possible. Relative Humidity 90% 80% 70% 60% 50% 40% 30% 0 5 10 Saturation Temperature C 15 20 25 Class A1 Allowable envelope Classes A1-A4 Recommended envelope Class A2 Allowable envelope Class A3 Allowable envelope Class A4 Allowable envelope 0 0 10 20 30 40 50 20% 10% 20 15 10 5 Humidity Ratio - grams of moisture per kilogram of dry air DIRECT AIR OPTIMISATION (DAO) Dry bulb temperature C Direct Air Optimisation (DAO), employed by several key customers, moves on from closed circuit cooling, drawing in fresh air directly into the data hall and rejecting the warm air to atmosphere. CHILLED WATER SYSTEM (CWS) This solution comprises water chillers (generally featuring free cooling), a chilled water network and computer room air conditiong units. The chilled water distribution system delivers the required cooling to a number of elements on site. The air conditioning units operate in full recirculation mode inside a sealed environment. INDIRECT AIR OPTIMISATION (IAO) Indirect Air Optimisation (IAO) removes heat from the data hall return air and rejects this to ambient via a heat exchange process, subsequently returning the suitably cooled air back into the data hall. An adiabatic spray system preceding the heat exchanger intervenes when ambient conditions require it. IAO is ideally suited to the growing number of applications allowing higher temperatures, offering exceptional energy savings. 3

IEC OPERATION IEC features a unique heat exchange system which removes the heat from the data hall return air and rejects it through to the external ambient. The return air from the data hall is therefore cooled to a suitable temperature for re-delivery into the data hall. The advantage of this layout is that the data hall remains a fully sealed environment, isolated from the external ambient air, consequently there is no moisture or pollutant carry over into the data hall. The air-to-air heat exchanger acts as an intermediary device providing sensible heat transfer between the data hall return air and ambient intake air. The use of adiabatic sprays preceding the heat exchanger, a process applied at higher ambient temperatures only, lowers the dry bulb temperature of the air, thus increasing the effectiveness of the heat exchange process when cooling the warm return air from the data hall to the desired supply temperature which is sent back into the data hall. On request IEC can also be supplied with a Mechanical Cooling Module, which offers additional cooling performace at extreme conditions, as well as added system redundancy. The whole system is managed by a plug-and-play controller which defines the various cooling stages, ensuring perfect temperature control combined with minimal running costs. 3 DISTINCT OPERATING RANGES SC AC MC Range in which only Standard Cooling operates. Range in which Adiabatic Cooling operates (adiabatic spray assists the cooling). Range in which Mecchanical Cooling is added to top up the Standard + Adiabatic Cooling. Operating area for a typical Data Centre in Frankfurt (Germany), showing how Mechanical Cooling can be totally avoided in this application. SC 10 15 Saturation Temperature C 20 25 30 90% 80% AC 70% 60% 50% Relative Humidity % 40% 30% 20% MC 10% 0,030 0,025 0,020 0,015 0,010 0,005 Humidity Ratio - grams of moisture per kilogram of dry air -15-10 -5 0 5 10 0,000 15 20 25 30 35 40 45 50 55 60 All values refer to 100% load capacity. Dry bulb Temperature C NB: for 23 C WB external temperatures IEC can guarantee a Data Centre supply air temperature of 27 C without the need for any Mechanical top-up Cooling. 4

IEC SC STANDARD COOLING takes place when the external ambient air temperature lies below the indoor air temperature. The external air is heated by passing through the heat exchange system, thus removing heat from the data centre hall. In this condition no adiabatic cooling or mechanical cooling is required, due to the advantageous external temperature conditions. DATA CENTRE RETURN AIR LOW EXTERNAL AIR INTAKE TEMPERATURES DATA CENTRE SUPPLY AIR EXTERNAL AIR EXHAUST AC ADIABATIC COOLING uses adiabatic water sprays preceding the heat exchange system to precool the external air. When water is introduced into the air flowing into an air-cooled cooling system it evaporates, increasing the humidity and lowering the temperature. Cooler air increases the heat exchanger s heat rejection capacity, improving system efficiency. Adiabatic cooling operates by boosting the standard cooling process, adding extra performance when required. DATA CENTRE RETURN AIR MEDIUM EXTERNAL AIR INTAKE TEMPERATURES DATA CENTRE SUPPLY AIR EXTERNAL AIR EXHAUST ADIABATIC SYSTEM: ON MC MECHANICAL TOP-UP COOLING (OPTIONAL) makes use of a refrigeration circuit to provide additional cooling to the supply air exiting the heat exchange system. Alternatively chilled water can be used as the cooling medium. Mechanical cooling operates only in the rare occassions where external temperatures are too high, and always operates together with both standard and adiabatic cooling, adding a percentage of additional cooling. DATA CENTRE RETURN AIR VERY HIGH EXTERNAL AIR INTAKE TEMPERATURES DATA CENTRE SUPPLY AIR EXTERNAL AIR EXHAUST ADIABATIC SYSTEM: ON MECHANICAL COOLING: ON 5

IEC: NEXT GENERATION DATA CENTRE COOLING HIGHEST EFFICIENCY IEC minimizes running costs, and can achieve a data centre supply air temperature which is only 4 C above the ambient wet bulb temperature, this without the use of mechanical cooling. For example, for a data centre in London (UK) and on a day with a 23 C ambient wet bulb temperature, a combined standard + adiabatic cooling operation (ie. without mechanical cooling) offers a 27 C data centre supply air temperature, thus falling within ASHRAE A1 recommended supply air conditions. MAIN FEATURES 4 sizes available. Cooling capacity from 84 kw to 336 kw per module. Modular design allowing a perfect match to the cooling load. TECHNOLOGY Indirect air utilization to avoid internal air flow contamination. Elevated number of hours operating in free cooling, in combination with indirect adiabatic humidification. Additional cooling water coil or direct expansion cooling system for peak loads and redundancy. COMPONENTS EC fans with efficiencies of up to 90%. Unique aluminium crossflow heat exchanger featuring AERO-SHIELD acrylic protection, exclusive to, offering minimum pressure drops and long-term fault-free operation. Adiabatic cooling system with high pressure atomising jets ensuring the fastest water evaporation levels. REGULATION & CONTROL Fully wired components and electrical controls mounted onboard. Integrated programmable logic controller. Interface for BMS. Remote controller option with possibility for touch screen. Plug and play operation. NO MECHANICAL COOLING Dependent on external wet bulb temperatures, IEC can provide full cooling requirements via highly efficient heat transfer processes. Onboard mechanical back up to trim the temperature can be provided where necessary. INDIRECT AIR SOLUTION No internal contamination of data centre air from outside. No need for extensive filtration systems, humidification and refrigeration. CAPITAL AND OPERATIONAL COST REDUCTIONS Approximately 95% running costs reduction compared with traditional chiller solutions. Infrastructure cost reduction of around 32% versus standard chilled water systems. 6

ADVANCED FIELD TESTED COMPONENTS HIGH EFFICIENCY HEAT EXCHANGER Crossflow configuration. Designed for air-flows up to 100.000 m 3 /h and performance up to 80%. Eurovent certified. Low pressure drop. Heat recovery in high energy consumption environments. Aluminium construction with exclusive AERO-SHIELD acrylic protection. Additionally sealed exchanger block. Fully removeable. FANS FEATURING EC TECHNOLOGY Efficiencies of up to 90%. Extended service life. Each individual fan features its own inverter, mounted directly on-board. Motor efficiency beyond IE4 efficiency class requirements, without using rare earth magnets. Backward curved steel blades minimize bearing loads, maximize durability and offer high circumferential velocities. BYPASS DAMPER SYSTEM Modulating damper to bypass air from the hot aisle which mixes with the cold supply air. Ensures optimum operation whatever the conditions. ADIABATIC COOLING SYSTEM Creates 10 micron droplets. High pressure system for leading performance. Total droplets area 3 times higher than standard low pressure systems (increased efficiency, faster evaporation). Inverter controlled pumps with integrated timers for optimised spraying cycles. Less than 55dB noise level. RESISTANT CONSTRUCTION Stainless steel panels and base plates within all sections subjected to humidity. Ensures longevity and increased reliability. MECHANICAL COOLING SYSTEM (OPTIONAL) Choice between direct expansion system and cooling water coil. Ensures perfect temperature control even in extreme conditions. Offers added redundancy. 7

EXCEPTIONAL RUNNING COST REDUCTIONS IEC, thanks to advanced indirect cooling technology, offers the absolute lowest levels of PUE (Power Usage Effectiveness), and as a consequence allows the data centre user to obtain both significantly reduced annual running costs and a notably reduced carbon footprint. LOWEST ppue ppue is defined as the ratio between the total amount of energy used by a data centre and the energy delivered to the computing equipment: the closer to 1,0, the more efficient the system. When compared to both average current levels and alternative technologies IEC offers significantly lower ppue levels. ppue 1,04 ppue 1,07 ppue 1,80 ppue 2,10 IEC Dynamic Setpoint Chillers Traditional Free cooling Chillers Average market value MINIMAL RUNNING COSTS Thanks to a near complete application of free cooling using the external air, IEC power consumptions are reduced drastically versus alternative technologies. Mechanical cooling is either completely avoided or, in the hottest conditions only, notably reduced, as such the only real energy consumption derives from the advanced fans featuring EC technology and efficiencies of around 90%. ANNUAL COOLING COST WATER CONSUMPTION Average market value Traditional Free cooling Chillers Dynamic Setpoint Chiller IEC Water is used in the adiabatic cooling phase only. Thanks to s 10 micron droplet distribution network and the intrinsic efficiency of the main heat exchanger, which reduces the periods when adiabatic intervention is required, water consumption can be contained to little more than 1.000m 3 per annum for a 1MW data centre. MAINTENANCE IEC makes minimal use of moving parts; this not only benefits running costs but also both maintenance needs and system reliability. The air and water filters can be easily cleaned, the heat exchanger is generally self-cleaned thanks to the adiabatic process, the fan motors featuring sealed for life bearings require no programmed maintenance. References: The Green Grid, 2007, The Green Grid Data Centre Power Efficiency Metrics: PUE and DCiE, Technical Committee White Paper, Environmental Protection Agency, ENERGY STAR Program, 2007, Report to Congress on Server and Data Energy Efficiency. Calculations assume a system energy usage of 4,0kW per 100kW system load for a London (UK) based data centre with ancilliary systems (UPS, lighting etc) at 7,0% of total load. 8

THE IEC RANGE IEC MODEL 100 200 300 400 Nominal cooling capacity kw 84 168 252 336 Airflow m 3 /h 21000 42000 63000 84000 Length mm 5460 5460 5940 5940 Height mm 4048 4048 4048 4048 Width mm 1534 2334 3868 4668 Nominal Conditions: External air: 23 C (wb) / 35 C (db). Supply air temperature: 24 C. Δt: 12K. Performances at differing conditions available on request. IEC 300 IEC 400 IEC 100 IEC 200 9

AERMEC: THE COMPANY, founded in 1961, counts amongst Europe s longest established Air Conditioning suppliers. A true pioneer, with over 50 years of innovative customer focussed solutions, is present on all continents worldwide and with Subsidiaries and Affiliates in France, Germany, Italy, the Netherlands, Poland, Russia, Spain and the UK. A GROUP OF EXPERTS is the largest partner within Giordano Riello International Group, each company acting as a Centre of Excellence covering the full Air Conditioning portfolio. This includes, amongst others, specific know-how in heat exchanger technology, ventilation and air sourced conditioning applications. Group turnover stands at 305m, with around 1600 employees. INNOVATION It is this unique structure and consolidated presence which allows to implement state of the art innovation, most notably in the field of energy saving and environmental footprint minimization. QUALITY AND PERFORMANCE Product quality is an hallmark. Premium components and meticulous testing of each and every product exiting the factories ensures longevity and trouble free operation. This is coupled with Eurovent certified performance levels, as continuously verified within s advanced testing labs, which span up to 1500kW cooling capacity. Specific labs within also cater for extreme temperature testing, noise level verification and vibration testing. KNOWLEDGABLE SUPPORT offers expert support for every single project, providing honest consultancy and a partnership built upon trust; the result is a tailor made solution optimizing each individual application. s facilities in Bevilacqua (Italy) 10

EXTENSIVE DATA CENTRE EXPERIENCE s experience in data centre cooling technologies spans many years, numerors countries and countless individual projects. s expert professional project approach, combined with system efficiency and reliability, renders a natural choice in data centre applications. Multiple installations in a total of 17 nations Countries featuring data centre solutions. CONSULTANCY AND AFTER SALES SUPPORT offers data centre users a focused technical support, accompanying its customers in strategic data centre decisions and providing a full portfolio of services, including: System energy efficiency analysis using innovative energy simulation softwares; allows you to evaluate the overall system efficiency in order to obtain the lowest possible PUE. Accurate real operation condition witness tests in leading testing laboratories, allowing customers to validate the performance of the units prior to start-up. Safety in time: evolved devices supplied with the system allow 24/7 control and supervision of the systems, even remotely, ensuring maximum reliability and peace of mind. service personnel available at all times for fast and efficient troubleshooting and on-site interventions. 11

S.p.A. Via Roma, 996 37040 Bevilacqua (VR) - Italia Tel. + 39 0442 633111 Fax +39 0442 93577 marketing@aermec.com www.aermec.com reserves the right to make all modification deemed necessary for improving the product at any time with any modification of technical data.