FREEING HPC FROM THE DATACENTRE

Similar documents
NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

INCREASE IT EFFICIENCY, REDUCE OPERATING COSTS AND DEPLOY ANYWHERE

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Results of Study Comparing Liquid Cooling Methods

Thermal management. Thermal management

Industry leading low energy evaporative fresh air cooling.

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Switch to Perfection

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

LCS Value Proposition November 2014

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Rolf Brink Founder/CEO

Product Guide. Direct Contact Liquid Cooling Industry leading data center solutions for HPC, Cloud and Enterprise markets.

Rittal Cooling Solutions A. Tropp

ENCLOSURE HEAT DISSIPATION

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Total Modular Data Centre Solutions

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Recapture Capacity for Existing. and Airflow Optimization

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

HIGH DENSITY RACK COOLING CHOICES

Power and Cooling for Ultra-High Density Racks and Blade Servers

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

Efficiency of Data Center cooling

Making the impossible data center a reality.

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

MSYS 4480 AC Systems Winter 2015

Reducing Data Center Cooling Costs through Airflow Containment

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.


Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Impact of Air Containment Systems

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

DATA CENTER LIQUID COOLING

MEASURE MONITOR UNDERSTAND

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Air Containment Design Choices and Considerations

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

HPC/Advanced Cooling Track. Rolf Brink/CEO/Asperitas

Cannon T4 Modular Data Center (MDC)

How to Increase Data Center Efficiency

Close Coupled Cooling for Datacentres

Power & Cooling Considerations for Virtualized Environments

Green Data Centers A Guideline

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Who are Aqua Cooling?

Reliable Direct Liquid Cooling for the world's most demanding Data Centers.

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

ECOBAY: Closed loop water cooled server cabinet for data centers

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Cooling Solutions & Considerations for High Performance Computers

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Power Monitoring in the Data Centre

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Case Study Leicester City Council

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Reducing Energy Consumption with

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Taking The Heat Off University of East Anglia Data Centre

DAX Directional Airflow exhaust system. Energy efficient heat removal and redistribution. Product overview and technical data

Google s Green Data Centers: Network POP Case Study

IT Air Conditioning Units

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Virtualization and consolidation

The Shared Data Centre Project

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

Over IP Group. Data Centre Infrastructure Management (DCIM)

One Stop Cooling Solution. Ben Tam Business Development Manager

Case study: Energy efficiency improvements in a legacy data centre. Robert Tozer Director Opera>onal Intelligence

Planning a Green Datacenter

Ecoflair Air Economizers

A Green Approach. Thermal

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

Overcoming the Challenges of Server Virtualisation

Close-coupled cooling

Welcome Association of Energy Engineers to the Microsoft Technology Center

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Scalable, Secure. Jason Rylands

Green Computing: Datacentres

Viable Options for Data Center Solutions. By Ning Liu Offering Group Panduit APJ

Data Center Energy Savings By the numbers

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

Art and Science of Building NUS Data Centres

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

Server Room & Data Centre Energy Efficiency. Technology Paper 003

Liquid Cooling Package LCP Cooling Systems

Transcription:

FREEING HPC FROM THE DATACENTRE ku:l sistem TM COOLING REDEFINED www.iceotope.com

ku:l sistem removes the need for inefficient air cooling technologies FREEING HPC FROM THE DATACENTRE Iceotope s patented and highly unique immersive liquid- cooling technology provides unrivalled benefits, freeing hpc from the datacentre. At Iceotope we harvest the heat generated by power electronics at source, directly into our engineered fluid; removing the need for inefficient air cooling and its associated infrastructure, cost, space and complexity. DEPLOYMENT SIMPLIFIED BUILT TO LAST RELIABLE AND CONSISTENT SMART CITY READY SUBSTANTIAL SAVINGS

Figure 1 - Cross-section through 1U chassis shows coolant recirculation We have no need for chilled air, fans, CRAC units, hot-aisles, cold-aisles and all the space-hungry infrastructure of traditional air cooling. KEY POINTS Within a sealed, 1U chassis, we create a protective cooling environment for precious electronics, surrounding them with dielectric coolant. Heat is harvested from every component and we deliver the coolest coolant where it is needed the most, minimising coolant volume. A gently recirculating coolant flow inside the sealed chassis enables the harvested heat to be transferred to a building circuit via a plate heat exchanger. Our dielectric coolant is non-oily, non-solvent, non-toxic, leaves no residue and does not affect the electronics. Building coolant can enter the heat exchanger at up to 45oC and exit at up to 50oC. With these high operating temperatures we have no need for chillers and heat can be rejected to ambient using drycoolers, or captured and reused for heating. Using dry-coolers means that our system consumes no water. We have no need for chilled air, fans, CRAC units, hotaisles, cold-aisles and all the space-hungry infrastructure of traditional air cooling. In short, we can take high performance computing out of the datacentre and run it, silently, where it is needed.

Figure 2 - Plan view of 1U chassis shows coolant recirculation

Figure 3 - Shows building blocks of the system

REDUCE COSTS Up to 25% CapEx reduction Up to 75% less floor space Up to 70% energy reduction Up to 35% overall TCO reduction PUE of 1.03 Facility-level cooling water in at up to 45oC Facility-level cooling water out at up to 50oC No requirement for water chillers Losing 5degC from 50oC output water is easily achieved using a cost effective, lightweight dry cooler IMPROVE PERFORMANCE Increase s the life cycle of your IT Maximise the memory bandwidth by 30% Do not suffer from any of the thermal shock or heat fatigue all components are kept at the same temperature Maximises the performance of the CPU allowing them to run at full capacity (boost Turbo mode) without throttling, so more power IT can run at max performance continuously

IMPROVE RELIABILITY Full server heat capture -Total Immersion cooling covers all IT, power and mainboard components Immersion cooling provides a protective environment for the IT Immersion cooling tackles the problem of heat at source IT does not become contaminated with airborne dust System is tier-3 compliant, straight out of the box Twin power feeds can be easily incorporated into chassis design Redundancy for cooling can be easily facilitated at the rack-level with A & B feeds in to the rack-level manifold Failure domain of 1 chassis REDUCE INFRASTRUCTURE Up to 66% reduction in floor space No requirement for additional air handling equipment or air cooling infrastructure (Chillers & CRACS) thus the expensive maintenance and support of such units No requirement for cold isle containment Deploy in the harshest environments, as we use no fans nothing can be contaminated Less Complex Data Centre designs with less building infrastructure, no raised reinforced floors, ducting, or the requirement for false ceilings etc Maximise Data Centre real estate, deploy anywhere, utilise dead space, use un-used brownfield sites Increase density more power per rack up to 40KW per rack Easy implementation into existing air-cooled environment, (chassis, rail kit manifold)

FLEXIBILITY Retrofits into Standard 19 rack Can live alongside existing air-cooled kit 1U chassis can be configured to accommodate a range of IT / server form factors Can cool high-power processors in 1U that would otherwise require 2 or 3U heatsinks if cooled with air IT agnostic ADDITIONAL BENEFITS Silent in operation Highly secure Clean, environmentally friendly dielectric coolant (unlike other solutions on the market) Coolant is Factory Mutual approved Zero flash point Hot output water can be used for heating Easy to incorporate into an existing cabinet Ability to recapture the waste heat, to re-use with-in the building Lightweight heat rejection means less structural load on roof Facility-level cooling system recirculates so it consumes no water Dealing direct with manufacturer 1 point of contact and responsibility We can fit into a customer s supply chain by icensing design

GET IN TOUCH T: +44 (0)114 224 5500 E: info@iceotope.com E: sales@iceotope.com AMP Technology Centre Advanced Manufacturing Park Brunel Way, Sheffield S60 5WG, UK COOLING REDEFINED www.iceotope.com