Rolf Brink Founder/CEO

Similar documents

IN COLLABORATION WITH

FREEING HPC FROM THE DATACENTRE

HPC/Advanced Cooling Track. Rolf Brink/CEO/Asperitas

Results of Study Comparing Liquid Cooling Methods

Next Generation Cooling

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

WHITEPAPER IMMERSED COMPUTING ROLF BRINK, CEO.

Cooling Solutions & Considerations for High Performance Computers

Thermal management. Thermal management

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Planning a Green Datacenter

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Green Data Centers A Guideline

Close Coupled Cooling for Datacentres

MSYS 4480 AC Systems Winter 2015

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Switch to Perfection

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Capacity and Power Management: The Forgotten Factors in Disaster Recovery Planning Presented by: Clemens Pfeiffer CTO Power Assure, Inc.

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

Data Center Energy Savings By the numbers

LCP Hybrid Efficient performance with heat pipe technology

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Case Study: Energy Systems Integration Facility (ESIF)

One Stop Cooling Solution. Ben Tam Business Development Manager

INSIDE THE PROTOTYPE BODENTYPE DATA CENTER: OPENSOURCE MONITORING OF OPENSOURCE HARDWARE

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Green Computing: Datacentres

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Reducing Data Center Cooling Costs through Airflow Containment

How to Increase Data Center Efficiency

Recapture Capacity for Existing. and Airflow Optimization

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Server room guide helps energy managers reduce server consumption

MEASURE MONITOR UNDERSTAND

Air Containment Design Choices and Considerations

Reducing Energy Consumption with

Virtualization and consolidation

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

Green Computing: Datacentres

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Total Modular Data Centre Solutions

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

A Green Approach. Thermal

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

DATA CENTER LIQUID COOLING

Energy Efficiency and WCT Innovations

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Scalable, Secure. Jason Rylands

White paper adaptive IT infrastructures

Efficiency of Data Center cooling

Rittal Cooling Solutions A. Tropp

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Overcoming the Challenges of Server Virtualisation

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

Your data center s hidden source of revenue

18 th National Award for Excellence in Energy Management. July 27, 2017

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

Saving Energy with Free Cooling and How Well It Works

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

CAS 2K13 Sept Jean-Pierre Panziera Chief Technology Director

HIGH DENSITY RACK COOLING CHOICES

Power and Cooling for Ultra-High Density Racks and Blade Servers

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Remote Management of Data Center White Space: What to Manage and How? Ashish Moondra Sr. Product Manager Chatsworth Products

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

INNOVATE DESIGN APPLY.

Making the impossible data center a reality.

Topics. CIT 470: Advanced Network and System Administration. Google DC in The Dalles. Google DC in The Dalles. Data Centers

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

High Efficient Data Center Cooling

Product Guide. Direct Contact Liquid Cooling Industry leading data center solutions for HPC, Cloud and Enterprise markets.

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

Algorithms, System and Data Centre Optimisation for Energy Efficient HPC

Charles Nolan. Do more with less with your data centre A customer s perspective. Director IT Infrastructure, UNSW. Brought to you by:

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Data Center Infrastructure Management (DCIM) By Jesse Zhuo

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

INNOVATE DESIGN APPLY.

Industry leading low energy evaporative fresh air cooling.

Energy Usage in Cloud Part2. Salih Safa BACANLI

Power Monitoring in the Data Centre

Impact of Air Containment Systems

IT COOLING DATA CENTER COOLING THE MILESTONE OF GREEN DATA CENTER COOLING TECHNOLOGY

Transcription:

Rolf Brink Founder/CEO +31 88 96 000 00 www.asperitas.com

COPYRIGHT 2017 BY ASPERITAS Asperitas, Robertus Nurksweg 5, 2033AA Haarlem, The Netherlands Information in this document can be freely used and distributed for internal use only. Copying or re-distribution for commercial use is strictly prohibited without written permission from Asperitas. If you would like to use the original powerpoint version of this document, you can send an email to: Marketing@Asperitas.com FOR MORE INFORMATION, FEEDBACK OR TO SUGGEST IMPROVEMENTS FOR THIS DOCUMENT, PLEASE SEND YOUR SUGGESTIONS OR INQUIRIES TO: WHITEPAPERS@ASPERITAS.COM

THE DATACENTRE OF THE FUTURE A DATACENTRE IS NOT ABOUT ICT, POWER, COOLING OR SECURITY. IT IS NOT EVEN ABOUT SCALE OR AVAILABILITY OF THESE SYSTEMS. IT IS ABOUT THE AVAILABILITY OF INFORMATION

THE CHALLENGES INCREASE IN INFORMATION FOOTPRINT DEMAND FOR HIGH DENSITY CLOUD CONSOLIDATION OF POWER DEMAND OVERLOADING OF OUTDATED POWER GRID GLOBAL NETWORK LOAD CREATING EXERGY

ENERGY FOOTPRINT OF INFORMATION FACILITIES Datacentres, Server rooms, Network hubs etc. Estimated 4% Global Electricity Production (25 PWh) 4% = 1 PWh 1000000000000000 Wh (10 15 )

EXPLANATION PREVIOUS SLIDE (ENERGY FOOTPRINT) There is no real data available which can substantiate any sort of claim to the percentage mentioned. It is a very rough estimate for a global scale. The fact that the information is unavailable, is part of the problem. You cannot solve the problem which cannot be identified. https://yearbook.enerdata.net/electricity/world-electricity-production-statistics.html The only available source for a global figure is the European Commission where Neelie Kroes mentioned datacentres being responsible for 2% of the global electrical energy production and ICT in general being responsible for 8-10%. (2012)

GLOBAL PUE: APPROX. 1.7 UPS/Nobreak 2% Other 2% Global estimated PUE breakdown Cooling 37% ICT 59% Fans (20%) 10% Information (65%) 40% Power supply (15%) 9%

EXPLANATION PREVIOUS SLIDE (GLOBAL PUE ESTIMATE) There is no real data available which can substantiate any sort of claim to the percentages mentioned. Everything is a very rough estimate for a global scale. For both the Power supply and Fans percentages, the estimates are based on the following factors: Most datacentres (colo) have no control over the internal temperature management of the IT. Environmental temperatures in datacentres are usually too high for optimal power efficiency of IT. IT is in general greatly under-utilised (30% is already high in many cloud environments, corporate IT in colo is often worse) which creates a very high overhead for both the PSU and fans. Most cloud and corporate backoffice platforms are running on inefficient, cheap IT hardware which is driven by CAPEX as opposed to OPEX. Reason for this is that power consumption is not part of the IT budget. Virtually all high density cloud environments (5-10 kw/rack) are based on 1U servers. These are the least efficient when it comes to fan overhead.

ACTUAL INFORMATION EFFICIENCY Actual efficiency 1000 TWh total energy Actual cooling: 471 TWh Actual power loss: 112 TWh 112 TJ/s Other overhead: 17 TWh 17 TJ/s Energy for information: 400 TWh 400TJ/s Information 40% Cooling + IT fans 47% 1000 TWh/400TWh PUE equivalent: 2.5 Other 2% UPS+IT Power supply 11%

ENERGY TRANSFORMATION Global thermal energy production by Information 1904400000000000000 J or 1.9 EJ (10 18 ) (529 TJ/s*3600): Energy for heat rejection EXERGY DESTRUCTION: 471000000000000 Wh

WHAT IF INFORMATION FACILITIES COULD... EXCLUDE COOLING INSTALLATIONS REDUCE IT OVERHEAD BALANCE THE POWER GRID REDUCE THE DATA NETWORK LOAD BECOME ENERGY PRODUCERS

COOLING THE CLOUD 1 MW critical load, ΔT of 10 C Thermal production: 1MJ/s 1 C rise with air: 1005 J/kg C * 0.001205 kg/l = 1.211025 J/L/s 1MJ/s requires: 1000000 J/(10 C *1.2 J) 83333L/s AIR

83333 L/s AIR VS WATER 1 MJ/s Water required for ΔT of 10 C 4187 J/kg C * 1 kg/l = 4187 J/L/s per 1 C 10 C with 1 MJ/s: 24 L/s Liquid can travel 200 TIMES THE DISTANCE with same thermal losses THE DATACENTER OF THE FUTURE: 6 L/s AND IS AN ENERGY PRODUCER...

EXPLANATION PREVIOUS SLIDES (WATER VS AIR) For simplicity, the identical ΔT is maintained for both approaches. After feedback from reviewers and the audience of Datacentre Transformation Manchester, the comparison was raised to 10 C Air will usually allow the ΔT to become higher than 10 C, although due to poor utilization, this is not always achieved. In CRAC water circuits, the ΔT is usually below 10 C.

THE WRONG QUESTION DON T ASK WHICH TECHNOLOGY TO USE CHOOSE BETWEEN AIR OR LIQUID THEN COMBINE LIQUID TECHNOLOGIES

TLC - IMMERSED COMPUTING 100% Removal of heat from the IT Highest IT efficiency by eliminated fans No air required Level of intelligence Management control and insight Automatic optimisation of the water circuit Optimised for high density cloud/hpc nodes Varying servers Flexible IT hardware Feed: 18-40 C / 55 C Extreme / max ΔT 10 C

DLC - DIRECT-TO-CHIP LIQUID COOLING Removes heat from hottest parts of the IT Increased IT efficiency by reduced fan power Requires additional cooling (ILC) Level of intelligence Management control and insight Automatic optimisation of the water circuit Optimised for HPC racks with identical nodes Very high temperature chips High density computing Feed: 18-45 C / max ΔT 15 C

ILC - (ACTIVE) REAR DOOR COOLING 100% Removal of heat from the IT Small IT efficiency by assisted circulation Acts as air handler in the room Level of intelligence Management control and insight Automatic optimisation of the water circuit Optimised for IT with limited liquid compatibility Storage Network Legacy systems and high maintenance servers Feed: 18-23 C / 28 C Extreme / max ΔT 12 C

OPTIMISING LIQUID INFRASTRUCTURES TECHNOLOGY NORMAL INLET OUTLET CRAC (generic) 6-18 C 12-25 C ILC (U-Systems) 18-23 C 23-28 C DLC (Asetek) 18-45 C 24-55 C TLC (Asperitas) 18-40 C 22-48 C TECHNOLOGY EXTREME INLET OUTLET CRAC (generic) 21 C 30 C ILC (U-Systems) 28 C 32 C DLC (Asetek) 45 C 65 C TLC (Asperitas) 55 C 65 C

INCREASING ΔT WITH TEMPERATURE CHAINING Serial implementation of the infrastructure 17 C CRAC Parallel ILC Parallel TLC & DLC paired TLC & DLC paired Facility output +5 C Output 22 C +6 C Output 28 C +16 C Output 44 C +16 C Output 60 C 60 C CREATE HIGH T 3-stage cooling for low water volume Down to 35 C, free-air Between 35-28 C, adiabatic Below 28 C, chiller USABLE HEAT

TEMPERATURE CHAINING EXAMPLE Closed room 3-stage configuration ILC setup maintains air temperature Water volume decreased by 85% ΔT 6 C : 29.9 L/s ΔT 40 C : 4.5 L/s Cooling options Closed cooling circuit with pumps and coolers Closed cooling circuit with pumps and reuse behind Heat Exchanger Open cooling circuit with external water source supplied for reuse 120 kw ILC +6 C Facility input 20 C +6 C +5 C +8 C +7 C +2 C +5 C +7 C +5 C +9 C +8 C +4 C +6 C Stage 1 ILC 26 C +8 C +5 C +7 C +6 C +2 C +7 C +9 C +5 C +3 C +2 C +6 C +7 C 400 kw TLC 40 kw DLC +19 C Stage 2 DLC & TLC +5 C +6 C +6 C +7 C +6 C +6 C +6 C +6 C +7 C +4 C +3 C +6 C 45 C 160 kw TLC 60 kw DLC +15 C +7 C +10 C +7 C +9 C +17 C +12 C +14 C +16 C +7 C +8 C +6 C +7 C Stage 3 Optimised DLC&TLC Facility output 60 C

REUSE MICRO INFRASTRUCTURE Micro datacentre or server room Open water circuit Reusable requirement: 65 C Variable volume Feedback loop for constant temperature 51 C +7 C +7 C Variable temp Facility input 5-40 C Constant temp Facility output 65 C

TEMPERATURE CHAINING IMPACT 83333 L/s AIR VS WATER 1 MJ/s Water required for ΔT of 40 C 4187 J/kg C * 1 kg/l = 4187 J/L per 1 C 40 C with 1 MJ/s: 6 L/s

DATACENTRE DESIGN DESIGNED FOR AIR Cooling options: 100% Chillers 100% Free air/adiabatic + 100% Chillers (off) High volume, low T (5-20 C) Fluid handling Spacious high capacity air ducting Air filtration Hot/Cold aisle separation Information density (avg) 1,5 kw/m 2 Concrete floor + Raised floors Power UPS (IT only): 100% Gensets (facility): 100% DESIGNED FOR MIXED LIQUID Cooling options: External cold water supply by reuser 100% Free air/adiabatic + 5% chillers Low volume, high T (20+ C) Fluid handling Normal capacity water circuit Water quality management Minimal fresh-air ventilation Information density (mixed) 12 kw/m 2 Bare concrete floor Power (compared to air) UPS (IT only): 90% Gensets: 60%

SITE PLANNING AND QUALIFICATION Minimised energy footprint Minimised installations requirements Flexibility by minimal environmental impact FOCUS ON 24/7 HEAT CONSUMERS

REDEFINING THE LANDSCAPE Large facilities Core Datacentres On the edge of urban areas Distributed micro facilities Edge nodes Inside the urban area Energy balancing Distributed minimised power load Focus on heat reuse

DISTRIBUTED MICRO EDGE NODES 10-100 kw EDGE ENERGY REUSE Spas, swimming pools (100% reuse) Hospitals, hotels with hot water loops (100% reuse) Urban fish/vegetable farms with aquaponics (100% reuse) District heating (100% reuse) Aquifers for heat storage (75% reuse) Water mains (29% reuse) Canals, lakes and sewage (exergy destruction) Edge of network, within urban areas IoT capture and processing Data caching (Netflix, Youtube, etc.) Localised cloud services (SaaS, Paas, IaaS) Minimised facilities External cooling input 24/7 energy rejection for reuse Geo redundant Tesla Powerpack for controlled failover District data hub

CORE DATACENTRES Industrial scale reuse infrastructures 100% 24/7 heat reuse Agriculture Spas Cooking, pressurising, sterilisation, bleaching Distillation, concentrating, drying or Kilning Chemical Large facilities No-break systems Limited cooling infrastructure 24/7 information availability Edge management Replicated Edge back-end Communication hub Exergy destruction Rivers/ocean Liquid-to-air rejection

EDGE MANAGEMENT Emerging platforms for decentralised cluster management Integration with energy and heat management CLOUD CUSTOMERS CLOUD DEMAND.ware HARDWARE INSTALLATION Q.RADS, O.MAR, AIC24 HEAT DEMAND CUSTOMERS COMPUTING JOBS COMPUTING RESULTS CLOUD DEMAND DISTRIBUTION MANAGEMENT FOR HEATING INFORMATION HARDWARE JOB SCHEDULING HEAT DEMAND COMPUTING JOBS COMPUTING RESULTS Free and green heat.rads

LIQUID INFORMATION FACILITIES Reduced or eliminated technical installations Cooling No-break Reduced build cost No raised floors Reduced space for fluid handling Increased, distributed power density Reduced m 2 Reduced operational cost Reduced maintenance on installations High IT density Higher specs IT hardware, also for cloud Reduced software cost

LIQUID IS COMING HOW TO PREPARE? Design for water (redundant) to IT Sufficient ability to distribute piping to whitespace Plan for reusable heat Plan for liquid way of work IT maintenance rooms for wet equipment Staff training for liquid management Proper supplies and tooling

WHAT NEEDS TO BE DONE? Focus on heat consumption without dependency Not with an invoice Free cooling guarantee UPS/Nobreak 2% Other 2% PUE inefficiency Government involvement Incentives Intermediate for (industrial) heat reuse Information footprint as part of district planning More (low grade) heating networks Cooling 37% ICT 59% Fans (20%) 10% Information (65%) 40% Power supply (15%) 9% Focus on TCO, not CAPEX PUE need for a new easy metric PUE figures are widely manipulated PUE discourages IT efficiency It needs to give insight in actual inefficiency FEAR OF WATER

THE DATACENTRE OF THE FUTURE? THE FUTURE IS NOW!