PROCESS & DATA CENTER COOLING How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES
CONGRATULATIONS 2
BOLD PREDICTION 3
Where We Are TODAY Computer Technology NOW advances more every hour than it did in its entire first 90 years Today a 7 year old Data Center is considered obsolete HPC facilities refresh every 3-5 years 2018 saw 23+ Billion (IoT) Devices Connected Worldwide 2018 saw the largest HYPERSCALE Data Centers deployments EXASCALE Computing: refers to computing systems capable of at least one exaflops, or a billion billion (i.e. a quintillion) calculations per second. 4
PROCESS & DATA CENTER COOLING High Density Free Cooling, Adiabatic & Heat Recovery Chillers Efficiency & Reliability Increased Uptime & Performance with Lowest TCO Maximize space, build for the future & reduce OpEx with RDHX s 5
HIGH DENSITY MARKET GROWTH The global High Performance Computing (HPC) Market is expected to grow at approximately USD 33 Billion by 2022, at a Compound Annual Growth Rate (CAGR) of approximately 5%. Source: Market Research Future Segmentation by Vertical Markets: Retail, Manufacturing, BFSI, IT and Telecommunication, Healthcare, Energy & Utilities, Transportation and Other.
Big Data, AI, Machine Learning Worldwide consumption of data is outpacing infrastructure Edge data centers help to speed up the transfer of the data allowing for the advancement of IOT, self driving cars, and other connected devices. HPC adoption by enterprise companies High ROI on HPC Investments Average of $673.00 in revenue per dollar of HPC invested Average of $44 in cost savings per dollar of HPC invested
Multiple Cores Thousands Of Cores
Integrated Free Cooling Air Cooled Chiller 10
Maximizing Chiller Efficiencies Increase Chilled Water Temperatures Increase chiller capacity by 30% when producing 60F water compared to 44F water! Reduce overall chiller HP by using a smaller compressor plant. Higher efficiencies Higher chilled water temps allow for more water-side economizing 11
Free Cooling Efficiency @ 44/54/95 12
Free Cooling Efficinecy @ 60/70/95 13
HYPERSCALE FREE COOLING CHILLER PLANT Maximize Available Free Cooling Hours Highest Chilled Water Temps as possible Use Redundant chillers for additional Free Cooling hours when possible Plan for Operation Reliability Low Ambient Start Hot Air Re-circulation
Adiabatic Efficiency Booster System Air Cooled Free Cooling Chiller 15
Adiabatic Energy Analysis Fills the void between Traditional Chillers & Cooling Towers Warmer Chilled Water with NO Tower Treatment Demand Charge Savings 16
Simultaneous Heat Recovery Air or Water Cooled Chiller USED FOR: VAV Re-heat Facility Hot Water Dehumdification
Heat Recovery Chiller 1.3 megawatts of IT load ROI on heat recovery system of ~1.7 months
THE EVOLUTION OF DATA CENTER SERVER & RACK COOLING Chaos Air Distribution Systems Contained Air Distribution Systems Row Based Liquid to Air Systems RDHx Systems (Active & Passive) Liquid Cooling Systems (Direct-tochip, On-chip & Immersion)
CHAOS Air Distribution Systems Perimeter CRAC s; CRAH s with massive volumes of cooled air through under-floor distribution Room based air conditioning with Hot aisle & Cold aisle SEPARATION LIMITATIONS: Low Rack Density. Footprint. Scalability. Air is allowed to inefficiently mix and move freely throughout the Data Center
Contained Air Distribution Systems Room based air conditioning with hot or cold aisle CONTAINMENT Improved efficiency & reliability * eliminated air re-circulation * Allowed increased temperatures LIMITATIONS: Airflow management (hot spots) & Footprint
ROW BASED LIQUID TO AIR SYSTEMS Row based liquid to air with or without containment Well known system Good for Medium Densities Can leverage slightly warmer water temps LIMITATIONS: Still airflow challenges, along with Footprint & Scalability for growth
Direct-to-Chip or On-Chip Immersion Up to 100 kw
Active Water Based - Up to 75kW+ Active Refrigerant Based Up to 30kW Passive Water Based Up to 20kW * Passive Refrigerant Based Up to 20kW *
Maximize Power & Space EXAMPLE: 9 kw per rack existing 20 kw per rack potential 110W estimated power draw 130% headroom for future IT NO Additional Changes Required
Benefits Remove up to 100% of the heat at its source More usable rack space Scalable solution Server rack and hardware agnostic Remove up to 100% of CRAH units Can be designed with no minimum load requirements 100% sensible heat Water based only Simple maintenance
CapEx In Thousands $900 CapEx COST COMPARISON $800 $700 $600 $721 $812 Capital cost vs. traditional cooling strategies $500 $400 $300 $370 $440 $200 33 racks 1MW of compute $100 $- ARDHx CRAH In-Row In-Row w/ HACs
$160,000 OpEx In Thousands OpEx COST COMPARISON $140,000 $120,000 $133,884 $125,603 $117,229 $100,000 Operating cost vs. traditional cooling strategies 33 racks 1MW of computer Cost per kwh $0.1127 $80,000 $60,000 $40,000 $20,000 $- $37,552 ARDHx CRAH In-Row In-Row w/ HACs
Free Cooling - North America 29
Industry is moving away from a TRADITIONAL Design Approach Building -> Data Hall -> Server Racks -> Cooling to a TAILORED Design Approach Server Rack -> Infrastructure -> Cooling -> Data Hall
FACILIY DESIGN Start from the Server Rack IT Load and design out. Design the room around the selected racks Install optimal infrastructure End Result is a Better Suited DC: Lowest TCO Increased Reliability
COOLING SOLUTION Design Your System with a Partner Who Understands Both IT & Infrastructure
New Innovative Products System Design Flexibility Continued Adoption of High Density Applications and Designs
HOW CAN WE HELP? Motivair Cooling Solutions will improve equipment uptime and performance while reducing total cost of ownership and operation for your next high density project Contact Us Today Reach out to your TCA team to schedule an exploratory discussion on your existing infrastructure s ability to handle any upcoming high density installations.
THANK YOU! Phone JOHN MORRIS +1 716-983-7437 Email jmorris@motivaircorp.com