How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Similar documents
Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Thermal management. Thermal management

Green IT and Green DC

Rittal Cooling Solutions A. Tropp

Power & Cooling Considerations for Virtualized Environments

Virtualization and consolidation

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

DATA CENTER LIQUID COOLING

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

MSYS 4480 AC Systems Winter 2015

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Switch to Perfection

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

DCIM Data Center Infrastructure Management Measurement is the first step

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Liebert DCW CATALOGUE

Close Coupled Cooling for Datacentres

Overcoming the Challenges of Server Virtualisation

Energy Efficient Data Centers

Close-coupled cooling

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Rethinking Datacenter Cooling

DC Energy, Efficiency Challenges

ENCLOSURE HEAT DISSIPATION

INNOVATE DESIGN APPLY.

Optimize Your Data Center. Gary Hill VP, Business Development DVL Group, Inc.

INNOVATE DESIGN APPLY.

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

84 kw, Tier 3, Direct Expansion, 937 ft 2

FREEING HPC FROM THE DATACENTRE

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Making the impossible data center a reality.

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

How to Increase Data Center Efficiency

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

PLAYBOOK. How Do You Plan to Grow? Evaluating Your Critical Infrastructure Can Help Uncover the Right Strategy

HIGH DENSITY RACK COOLING CHOICES

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

One Stop Cooling Solution. Ben Tam Business Development Manager

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Google s Green Data Centers: Network POP Case Study

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Next Generation Cooling

Reducing Energy Consumption with

The Energy. David L. Moss Dell Data Center Infrastructure

November/December 2011

MEASURE MONITOR UNDERSTAND

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Planning a Green Datacenter

STULZ Micro DC. with Chip-to-Atmosphere TM Liquid to Chip Cooling

Industry leading low energy evaporative fresh air cooling.

A Green Approach. Thermal

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Power and Cooling for Ultra-High Density Racks and Blade Servers

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

> Executive summary. Implementing Energy Efficient Data Centers. White Paper 114 Revision 1. Contents. by Neil Rasmussen

Server Room & Data Centre Energy Efficiency. Technology Paper 003

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Interested in conducting your own webinar?

Energy Usage in Cloud Part2. Salih Safa BACANLI

APC APPLICATION NOTE #146

IT Air Conditioning Units

Scalable, Secure. Jason Rylands

Green Data Centers A Guideline

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

Cooling Solutions & Considerations for High Performance Computers

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Data Center Trends: How the Customer Drives Industry Advances and Design Development

DATA CENTER AIR HANDLING COOLING SOLUTIONS

Data Center Design: Power, Cooling & Management

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Data Center Trends and Challenges

Data Center Energy Savings By the numbers

Cost Model Energy Benefits DirectAire & SmartAire Overview & Explanation

Data Center Design and. Management. Steve Knipple. August 1, CTO / VP Engineering & Operations

Total Modular Data Centre Solutions

Impact of Air Containment Systems

Cannon T4 Modular Data Center (MDC)

Prefabricated Data Center Solutions: Coming of Age

Recapture Capacity for Existing. and Airflow Optimization

SmartRow Intelligent, Integrated Infrastructure in a Self-Contained Line-Up. Solutions For Business-Critical Continuity

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Dealing with the Density Dilemma

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

5.2 MW, Pod-based build, Chilled Water, ft 2

Data Centers. Powerful Solutions That Put You in Control. Energy Management Solutions. Where your energy dollars are used. 1

RF Code Delivers Millions of Dollars in Annual Power & Cooling Savings for CenturyLink

R4 Ventures LLC White Paper

Transcription:

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals Michael Gagnon Coolcentric October 13 2011

Data Center Trends Energy consumption in the Data Center (DC) 2006-61 billion kwhr (~6 millions households) - 1.5% of the total USA electricity 2012-100 billion kwhr (~10 million households) - 2.0% of the total ($7.4 billion cost) Energy Usage ~40% of the electricity consumed in the DC = air conditioning system (~ $3 billion) Technology Trends Energy consumption is the most dominant trend in the next 5 years (Gartner Group) IT Equipment power density is increasing Consolidation, virtualization and cloud computing lead to higher rack densities Companies want to optimize their DC white space with higher density racks (fully populated) containing IT equipment that performs work (high utilization rate) DC Costs Data Center costs can exceed $1,000/square foot, of which 70% goes toward mechanical, electrical and cooling

Data Center Trends Higher Education and HPC are early adopters of liquid cooling no longer using air-cooling to remove all of the DC heat. The DC cooling needs are becoming more complex and exceed air cooling capabilities Traditional air-cooled systems cannot keep up with the changes Inefficient - delivery system is inefficient leading to over provisioning and increased space (CAPEX) Higher power consumption (OPEX) Higher air velocities = increased noise levels Unpredictable cooling can lead to hot spots Remove heat at the source Row, Rack, Chassis Passive Rear Door Heat Exchangers (RDHx) Cost effective Increases cooling efficiency for a greener approach Easy to deploy in Retrofits or new DC builds Scalable add only when you need it

Traditional Air-cooling System

Coolcentric System Overview Passive RDHx no fans, no moving parts, no power, no noise Equipment-ejected exhaust air passes through coil and is cooled before exiting the RDHx Chilled water used above dew point (no condensate) Heat is removed from room through return water connection

Syracuse University Case Study IT requirements growing faster than existing facilities could handle Syracuse was faced with a decision - retrofit existing DC or build new 650kW DC Existing DC infrastructure was tired - old, inefficient and hard to retrofit Syracuse University Options Add more white space band-aid solution, didn t address core cooling issues Add additional perimeter Computer Room Air Conditioners (CRAC) more space, more power and delivery system inefficient Add new In-Row Cooling devices more space, more power Incorporate aisle containment inflexible, still using old CRAC Environmental and social responsibility commitment Energy efficiency is a top priority to Syracuse low PUE Green DC Retrofit options could not attain energy objectives The decision was made to build a new data center using RDHx as the primary cooling topology

Syracuse University Case Study Syracuse was able to achieve their goals Most energy-efficient, green DC 50% less energy consumption Successful deployment of Trigeneration system, gas-powered micro turbines, chilled water RDHx Passive RDHx provide the primary cooling and only one CRAC is used for make-up air and humidification control RDHx were chosen because they consume minimal floor space, consume no power at the rack and help reduce energy consumption http://www.syr.edu/greendatacenter/

Purdue University Case Study Rosen Center for Advanced Computing High performance computing and storage DC Used for advanced, cutting-edge science, engineering and research Historical building with DC in basement Never intended to support an IT infrastructure No raised floor, limited ceiling height = no flexibility and high noise levels The 500kW limit was now stretched to the limit by new requirements No more available White space Due to power, cooling and chilled water capacity constraints the data center was configured with partially populated racks with light load densities. The spreading of the load had reached maximum space utilization No room for additional CRAC nor did Purdue want to continue to use CRAC Active Fan Doors were tested as a space saving alternative Purdue University Options Fully deploy active fan door Build a new $60 million DC Deploy liquid cooling in the form of the close coupled passive RDHx

Purdue University Case Study Purdue decision Active fan doors were judged to have low ROI, consumed excessive power for the cooling provided and added too much noise The $60 million price tag for a new DC money the University hadn t allocated RDHx were judged to be the most economical and provide best cooling option Purdue reasons for using liquid cooled, passive RDHx: Half of the CRACs could be turned off = less noise, less power Remove heat at the source = ultimate aisle containment solution Scalable solution can quickly add RDHx when needed Small footprint = space conserved Eliminates CRAC single point of failure Reduced chilled water plant load Significant energy savings Extended the useful life of existing DC DC retrofit costs ~ $1 million

RDHx Benefits Save Energy, Reduce carbon footprint 90% less power draw than CRAC Increase Data Center Utilization & Capacity Up to 5X more compute capacity (densify and fully populate racks) Save space (or free up space for expansion) Up to 84% white space savings Remove heat at source = predictable cooling Eliminate air movement fans = reduce noise levels Lower capital expense & operating expense Generates savings on Day One, with typical TCO payback of 1 year or better