LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Similar documents
MSYS 4480 AC Systems Winter 2015

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Data Center Energy Savings By the numbers

Session 6: Data Center Energy Management Strategies

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Efficiency of Data Center cooling

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Reducing Energy Consumption with

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

Case Study: Energy Systems Integration Facility (ESIF)

Introducing the Heat Wheel to the Data Center

Phil Tuma, Application Development Engineer John Gross, Mechanical Engineering Director Kar-Wing Lau, CEO Charles Benge, Director Mission Critical

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Welcome Association of Energy Engineers to the Microsoft Technology Center

ASHRAE Data Center Standards: An Overview

Optimizing Cooling Performance Of a Data Center

Power and Cooling for Ultra-High Density Racks and Blade Servers

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Next Generation Cooling

Innovative Data Center Energy Efficiency Solutions

Reducing Data Center Cooling Costs through Airflow Containment

Green IT and Green DC

1.0 Executive Summary. 2.0 Available Features & Benefits

Rittal Cooling Solutions A. Tropp

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Total Modular Data Centre Solutions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

A Green Approach. Thermal

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Case study on Green Data Center. Reliance Jio Infocomm Limited

Thermal management. Thermal management

Green Data Centers A Guideline

Liebert DCW CATALOGUE

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

DATA CENTER LIQUID COOLING

Recapture Capacity for Existing. and Airflow Optimization

Retro-Commissioning Report

2000 kw, Tier 3, Chilled Water, Prefabricated, sq. ft.

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Impact of Air Containment Systems

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Cooling Solutions & Considerations for High Performance Computers

Reference Design Quicktour Design Summary

MEASURE MONITOR UNDERSTAND

INNOVATE DESIGN APPLY.

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

INNOVATE DESIGN APPLY.

5.2 MW, Pod-based build, Chilled Water, ft 2

How to Increase Data Center Efficiency

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Liebert Economizer Cooling Solutions Reduce Cooling Costs by up to 50% Without Compromising Reliability

Air Containment Design Choices and Considerations

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Data Center Energy and Cost Saving Evaluation

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Evaluation and Modeling of Data Center Energy Efficiency Measures for an Existing Office Building

Virtualization and consolidation

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Green Sustainable Data Centres. Checklist - EU Code of Conduct on Data Centres. Annex

Packaging of New Servers - energy efficiency aspects-

Saving Water and Operating Costs at NREL s HPC Data Center

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

HIGH DENSITY RACK COOLING CHOICES

84 kw, Tier 3, Direct Expansion, 937 ft 2

IXcellerate Moscow One Datacentre - Phase 1 & 2 Overview

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

1.0 Executive Summary

Green Computing: Datacentres

International 6SigmaDC User Conference CFD Modeling for Lab Energy Savings DCSTG Lab Temperature Setpoint Increase

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

1008 kw, Tier 2, Chilled Water, ft 2

DATA CENTER COLOCATION BUILD VS. BUY

Planning a Green Datacenter

Ft Worth Data Center Power Metering for Real Time PUE Calculation. Scope of Work

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Data Center Temperature Rise During a Cooling System Outage

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Temperature monitoring and CFD Analysis of Data Centre

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

CLOSE CONTROL AIR CONDITIONERS DATA CENTERS DESIGN SIDE NOTES

Transcription:

Slide-1 LANL High Performance Computing Facilities Operations Rick Rivera and Farhad Banisadr Facility Data Center Management Division

Slide-2 Outline Overview - LANL Computer Data Centers Power and Cooling CFD Data Center Modeling Data Center Power Usage and Efficiency Exa-Scale Challenges

Slide-3 Division Mission Slide 3

World-class facilities to support high performance computing systems Nicholas C. Metropolis Center for Modeling and Simulation 2002-43,500 sq ft computer room floor - 9.6 MW Rotary UPS power - 9.6 MW Commercial power - 6000 tons chilled water cooling Slide-4 Laboratory Data Communications Complex (LDCC) 1989-20,000 sq ft computer room - 8 MW Commercial power - 3150 tons chilled water cooling

Systems Slide-5 Cielo 54kw/rack Secure Classified Computers - SCC Redtail IBM, 64.6 teraflop/s Roadrunner IBM (Cell), 1376 teraflop/s Hurricane Appro, 80 teraflop/s Typhoon Appro, 100 teraflop/s Cielo Cray XE6, 1370 teraflop/s Roadrunner 15kw/rack Open Unclassified Computers - LDCC Yellowrail IBM, 4.9 teraflop/s Cerrillos IBM, 160 teraflop/s Turing Appro, 22 teraflop/s Conejo SGI, 52 teraflop/s Mapache SGI, 50 teraflop/s Mustang Appro, 353 teraflop/s Slide 5

Slide-6

Slide-7 Electrical Power Distribution Metropolis Center (SCC) Building Specific SCC Features Multiple 13.2 kv feeders Double ended substations Rotary uninterruptible power supply (RUPs) Scalability Power conditioning transient voltage surge suppressor (TVSS) Substation Switchgear + RUPS power distribution PDU power distribution Supercomputing Systems

Metropolis Center (SCC) Infrastructure Upgrade Project Split sub-stations (19.2 MW) Slide-8 Added 1,200-ton chiller Added cooling tower 14 additional air handling units (AHUs) Water cooling (9 MW load)

Air Systems Slide-9 Cold air generated by the air handling units is forced up through perforated tiles above the computer room sub floor. The air passes through and removes heat from the supercomputers, then is vented through the ceiling tiles and return plenums to complete the path back to the air handling units below. As heat densities associated with supercomputers increase, modeling becomes critical. This graphic shows a computer room air temperature profile.

Supercomputers require continuous and significant cooling Heat Flow This diagram traces the path of heat flow through the different cooling plant components, fluids, and systems. Slide-10

Slide-11

LANL/Cielo Update Slide-12

Slide-13

Slide-14

Comparative Energy Costs Slide-15

PUE Power Usage Effectiveness (PUE) Total Facility Power (kw) IT Equipment Power (kw) Power Usage Slide-16

Slide-17 Calculating PUE What to measure and how often? Importance of consistency How to measure Manual readings of BAS, UPS, PDUs Instrumentation Real time measurement Wireless meters and sensors Branch circuit monitoring Power usage software

SCC Power Usage Effectiveness Slide-18 cooling towers The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. Cooling Power Substation DUS-1, D, E 99% efficient 13.2 kv Power Distribution Switchgear + = 2.78 MW IT Load 67% Chillers 14% AHU/ Pumps 18% chiller plant 1% Misc SCC Computer Load Facility power 11.8 MW IT power 7.89 MW 11.8/7.89 = 1.49 (PUE) AHUs HVAC power 2.78 MW HVAC Load 2206 Tons (7.76 MW) HVAC KW/Ton 1.26 7.76/2.78 = 2.79 (COP) Computing Power Substation A, B, C, H 99% efficient + RUPS - 92% efficient SWB - 100% efficient PDUS - 96% efficient = 7.89 MW

Slide-19 HPC current and future efficiency practices to reduce PUE Data Center Metering/Measurements Variable Speed Drives (VSD) Cold Aisle Containment Close-Coupled Liquid Cooling Air-Side Economizers Water-Side Economizers

Circulation power and cooling fluid choice Air cooling Circulation power grows very rapidly 9% is typical for low power density datacenters using CRAC units Doubling power density could increase this 8-fold Large heat sinks limit rack packing efficiency Water cooling About 3500 times more effective than air at heat removal, lower pumping cost Plumbing costs, earthquake bracing needed Risk of leaks, condensation near computers in humid climates Attractive at high power densities Variant Short air loop to nearby heat exchanger Freon or similar fluids Stable temperature, governed by phase transition Risk of leaks, inert but displaces oxygen Slide-20 More complex plant, additional pumping costs, much variation in design & efficiency

Slide-21 Multiple Equipment Summary View Individual Substation View LA-UR 11-03816

Power Usage Effectiveness (PUE) PUE is a measure of how efficiently a computer data center uses its power. It is a ratio of total facility power to IT equipment power. Facility power: lighting, cooling, etc. IT equipment: computer racks, TOTAL FACILITY storage, etc. PUE IT EQUIPMENT POWER POWER Slide-22 NOTE: There was a communication loss on May 7 th. LA-UR 11-03816

Slide-23 Data Center Infrastructure Efficiency (DCIE) DCIE is the percentage value derived for measuring computer data center efficiency by taking the inverse of the PUE. DCIE IT EQUIPMENT POWER TOTAL FACILITY POWER x 100 NOTE: There was a communication loss on May 7 th. LA-UR 11-03816

Slide-24 Individual Machine Monitoring and Resulting PUE NOTE: There was a communication loss on May 7 th. Cielo Power (kw) Redtail Power (kw) Roadrunner Power (kw) LA-UR 11-03816 Power Usage Effectiveness - PUE

Metropolis Center Overall Performance May 2011 Slide-25 NOTE: There was a communication loss on May 7 th. Data Center Infrastructure Efficiency DCIE (%) Power Usage Effectiveness - PUE IT Load (kw) Total Load (kw) LA-UR 11-03816

Identify Direct Correlations Slide-26 Cielo Power (kw) Redtail Power (kw) Roadrunner Power (kw) Total IT Load (kw) Total Facility Load (kw) PUE DCiE (%) LA-UR 11-03816

WIRELESS Network Topology Thermal Monitoring Slide-27 LA-UR 11-03816

Ceiling Temperatures, Humidity, and Under Floor Air Pressure Slide-28 Static Pressure 0.5 in/wc 65 F Humidity 53% Humidity 49% 103 F LA-UR 11-03816

Slide-29 Wireless Information Summary from Cielo Exhaust Temp ( F) Ceiling Temp ( F) Supply Temp ( F) Return Temp ( F) Power (kw) LA-UR 11-03816

CIELO 1578 kw ROADRUNNER 1568 kw REDTAIL 1746 kw MISC. IT LOAD 1257 kw SCC Data Center Efficiencies Live values recorded May 31, 2011 at 8:00 am Slide-30 DCiE = Data Center Infrastructure Efficiency 74.81 % PUE = Power Usage Effectiveness 1.337 PUE/DCiE 2.0/50% =AVERAGE 1.5/67% = EFFICIENT 1.2/83% = VERY EFFICIENT LA-UR 11-03816

Slide-31 Challenges Exa-Scale And Beyond Power Cooling Weight Environmental

Slide-32 Energy Efficient Working Group (EE HPC WG) Supported by the DOE Sustainability Performance Office Organized and led by Lawrence Berkeley National Lab Participants from DOE National Laboratories, Academia, various Federal Agencies HPC vendor participation Group selects energy related topics to develop

Slide-33 Liquid Cooling Sub-committee Goal: Encourage highly efficient liquid cooling through use of high temperature fluids Eliminate compressor cooling (chillers) at National Laboratory locations Standardize temperature requirements common understanding between HPC mfgs and sites Ensure practicality of recommendations - Collaboration with HPC vendor community to develop attainable recommended limits Industry endorsement of recommended limits - Collaboration with ASHRAE to adopt recommendations in new thermal guidelines white paper

Slide-34 Direct Liquid Cooling Architectures Cooling Tower Dry Cooler

Summary Table to ASHRAE Slide-35 Liquid Cooling Class Main Cooling Equipment Supplemental Cooling Equipment Building Supplied Cooling Liquid Maximum Temperature L1 Cooling Tower and Chiller Not Needed 17 C (63 F) L2 Cooling Tower Chiller 32 C (89 F) L3 Dry Cooler Spray Dry Cooler or Chiller 43 C (110 F)

Slide-36 Thank You Questions? LA-UR 11-03816