Packaging of New Servers - energy efficiency aspects-

Similar documents
Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Energy Efficient Data Centers

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

MSYS 4480 AC Systems Winter 2015

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

Rittal Cooling Solutions A. Tropp

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Efficiency of Data Center cooling

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Green IT and Green DC

Pros and Cons of Various DC Distribution Architectures

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

HIGH DENSITY RACK COOLING CHOICES

Highly Efficient Power Protection for High Density Computing Applications

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Thermal management. Thermal management

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

IT Air Conditioning Units

ENCLOSURE HEAT DISSIPATION

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Close Coupled Cooling for Datacentres

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

MEASURE MONITOR UNDERSTAND

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Power and Cooling for Ultra-High Density Racks and Blade Servers

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

Data Center Energy Savings By the numbers

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Liquid Cooling Package LCP Cooling Systems

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Brand-New Vector Supercomputer

84 kw, Tier 3, Direct Expansion, 937 ft 2

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

Next Generation Cooling

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

Virtualization and consolidation

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

A Green Approach. Thermal

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Increasing Data Center Efficiency by Using Improved High Density Power Distribution

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Liebert DCW CATALOGUE

Internet-Scale Datacenter Economics: Costs & Opportunities High Performance Transaction Systems 2011

Energy Efficiency and WCT Innovations

Planning for Liquid Cooling Patrick McGinn Product Manager, Rack DCLC

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

ABB Automation & Power World: April 18-21, 2011 CLP CEU Myth Busting: The Truth behind Data Center Marketing Trends

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Green Data Centers A Guideline

Saving Energy with Free Cooling and How Well It Works

APC APPLICATION NOTE #74

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Data Center Trends and Challenges

Switch to Perfection

IBM Symposium Data Center Cooling. EC Systems for HVAC architecture and Direct air Cabinet Cooling

Cooling Solutions & Considerations for High Performance Computers

How to Increase Data Center Efficiency

High Performance Computing (HPC) Data Center Proposal

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

One Stop Cooling Solution. Ben Tam Business Development Manager

IBM eserver Total Storage, On Demand

Results of Study Comparing Liquid Cooling Methods

Power Monitoring in the Data Centre

HP Modular Cooling System: water cooling technology for high-density server installations

Power & Cooling Considerations for Virtualized Environments

System Packaging Solution for Future High Performance Computing May 31, 2018 Shunichi Kikuchi Fujitsu Limited

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Total Modular Data Centre Solutions

Data Center Trends and Power Management

Close-coupled cooling

DATA CENTER LIQUID COOLING

Case Study: Energy Systems Integration Facility (ESIF)

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

ECOBAY: Closed loop water cooled server cabinet for data centers

Liquid Cooling Package LCP Cooling Systems

High Voltage Direct Current Power Solutions for Data Centers

Supercomputers. Alex Reid & James O'Donoghue

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

FREEING HPC FROM THE DATACENTRE

Impact of Air Containment Systems

Rack cooling solutions for high density racks management, from 10 to 75 kw

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

Server Technology Inc.

Power Systems AC922 Overview. Chris Mann IBM Distinguished Engineer Chief System Architect, Power HPC Systems December 11, 2017

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

COOLING WHAT ARE THE OPTIONS AND THE PROS AND CONS FOR THE VARIOUS TYPES OF DATACENTRES SERVER AND COMMS ROOMS

Scalable, Secure. Jason Rylands

18 th National Award for Excellence in Energy Management. July 27, 2017

Transcription:

Packaging of New Servers - energy efficiency aspects- Roger Schmidt, IBM Fellow Chief Engineer, Data Center Energy Efficiency Many slides provided by Ed Seminaro Chief Architect of HPC Power6 & 7 1 st Berkeley Symposium on Energy Efficient Electronics, June 11-12, 2009

Agenda Liquid cooling Power6 packaging Power7 packaging Data Center issues

Back to the future Rear Door Heat Exchanger Leveraging Mainframe Heritage Capabilities A water-cooled, heat exchanger designed to remove heat generated from the back of your computer systems before it enters the room Patented rear heat exchanger can reduce heat at the source Utilizes chilled water to dissipate heat generated by computer systems while requiring no additional fans or electricity! Customer Benefits Increase server density without increasing cooling requirements. Effective solution for a data center wishing to deal with computer room hot spots, or is at the limit of its cooling capacity, but still has useable floor space to add racks of systems. Water cooling is 3500x more energy efficient than air 3 April 2006 I IBM Non- Confidential IBM uses it today in System x, idataplex and System p 2008 IBM Corporation

Processor Liquid Cooling for Increased Performance System z10 EC is cost effective and can help you Go Green by delivering highly energy efficient technologies Customer Benefits Highest server utilization High performance per watt Extremely efficient power and cooling Power 575 turns on the water for Green Supercomputer Customer Benefits 80% heat load to water 40% energy savings via improved cooling and power efficiencies Up to 8. trillion floating point operations per second + 3.58 TB of memory in a single rack 4 April 2006 I IBM Non- Confidential 2008 IBM Corporation

Today s IBM Mainframe z10ec Internal Batteries (optional) Power Supplies 2 x Support Elements 3x I/O cages Processor Books, Memory, MBA and HCA cards Ethernet cables for internal System LAN connecting Flexible Service Processor (FSP) cage controller cards InfiniBand I/O Interconnects 2 x Cooling Units Processor Book 5 April 2006 I IBM Non- 2008 IBM Corporation Confidential

System Heat Load Trends 1977 - present 90 80 70 Bipolar CMOS Heat Load - kw 60 50 40 30 Refrigeration Water Air 20 10 0 3033 3081 3090 3090EX ES9000 G2 G3 G4 G5 G6 z900 z990 6 April 2006 I IBM Non- 2008 IBM Corporation Confidential z9 z10

IBM Computer Platforms Focus of Talk = System-p -System-x Servers - DS3K/4K Midsize Storage x86 up & chip set Linux & Windows OS s Industry Standard Interconnect Key Competitors: HP, Dell, etc. Principal DE site: Raleigh, NC - System-pi Servers - System-z Mainframes - DS8K High-End Storage System-pi: Power up based DS8K: Power up based IBM chip set AIX, Linux, I5 OS s, IBM FW Industry Standard Interconnect Competitors: HP, Sun, Cray, SGI EMC, Hitachi, etc. Principal DE sites: Austin, Tx. MHV, N.Y. Rochester, Mn. Tucson, Az. - Blue Gene System - Cell Blade Special Purpose HPC BG: Power up based Cell: Power up & SPU based Customized Interconnect Competitors: Cray, etc. Princ DE site: Yorktown, N.Y. Rochester, Mn System-z: 370 up based IBM chip set throughout IBM cluster interconnect Key Competitors: HP, Sun, etc. Principal DE site: MHV, N.Y.

IBM Power Server Systems (Shown without lids and heat sinks) Blade JS2x 520 HV 550 & 560 HV8 570 L4 / ML 575 IH 595 HE P5 Systems P6 Systems (Transition to organic for some systems) P6 titles linked to press announcements Youtube: 550 Youtube: 570 Youtube: 575 2006 IBM Youtube: Corporation 595

The Following Slides Describe the P6 Power 575 High Density Mainstream HPC Server Announced at SC07 Began Shipping to Customers in 1H08 We went all out on the Power-575 to create an Extreme Server that represents the future of Energy efficient Mid-to-Large Scale Technical or Commercial Compute Intense processing Today It has a system structure that eliminates large amounts of unnecessary components The rack is directly customer chilled water cooled eliminating most customer on-floor cooling requirements The processor is directly water cooled, which reduces it operating temperature and therefore power draw It packs 2.5x the compute power of a full rack of blade centers into a single cabinet with very flexible I/O It combines the advantages of a large 32w SMP with the cost effectiveness of smaller 4w servers UI NCSA Blue Waters is based on the P7 Follow-on Product to P6/Power-575

P6-p575 121.6 GFLOPs 602 GFLOPs 1.46 TFLOPs 8.42 TFLOPs

Water Cooled 32W @ 4.7GHz Air Cooled 32W @ 3.5GHz

Assembled Node 12 1 st Berkeley Symposium on Energy Efficient Electronics, June 11-12,2009

Water Cooled Cold Plate 13 April 2006 I IBM Non- Confidential 2008 IBM Corporation

Secondary Loop A A Single WCU can Cool all 14 Nodes WCU s s are in parallel Each WCU has a door control Valve Door is removed from loop if temperature can t t be maintained WCU Processor Heat Load to Water (51%) RDHX Heat Load to Water (27%) Air Heat Load to Data Center (22%)

WCU (Water Conditioning Unit) 4U Tall by Half of a 24 Rack Unit Contains Control Valve, Heat Exchanger, Pump, Reservoir, Sensors, Motor Drive & Interface/Control Card Control Valve Heat Exchanger Pump Reservoir 15 April 2006 I IBM Non- Confidential 2008 IBM Corporation

P6-575 Signal Cable Troughs P6-575 Raised Floor 457.2mm (18") Min. Ht. Facilities Power Connector IBM Power Cord IBM Hose Facilities Water Distribution IBM Hose IBM Power Cord Facilities Power Connector

The Power6 575: How We Designed it for Energy Efficiency Create an Energy Efficient Solution Maximize the Energy Efficiency of the Data Center Power & Cooling Infrastructure Power 575 s Power and Cooling Input Enables High Data Center Facility efficiency 480 VAC balanced 3-phase direct power connection with line cord redundancy PDU step-down losses eliminated Lower building distribution losses and smaller wire sizes In environments that do not require Battery Back-up facility efficiency can be 98% with 480VAC Outstanding Power Line Disturbance Immunity of the Power-575 makes this mode very viable 100% single phase line disturbance immunity Protection against Under-voltages of up to 70% Exceptional protection against Over-Voltage conditions (Line Overshoot, lightning strike, ring) Direct building chilled water connection with full redundancy Saves on the order of 20% energy over conventional air cooling Eliminates CRAC hardware on the floor Enables tight packing of server racks Without a room temperature penalty P6-575 Signal Cable Troughs P6-575 Raised Floor 457.2mm (18") Min. Ht. Facilities Power Connector IBM Power Cord Facilities Water Distribution 17 April 2006 I IBM Non- Confidential IBM Hose IBM Hose IBM Power Cord Facilities Power Connector 2008 IBM Corporation

P5-P575 12 Node / 2 Switch ~35 KW max ~85 KW max ~6x Density of P6 Rack Servers ~3x Density of P6 Blade Servers P6-P575 14 Node / 2 WCU Footprint 137 Racks / 100TF (ASCI Purple) 0.73 TFlops / Rack 45 mw / MFlop 100% Thermal Load to Air 6 Node Line Cord Redundancy Same size Rack < 1/10th space Footprint 12 Racks / 100TF (ASCI Purple) 8.4 TFlops / Rack < 1/5th power 8 mw / MFlop 80% Thermal Load to H 2 0 / 20% to Air ~ 1/6th energy (10% Cost Savings minimum) 14 Node Line Cord Redundancy

P6 575 - Air vs Water Customer Benefits 80% heat load to water 40% energy savings via improved cooling and power efficiencies 19 1 st Berkeley Symposium on Energy Efficient Electronics, June 11-12,2009

Systems on the Horizon Sequoia-system developed and manufactured by IBM will be installed at LLNL and will start running in 2012. It will have 1.6 Million Power-Processors and 1.6 Pbytes of main memory leading to a peak performance of more than 20 Pflops Blue Waters at U. of Illinois/NCSA -The Blue Waters project will provide a computational system capable of sustained petaflop performance on a range of science and engineering applications. The system is built by IBM on the Power7 processor

U. Of Illinois/NCSA Green Data Center Blue Waters will be an unsurpassed national asset, delivering sustained performance of 1 petaflop (that's 1 quadrillion calculations every second) for many science and engineering applications that researchers use every day not just benchmarks. Blue Waters is being developed and built with a $208 million grant from the National Science Foundation and will be a resource for scientists and engineers across the country. Blue Waters will deploy IBM's POWER7 architecture, which is based on advanced multicore processor technology.

UI-NCSA Blue Waters Computing System & the Supporting Facilities Infrastructure 2008 IBM Corporation

Data Center Efficiency Minimize Power Consumption and Floor-space of IT Equipment Servers Storage Networking Maximize Efficiency of Data Center Power & Cooling Infrastructure Power Conversion & Distribution from 13KV Building Input to Equipment Input Efficiency of Computer Room Heat Removal Power Required to Remove each KW of Heat from the Room High Availability Design of infrastructure Minimize Waste of Floor Space

Potential Future Directions to Improve Data Center Efficiency Move to More Integrated Large Scale IT Equipment Integrate the Network into the Server Unify Storage & Servers Eliminate unnecessary Data Flow & Protocol Conversions that Burn Power Integration Reduces IT Equipment hardware content yielding better reliability, lower cost, & potentially better performance.

Potential Future Directions to Improve Data Center Efficiency Improve Heat Transfer from Electronic Components in IT Equipment to the Outside Ambient Environment Direct Water Cooling of Electronic Components Transfer of Heat Load from IT Equipment Cabinets to Outside Air via Water with the Highest Possible Temperature (15 deg-c / 60 deg-f target) Eliminate the majority of Room Level Air Cooling Needs, and provide the air conditioning needed via Air-to-Water Heat Exchange in the IT Equipment Improve Power Conversion & Distribution Efficiency in the Data Center Distribute Higher Voltages Directly to the IT Equipment Cabinets 400-480 VAC directly from the output of the building transformer 400-600 VDC directly from the output of a UPS without the inverter stage No on Floor Step Down Transformers Assure Low Loss in Power Distribution Cabling. Use efficient building transformers (98.5%) and efficient UPS s (94% AC output / 97% DC output). Assure efficiency at actual normal operating load.

Potential Future Directions to Improve Data Center Efficiency Provide Customer Choice in Input Operating Voltage 200-480 VAC 3-phase nominal & 400-600VDC, Enhanced Universal Input Configurations may be limited for 200-240 VAC input on the largest systems 277 VAC add to smaller equipment; enables 480V phase-to-neutral operation Attempt to enable best operating efficiency for each customer Design IT Equipment Power & Cooling for True End-to-End Efficiency Power efficiency optimization from 13KV all the way to 1V 13KV to the IT equipment input (PUE) is not a complete metric Cooling efficiency from the outside ambient all the way to the silicon junction Lower Temperature CMOS silicon dissipates less power Electronics packed closer together may dissipate less power

Data Center Energy Efficiency Metric Metric

Delivered Annual electricity consumption by sector, 1980-2030 2000 Commercial 1500 Residential Billion kwh 1000 history projections Industrial Commercial 500 Residential Industrial 0 Data Center Electric Growth 1980 1985 1990 1995 2000 2005 2010 2015 2020 2025 2030 Time 28

Data Center Infrastructure Cooling tower air loop Cooling tower water loop Refrigeration chiller loop Building chilled water loop CRAC & server fans B P C P B B Pumping power Pumping power Compressor power Pumping power Pumping power Electrical power

IT Equipment is HOT And HEAVY P Series 595 31 W x 66 D x 80 H 77500 BTU 3014 Lbs Toyota Camry 3276 Lbs 100,000 BTUs to heat a house in the NE P-Series 690 62 W x 50 D x 80 H 108,000 BTU 4233 Lbs

Possible Ventilation Schemes RDHX Ceiling Plenum Hot Aisle Cold Aisle Viewing Area Egg Crate Return Racks Ducted Return CRAC Unit Raised Floor Power /Cabli ng Power /Cabli ng

Hot Air Recirculation Animation - Raised Floor

Summary Liquid cooling is back improves overall energy efficiency Integrate functions more effectively to drive efficiency Viewing designs more holistically end to end power and cooling solutions

Thank You