Case Study: Energy Systems Integration Facility (ESIF)

Similar documents
Saving Water and Operating Costs at NREL s HPC Data Center

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Cooling Solutions & Considerations for High Performance Computers

Data Center Energy Savings By the numbers

New ultra-efficient HPC data center debuts 12 March 2013, by Heather Lammers

DATA CENTER LIQUID COOLING

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Comprehensive Energy Management & Utility Master Plan

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Green IT Project: The Business Case

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

System Overview. Liquid Cooling for Data Centers. Negative Pressure Cooling Without Risk of Leaks

Optimizing Laboratory Controls through Fault Detection and Diagnostics Software. Laura Dyas, PE, LEED AP O+M, CEM Bryce Buchanan, PE, CCP, CEM

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Session 6: Data Center Energy Management Strategies

Green Data Centers A Guideline

Catch the Heat Avoid the Heat! How an integrated management approach helps to operate your data center energy efficient and reliable

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Energy Efficiency and WCT Innovations

Saving Energy with Free Cooling and How Well It Works

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

DEDICATED HEAT RECOVERY UNIT (DHRC)

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

18 th National Award for Excellence in Energy Management. July 27, 2017

BENEFITS OF ASETEK LIQUID COOLING FOR DATA CENTERS

Next Generation Cooling

MSYS 4480 AC Systems Winter 2015

High Performance Computing (HPC) Data Center Proposal

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

Green IT and Green DC

Reducing Energy Consumption with

Engineered Energy Solutions An Overview. Controls, Automation and Energy Conservation

Energy Action Plan 2015

Achieving Operational Excellence

SYRACUSE UNIVERSITY GREEN DATA CENTER NYSERDA MONITORING PLAN BHP ENERGY LLC. February 17, 2010 Revised

Data Center Trends: How the Customer Drives Industry Advances and Design Development

Ft Worth Data Center Power Metering for Real Time PUE Calculation. Scope of Work

Hybrid Warm Water Direct Cooling Solution Implementation in CS300-LC

PLAYBOOK. How Do You Plan to Grow? Evaluating Your Critical Infrastructure Can Help Uncover the Right Strategy

DATA CENTER AIR HANDLING COOLING SOLUTIONS

Case study on Green Data Center. Reliance Jio Infocomm Limited

Efficiency of Data Center cooling

HIGH DENSITY RACK COOLING CHOICES

NREL Validates LCS Technology. PRESENTED BY Herb Zien CEO, LiquidCool Solutions

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Planning a Green Datacenter

Introducing the Heat Wheel to the Data Center

APC InRow Cooler Prototype Initial Report - Test Setup Verification. H. Coles, S. Greenberg Contributing: J. Bean (APC) November 18, 2011

Phil Tuma, Application Development Engineer John Gross, Mechanical Engineering Director Kar-Wing Lau, CEO Charles Benge, Director Mission Critical


systems integration R&D facility.

Continuous Commissioning at Sandia National Laboratories

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

T66 Central Utility Building (CUB) Control System Reliability, Optimization & Demand Management

Trane mytest Water-cooled Chiller Performance Validation What you spec is what you get *

M+W Group s Visionary Delta Datacenter

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

New Wisconsin Energy Code: What it Means for Me?

Energy Efficient Data Centers

Make the most of your Energy from Power Plant to Plug. Chris Simard, Regional Director Field Sales Group

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Chiller Replacement and System Strategy. Colin Berry Account Manager

Data Center Trends and Challenges

Highly Efficient Power Protection for High Density Computing Applications

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

ΕΠΛ372 Παράλληλη Επεξεργάσια

Total Modular Data Centre Solutions

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Rolf Brink Founder/CEO

STATE DATA CENTER DRAFT PROPOSAL. STATE OF OREGON DATA CENTER 28 October 2016

Operation Guide CT100

MEASURE MONITOR UNDERSTAND

Innovative Data Center Energy Efficiency Solutions

Results of Study Comparing Liquid Cooling Methods

Innovative Data Center Efficiency Techniques. Michael K Patterson, PhD PE Eco-technology Program Office Intel Corporation

Designing Ultra-Efficient Building Cooling Systems

Energy Efficiency for Exascale. High Performance Computing, Grids and Clouds International Workshop Cetraro Italy June 27, 2012 Natalie Bates

Why would you want to use outdoor air for data center cooling?

Your Solution For Energy and Cost Savings. McQuay Packaged Chiller Plant

Energy Systems Integration

INSIDE THE PROTOTYPE BODENTYPE DATA CENTER: OPENSOURCE MONITORING OF OPENSOURCE HARDWARE

Measurement and Management Technologies (MMT)

Performance Evaluation for Modular, Scalable Cooling Systems with Hot Aisle Containment in Data Centers. Final Report

Close Coupled Cooling for Datacentres

Reducing Data Center Cooling Costs through Airflow Containment

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

Compute Engineering Workshop March 9, 2015 San Jose

Recapture Capacity for Existing. and Airflow Optimization

Precision Cooling for Business-Critical Continuity. Liebert HPC-S. Designed for the Efficient Cooling of Small Data Centers

Aquaflair. Uniflair TRAC, TRAF and TRAH. Air-cooled chillers, free-cooling chillers and heat pumps with scroll compressors.

7th Annual Statewide Energy Efficiency Forum Applying Building Technologies to Realize Savings June 16, 2016

Industry leading low energy evaporative fresh air cooling.

Retro-Commissioning Report

FLORIDA STATE UNIVERSITY MASTER PLAN 10 Utilities

Topics. CIT 470: Advanced Network and System Administration. Google DC in The Dalles. Google DC in The Dalles. Data Centers

Transcription:

Case Study: Energy Systems Integration Facility (ESIF) at the U.S. Department of Energy s National Renewable Energy Laboratory (NREL) in Golden, Colorado OFFICE HIGH BAY LABORATORIES DATA CENTER 1

Case Study Roadmap Steve: NREL Introduction Steve: Performance Specs & Integrated Process Design Holistic approach to data center design and integration Peter: Cooling and Energy Recovery Steve: Preliminary (early) data All: Panel Discussion 2

NREL Snapshot Only National Laboratory Dedicated Solely to Energy Efficiency and Renewable Energy Leading clean-energy innovation for 37 years ~2300 total staff in world-class facilities Campus is a living model of sustainable energy ~34,000 visitors in FY2013. Located in Golden, Colorado Owned by the Department of Energy Operated by the Alliance for Sustainable Energy 3

Scope of Mission Energy Efficiency Renewable Energy Systems Integration Market Focus Residential Buildings Commercial Buildings Data Centers Personal and Commercial Vehicles Solar Wind and Water Biomass Hydrogen Geothermal Grid Infrastructure Distributed Energy Interconnection Battery and Thermal Storage Transportation Private Industry Federal Agencies Defense Dept. State/Local Govt. International 4

ESIF: Energy System Integration Facility New 185,000 s.f. research facility Office space for 220. High bay and laboratory space. Data center Integrated chips-to-bricks approach. Process: Design build with performance specs. LEED Platinum Significant achievement for building with large lab and data center load. Planning started in 2006. 5

National Renewable Energy Laboratory Steve Hammond NREL Data Center Showcase Facility 10MW, 10,000 s.f. Leverage favorable climate Use evaporative cooling, NO mechanical cooling. Waste heat captured and used to heat labs & offices. World s most energy efficient data center, PUE 1.06! High Performance Computing Petascale+ HPC Capability in 2013 20 year planning horizon 5 to 6 HPC generations. Insight Center Scientific data visualization Collaboration and interaction. Lower CapEx and lower OpEx. Leveraged expertise in energy efficient buildings to focus on showcase data center. Integrated chips to bricks approach.

Critical Data Center Specs Warm water cooling, 75F (24C) Water much better working fluid than air - pumps trump fans. Utilize high quality waste heat, 95F (35C) or warmer. +90% IT heat load to liquid. Up to 10% IT heat load to air. High power distribution 480VAC, Eliminate conversions. Think outside the box Don t be satisfied with an energy efficient data center nestled on campus surrounded by inefficient laboratory and office buildings. Innovate, integrate, optimize. Dashboards report instantaneous, seasonal and cumulative PUE values. National Renewable Energy Laboratory Steve Hammond

System Integration Designing for Energy Reuse HIGH PERFORMANCE COMPUTING DATA CENTER IT Load, Grows from 1 MW up to 10 MW Achieving Energy Efficiency Goals Requires: Evaporative Hydronic Cooling (Not Compressor Based) Maximize Direct Heat to Water, Minimize Use of Air for Cooling Maximize Energy Reuse in the Form of Low Grade Heating MUTUALLY BENEFICIAL RELATIONSHIP OFFICE SPACE and HIGH BAY LABORATORIES Require Large Volumes of Ventilation Air (Outside Air) Achieving Energy Efficiency Goals Requires: Maximize Evaporative Cooling, Minimize Compressor Use Utilize Energy Recovery to Reduce Operational Energy Use Provide for Static Pressure Reset & Exhaust Stack Velocity Turn-Down via Wind Anemometer Control. 8

COOLING AND ENERGY RECOVERY 9

Liquid Cooling Technologies Direct Contact Liquid Cooling TYPICAL SYSTEM CONFIGURATION A Cooling Distribution Unit (CDU) Isolates the Computer Cooling Liquid (Water) from the Central Building Systems The CDUs Circulate Water to Liquid-Cooled Local Heat Sinks (Multiple Each Server) Distribution Manifold Provides Brings Water to Each Server (Radiant Piping Technology) Supports Higher Density Solutions with a Minimal Amount of Air Cooling Required Water is in Direct Contact with Electronic Equipment (No Heat Pipe Interface) Higher Water Quality Requirements Higher Water Damage Risk Due to Multiple System Connections and Distribution Piping 10

Benefits of Liquid Cooling Thermal Stability DESIGN CONSIDERATIONS Due to the High and Variable Heat Loads within the Servers, Manufacturer s Favor a Stable, Consistent Temperature Profile Cooling Distribution Units (CDUs) Act as a Buffer to Central Building Systems and Give the User Control over Operating Set Points Dedicated Central Systems Serving the Data Center Need to BOTH Energy Efficiency & Temperature Stability 11

NREL Energy Systems Integration Facility Design Conditions and Capacity Liquid Cooling 100% total capacity 75 deg F chilled water supply Minimum 95 deg F return temp Design Capacity Day One: 1.0 MW IT load Final: 10 MW IT load Air Cooling 20% total capacity (Day One 10% capacity) 80 deg F inlet air / 25% RH minimum / 60% RH (42 deg f dp) maximum 100 deg f return temp

NREL Energy Systems Integration Facility

14

45 deg Chilled Water System 29% of the Year in Full Water Side Econmizer 15

75 deg Chilled Water System 100% of the Year in Full Water Side Econmizer 16

Cooling Load (kw, total) Cooling Load (kw, per rack) % Watercooled Supply Water Temp (F) Water delta T (F) Water press. Drop (PSI) Air Inlet/ Outlet Temp (F) System I 942.9 112 96.5% 70-75 25-30 (100F RWT) 20 80/- System II 398 51 86% 75 10 (85F RWT) 15 80/87 System III 216 100 91% 75 27.1 (102.1F RWT) 21 80/96.5 System IV 416-506 44.3-55.3 100% 75 20+ 14.5 -/- 17

18

19

20

21

Heat of Vaporization & Evaporative Cooling 22

NREL Energy Systems Integration Facility Heat Recovery System Recovered Heat to Lab and Office AHUs

Annual Energy GWh/yr Heat MBtu NREL Energy Systems Integration Facility We want the bytes AND the btu s! 10 NREL-ESIF Data Center Energy Recovery at 1 MW IT Load Annual Heat Demand & Recovery 9 2,500 8 7 6 5 4 3 2 1 0 EUE 0.9 Energy Targets PUE 1.06 EUE 0.9 EUE 0.7 Current Design PUE 1.05 EUE 0.7 Data Center Equipment Load Thermal Energy Recovered 2,000 1,500 1,000 500 0 Recovered Heat Campus Hot Water Heat for Export to Campus Heat Recovered Energy Usage Effectiveness (EUE) = Total Data Center Annual Energy Total Energy Recovered Total IT Equipment Annual Energy

THE BOTTOM LINE 25

NORMAL POWER (PLUG) DATA CENTER PANEL SP3 NORMAL LIGHTING DATA CENTER PANEL SL3 EMERGENCY LIGHTING DATA CENTER Panel ESL2 COOLING TOWERS DATA CENTER CT-602A, CT-602B, CT-602C, CT-6O2D COOLING TOWER FILTRATION UNIT & TRACE HEATERS FOR PIPING TOWER WATER PUMPS DATA CENTER P-602A, P-602B ENERGY RECOVERY PUMPS P-604A, P-604B AIR-HANDLING UNITS AHU-DC1, AHU-DC2 MAKE-UP AIR-HANDLING UNIT, MAU-DC1 WITH EVAPORATIVE COOLING PUMP DISTRIBUTION BOARD DSB-DCU-1 HPC FLEX RACK #9 HPC FLEX RACK #10 HPC FLEX RACK #11 HPC FLEX RACK #12 POWER DISTRIBUTION UNIT, HPC PDU-DCU-1 POWER DISTRIBUTION UNIT, LEGACY PDU-DCU-2 POWER DISTRIBUTION UNIT, HPC PDU-DC4-1 POWER DISTRIBUTION UNIT, LEGACY PDU-DC4-2 DISTRIBUTION BOARD DSB-DC2-1 DISTRIBUTION BOARD DSB-DC2-2 DISTRIBUTION BOARD DSB-DC1-1 DISTRIBUTION BOARD DSB-DC1-2 STAND-BY LOAD ESIF HPC Datacenter PUE Calculation NREL ESIF DATA CENTER ENERGY EFFICIENCY METRICS ENGINE BLOCK & FUEL HEATERS STAND-BY GENERATOR STAND-BY POWER POWER USE EFFECTIVENESS (PUE) DATA CENTER ENERGY EFFICIENCY IS BENCHMARKED USING AN INDUSTRY STANDARD METRIC OF POWER USE EFFECTIVENESS (PUE). THE PUE IS DEFINED AS FOLLOWS: ENERGY USE EFFECTIVENESS (EUE) ENERGY USE EFFECTIVENESS (EUE) IS A METRIC OF RECOVERED ENERGY BENEFICIALLY USED OUTSIDE OF THE DATA CENTER (HEATING). THE EUE IS DEFINED AS FOLLOWS: CRH CALCULATED POWER BASED ON RUNTIME HOURS (TYPICAL) AUTOMATIC TRANSFER SWITCH, ATS-LS AUTOMATIC TRANSFER SWITCH, ATS-CPLE AUTOMATIC TRANSFER SWITCH, ATS-U1SB POWER USE EFFECTIVENESS = TOTAL FACILITY POWER TOTAL FACILITY POWER IT EQUIPMENT POWER = LIGHTING & PLUG POWER + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT POWER ENERGY USE EFFECTIVENESS = TOTAL FACILITY ENERGY TOTAL FACILITY ENERGY RECOVERED ENERGY TOTAL FACILITY ENERGY = LIGHTING & PLUG ENERGY + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT ENERGY DATA CENTER NORMAL POWER SERVICE ENTRANCE SWITCHBOARD NORMAL LOAD IT EQUIPMENT POWER = TOTAL POWER USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER IT EQUIPMENT ENERGY = TOTAL ENERGY USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER EPM UPS UPM1 & UPM2 EPM EPM EPM EPM DATA CENTER ELECTRICAL TRANSFORMER CENTRAL PLANT DISTRIBUTION BOARD DSB-CP MECHANICAL DISTRIBUTION BOARD, DSB-MS1 DISTRIBUTION BOARD DSB-DCU DISTRIBUTION BOARD DSB-DC4 DISTRIBUTION BOARD DSB-DC3 DISTRIBUTION BOARD DSB-DC2 DISTRIBUTION BOARD DSB-DC1 DSB-U1SB CENTRAL PLANT EMERGENCY DISTRIBUTION BOARD, DSB-CPE UPS DISTRIBUTION BOARD ATS-U1SB EPM EPM EPM EPM CRH EPM EPM EPM EPM ELECTRICAL POWER METER (TYPICAL) COOLING DISTRIBUTION RACK, CDU-1 COOLING DISTRIBUTION RACK, CDU-2 COOLING DISTRIBUTION RACK, CDU-3 COOLING DISTRIBUTION RACK, CDU-4 RECOVERED ENERGY BENEFICIALLY USED OUTSIDE DATA CENTER, HEAT EXCHANGER HX-605A COOLING DISTRIBUTION RACK, CDU-5 COOLING DISTRIBUTION RACK, CDU-6 RECOVERED ENERGY (TYPICAL FOR HPC RACKS) LIGHTING & PLUG POWER COOLING LOADS PUMP LOADS HVAC LOADS IT EQUIPMENT POWER

STRIBUTION UNIT, PDU-DCU-1 STRIBUTION UNIT, Y PDU-DCU-2 STRIBUTION UNIT, PDU-DC4-1 STRIBUTION UNIT, Y PDU-DC4-2 UTION BOARD SB-DC2-1 UTION BOARD SB-DC2-2 BUTION BOARD SB-DC1-1 BUTION BOARD SB-DC1-2 Preliminary ESIF HPC Datacenter PUE ERGY EFFICIENCY METRICS POWER USE EFFECTIVENESS (PUE) DATA CENTER ENERGY EFFICIENCY IS BENCHMARKED USING AN INDUSTRY STANDARD METRIC OF POWER USE EFFECTIVENESS (PUE). THE PUE IS DEFINED AS FOLLOWS: POWER USE EFFECTIVENESS = TOTAL FACILITY POWER IT EQUIPMENT POWER TOTAL FACILITY POWER IT EQUIPMENT POWER = LIGHTING & PLUG POWER + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT POWER = TOTAL POWER USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER ENERGY USE EFFECTIVENESS (EUE) ENERGY USE EFFECTIVENESS (EUE) IS A METRIC OF RECOVERED ENERGY BENEFICIALLY USED OUTSIDE OF THE DATA CENTER (HEATING). THE EUE IS DEFINED AS FOLLOWS: ENERGY USE EFFECTIVENESS = TOTAL FACILITY ENERGY IT EQUIPMENT ENERGY Evap Cooling Towers, 2 Lighting, Plug Load, Misc., 6 Pumps, 34 Fan Walls, 7 TOTAL FACILITY ENERGY RECOVERED ENERGY TOTAL FACILITY ENERGY = LIGHTING & PLUG ENERGY + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT ENERGY = TOTAL ENERGY USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER 6 + 2 + 34 + 7 + 636 UPS UPM1 & UPM2 636 = 1.077! EPM EPM EPM EPM DISTRIBUTION BOARD DSB-DCU DISTRIBUTION BOARD DSB-DC4 DISTRIBUTION BOARD DSB-DC3 DISTRIBUTION BOARD DSB-DC2 IT Equipment, 636KW DISTRIBUTION BOARD DSB-DC1

Heat ESIF Offices, Labs, ventilation (save $200K / year) Green Data Center Bottom Line IT Load Energy Recovery NO Mechanical Chillers Evap. Water Towers Cost less to build Cost less to operate Comparison of ESIF PUE 1.06 vs efficient 1.3 data center. CapEx No Chillers Initial Build: 600 tons 10 Yr. growth: 2400 tons 10-year Savings: ($1.5K / ton) Savings No Chillers $.9M $3.6M $4.5M OpEx (10MW IT Load) PUE of 1.3 PUE of 1.06 Annual Savings 10-year Savings ($1M / MW year) Utilities $13M $10.6M $2.4M $24M (excludes heat recovery benefit)

What s Next - System Integration, DR, and Energy Mgmt MAXIMIZE ENERGY EFFICIENCY & REUSE Maximize Beneficial Daylighting, Minimize Lighting Loads Active Radiant (Chilled) Beams Perimeter Cooling & Heating Underfloor Air, Natural Ventilation, & Solar-Powered Relief Fans ENERGY Mgmt & Demand Response Go beyond power capping Energy Management Alter workload to meet opportunity. Alter workload minimize impact. 29

30