MSYS 4480 AC Systems Winter 2015

Similar documents
Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Data Center Trends: How the Customer Drives Industry Advances and Design Development

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

DATA CENTER EFFICIENCY: Data Center Temperature Set-point Impact on Cooling Efficiency

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

Welcome Association of Energy Engineers to the Microsoft Technology Center

Green Data Centers A Guideline

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Efficiency of Data Center cooling

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

18 th National Award for Excellence in Energy Management. July 27, 2017

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Thermal management. Thermal management

Data Center Energy Savings By the numbers

LANL High Performance Computing Facilities Operations. Rick Rivera and Farhad Banisadr. Facility Data Center Management

Reducing Data Center Cooling Costs through Airflow Containment

Liebert DCW CATALOGUE

Recapture Capacity for Existing. and Airflow Optimization

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

MEASURE MONITOR UNDERSTAND

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Green Computing: Datacentres

REPORT. Energy Efficiency of the San Diego Supercomputer Center and Distributed Data Centers at UCSD. Prepared for:

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Is Data Center Free Cooling Feasible in the Middle East? By Noriel Ong, ASEAN Eng., PMP, ATD, PQP

Retro-Commissioning Report

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Data center cooling evolution. The hottest way to save energy. Lucio Bustaffa: Product Development Manager BlueBox Group

Virtualization and consolidation

Rittal Cooling Solutions A. Tropp

Green Computing: Datacentres

Reducing Energy Consumption with

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

IBM Site Enablement Services

Introducing the Heat Wheel to the Data Center

A Green Approach. Thermal

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Hot vs Cold Energy Efficient Data Centers. - SVLG Data Center Center Efficiency Summit

Session 6: Data Center Energy Management Strategies

Air Containment Design Choices and Considerations

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Data Center Trends and Challenges

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Impact of Air Containment Systems

Power & Cooling Considerations for Virtualized Environments

Google s Green Data Centers: Network POP Case Study

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

HIGH DENSITY RACK COOLING CHOICES

Switch to Perfection

A Data Center Heat Removal System That Saves... 80% Discover Inertech s Award-winning Data Center Heat Removal System

Power and Cooling for Ultra-High Density Racks and Blade Servers

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Planning a Green Datacenter

Server room guide helps energy managers reduce server consumption

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

How High-Density Colocation and Enhanced Visibility Improve Business Performance

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Next Generation Cooling

Total Modular Data Centre Solutions

Scalable, Secure. Jason Rylands

CLOSE CONTROL AIR CONDITIONERS DATA CENTERS DESIGN SIDE NOTES

Energy Saving Best Practices

ENCLOSURE HEAT DISSIPATION

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

Achieving Operational Excellence

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

Trends in Data Center Energy Efficiency

Data Center Temperature Design Tutorial. Ian Seaton Chatsworth Products

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

CHAPTER 11 Data Centers and IT Equipment

Evaluation and Modeling of Data Center Energy Efficiency Measures for an Existing Office Building

Capacity and Power Management: The Forgotten Factors in Disaster Recovery Planning Presented by: Clemens Pfeiffer CTO Power Assure, Inc.

Case Study: The design of a modern data centre Jeffrey Brindle Technical Director, HP Critical Facilities Services

The End of Redundancy. Alan Wood Sun Microsystems May 8, 2009

How much power?! Data Centre Efficiency. How much Money?! Carbon Reduction Commitment (CRC)

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

The Energy. David L. Moss Dell Data Center Infrastructure

Case study on Green Data Center. Reliance Jio Infocomm Limited

DC Energy, Efficiency Challenges

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

International 6SigmaDC User Conference CFD Modeling for Lab Energy Savings DCSTG Lab Temperature Setpoint Increase

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

3300 kw, Tier 3, Chilled Water, 70,000 ft 2

10 TECHNIQUES FOR IMPROVING DATA CENTER POWER EFFICIENCY

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

ZTE ZXCR Series DX In-Row Air Conditioning Product Description

Ecolibrium. Cloud control. Keeping data centres cool. OCTOBER 2018 VOLUME 17.9 RRP $14.95 PRINT POST APPROVAL NUMBER PP352532/00001

Optimizing Cooling Performance Of a Data Center

Power Monitoring in the Data Centre

ABB Automation & Power World: April 18-21, 2011 CLP CEU Myth Busting: The Truth behind Data Center Marketing Trends

Data Centre Energy Efficiency. Duncan Clubb EMEA Data Centre Practice Director CS Technology

Innovative Data Center Efficiency Techniques. Michael K Patterson, PhD PE Eco-technology Program Office Intel Corporation

Transcription:

MSYS 4480 AC Systems Winter 2015 Module 12: DATA CENTERS By Satwinder Singh 14/05/2015 1

Data Centers Data Centers are specialized environments that safeguard your company's most valuable equipment and intellectual property. Data Centers house the devices that do the following: Process your business transactions Host your website Process and store your intellectual property Maintain your financial records Route your e mails

Computer Rooms and Data Centres

Server Farms Google, The Dalles, OR = 6,500m² Yahoo, Quincy, WA = 13,000m² Microsoft, Qunicy, WA = 43,600m² 48MW Cloud computing

Sea Containers

Data Centre What are they? Space for computer equipment: House it Power it Cool it Why should I care? It is just a room with servers. Heart of the organization Highest investment density ($/sq.ft) Highest power density (W/sq.ft)

What is in a typical rack? Rack of state of art servers: 20kW @10 cents /kwh: $17,000 per year High density rack: 10kW @10 cents /kwh: $8,500 per year Average rack: 5kW @10 cents /kwh: $4,250 per year A Data Center can have HUNDREDS of these racks.

Data Centers Types Wiring Closets (1-3 Racks) Computer Rooms (1-5 Racks) Small Data Centres (5-20 Racks) Medium Data Centres (20-100 Racks) Large Data Centres (Over 100 Racks) Server Farms (20,000 ft² and over) Sea Containers Range 0.5-3 KW 3-9.9 KW 10-40 KW 40 200 KW Over 200 KW MW range 700 kw Cooling 1 2 ton split AC unit Split AC/ High Precision Ceiling AC Raised floor Overhead ducting Hot / Cold Aisle Raised Floor Overhead Ducting Hot / Cold Aisle Raised floor Hot / Cold Aisle Supplemental cooling Raised floor Hot / Cold Aisle Supplemental cooling Cold Aisle and hot air plenum Overhead cooling

Data Centers Components Main Components (more than 20 elements) 1. Architectural 2. Mechanical 3. Electrical 4. Communications 5. Security 6. Structural 7. Housekeeping

Moores Law the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented.

Moores Law

Data Center Energy Hogs Data centers are energy intensive facilities. 10 100x more energy intensive than an office Server racks well in excess of 30 kw Surging demand for data storage EPA estimate: 3% of U.S. electricity Power and cooling constraints in existing facilities.

Data Centers High Energy users Energy intensive facilities Server racks now designed for more than 25+ kw Projected to double every 5 years. Surging demand for Data Storage Typical facility ~1MW (Could be more) US energy usage for Data Centers 1.5% of total Projected to double in next 5 years Rising cost of ownership Cost of utilities is surpassing IT equipment cost

Example: Data Centre 15ft x 15ft = 625 ft² Data Center IT Load 2 Racks of state-of-art servers @ 20kW 4 High density racks @ 10kW 14 average racks @ 5kW 40 kw 40 kw 70 kw 150 kw $34,000 $34,000 $60,000 Total connected IT load: 150 kw Annual energy consumption: 1,314,000 kwh Annual cost: $130,000 - @ 10 cents per kwh Non-UPS Load: (Lighting, Cooling etc ) Cooling IT : 42 tons Cooling UPS : 3 tons Lighting : 3 tons

How Data centre is different than Office? Annual energy costs: 15 times (as high as 40 times) Power density Office space: 3 5 W/ft² Data Center: 25 W/ft² to 60 W/ft² (High and increasing and doubling every 5 years) 24x7 operation Value of facility: at least 10 times (could be 30 times)

Safe Temperature Limits

Environmental Conditions Data center equipment s environmental conditions should fall within the ranges established by ASHRAE as published in the Thermal Guidelines.

Equipment Environmental Specification

Allowable / Recommended

2011 ASHRAE Thermal Guidelines

2011 ASHRAE Allowable Ranges

Cooling in Data Centers

Psychrometric Bin Analysis

Estimated Savings

Most common configuration (Chilled Air) Raised Floor Courtesy: ASHRAE Datacom Equipment Power Trends and Cooling Applications

Managing Supply & Return Using ceiling as return air plenum (Chilled Air) Courtesy: ASHRAE Datacom Equipment Power Trends and Cooling Applications

High Density Spots Overhead Cooling Units + Raised Floor (Liquid Cooling + Chilled Air) Courtesy: ASHRAE Datacom Equipment Power Trends and Cooling Applications

Closed Couple DC Cooling

Manage Airflow

Airflow Management

Airflow Management

Air Management

Isolate Cold and Hot Aisles

Natural Ventilation

Liquid Cooling Overview Water and other liquids (dielectrics, glycols and refrigerants) may be used for heat removal. Liquids typically use LESS transport energy (14.36 Air to Water Horsepower ratio for example below). Liquid to liquid heat exchangers have closer approach temps than Liquid to air (coils), yielding increased economizer hours.

Metrics in Data Centers (Historic) Load Intensity: Data Center floor area: ft² Total load density: W/ft² Computing load density: W/ft² HVAC load density: W/ft² Data Center Electrical power demand: UPS Loss kw Computer load (UPS Power) kw HVAC chilled water plant kw HVAC kw Lighting kw HVAC: Chiller plant Chiller Efficiency: kw/ton Chilled Water Plant Efficiency: kw/ton Chiller load: Tons Data Center Load: Tons HVAC air systems Central air handling fan power: cfm/kw Air handler fan efficiency: External temperature and humidity Design Data: Design basis for Computer load kw/ft² Design basis for Chilled Water, T air side HVAC, and % RH UPS Systems Flow rate

Metrics in Data Centers Power Usage Effectiveness (PUE): PUE of 3 is equal to 33% DCiE Data Center Infrastructure Efficiency (DCiE): 1/PUE Computer Power Consumption Index (CPCI): Fraction of the total data center power to Power used by the computers.

Data Center Efficiency Metric Power Usage Effectiveness (PUE) is an industry standard data center efficiency metric. The ratio of power used or lost by data center facility infrastructure (pumps, lights, fans, conversions, UPS ) to power used by compute. Not perfect, some folks play games with it. 2011 survey estimates industry average is 1.8. Typical data center, half of power goes to things other than compute capability.

PUE

PUE

Power Usage Effectiveness (PUE): Current Trends 1.9 Improved Operations 1.7 Best Practices 1.3 State of the Art 1.2 Figure 1: EPA Estimated PUE Values in 2011

PUE Less than 1 Re using waste heat from the data center elsewhere can render the PUE below 1.0

PUE Less than 1

Best Practices in DC Cooling 1. Reduce the IT load Virtualization & Consolidation (up to 80% reduction). 2. Implement contained hot aisle and cold aisle layout. Curtains, equipment configuration, blank panels, cable entrance/exit ports, 3. Install economizer (air or water) and evaporative cooling (direct or indirect). 4. Raise discharge air temperature. 5. Install VFD s on fans, pumps, chillers, and towers 6. Reuse data center waste heat if possible. 7. Expand humidity range and improve humidity control (or disconnect) Raise the chilled water (if used) set point. Increasing chiller water temperature by 1 F reduces chiller energy use by 1.4% 8. Central Plants: Use a central plant (e.g., Chiller/CRAHs vs. CRAC units) Use centralized controls on CRAC/CRAH units to prevent simultaneous humidifying and dehumidifying 9. Install high efficiency equipment including UPS, power supplies, etc. 10. Move chilled water as close to server as possible (direct liquid cooling). 11. Consider centralized high efficiency water cooled chiller plant Air cooled = 2.9 COP, water cooled = 7.8 COP

Availability

Modularity

Tier Level (Reliability) Infrastructure Tiers The higher the availability you want your Data Center to achieve, the more layers of infrastructure it must have. N capacity is the amount of infrastructure required to support all servers or networking devices in the Data Center, assuming that the space is filled to maximum capacity and all devices are functioning. N most commonly used when discussing standby power, cooling, and the room's network. N+1 infrastructure can support the Data Center at full server capacity and includes an additional component Alternately called a 2N or system plus system design, it involves fully doubling the required number of infrastructure components Even higher tiers exist or can be created: 3N, 4N, and so on.

Tier Level (Reliability) Tier I is composed of a single path for power and cooling distribution, without redundant components, providing 99.671% availability. Tier II is composed of a single path for power and cooling distribution, with redundant components, providing 99.741% availability. Tier III is composed of multiple active power and cooling distribution paths, but only one path active, has redundant components, and is concurrently maintainable, providing 99.982% availability. Tier IV is composed of multiple active power and cooling distribution paths, has redundant components, and is fault tolerant, providing 99.995% availability.

Data Centers Tier Classification Tier I Tier II Tier III Tier IV Number of delivery paths Only 1 Only 1 1 Active 1 Passive 2 Active Redundant components N N +1 N +1 2(N+1) or S+S Support space to raised floor ratio 20% 30% 80-90% 100% Initial watts / sq. ft. 20-30 40-50 40-60 50-80 Ultimate watts / sq. ft. 20-30 40-50 100-150 150+ Raised floor height 12" 18" 30-36" 30-36" Floor loading pounds/sq. ft. 85 100 150 150+ Utility Voltage 208,480 208,480 12-15kV 12-15kV Months to implement 3 3 to 6 15 t0 20 15 to 20 Year first deployed 1965 1970 1985 1995 Construction $/Sq. Ft. Raised floor * $ 450 $ 600 $ 900 $1,100+ Annual IT downtime due to site 28.8 hrs 22.0 hrs 1.6 hrs 0.4 hrs Site availability 99.671% 99.749% 99.982% 99.995% * excludes land and abnormal civil cost. Assumes minimum of 15,000 sq.ft. of raised floor, architecturally plain one story building fitted out for the initial capacity, but with the backbone designed to reach the ultimate capacity with the installation of additional components. Make adjustments for NYC, Chicago, and other high cost areas.

Data Center Space Classifications Numerical Ranking Terminology Summary Definition Uptime Institute Equivalent 1 Uncontrolled Availability 2 Managed Availability 3 Controlled Availability (99.671%) 4 Moderate Availability (99.749%) 5 High Availability (99.982%) 6 Maximum Availability (99.995%) Shared building power and cooling No generator Dedicated utility power Unconditioned power to load Dedicated, non-redundant HVAC No / shared generator Dedicated utility power Uninterruptible power system (UPS) Dedicated, non-redundant HVAC Generator Dedicated utillity power Redundant UPS systems Dedicated, redundant HVAC Dedicated Generator Dedicated utillity power Redundant UPS systems Dedicated, redundant HVAC Dedicated, redundant generators Dedicated, redundant utility power Redundant UPS systems Dedicated, redundant HVAC Dedicated, redundant generators Fully automated systems fail-over Tier I (Non-Redundant) Tier II (Redundant) Tier III (Concurrently Maintainable) Tier IV (Fault Tolerant)