Viable Options for Data Center Solutions. By Ning Liu Offering Group Panduit APJ

Similar documents
Cabling a Data Center to TIA-942 Standard

Data Centre Solutions

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

MSYS 4480 AC Systems Winter 2015

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

POWER AND THERMAL MANAGEMENT SOLUTIONS

Structured Cabling Design for Large IT/Service Provider Data Centers

Thermal management. Thermal management

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Data Center Design and the ANSI/TIA-942 Standard

Air Containment Design Choices and Considerations

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Impact of Air Containment Systems

DATA CENTER COLOCATION BUILD VS. BUY

INT 492 : Selected Topic in IT II (Data Center System Design) 3(3-0-6)

Energy Saving Best Practices

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Great Lakes Product Solutions for Cisco Catalyst and Catalyst Switches: ESSAB14D Baffle for Cisco Catalyst 9513 Switch

Trends in Data Centre

Liebert DCW CATALOGUE

Rittal Cooling Solutions A. Tropp

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Reducing Data Center Cooling Costs through Airflow Containment

Close Coupled Cooling for Datacentres

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Cisco Nexus 7000 Series.

Power and Cooling for Ultra-High Density Racks and Blade Servers

PROCESS & DATA CENTER COOLING. How and Why To Prepare For What s To Come JOHN MORRIS REGIONAL DIRECTOR OF SALES

Cisco Nexus 7000 Series

Data Centre Design ready for your future

SmartRow Intelligent, Integrated Infrastructure in a Self-Contained Line-Up. Solutions For Business-Critical Continuity

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Power & Cooling Considerations for Virtualized Environments

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

Switch to Perfection

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

DATA CENTER SOLUTIONS TM. Fiber Flex

Virtualization and consolidation

Overcoming the Challenges of Server Virtualisation

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Green IT and Green DC

Cisco Nexus 7700 Switches Data Sheet

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

(Data Center Networks & Cloud Computing Security)

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Recapture Capacity for Existing. and Airflow Optimization

Data Center Trends: How the Customer Drives Industry Advances and Design Development

How You Can Optimize Passive Optical LAN Through Structured Cabling

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

BICSI Northeast Regional Meeting June 17, Storrs CT. Considerations in Data Center Design. Thomas McNamara, RCDD

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Efficiency of Data Center cooling

Next Generation Cooling

Saving Critical Space. Cable Management Solutions

Data Center Solutions

An Optimized Infrastructure In a Virtualised World. Bassel Al Halabi Regional Manager Middle East & Pakistan Panduit International Corporation

Reducing Energy Consumption with

Data Center Assessment Helps Keep Critical Equipment Operational. A White Paper from the Experts in Business-Critical Continuity TM

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

DATACOM CABINETS SOLUTION GUIDE

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Structured Cabling ESL Capability Summary

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Optimization and Speed: Two Critical Standards Enabling Next Generation Networks. Stephanie Montgomery, TIA Brett Lane, Panduit

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

How to Increase Data Center Efficiency

Powering Change in the Data Center. A White Paper from the Experts in Business-Critical Continuity TM

Data Centers & Storage Area Networks Trends in Fiber Optics & Cabling Systems

1.866.TRY.GLCC WeRackYourWorld.com

FREEING HPC FROM THE DATACENTRE

Data Center Airflow Management Basics: Comparing Containment Systems

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

Eaton Open Data Centre Solutions. Kari Koli

Structured Cabling Design for Large IT/Service Provider Data Centres. Liang Dong Yuan, Technical Sales manager, CDCE

Enabling Business Critical Continuity for Unified Communications. Andy Liu Product Manager, Asia Pacific

100 Years of Quality, Service, Innovation and Value. Presented by Darren McSorley Education and Technical Services ASIA PACIFIC

Cisco Nexus 7000 Series

White Paper. Physical Infrastructure for a Resilient Converged Plantwide Ethernet Architecture

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY.

18 th National Award for Excellence in Energy Management. July 27, 2017

PLAYBOOK. How Do You Plan to Grow? Evaluating Your Critical Infrastructure Can Help Uncover the Right Strategy

WHITE PAPER. The Three Principles of Data Center Infrastructure Design

SmartRow Solution Intelligent, Integrated Infrastructure for the Data Center

ADC KRONE. Data Centre Design Considerations. TrueNet Engineered for Uptime

SMARTAISLE Intelligent, Integrated Infrastructure for the Data Center

Rethinking Datacenter Cooling

Power Monitoring in the Data Centre

Server Room & Data Centre Energy Efficiency. Technology Paper 003

» BEST PRACTICES FOR AIRFLOW MANAGEMENT AND COOLING OPTIMISATION IN THE DATA CENTRE

PN Telecommunications Infrastructure Standard for Data Centers Draft 2.0 July 9, 2003

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Transcription:

Viable Options for Data Center Solutions By Ning Liu Offering Group Panduit APJ

Agenda Market Overview Data Center Challengers Business Drivers Data Center Standards Data Center Technologies Panduit Solutions

Market Overview By 2010, more than 70% of companies will have carried out a formal data center project (new, consolidation, virtualization) (Gartner) Did you know: Half of the world s Data Centers will run out of power by 2008? (Gartner press release, November 29, 2006) Typical situation: Data Center cost is 30% CapEx & 70% OpEx

Data Center Environmental Challenges Cooling Power Structured cabling Structural loading The nature of Data Center infrastructure makes it challenging to find solutions that don t spawn other problems

Cooling Issues Today s products are hotter than yesterday s Tomorrow s products will be hotter than today s Data Center Managers prefer to tightly install equipment to fully utilize cabinet space

Power- Where is it going? Data Center Power Consumption: Cooling: 50% Server: 25% Network Equipment: 12% Power Loss: 10% Lighting: 3% Approx 25% of Data Center Power goes to networking equipment and typically includes cooling and power requirement

Business Continues to Drive Data Centers Post dot com spending Mission critical applications Uptime requirements SLA Service Level Agreements Mobile Computing Regulation / Directives

Answers to the Challenges Design Implementation Operation

Data Center Standards ANSI/TIA-942 Telecommunications Infrastructure Standard for Data Centers TIA/EIA TIA/EIA TIA/EIA TIA/EIA 568 Copper & Fiber Cabling 569 Pathways & Spaces 606 Administration 607 Grounding & Bonding ASHRAE Cooling/HVAC Uptime Institute IEEE 1100 ITE Grounding TIA: Telecommunications Industry Association http://www.tiaonline.org/ Uptime Institite: http://uptimeinstitute.org/ Government work on server and DC Energy Efficiency: http://www.energystar.gov/index.cfm?c=prod_development.server_efficiency

TIA-942 Data Center Logical Layout Offices, Operations Center, Support Rooms Access Providers Entrance Room (Carrier Equip & Demarcation) Access Providers Horizontal Cabling Telecom Room (Office & Operations Center LAN Switches) Backbone Cabling Main Dist Area (Routers/Backbone LAN/SAN Switches, PBX, M13 Muxes) Backbone Cabling Backbone Cabling Computer Room Horiz Dist Area (LAN/SAN/KVM Switches) Horizontal Cabling Zone Dist Area Horiz Dist Area (LAN/SAN/KVM Switches) Horiz Dist Area (LAN/SAN/KVM Switches) Horiz Dist Area (LAN/SAN/KVM Switches) Horizontal Cabling Horizontal Cabling Horizontal Cabling Horizontal Cabling Equipment Dist Area (Rack / Cabinet) Equipment Dist Area (Rack / Cabinet) Equipment Dist Area (Rack / Cabinet) Equipment Dist Area (Rack / Cabinet)

TIA-942 Data Center Major Elements

Data Center Tier Levels Tier I Basic Tier II Redundant Components Tier III Concurrently Maintainable Tier IV Fault Tolerant Site Availability 99.671% 99.749% 99.982% 99.995% Downtime (Hours/Year) 28.8 22.0 1.6 0.4 Operations Center Not Required Not Required Required Required Redundant Access Provider Services Not Required Not Required Required Required Redundant Backbone Pathways No No Yes Yes Redundant Horizontal Cabling No No No Optional UPS Redundancy N N+1 N+1 2N Gaseous Suppression Clean Agents Clean Agents No No System FM200/Intergen FM200/Intergen

Date Center Air Flow Put DATA cables in HOT aisles, up high Data Center AC HOT Server Cabinet COLD Server Cabinet HOT Server Cabinet PERFORATED FLOOR TILE TELCOM CABLE TRAY POWER CABLE Based on ASHRAE Thermal Guidelines for Data Processing Environments Panduit Confidential

Data Center Grounding Standards TIA-607 specifies the building ground system to earth ground TIA-942 specifies the grounding of the data racks and equipment to the the CBN IEEE 1100 specifies the common bonding network (CBN), (grounding grid below the raised floor)

Panduit Data Center Solution Energy Saving Availability Scalability Security Manageability

Data Center Design Challenges Providing maximum uptime Proper planning for growth Technology upgrades Rising costs (CapEx, OpEx) Panduit Solutions TIA based Certified Panduit partner base Infrastructure expertise Design tools AutoCad & Visio Analysis tools Computational fluid dynamics for product development Cable pathway fill calculators Product configurators All products RoHS compliant

Implementation Racks and Cabinets Maximize floor space density Facilitate proper cooling practices Install blanking panels to minimize hot air recirculation (impacts delta degrees) Utilize floor grommets >50% cold air escapes through unsealed cable holes and conduits

Implementation Cost savings Reduce real estate costs by 23% Reduce installation time with 46% less components 1,500 sq. ft. Data Centre (36% space savings) Tier 1 (approx. $450 / ft. sq.): $24,300 Tier 4 (approx. $1200 / ft. sq.): $64,800 Based at 10% of entire floor space

Implementation Cabinets Prevalent in data centers Applications: Servers and switches (especially high density) Aesthetically pleasing Highly secured Wide variety, modular Cable density concerns Floor tile footprint vs. additional benefits Thermal issues Cooling airflow patterns

Implementation CFD analysis software image of front of cabinets: Reduced switch temperature in the NET-ACCESS cabinet with ducting Exhaust air prevented from re-circulating within cabinet NET-ACCESS Cabinet with Duct Cabinet without Duct

Enabling Hot Aisle/Cold Aisle Designs with High Density 6509 and 9513 Chassis Example Panduit Cabinet 45RU (32 W x 40 D x 84 H) Up to 20kW/cabinet heat rejection capability 3x6509 s or 3x9513 s per Rack Front to back airflow into Hot Aisles Integrated Cable Management Modular design to support future air handlers or spot cooling Part # CN4-1 and CN4-2 for MDS 9513 and # CN4-3 for the Catalyst 6509E

Implementation Proven performance Engineered by IBM with 30+ years in liquid cooling computers Passive operation Increased density Removes up to 60% of heat, or 20kW Allows for high-density deployment Energy efficient Lessens burden on CRAC units More efficient than fan based systems

Implementation Cabling plant Develop a strategy (current and future) Install the proper cable counts Deploy a zone cabling configuration Remove dead cables Ribbon cables: reduce the overall cable counts and bundle diameters

Server Port Configurations Ba l d e Ce n t er O C M A P C T POW ER O C M A P C T POW ER 1 O C M A P C T 2 3 4 5 6 7 8 9 1 0 11 12 13 14 Ba l d e Ce n t er POW ER O C M A P C T POW ER O C M A P C T 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 POW ER Ba l d e Ce n t er O C M A P C T POW ER O C M A P C T POW ER 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 Ba l d e Ce n t er O C M A P C T POW ER O C M A P C T POW ER 1 O C M A P C T 2 3 4 5 6 7 8 9 1 0 11 12 13 14 Ba l d e Ce n t er POW ER O C M A P C T POW ER O C M A P C T 3 4 5 6 7 8 9 1 0 11 12 13 14 1 2 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 POW ER Ba l d e Ce n t er O C M A P C T POW ER O C M A P C T POW ER Sparse 1 large server DATA: 2 per cabinet OOBM: 1 per Cabinet SAN: 2 per cabinet 3RU Servers 14 servers per cabinet DATA: 28 per cabinet OOBM: 14 per cabinet SAN: 28 per cabinet 1RU Servers 42 servers per cabinet LAN: 84 per cabinet OOBM: 42 per cabinet SAN: 84 per cabinet Blade Servers 84 servers per cabinet LAN: 168 per cabinet OOBM: 12 per cabinet SAN: 168 per cabinet

Zone Cabling Solutions Benefits Less disruptive Flexibility Pedestals create inherent infrastructure pathways Good utilization of real estate Patch field is hidden, clean look Ideal for dynamic data Center environments (I.e., storage and mainframes) Entrance Room MDA ZDA HDA EDA Typical Data Cabling Topology per TIA/EIA-942

Physical Layer Management-PanView Customers running mission critical applications where network downtime is very costly PanView monitors critical data paths and can alert administrators if/when unsolicited changes are made Real-time visibility of the physical layer and guided patching allow PanView to help restore connectivity more quickly Customers concerned about highly sensitive data and other network security concerns PanView monitors every switch port and server or appliance connection, logging information about network access Assigning physical location information to network devices allow PanView to accurately track assets via network connectivity ROUTERS SWITCHES PATCH PANELS DEVICES CiscoWorks PanView HPOV / Tivoli / CA Unicenter

Power over Ethernet-DPoE DPoE Powered Patch Panel Central Management of all Panels Selectively Shutdown Powered Ports Graphical View of Power Consumption Scalability Medium to large enterprises require optimal space savings using DPoE Power Patch Panels. Small to medium businesses require flexibility and optimal size may choose the DPoE Power Hubs. Flexibility A call center will benefit from the Power Patch Panels since all desks will have IP Telephones which require power. An office which moves personnel around will benefit from the flexibility of the Power Hub.

Modular Data Center Example 12 Server PODs Consists of the following: 4 Switch Cabinets for LAN & SAN 32 Server Cabinets 12 Servers per Server Cabinet Servers: 4032 6509 Switches: 30 Server\Switch Cabinets: 399 Midrange\SAN Cabinets Allotted For: 124 Core 1 Core 2 Agg1 Agg2 Agg3 Agg4 Acc1 Acc2 Acc11 Acc12 Acc13 Acc14 Acc23 Acc24 6 Pair Switches 6 Pair Switches 336 Servers 336 Servers 336 Servers 336 Servers

Modular Data Center Example Total White Space: 14,400 sqft

Modular Data Center Example Equipment Distribution Area (EDA) Single POD Acc1 Acc2 336 Servers

Modular Data Center Example EDA Application Photo

Modular Data Center Example Horizontal Distribution Area (HDA) Single POD Acc1 Acc2 336 Servers

Modular Data Center Example HDA Application Photo

Modular Data Center Example--EDA Core 1 Core 2 Agg1 Agg2 Agg3 Agg4 Core Routing\Firewalls LAN Appliances SAN Directors

Modular Data Center Example CFD Analysis

Summary Global Presence Delivering Solutions Anywhere in the World Technology Leader Innovative Product Sets, Large R&D Investment Breadth of Products Most Complete End-to-End Solutions Stable Organization Responsive Culture (Innovation, Quality, Service) Partners / Alliances Market Leaders (Development, Deployment and Distribution)

Thank You