Optimizing data centers for high-density computing

Size: px
Start display at page:

Download "Optimizing data centers for high-density computing"

Transcription

1 Optimizing data centers for high-density computing technology brief, 2nd edition Abstract... 2 Introduction... 2 Power consumption and heat load... 2 Power consumption... 2 Heat load... 5 The Need for Planning... 7 Optimizing the effectiveness of cooling resources... 8 Raised floors... 8 Perforated tiles... 8 Air supply plenum... 9 Racks... 9 Cooling footprint Internal airflow Hot and cold aisles Rack geometry Computer room air conditioners Capacity of CRAC units Placement of CRAC units Discharge velocity Airflow distribution for high-density data centers Ceiling return air plenum Dual supply air plenums Ceiling-mounted heat exchangers Advanced thermal management techniques Static Smart Cooling Dynamic Smart Cooling Conclusion For more information... 20

2 Abstract This paper describes factors causing the increase in power consumption and heat generation of computing hardware. It identifies methods to optimize the effectiveness of cooling resources in data centers that are deploying high-density equipment or that are already fully populated with high-density equipment. The intended audience for this paper includes IT managers, IT administrators, facility planners, and operations staff. Introduction From generation to generation, the power consumption and heat loads of computing, storage, and networking hardware in the data center have drastically increased. The ability of data centers to meet increasing power and cooling demands is constrained by their designs. Most data centers were designed using average (per unit area) or "rule of thumb" criteria, which assume that power and cooling requirements are uniform across the facility. In actuality, power and heat load within data centers are asymmetric due to the heterogeneous mix of hardware and the varying workload on computing, storage, and networking hardware. These factors can create "hot spots" that cause problems related to overheating (equipment failures, reduced performance, and shortened equipment life) and drastically increase operating costs. The dynamic nature of data center infrastructures creates air distribution problems that cannot always be solved by installing more cooling capacity or by using localized cooling technologies. A more sophisticated scientific method can help to find the most effective solutions. Research at HP Laboratories has found that proper data center layout and improved Computer Room Air Conditioner (CRAC) 1 utilization can prevent hot spots and yield substantial energy savings. This paper is intended to raise awareness of present and future challenges facing data centers beginning to deploy or already fully populated with high-density equipment. This paper describes power consumption and heat load, recommends methods to optimize the effectiveness of data center cooling resources, and introduces thermal management methods for high-density data centers. Power consumption and heat load In the past when data centers mainly housed large mainframe computers, power and cooling design criteria were designated in average wattage per unit area (W/ft 2 or W/m 2 ) and British Thermal Units per hour (BTU/hr), respectively. These design criteria were based on the assumption that power and cooling requirements were uniform across the entire data center. Today, IT managers are populating data centers with a heterogeneous mix of high-density hardware as they try to extend the life of their existing space. This high-density hardware requires enormous amounts of electricity and produces previously unimaginable amounts of heat. For example, IT infrastructures are now using 1U dual-processor and 4U quad-processor ProLiant blade servers that can be installed together in a rack-mounted enclosure, interconnected, and easily managed. This high-density server technology lowers the operating cost per CPU by reducing management expenses and the requirements for floor space. Despite speculation that high-density server technology has the opposite effect on power consumption and heat load, a closer server-toserver comparison reveals that HP p-class blades consume less power and generate less heat load. Power consumption HP provides online power calculators to estimate power consumption for each ProLiant server. The power calculators provide information based on actual system measurements, which are more 1 CRAC units are sometimes referred to as air handlers. 2

3 accurate than using nameplate ratings. Figure 1 displays an HP power calculator, which is a macrodriven Microsoft Excel spreadsheet. Power calculators for all current HP ProLiant servers can be found at Calculator Catalog.xls. Figure 1. Screen shot of ProLiant DL380 G4 server power calculator 3

4 From generation to generation, the power consumption of high-density servers is increasing due to the extra power needed for faster and higher capacity internal components. For example, the power required by a ProLiant DL360 G3 featuring a 3.0-GHz Intel Xeon processor is 58 percent higher than its predecessor with a Pentium III 1.4 GHz processor (see Table 1). Table 1. Increase in power consumption from generation to generation of ProLiant DL and BL servers Rack Unit (CPUs/Memory/ Drives/Adapters) Power Supply Rating per Server Generation Power Increase 1U DL360 G2 DL360 G3 DL360 G4 (2P, 4 GB, 2 HDD, 1 PCI) 246 W V 389 W V 460 W BTU/hr 1328 BTU/hr 1570 BTU/hr 2U DL380 G2 DL380 G3 DL380 G4 (2P, 4 GB, 6 HDD, 2 PCI) 362 W V 452 W V 605 W V 1233 BTU/hr 1540 BTU/hr 2065 BTU/hr 4U DL580 G1 DL580 G2 DL580 G3 (4P, 8 GB, 4 HDD, 3 PCI) 456 W V 754 W V 910 W V 1556 BTU/hr 2573 BTU/hr 3106 BTU/hr The table above compares the power consumption of individual servers; however, standard racks can house several of these servers. Estimating the power consumption of a rack of servers is more difficult because several variables (number of servers per rack, type and number of components in each server, etc.) contribute to the amount of power consumed. For racks, a very useful metric is power density, or power consumption per rack unit (W/U). Power density captures all of the key variables that contribute to rack densification. Figure 2 illustrates the overall power density trend from generation to generation of HP ProLiant BL and DL servers. Figure 2. Power density of HP ProLiant BL and DL servers (Watts/U) 4

5 Heat load Virtually all power consumed by a computer is converted to heat. The heat generated by the computer is typically expressed in BTU/hr, where 1 W equals BTU/hr. Therefore, once the power consumption of a computer or a rack of computers is known, its heat load can be calculated as follows: Heat Load = Power [W] BTU/hr per watt For example, the heat load for a DL360 G4 server is 460 W BTU/hr/W =1,570 mbtu/hr. The heat load of a 42U rack of DL 360 G4 servers is almost 65,939 BTU/hr, which is more than that of a typical one-story house. Table 2 lists the power requirements and heat loads of racks of densityoptimized ProLiant DL and BL class servers. The table shows the trend toward higher power and heat load with rack densification. Table 2. Power and heat loads of fully-configured, density-optimized ProLiant servers* ProLiant Server DL 580 G3 DL380 G4 DL360 G4 BL20p G2 Servers per Rack Power W = W = W = W = 9.0 kw kw kw kw Heat load 30,717 BTU/hr 41,212 BTU/hr 65,939 BTU/hr 50,374 BTU/hr * These calculations are based on the product nameplate values for fully configured racks and therefore may be higher than the actual power consumption and heat load. IT equipment manufacturers typically provide power and heat load information in their product specifications. HP provides a Rack/Site Installation Preparation Utility to assist customers in approximating the power and heat load per rack for facilities planning (Figure 3). The Site Installation Preparation Utility uses the power calculators for individual platforms so that customers can calculate the full environmental impact of racks with varying configurations and loads. This utility can be downloaded from 5

6 Figure 3. The ProLiant Class, Rack/Site Installation Preparation Utility available on the HP website 6

7 The Need for Planning The server densification trend is being driven by customers need to maximize the use of valuable data center floor space. Because concentrated heat generation is an inevitable byproduct of concentrated computing power, data centers must ensure adequate localized cooling capacity to match non-uniformly distributed heat loads. Most data centers have sufficient cooling resources already in place; their main challenge is directing cooling to racks that generate high heat loads. For data centers that cannot afford to upgrade cooling capacity to handle concentrated heat loads, HP s small form factor servers offer the flexibility to limit rack power density (kw/rack) based on the capacity of nearby cooling resources. For example, Figure 4 shows several rack configurations using six generations of ProLiant DL360 servers, each limited to 10 kw by controlling the number of servers per rack. Some data center personnel believe that limiting rack power density neutralizes the space-saving benefits of densification because the racks are not full. However, from generation to generation the compute power per server is increasing faster than the corresponding increase in power consumption, resulting in higher efficiency. The ability to limit rack power density while increasing computing power can prolong the lifecycle of space-constrained infrastructures. Data centers that have sufficient capacity or that can afford to add cooling capacity can use HP small form factor servers to maximize rack computing power so that different facilities can be consolidated to reduce overall operating costs. Figure 4. Limiting rack power density based on power consumption results in lower rack utilization. The goal of all data centers is to optimize the effectiveness of existing cooling resources. The following section describes proven methods to achieve better airflow distribution. Given that power consumption and cooling demands will continue to increase, future data center designs will have to take a more holistic approach that examines cooling from the chip level to the facility level. The section titled Advanced thermal management techniques outlines breakthrough research by HP Laboratories to create intelligent data centers that dynamically redistribute compute workloads and provision cooling resources for optimum operating efficiency. 7

8 Optimizing the effectiveness of cooling resources This section recommends methods to optimize the effectiveness of cooling resources in raised floor infrastructures, a common configuration used in today's data centers. Raised floors Most data centers use a down draft airflow pattern in which air currents are cooled and heated in a continuous convection cycle. The down draft airflow pattern requires a raised floor configuration that forms an air supply plenum beneath the raised floor (Figure 5). The CRAC unit draws in warm air from the top, cools the air, and discharges it into the supply plenum beneath the floor. Raised floors typically measure 18 inches (46 cm) to 36 inches (91 cm) from the building floor to the top of the floor tiles, which are supported by a grounded grid structure. The static pressure in the supply plenum pushes the air up through perforated floor tiles to cool the racks. Most equipment draws in the cold supply air and exhausts warm air out the rear of the racks. Ideally, the warm exhaust air rises to the ceiling and returns along the ceiling back to the top of the CRAC units to repeat the cycle. Many traditional data centers arrange rows of racks in the front-to-back layout shown in Figure 5. The mixing of cold and hot air in the aisles is very inefficient and wastes valuable cooling resources and energy. While this layout can work with lower power densities and heat loads, as the power density and heat load increase, the equipment inlet temperatures will begin to rise (shown in the figure) and overheat critical resources. Figure 5. Traditional raised floor configuration with high-density racks arranged front to back Perforated tiles Floor tiles range from 18 inches (46 cm) to 24 inches (61 cm) square. The percentage and placement of perforated floor tiles are major factors in maintaining static pressure. Perforated tiles should be placed in front of at least every other rack. In higher density environments, perforated tiles may be necessary in the front of each rack. Perforated tiles are classified by their open area, which may vary from 25 percent (the most common) to 56 percent (for high airflow). A 25 percent perforated tile provides approximately 500 cubic feet per minute (cfm) at a 5 percent static pressure drop, while a 56 percent perforated tile provides approximately 2000 cfm. 8

9 Air supply plenum The air supply plenum must be a totally enclosed space to achieve pressurization for efficient air distribution. The integrity of the subfloor perimeter (walls) is critical to prevent moisture retention and to maintain supply plenum pressure. This means that openings in the plenum perimeter and raised floor must be filled or sealed. Subfloor plenum dividers should be constructed in areas with large openings or with no subfloor perimeter walls. The plenum is also used to route piping, conduit, and cables that bring power and network connections to the racks. In some data centers, cables are simply laid on the floor in the plenum where they can become badly tangled (Figure 6). This can result in cable dams that block airflow or cause turbulence that minimizes airflow and creates hot spots above the floor. U-shaped basket cable trays or cable hangers can be used to manage cable paths, prevent blockage of airflow, and provide a path for future cable additions. Another option is to use overhead cable trays to route network and data cables so that only power cables remain in the floor plenum. Electrical and network cables from devices in the racks pass through cutouts in the tile floor to wireways and cable trays beneath the floor. Oversized or unsealed cable cutouts allow supply air to escape from the plenum, thereby reducing the static pressure. Self-sealing cable cutouts are required to maintain the static pressure in the plenum (Figure 7). Figure 6. Unorganized cables (left) and organized cables (right) beneath a raised floor. Figure 7. Self-sealing cable cutout in raised floor Racks Racks (cabinets) are a critical part of the overall cooling infrastructure. HP enterprise-class cabinets provide 65 percent open ventilation using perforated front and rear door assemblies (Figure 8). To support the newer high-performance equipment, glass doors must be removed from older HP racks and from any third-party racks. 9

10 Figure 8. HP enterprise-class cabinets Cooling footprint The floor area that each rack requires must include an unobstructed area to draw in and discharge air. Almost all HP equipment cools from front to rear so that it can be placed in racks positioned sideby-side. The cooling footprint (Figure 9) includes width and depth of the rack plus the area in front for drawing in cool air and the area in back for exhausting hot air. Equipment that draws in air from the bottom or side or that exhausts air from the side or top will have a different cooling footprint. The total physical space required for the data center includes the cooling footprint of all the racks plus free space for aisles, ramps, and air distribution. Typically, a width of two floor tiles is needed in front of the rack, and a width of at least one unobstructed floor tile is needed behind the rack to facilitate cable routing. Figure 9. Cooling footprint Internal airflow Front and rear cabinet doors that are 65 percent open to incoming airflow also present a 35 percent restriction to air discharged by the equipment in the rack. Servers will intake air from the path of least resistance. Therefore, they will access the higher-pressure discharge air flowing inside the cabinet easier than they will access cooling air coming through the front of the cabinet. Some configurations such as those with extreme cable or server density may create a backpressure situation forcing heated exhaust air around the side of a server and back into its inlet. In addition, air from the cold isle or hot 10

11 isle can flow straight through a rack with open "U" spaces. Gaskets or blanking panels must be installed in any open spaces in the front of the rack to support the front-to-back airflow design and prevent these negative effects (Figure 10). Figure 10. Airflow in rack without blanking panels (top) and with blanking panels (bottom) Hot and cold aisles The front-to-rear airflow through HP equipment allows racks to be arranged in rows front-to-front and back-to-back to form alternating hot and cold aisles. The equipment draws in the cold supply air and exhausts warm air out the rear of the rack into hot aisles (Figure 11). The amount of space between rows of racks is determined as follows. Cold aisle spacing should be 48 inches, two full tiles, and hot isle spacing should be at least one full tile, 24 inches minimum. This spacing is required for equipment installation and removal and for access beneath the floor. Cold aisles should be a minimum of 14 feet apart center-to-center, or seven full tiles. Figure 11. Airflow pattern for raised floor configuration with hot aisles and cold aisles Rack geometry Designing the data center layout to form hot and cold aisles is one step in the cooling optimization process. Also critical is the geometry of the rack layout. Research by HP Laboratories has revealed that minor changes in rack placement can change the fluid mechanics inside a data center and lead to inefficient utilization of CRAC units. See the "Static Smart Cooling" section for more information. 11

12 Computer room air conditioners A common question with respect to cooling resources is how many kilowatts a particular CRAC unit can cool. Assuming a fixed heat load from the equipment in its airflow pattern, the answer depends largely on the capacity of the CRAC unit, its placement in the facility, and its discharge velocity. Capacity of CRAC units The heat load of equipment is normally specified in BTU/hr. However, in the U.S., CRAC unit capacity is often expressed in "tons" of refrigeration, where one ton corresponds to a heat absorption rate of 12,000 BTU/hr. The "tons" capacity rating is measured at 80 F; however, the recommended operating conditions for CRAC units are 70 to 72 F and 50 percent relative humidity (RH). At 72 F, the CRAC unit output capacity is considerably reduced. Furthermore, the tons rating is very subjective because it is based on total cooling, which is comprised of "sensible cooling" and "latent cooling." Computer equipment produces sensible heat only; therefore, the sensible cooling capacity of a CRAC unit is the most useful value. For this reason, CRAC unit manufacturers typically provide cooling capacities as "total BTU/Hr" and "sensible BTU/Hr" at various temperatures and RH values. Customers should review the manufacturer's specifications and then divide the sensible cooling capacity (at the desired operating temperature and humidity) by 12,000 BTU/Hr per ton to calculate the useable capacity of a given CRAC unit, expressed in tons of cooling. Cooling capacity is also expressed in volume as cubic feet per minute (cfm). The volume of air required is related to the moisture content of the air and the temperature difference between the supply air and return air (ΔT): Cubic feet per minute = BTU/hr (1.08 ΔT) The cooling capacity calculations presented here are theoretical, so other factors must be considered to determine the effective range of a particular CRAC unit. The effective cooling range is determined by the capacity of the CRAC unit and the heat load of the equipment in its airflow pattern. Typically, the most effective cooling begins about 8 feet (2.4 m) from the CRAC unit. Although units with capacities greater than 20 tons are available, the increased heat density of today's servers limits the cooling range to approximately 30 feet or 9.1 m (Figure 12). Figure 12. Cooling ranges of CRAC units Placement of CRAC units The geometry of the room and the heat load distribution of the equipment determine the best placement of the CRAC units. CRAC units can be placed inside or outside the data center walls. Customers should consider placing liquid-cooled units outside the data center to avoid damage to electrical equipment that could be caused by coolant leaks. CRAC units should be placed perpendicular to the rows of equipment and aligned with the hot aisles, discharging air into the supply plenum in same direction (Figure 13). This configuration provides the 12

13 shortest possible distance for the hot air to return to the CRAC units. Discharging in the same direction eliminates dead zones that can occur beneath the floor when blowers oppose each other. Rooms that are long and narrow may be cooled effectively by placing CRAC units around the perimeter. Large, square rooms may require CRAC units to be placed around the perimeter and through the center of the room. Figure 13. CRAC units should be placed perpendicular to hot aisles so that they discharge cool air beneath the floor in the same direction. Discharge velocity To force air from beneath the raised floor through the perforated tiles, the static pressure in the supply air plenum must be greater than the pressure above the raised floor. The velocity of the cooled air is highest near the CRAC unit because the entire flow is delivered through this area. The air velocity decreases as air flows through the perforated tiles away from the CRAC unit. The decrease in velocity is accompanied by an increase in static pressure with distance from the CRAC unit. Excessive discharge velocity from the CRAC unit reduces the static pressure through perforated tiles nearest the unit, causing inadequate airflow (Figure 14). The static pressure increases as the highvelocity discharge moves away from the unit, thereby increasing the airflow through the perforated tiles. To counter this situation, airfoils under the raised floor can be used to divert air through the perforated tiles. 2 Another option is to use a fan-assisted perforated tile to increase the supply air circulation to a particular rack or hot spot. Fan-assisted tiles can provide 200 to 1500 cfm of supply air. 2 From Changing Cooling Requirements Leave Many Data Centers at Risk. W. Pitt Turner IV, P.E. and Edward C. Koplin, P.E. ComputerSite Engineering, Inc. 13

14 Figure 14. Plenum static pressure greater than pressure above the floor (left). High-velocity discharge reduces static pressure closest to the unit (right). Airflow distribution for high-density data centers To achieve an optimum down draft airflow pattern, warm exhaust air must be returned to the CRAC unit with minimal obstruction or redirection. Ideally, the warm exhaust air will rise to the ceiling and return to the CRAC unit intake. In reality, only the warm air closest to the intake may be captured; the rest may mix with the supply air. Mixing occurs if exhaust air goes into the cold aisles, if cold air goes into the hot aisles, or if there is insufficient ceiling height to allow for separation of the cold and warm air zones (Figure 15). When warm exhaust air mixes with supply air, two things can happen: The temperature of the exhaust air decreases, thereby lowering the useable capacity of the CRAC unit. The temperature of the supply increases, which causes warmer air to be recirculated through computer equipment. Figure 15. Mixing of supply air and exhaust air 14

15 Ceiling return air plenum In recent years, raised floor computer rooms with very high heat density loads have begun to use a ceiling return air plenum to direct exhaust air back to the CRAC intake. As shown on the right of Figure 16, the ceiling return air plenum removes heat while abating the mixing of cold air and exhaust air. Once the heated air is in the return air plenum, it can travel to the nearest CRAC unit intake. The return air grilles in the ceiling can be relocated if the layout of computer equipment changes. Figure 16. Ceiling return air plenum Dual supply air plenums As power and heat densities climb, a single supply air plenum under the raised floor may be insufficient to remove the heat that will be generated. High-density solutions may require dual supply air plenums, one above and one below (see Figure 17). In this configuration, additional supply air is forced downward in the cold aisle. Figure 17. Dual air supply plenum configuration for high-density solutions 15

16 Ceiling-mounted heat exchangers The drive to maximize the compute density of data centers has prompted research and development of alternative cooling methods that do not require floor space. 3 Figure 18 shows a representation of an alternate cooling method using modular air-to-liquid heat exchangers in the ceiling. The heat exchanger units collect the hot air exhaust from the racks and cool it using circulated chilled water. The units eject the cool air downward by using fan trays located over on the intake side of each rack. The advantage of this approach is the proximity of the heat exchangers to the racks. With this scheme, rack cooling can be localized. Unique mechanical design ideas, such as the ability to move the intake and exhaust sections, have been implemented in the heat exchangers to direct airflow to and from the racks. Additionally, modular heat exchangers offer the flexibility to scale the cooling as needed. More importantly, ceiling-mounted heat exchangers save revenue-generating floor space and allow the raised floor to be mainly used for cable distribution. Figure 18. Ceiling-mounted air-to-liquid heat exchangers Advanced thermal management techniques Heat loads vary throughout a data center due to the heterogeneous mix of hardware types and models, changing compute workloads, and the addition or removal of racks over time. The variation in heat load is too complex to predict intuitively or to address by adding cooling capacity. HP Laboratories has devised two thermal analysis approaches Static Smart Cooling 4 and Dynamic Smart Cooling that model heat distribution throughout a data center using computational fluid dynamics (CFD). Static Smart Cooling uses CFD modeling to aid planners in designing the physical layout of the data center for optimum distribution of cooling resources and heat loads. Static Smart Cooling can also predict the changes in heat extraction of each CRAC unit when the rack layout and equipment heat load are varied. Dynamic Smart Cooling offers a higher level of automated facility management. It enables intelligent data centers that dynamically provision cooling resources to match the changing heat dissipation of computing, networking, and storage equipment. It also redistributes compute workloads based on the most efficient use of cooling resources within a data center or a global network of data centers. 3 For more information, please read Computational Fluid Dynamics Modeling of High Compute Density Data Centers to Assure System Inlet Air Specifications at 4 For more information, please read Thermal Considerations in Cooling Large Scale High Compute Density Data Centers at 16

17 Static Smart Cooling Static Smart Cooling uses CFD modeling to determine the best layout and provisioning of cooling resources based on fixed heat loads from data center equipment. The heat extraction of each CRAC unit is compared to its rated capacity to determine how efficiently (or inefficiently) the CRAC unit is being used, or "provisioned." The provisioning of each unit in the data center is presented as a positive or negative percentage as follows: An under-provisioned CRAC unit (positive percentage) indicates that the cooling load is higher than the capacity of the unit. A closely provisioned CRAC unit (small negative percentage) signifies that the cooling load is less than but reasonably close to the capacity of the unit, leading to efficient use of energy resources. An over-provisioned CRAC unit (large negative percentage) operates significantly below the capacity of the unit. This results in wasted energy if operation of the unit cannot be adjusted to match the lower cooling load. For example, Figure 19 shows the row-wise distribution of heat loads (41 kw to 182 kw) for a combination of compute, storage, and networking equipment in a typical raised floor data center with four CRAC units. The CFD model shows that the provisioning of the CRAC units is completely out of balance. Figure 19. Poorly provisioned CRAC units 17

18 In Figure 19, the 102-kW row and the 182-kW row have been repositioned to better distribute the heat load. This CFD model shows that the CRAC units are now provisioned within 15 percent of their capacity. Figure 19. Statically provisioned CRAC units Dynamic Smart Cooling A smart data center requires a distributed monitoring system and a feedback control system that continually provisions the cooling resources based on the workload distribution. 5 Computing resources must be pooled and virtualized rather than dedicated to a particular user or application so that workloads can be allocated dynamically. The control system schedules compute workloads across racks of servers in a way that minimizes energy use and maximizes cooling efficiency. The computing resources not in use are put on standby to improve operating efficiency. Due to the high airflow rates in data centers, thermal management is achievable only if hot and cold air mixing is minimal. The hot air must return to the CRAC units with minimal infiltration into the cold zones because such mixing increases the inlet temperatures at the racks. Researchers have developed dimensionless parameters, known as Return Heat Index (RHI) and Supply Heat Index (SHI) that can be used as control points to allocate compute workloads and cooling to minimize energy use. 6 RHI denotes the degree of mixing of the cold air with the hot return air to the CRAC units. SHI is a measure of heat infiltration into cold aisles. SHI is the primary control parameter used to minimize the energy required to meet inlet air specifications. Dynamic Smart Cooling is possible for a data center with the following features: Distributed sensors such as - temperature sensors on the racks measuring supply and exhaust air temperature of the systems - temperature sensors in the aisles measuring three dimensional temperature distribution in the data center - temperature sensors at the CRAC return and supply - pressure sensors in the air distribution plenum 5 Patel, C.D., Sharma, R.K, Bash, C.E., Beitelmal, A, Friedrich, R., Smart Cooling of Data Centers, July 2003, IPACK , Proceedings of IPACK03- International Electronics Packaging Technical Conference and Exhibition, Maui, Hawaii. 6 Sharma R, Bash, C.E, Patel, C.D, June 2002, Dimensionless Parameters for Evaluation of Thermal Design and Performance of Large Scale Data Centers, AIAA , American Institute of Aeronautics and Astronautics Conference, St. Louis, MO. 18

19 - sensors that measure power drawn by machines Variable air flow devices to modulate flow work, and variable cooling coil temperature in the CRACs to modulate thermodynamic work Data aggregation system that - collects sensed data from all locations - visually presents the real time power draw - calculates data center control parameters: RHI and SHI Control system that modulates the variable air conditioning resources through a control algorithm for a given distribution of workloads (heat loads) Data center manager (computerized system) that uses thermal policies to distribute workloads in the data center At the time of this writing, HP Laboratories is developing the control system and the data center manager, and plans to report its progress in future papers. Conclusion The growing power consumption of computing components requires modular data center designs with sufficient headroom to handle increasing power and cooling requirements. To determine actual requirements, facilities planners must consider several factors, including room geometry and the capacity and placement of the CRAC units. Planners must also give special attention to factors that affect airflow distribution, such as supply plenum static pressure, airflow blockages beneath raised floors, and configurations that result in airflow mixing in the data center. HP is a leader in the thermal modeling of data centers. HP Professional Services can work directly with customers to optimize existing data centers for more efficient cooling and energy consumption. The modeling services can also be used to confirm new data center designs or predict what will happen in a room when certain equipment fails. As long as the data center has the power and cooling resources to support the expected loads, Static Smart Cooling can rectify cooling problems as well as enhance the overall efficiency of air conditioning resources. In most cases, the energy savings alone may pay for the cost of the service in a relatively short period. 19

20 For more information For additional information, refer to the resources detailed below. Resource description Thermal Considerations in Cooling Large Scale High Compute Density Data Centers white paper HP Rack/Site Installation Preparation Utility Power calculators Web address Copyright 2004, 2005 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein. Intel and Xeon are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and other countries. TC050901TB, 9/2005

A Green Approach. Thermal

A Green Approach. Thermal A Green Approach to Thermal Management Thermal Management - AGENDA Definitions & Thermal Facts Density Trends Rack Level Management Passive Solutions Thermal Sources & Green Solutions Data Center Layout

More information

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA

Smart Data Centres. Robert M Pe, Data Centre Consultant HP Services SEA Smart Data Centres Robert M Pe, Data Centre Consultant Services SEA 2006 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice Content Data center

More information

HP Modular Cooling System: water cooling technology for high-density server installations

HP Modular Cooling System: water cooling technology for high-density server installations HP Modular Cooling System: water cooling technology for high-density server installations technical brief Abstract... 2 Introduction... 2 Data center trends... 2 Increased power demands... 3 Increased

More information

ENCLOSURE HEAT DISSIPATION

ENCLOSURE HEAT DISSIPATION ENCLOSURE HEAT DISSIPATION DATA CENTER SOLUTIONS A White Paper by Great Lakes Case & Cabinet Company www.werackyourworld.com 814.734.7303 Page 2 Introduction Dangerous heat levels inside an enclosure threaten

More information

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems.

A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems. DATA CENTRE COOLING SYSTEM A) Differences between Precision and Comfort Cooling Here are the major differences exist between precision air conditioning and comfort systems. 1. Cooling Optimized to Electronic

More information

Power and Cooling for Ultra-High Density Racks and Blade Servers

Power and Cooling for Ultra-High Density Racks and Blade Servers Power and Cooling for Ultra-High Density Racks and Blade Servers APC White Paper #46 Richard L. Sawyer Director of Data Center Technology American Power Conversion Company 2005 APC corporation What has

More information

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet

Moving Containment Inside the Enclosure. Jeff Markle Great Lakes Case & Cabinet Moving Containment Inside the Enclosure Presented by Jeff Markle Great Lakes Case & Cabinet Containment Containment : To put constraint upon; to restrain; to confine; to keep within bounds Containment

More information

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS

Cooling. Highly efficient cooling products for any IT application. For More Information: (866) DATA CENTER SOLUTIONS Cooling > Highly efficient cooling products for any IT application. DATA CENTER SOLUTIONS For More Information: (866) 787-3271 Sales@PTSdcs.com End-to-End Cooling Solutions from the Closet to the Data

More information

Reducing Energy Consumption with

Reducing Energy Consumption with Reducing Energy Consumption with Passive Cooling Ian Cathcart, RCDD Technical Support Manager Technical Support Manager Chatsworth Products, Inc. Thermal Management: Mission Critical Data center heat loads

More information

Reducing Data Center Cooling Costs through Airflow Containment

Reducing Data Center Cooling Costs through Airflow Containment FEATURE Reducing Data Center Cooling Costs through Airflow Containment Implementing a containment solution in the data center costs much less than the alternatives of adding air handlers or other supplemental

More information

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc

Current Data Center Design. James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc Current Data Center Design James Monahan Sept 19 th 2006 IEEE San Francisco ComSoc What is the biggest facility problem in your data center? 35% 30% 25% 20% 15% Series1 10% 5% 0% Excessive heat Insufficient

More information

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden

Where s the Heat Coming From. Leland Sparks Global Account Technical Consultant Belden Where s the Heat Coming From Leland Sparks Global Account Technical Consultant Belden What is the greatest DC challenge you face? Air Flow to Heat Load Relation Server Inlet Fans Chips Outlet LOAD Dell

More information

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach

High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach High Tech Getting Too Hot? High Density Environments Require A Different Cooling Approach Proper airfl ow and thermal management are the most critical aspects of designing and operating your data center,

More information

Thermal management. Thermal management

Thermal management. Thermal management Thermal management Thermal management Managing thermal loads is a major challenge for all Data Centre operators. Effecting proper control of the thermal environment drives energy consumption and ultimately

More information

Application of TileFlow to Improve Cooling in a Data Center

Application of TileFlow to Improve Cooling in a Data Center Case Study Application of TileFlow to Improve Cooling in a Data Center 3025 Harbor Lane North, Suite 300 Plymouth, MN 55447 USA Copyright 2004 Innovative Research, Inc. 2 Introduction A large insurance

More information

Ten Cooling Solutions to Support High- Density Server Deployment

Ten Cooling Solutions to Support High- Density Server Deployment Ten Cooling Solutions to Support High- Density Server Deployment By Peter Hannaford White Paper #42 Revision 2 Executive Summary High-density servers offer a significant performance per watt benefit. However,

More information

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc.

Cooling on Demand - scalable and smart cooling solutions. Marcus Edwards B.Sc. Cooling on Demand - scalable and smart cooling solutions Marcus Edwards B.Sc. Schroff UK Limited Data Centre cooling what a waste of money! A 1MW data center needs 177,000,000 kwh in its lifetime of 10

More information

Liebert DCW CATALOGUE

Liebert DCW CATALOGUE Liebert DCW CATALOGUE Chilled Water-Based High Density Cooling For The Data Center LIEBERT DCW Flexible Approaches For Energy Efficient Cooling Save up to 70% in cooling energy usage Data center managers

More information

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms Problem Several cooling system mistakes are very common and avoidable Mistakes compromise availability and increase

More information

Recapture Capacity for Existing. and Airflow Optimization

Recapture Capacity for Existing. and Airflow Optimization Recapture Capacity for Existing Data Centers Through Cooling and Airflow Optimization Introduction Challenges and Trends Agenda Cooling & Airflow Optimization Takeaways Learning Objectives Recognize opportunities

More information

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer

Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer Using CFD Analysis to Predict Cooling System Performance in Data Centers Ben Steinberg, P.E. Staff Applications Engineer Agenda What is CFD? Applications of CFD Applicability to the Data Center How it

More information

Total Modular Data Centre Solutions

Total Modular Data Centre Solutions Sector Data Centre Total Modular Data Centre Solutions www.airedale.com BRITISH MANUFACTURER ENERGY EFFICIENCY CONTROLS SERVICE TRAINING 01 Standardised Modular Designs to fit your needs Our modular system

More information

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency

More information

Energy Efficient Data Centers

Energy Efficient Data Centers IBM Systems and Technology Group Lab Services Energy Efficient Data Centers Steven Ahladas IBM Corp ahladas@us.ibm.com Keeping our clients in the race. All trends going the same way. IT Servers were an

More information

Optimizing Cooling Performance Of a Data Center

Optimizing Cooling Performance Of a Data Center ASHRAE www.ashrae.org. Used with permission from ASHRAE Journal. This article may not be copied nor distributed in either paper or digital form without ASHRAE s permission. For more information about ASHRAE,

More information

Rethinking Datacenter Cooling

Rethinking Datacenter Cooling Rethinking Datacenter Cooling www.sempertec.com SemperTec White Paper Rethinking Datacenter Cooling Contents Executive Summary Introduction Air Distribution: Three Cooling Design Considerations Cold Aisle

More information

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire

Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101. Nick Gangemi Regional Sales Manager Data Aire Mission Critical Facilities & Technology Conference November 3, 2011 Cooling 101 Nick Gangemi Regional Sales Manager Data Aire Agenda Exactly what are we talking about ASHRAE TC 9.9 Design best practices

More information

HP Power Calculator Utility: a tool for estimating power requirements for HP ProLiant rack-mounted systems

HP Power Calculator Utility: a tool for estimating power requirements for HP ProLiant rack-mounted systems HP Power Calculator Utility: a tool for estimating power requirements for HP ProLiant rack-mounted systems technology brief Abstract... 2 Introduction... 2 Key parameters... 2 Input line voltage... 2 Device

More information

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center

Data Center Enclosures Best Practices. Maximize the Efficiency of the Enclosure within a Data Center Data Center Enclosures Best Practices Maximize the Efficiency of the Enclosure within a Data Center Things to ask before selecting Enclosures: Primary Use Severs or Patching, or both determines size. RMU

More information

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings WHITE PAPER AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc.

More information

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power

Future of Cooling High Density Equipment. Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power Future of Cooling High Density Equipment Steve Madara Vice President and General Manager Liebert Precision Cooling Business Emerson Network Power Agenda Issues facing the Data Center Managing the explosive

More information

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer

Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling. Jeremiah Stikeleather Applications Engineer Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling Jeremiah Stikeleather Applications Engineer Passive RDHx as a Cost Effective Alternative to CRAH Air Cooling While CRAH cooling has been

More information

Close Coupled Cooling for Datacentres

Close Coupled Cooling for Datacentres Close Coupled Cooling for Datacentres Dr Fathi Tarada Mosen Ltd fathi.tarada@mosenltd.com www.mosenltd.com Agenda Increasing Power Densities Raised Floor Limitations Hot and Cold Aisle Enclosures Close

More information

Impact of Air Containment Systems

Impact of Air Containment Systems White Paper June 2012 WP-20 Impact of Air Containment Systems Reducing Energy Consumption Introduction: The Importance of Maintaining Data Center Cooling Energy Efficiency Data centers are mission-critical

More information

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation

The Efficient Enterprise. All content in this presentation is protected 2008 American Power Conversion Corporation Ee The Efficient Enterprise All content in this presentation is protected 2008 American Power Conversion Corporation Keystrokes Kilowatts Heat OUT Electricity IN Need for bandwidth exploding Going hyperbolic!

More information

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc. Minneapolis Symposium September 30 th, 2015 Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc. radmehr@inres.com Learning Objectives 1. Gain familiarity with Computational

More information

New Techniques for Energy-Efficient Data Center Cooling

New Techniques for Energy-Efficient Data Center Cooling New Techniques for Energy-Efficient Data Center Cooling Applications for Large and Small Data Centers By Wally Phelps Director of Engineering AdaptivCOOL www.adaptivcool.com Copyright Degree Controls,

More information

CFD Modeling of an Existing Raised-Floor Data Center

CFD Modeling of an Existing Raised-Floor Data Center CFD Modeling of an Existing Raised-Floor Data Center Amir Radmehr 1, Brendan Noll 2, John Fitzpatrick 2, Kailash Karki 1 1. Innovative Research, Inc. 3025 Harbor Lane N, Suite 300, Plymouth, MN 2. University

More information

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD

ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD R I T T A L T H E R M A L S E R I E S ENCLOSURE THERMAL TESTING HEAT REMOVAL IN THE REAL WORLD FRIEDHELM L O H GROUP I. INTRODUCTION As a leading manufacturer of enclosure solutions, Rittal is frequently

More information

Next Generation Cooling

Next Generation Cooling Next Generation Cooling Server Manufacturer Perspective Practical and Cost Effective David Moss Data Center Strategist Dell 1 Why Chip, Row, or Rack? Density? Energy Don t be scared; I don t see 50kW/rack

More information

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering

Optimization in Data Centres. Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering Optimization in Data Centres Jayantha Siriwardana and Saman K. Halgamuge Department of Mechanical Engineering Melbourne School of Engineering Outline Background and Motivation Overview and scale of data

More information

Running Your Data Center Under Ideal Temperature Conditions

Running Your Data Center Under Ideal Temperature Conditions TECHNICAL BRIEF Mastering Data Center Environmental Controls to Help Drive Your Bottom Line Running Your Data Center Under Ideal Temperature Conditions Challenge Many data centers waste energy using inefficient

More information

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity

Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center. Precision Cooling For Business-Critical Continuity Liebert DCW Family Chilled Water-Based High Density Cooling For The Data Center Precision Cooling For Business-Critical Continuity Flexible Approaches For Energy Efficient Cooling Save up to 70% in cooling

More information

QuickSpecs. HP Rack G2 Series. Overview

QuickSpecs. HP Rack G2 Series. Overview Overview HP sets the new standard for performance and value in the enterprise with the new 10000 G2 Series Rack family. This new enterprise-class rack combines unparalleled structural integrity, cooling,

More information

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data

Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data White Paper EC9005A Reclaim Wasted Cooling Capacity Now Updated with CFD Models to Support ASHRAE Case Study Data Peer reviewed case study data from ASHRAE now updated with CFD model analysis revealing

More information

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER

A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER A LAYMAN S EXPLANATION OF THE ROLE OF IT RACKS IN COOLING YOUR DATA CENTER By David Moss Dell Data Center Infrastructure dell.com/hiddendatacenter Many people see an IT rack as a glorified shelving system.

More information

Variable Density, Closed-Loop, Water-Cooled Data Center Solution

Variable Density, Closed-Loop, Water-Cooled Data Center Solution Variable Density, Closed-Loop, Water-Cooled Data Center Solution Overview Great Lakes Case & Cabinet Co., Inc. has worked with Naissus Thermal Management Solutions, of Toronto, Ontario to develop one of

More information

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE Best practices for airflow management and cooling optimisation DC ASSESSMENT SERVICES AISLE CONTAINMENT AIRFLOW MANAGEMENT Achieving true 24/7/365 operation

More information

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42

Ten Cooling Solutions to Support High-density Server Deployment. Schneider Electric Data Center Science Center White Paper #42 Ten Cooling Solutions to Support High-density Server Deployment Schneider Electric Data Center Science Center White Paper #42 1 Introduction 2 Introduction This paper provides 10 approaches for increasing

More information

APC APPLICATION NOTE #146

APC APPLICATION NOTE #146 #146 Hot Aisle Containment Application Guidelines and Approved Configurations By Kevin Lemke Abstract The Hot Aisle Containment System (HACS) is a modular ceiling panel and door system for use in isolating

More information

Airflow Management s Role in Data Center Cooling Capacity

Airflow Management s Role in Data Center Cooling Capacity Airflow Management s Role in Data Center Cooling Capacity By Larry Mainers, CEO Subzero Engineering 2014 Subzero Engineering SUMMARY Airflow management (AFM) is changing the way data centers cool the IT

More information

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model

<Insert Picture Here> Austin Data Center 6Sigma DC CFD Model Austin Data Center 6Sigma DC CFD Model Keith Ward Keith Ward Data Center Facilities Engineering and Logistics Design Austin Data Center Statistics - Space Original AT&T Designed and

More information

APC APPLICATION NOTE #74

APC APPLICATION NOTE #74 #74 Configuration of InfraStruXure for Data Centers to Support Dell PowerEdge 1855 Abstract Many companies are making plans to incorporate Dell PowerEdge Blade Servers into their data center applications.

More information

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy

Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy Rittal White Paper 506: Cold Aisle Containment for Improved Data Center Cooling Efficiency By: Daniel Kennedy Executive Summary Increasing power requirements for new servers have created the need to support

More information

Improving Rack Cooling Performance Using Blanking Panels

Improving Rack Cooling Performance Using Blanking Panels Improving Rack Cooling Performance Using Blanking By Neil Rasmussen White Paper #44 Revision 1 Executive Summary Unused vertical space in open frame racks and rack enclosures creates an unrestricted recycling

More information

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October

How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals. Michael Gagnon Coolcentric October How Liquid Cooling Helped Two University Data Centers Achieve Cooling Efficiency Goals Michael Gagnon Coolcentric October 13 2011 Data Center Trends Energy consumption in the Data Center (DC) 2006-61 billion

More information

Effectiveness and Implementation of Modular Containment in Existing Data Centers

Effectiveness and Implementation of Modular Containment in Existing Data Centers WHITE PAPER Effectiveness and Implementation of Modular Containment in Existing Data Centers By Lars Strong, P.E., Upsite Technologies, Inc. Bruce Long, Upsite Technologies, Inc. 505.798.0200 upsite.com

More information

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM.

HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM. HIGH DENSITY DATACOM INTEGRATED COOLING SYSTEM. SYSTEM LOGICA is the most advanced alternative to the problem of cooling of server rooms MONTAIR is the trademark which represents excellence on the market

More information

MSYS 4480 AC Systems Winter 2015

MSYS 4480 AC Systems Winter 2015 MSYS 4480 AC Systems Winter 2015 Module 12: DATA CENTERS By Satwinder Singh 14/05/2015 1 Data Centers Data Centers are specialized environments that safeguard your company's most valuable equipment and

More information

Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI)

Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI) 2005. American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. Reprinted by permission from ASHRAE Transactions, Vol. 111, Part 2. This material may not be copied nor distributed

More information

Energy Efficient Data Center Design. Can Ozcan Ozen Engineering Emre Türköz Ozen Engineering

Energy Efficient Data Center Design. Can Ozcan Ozen Engineering Emre Türköz Ozen Engineering Energy Efficient Data Center Design Can Ozcan Ozen Engineering Emre Türköz Ozen Engineering 1 Bio Can Ozcan received his Master of Science in Mechanical Engineering from Bogazici University of Turkey in

More information

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary

Datacenter Efficiency Trends. Cary Roberts Tellme, a Microsoft Subsidiary Datacenter Efficiency Trends Cary Roberts Tellme, a Microsoft Subsidiary Power Slashing Tools P3 Kill-A-WATT Measures min, max, and instantaneous voltage and current draw Calculates power (watts), kilowatt

More information

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas

Dominick Lovicott Enterprise Thermal Engineering. One Dell Way One Dell Way Round Rock, Texas One Dell Way One Dell Way Round Rock, Texas 78682 www.dell.com DELL ENTERPRISE WHITEPAPER THERMAL DESIGN OF THE DELL POWEREDGE T610, R610, AND R710 SERVERS Dominick Lovicott Enterprise Thermal Engineering

More information

HPE Performance Optimized Datacenter (POD) Networking Guide

HPE Performance Optimized Datacenter (POD) Networking Guide HPE Performance Optimized Datacenter (POD) Networking Guide Abstract This document provides networking guidance for the various HPE PODs. Part Number: 663145-003R November 2015 Edition: 4 Copyright 2011,

More information

84 kw, Tier 3, Direct Expansion, 937 ft 2

84 kw, Tier 3, Direct Expansion, 937 ft 2 [Reference Design 42] 84 kw, Tier 3, Direct Expansion, 937 ft 2 DESIGN OVERVIEW Data Center IT Capacity 84 kw Target Availability Tier 3 Annualized PUE at 100% Load 1.712 Total Racks and Average Density

More information

At a Glance. HP Dynamic Smart Cooling. Overview. DSC Base Station

At a Glance. HP Dynamic Smart Cooling. Overview. DSC Base Station Overview DSC Base Station The new (DSC) technology helps you tackle one of the most critical issues in today's data centers - power and cooling. DSC will enable you to change data center energy costs from

More information

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw

CHILLED WATER. HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw IT COOLING CLOSE CONTROL AIR CONDITIONERS CHILLED WATER HIGH PRECISION AIR CONDITIONERS, FROM 7 TO 211 kw 3 Versions available Single chilled water coil Double chilled water coil High temperature IT COOLING

More information

QuickSpecs. HP Rack G2 Series Overview

QuickSpecs. HP Rack G2 Series Overview Overview HP sets the new standard for performance and value in the enterprise with the new 10000 G2 Series Rack family. This new enterprise-class rack combines unparalleled structural integrity, cooling,

More information

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months

TWO FINANCIAL IMPACT STUDIES. Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months TWO FINANCIAL IMPACT STUDIES Seal IT Equipment Cabinets for Significant Annual Cost Savings and Simple Payback in a Few Short Months TWO FINANCIAL IMPACT STUDIES Seal IT Equipment Cabinets for Significant

More information

The HP BladeSystem p-class 1U power enclosure: hot-plug, redundant power for a server blade enclosure

The HP BladeSystem p-class 1U power enclosure: hot-plug, redundant power for a server blade enclosure The HP BladeSystem p-class 1U power enclosure: hot-plug, redundant power for a server blade enclosure technology brief Abstract... 3 Introduction... 3 Components of the enclosure... 3 Hot-plug, redundant

More information

Temperature monitoring and CFD Analysis of Data Centre

Temperature monitoring and CFD Analysis of Data Centre Available online at www.sciencedirect.com Procedia Engineering 56 (2013 ) 551 559 5 th BSME International Conference on Thermal Engineering Temperature monitoring and CFD Analysis of Data Centre N.M.S.

More information

Evaporative free air cooling technology Providing evaporative free air cooling solutions.

Evaporative free air cooling technology Providing evaporative free air cooling solutions. Evaporative free air cooling technology Providing evaporative free air cooling solutions. Creating an effective workspace environment data centre free air cooling technology FreeCool - Evaporative Free

More information

Google s Green Data Centers: Network POP Case Study

Google s Green Data Centers: Network POP Case Study Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice

More information

Improving Data Center Reliability and Efficiency by Recovering Cooling Capacity with KoldLok Raised Floor Grommets BUSINESS CASE ANALYSIS

Improving Data Center Reliability and Efficiency by Recovering Cooling Capacity with KoldLok Raised Floor Grommets BUSINESS CASE ANALYSIS BUSINESS CASE ANALYSIS Improving Data Center Reliability and Efficiency by Recovering Cooling Capacity with KoldLok Raised Floor Grommets by Daniel L. Gollahon, CDP Subject: This business case examines

More information

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director

Energy Logic: Emerson Network Power. A Roadmap for Reducing Energy Consumption in the Data Center. Ross Hammond Managing Director Energy Logic: A Roadmap for Reducing Energy Consumption in the Data Center Emerson Network Power Ross Hammond Managing Director Agenda Energy efficiency Where We Are Today What Data Center Managers Are

More information

Green IT and Green DC

Green IT and Green DC Green IT and Green DC Alex SL Tay Regional Service Product Line Manager Site & Facilities Services, IBM ASEAN 1 What our clients are telling us We are running out of space in our data center We have environmental

More information

Overcoming the Challenges of Server Virtualisation

Overcoming the Challenges of Server Virtualisation Overcoming the Challenges of Server Virtualisation Maximise the benefits by optimising power & cooling in the server room Server rooms are unknowingly missing a great portion of their benefit entitlement

More information

Liquid Cooling Package LCP Cooling Systems

Liquid Cooling Package LCP Cooling Systems Liquid Cooling Package LCP Cooling Systems 2 Rittal LCP The whole is more than the sum of its parts. The same is true of Rittal The System. With this in mind, we have bundled our innovative enclosure,

More information

LCP Hybrid Efficient performance with heat pipe technology

LCP Hybrid Efficient performance with heat pipe technology LCP Hybrid Efficient performance with heat pipe technology 2 LCP Hybrid The whole is more than the sum of its parts The same is true of "Rittal The System." With this in mind, we have bundled our innovative

More information

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered

Survey and Audit Service Schedule. Airflow and Thermal Imaging Survey Service Schedule. Data Centre Solutions Expertly Engineered Survey and Audit Service Schedule Airflow and Thermal Imaging Survey Service Schedule Data Centre Solutions Expertly Engineered Consult optimisation, audit and survey services Data Centre Services Workspace

More information

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres

Product Brochure DENCO Close Control Units Row-DENCO. High Density Cooling for Data Centres Product Brochure DENCO Close Control Units Row-DENCO High Density Cooling for Data Centres Data Centre Cooling Through a history in close control, we have developed a broad and deep understanding of how

More information

QuickSpecs. HP Performance Optimized Datacenter (POD) 20ce. Overview. Retired

QuickSpecs. HP Performance Optimized Datacenter (POD) 20ce. Overview. Retired Overview The HP Performance-Optimized Data Center (HP POD) is a containerized datacenter equipped with power infrastructure, cooling and IT power distribution. The HP POD can be deployed within weeks instead

More information

Design and Installation Challenges: Aisle Containment Systems

Design and Installation Challenges: Aisle Containment Systems Design and Installation Challenges: Aisle Containment Systems 1 We all know the Benefits of Containment 1. Reduces Total Cost of Data Center Ownership 2. Eliminates hot spots 3. Supports higher heat and

More information

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Executive Summary As electricity prices and IT

More information

Pramod Mandagere Prof. David Du Sandeep Uttamchandani (IBM Almaden)

Pramod Mandagere Prof. David Du Sandeep Uttamchandani (IBM Almaden) Pramod Mandagere Prof. David Du Sandeep Uttamchandani (IBM Almaden) Motivation Background Our Research Agenda Modeling Thermal Behavior Static Workload Provisioning Dynamic Workload Provisioning Improving

More information

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1.

survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1. :: In a recent Uptime Institute survey, 71% of respondents listed Improving capacity planning as a top driver for buying DCIM 1. 2 Maintain Assess Control ROI Monitor Optimize The Need for a Better Approach

More information

The Energy. David L. Moss Dell Data Center Infrastructure

The Energy. David L. Moss Dell Data Center Infrastructure The Energy Advantages of Containment Systems David L. Moss Dell Data Center Infrastructure www.dell.com/hiddendatacenter Summary The Uptime Institute estimates that the cost to build a new 20,000 square

More information

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America 7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer

More information

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models The MIT Faculty has made this article openly available. Please share how this access benefits

More information

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels

To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels To Fill, or not to Fill Get the Most out of Data Center Cooling with Thermal Blanking Panels The use of a hot aisle/cold aisle configuration in the data center has long been considered a best practice

More information

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY. INNOVATE DESIGN APPLY By A whole new cooling experience The Motivair ChilledDoor Rack Cooling System targets the critical cooling requirements of today s modern server rack designs. As technology continues

More information

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development

Systems and Technology Group. IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development Systems and Technology Group IBM Technology and Solutions Jan Janick IBM Vice President Modular Systems and Storage Development Power and cooling are complex issues There is no single fix. IBM is working

More information

Environmental Data Center Management and Monitoring

Environmental Data Center Management and Monitoring 1 Environmental Data Center Management and Monitoring A White Paper from aritan 2 Table of Contents Introduction... 3 Sensor Design Considerations... 3 Temperature and Humidity Sensors... 4 Airflow Sensor...5

More information

1.866.TRY.GLCC WeRackYourWorld.com

1.866.TRY.GLCC WeRackYourWorld.com cool it. 1.866.TRY.GLCC WeRackYourWorld.com In order to maximize space in the data center and accommodate for increased data, rack-mount enclosures are becoming populated with high-density servers and

More information

HIGH DENSITY RACK COOLING CHOICES

HIGH DENSITY RACK COOLING CHOICES WATERCOOLED HIGH PERFORMANCE COMPUTING SYSTEMS AQUILA WHITE PAPER HIGH DENSITY RACK COOLING CHOICES by Phil Hughes Reprinted with permission of Clustered Systems Company, Inc 1 THIS WHITE PAPER DISCUSSES

More information

A separate CRAC Unit plant zone was created between the external façade and the technical space.

A separate CRAC Unit plant zone was created between the external façade and the technical space. Nominee: FES Co Ltd Nomination title: HP Cold Cube The Doxford Facility is a highly secure Data Centre. The challenge was to take advantage of the previous investment in highly resilient infrastructure

More information

Data Center Lifecycle and Energy Efficiency

Data Center Lifecycle and Energy Efficiency Data Center Lifecycle and Energy Efficiency Lifecycle infrastructure management, power management, thermal management, and simulation solutions enable data center modernization. White Paper October 2014

More information

INNOVATE DESIGN APPLY.

INNOVATE DESIGN APPLY. INNOVATE DESIGN APPLY By A whole new cooling experience The Motivair ChilledDoor Rack Cooling System targets the critical cooling requirements of today s modern server rack designs. As technology continues

More information

Virtualization and consolidation

Virtualization and consolidation Virtualization and consolidation Choosing a density strategy Implementing a high-density environment Maximizing the efficiency benefit Anticipating the dynamic data center Schneider Electric 1 Topical

More information

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions

Schneider Electric Cooling Portfolio. Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions Schneider Electric Cooling Portfolio Jim Tubbesing Heath Wilson APC/Schneider North Texas Rep Tubbesing Solutions Data Center Cooling Agenda 1. How to size data center, server rooms, IDF rooms, etc in

More information