Cooling strategies for IT equipment

Similar documents
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

The New Data Center Cooling Paradigm The Tiered Approach

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Element D Services Heating, Ventilating, and Air Conditioning

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

APC APPLICATION NOTE #112

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

National Grid Your Partner in Energy Solutions

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

In Row Cooling Options for High Density IT Applications

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Benefits of. Air Flow Management. Data Center

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Best Practices for Wire-free Environmental Monitoring in the Data Center

EHDC. Cooling system for Data Center and server cabinets.

The Different Types of Air Conditioning Equipment for IT Environments

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Elements of Energy Efficiency in Data Centre Cooling Architecture

Best Practices for Wire-free Environmental Monitoring in the Data Center

Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy

Blade Server & Data Room Cooling Specialists

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

APC APPLICATION NOTE #92

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

- White Paper - Data Centre Cooling. Best Practice

QuickSpecs. Overview. HP Power Distribution Rack. HP Power Distribution Rack. DA North America Version 5 August 22, 2008 Page 1

Enclosure and Airflow Management Solution

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Power and Cooling for Ultra-High Density Racks and Blade Servers

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Rittal Liquid Cooling Series

Driving Data Center Efficiency Through the Adoption of Best Practices

Choosing Close-Coupled IT Cooling Solutions

Business white paper. A new approach to industrialized IT. HP Flexible Data Center

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Improving Data Center Efficiency with Rack or Row Cooling Devices:

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Glossary of Heating, Ventilation and Air Conditioning Terms

Analysis of data centre cooling energy efficiency

Data center upgrade proposal. (phase one)

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Virginia Tech. Background. Case Summary. Results

Thermal Storage System Provides Emergency Data Center Cooling

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Environmental Data Center Management and Monitoring

CoolTop. A unique water cooled air-conditioning unit for server room cooling from the top AC-TOPx-CW-240/60

Data Center Power Consumption

Benefits of Cold Aisle Containment During Cooling Failure

STULZ Water-Side Economizer Solutions

Data Center Components Overview

How to Build a Data Centre Cooling Budget. Ian Cathcart

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

QUALITATIVE ANALYSIS OF COOLING ARCHITECTURES FOR DATA CENTERS

An Introduction to Cold Aisle Containment Systems in the Data Centre

Raised Floor Data Centers Prepared for the Future

Direct Fresh Air Free Cooling of Data Centres

Liebert CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment

Energy Efficiency and Availability Management in Consolidated Data Centers

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Center Thermal Management Can Live Together

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Creating Efficient HVAC Systems

HPC TCO: Cooling and Computer Room Efficiency

University of St Andrews. Energy Efficient Data Centre Cooling

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Cable management for rack-mounted systems

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

Using CFD for optimal thermal management and cooling design in data centers

How To Improve Energy Efficiency In A Data Center

How to Meet 24 by Forever Cooling Demands of your Data Center

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

European Code of Conduct for Data Centre. Presentation of the Awards 2016

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

White Paper Rack climate control in data centres

Data Center Energy Profiler Questions Checklist

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Data Center Planning. PUE Overview

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

White Paper #10. Energy Efficiency in Computer Data Centers

Airflow Simulation Solves Data Centre Cooling Problem

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Re Engineering to a "Green" Data Center, with Measurable ROI

Transcription:

Cooling strategies for IT equipment Technology brief Introduction... 2 Limits of traditional cooling practices... 2 Free cooling strategies... 4 Air-side economization... 5 Water-side economization... 6 Benefits and disadvantages of free air cooling... 6 Air containment strategies... 6 Cold-aisle containment... 7 Hot-aisle containment... 8 Closed-loop cooling systems... 9 HP Modular Cooling System... 9 HP Performance-Optimized Datacenter... 11 Choosing the best cooling strategy... 12 Cooling decisions based on facility characteristics... 12 Cooling decisions based on room layout... 13 Cooling decisions based on server density/power per rack... 13 Managing data center cooling... 14 Managing ITE cooling with HP Systems Insight Manager... 14 Managing the data center environment with HP Environmental Edge... 15 Conclusion... 15 For more information... 16 Call to action... 16

Introduction If you re deploying the latest generation of information technology (IT) equipment or adding more servers to a crowded data center, conventional data center cooling methods and systems will likely be inadequate. The most effective cooling solution for any computing facility depends on the specific characteristics of the facility, equipment layout, and server density. This technology brief first explains limitations of traditional cooling practices. Then it describes a range of systems you can choose from to modify or supplement your existing cooling system to get the cooling capacity your data center requires. Limits of traditional cooling practices Enterprise data centers have most often used an open-area approach to cool racks of servers and storage systems. With this approach, one or more computer room air handlers (CRAHs) are placed on the periphery of the data center room. IT equipment (ITE) racks are arranged in a cold-aisle/hotaisle layout (Figure 1). Cool air is forced through a raised floor plenum and up through vented floor tiles in the cold aisle toward the front of the ITE racks. The cool air is drawn through the ITE racks, and warm air is vented out the rear of the racks and upward toward the ceiling. Air circulation works on the basic strategy of providing cool air at the floor level and collecting warm air near the ceiling. Figure 1: Traditional open-area data center cooling Basic room requirement: Raised floor ITE racks CRAH The open-area strategy is generally adequate for racks using up to 10 kilowatts of power and lets data centers scale relatively easily. However, some of the warm air mixes with the cool air, reducing cooling system efficiency. Some equipment generates excessive heat, creating hot spots that need supplemental cooling or specific air channeling. The typical remedy has been to set the cooling system to run colder to compensate for the hot spot. 2

Server blade systems and 1U servers let you assemble high-density infrastructures; however, these systems create much more heat per square foot of floor space and more hot spots. In instances like this, the open-area approach cannot keep up with the demand for cool air. Figure 2 compares two examples of server loading for a 42U rack. In Figure 2A, our rack contains 20 HP ProLiant DL380 G5 servers, each consuming approximately 424 watts for a total rack consumption of 8.47 kw. In Figure 2B, our rack contains 42 dual-processor DL160 G6 servers consuming 383 watts per server for a total rack consumption of over 16 kw of power. Figure 2: Examples of power consumption in 42U IT equipment racks with different server loads A: ProLiant DL380 G5 2U servers (20 servers @ 423.9 W per server) B: ProLiant DL160 G6 1U servers (42 servers @ 383 W per server) 8.47 kw 20 servers 16.1 kw 42 servers 4.23 kw 10 servers 8.0 kw 21 servers Power consumption Power consumption Higher power consumption produces more heat. Virtually all power consumed by rack-mounted equipment is converted to sensible heat, which increases the temperature in the environment. The sensible heat load is typically expressed in British Thermal Units per hour (BTU/hr) or watts, where 1 W equals 3.413 BTU/hr. The rack s heat load in BTU/hr can be calculated as follows: Heat Load = Power [W] 3.413 BTU/hr per watt For example, the heat load for a ProLiant DL160 G6 server in a 2P configuration consuming 383 watts is calculated as follows: 383 W 3.413 BTU/hr/W =1307 BTU/hr This means that the heat load of our fully-loaded 42U rack of DL160 G6 servers is 54,901 BTU/hr. In the United States, cooling capacity is often expressed in "tons" of refrigeration, which is derived by dividing the sensible heat load by 12,000 BTU/hr per ton. Therefore, the cooling requirement for our rack of DL160 G6 servers is computed as follows: 54,901 BTU/hr 12,000 BTU/hr per ton = 4.58 tons 3

The increasing heat loads created by the latest server systems require more aggressive cooling strategies than the traditional open-area approach. Such strategies include the following: Free cooling Air containment Closed-loop cooling Figure 3 identifies cooling strategies and implementations based on kilowatts per rack and on the server density per rack. Figure 3: Cooling strategies based on server density/power per rack Density (nodes per rack) 50 25 Traditional open-area cooling Supplemented data center cooling Cold/hot aisle containment Closed-loop cooling Chassis/component level cooling, future cooling technologies 8 16 24 32 40 Power (kw per rack) Free cooling strategies Free cooling strategies use ambient outside temperature to remove heat from inside a data center. You can implement free cooling as either a primary or secondary system. Free cooling systems take advantage of regional climate conditions for more efficient operation, but they do include mechanical and electrical equipment. Free cooling is generally applied using one of two methods: air-side economization or water-side economization. 4

Air-side economization Air-side economization uses a system of supply and return fans, filters, and dampers that maintain positive air pressure with respect to the IT equipment exhaust, and positive data center air pressure with respect to the outside. Air-side economization may also include supplemental cooling coils that you can switch in and out of circuit as needed for changes in climate. You can implement air-side economization as either a direct or indirect system. In direct air-side economization, outside air is filtered, stabilized for relative humidity, and then delivered to the ITE racks (Figure 4). The ITE rack layout and exhaust venting is similar to a hot-aisle containment strategy. Figure 4: Air-side economization (direct method) Input Filters/fans Exhaust Filter/fan Exhaust air to outside Air from outside ITE racks Room requirements: Air circulation system built into facility Air-side economization using the direct method is more applicable in cool regions where outside air requires little or no refrigeration. With indirect air-side economization, outside air cools data center air through an air-to-air heat exchanger. This method uses two air circulation circuits: one for outside air and one for inside air. Outside air is collected, cooled, and then used to cool re-circulated indoor air in a heat exchanger that is common to both air circulation circuits. The two air circuits share the heat exchanger, but indoor and outside air flows are segregated. Even though it requires two fan systems, indirect air-side economization can be more energy efficient and may be the preferred method for warmer climates. 5

Water-side economization Water-side economization uses a cooling tower as either the primary or supplementary means of cooling the chilled water supply for an air conditioning system. In a typical data center application (Figure 5), a cooling tower cools water in the condenser circuit, which in turn cools the chilled water in a common heat exchanger. The purpose of the external heat exchanger is to take the load off the CRAH, which can consume significant energy if the cooling load is high. Figure 5: Water-side free cooling ITE racks (cooling load) Chiller pump Cooling tower CRAH Heat exchanger Condenser pump Figure 5 shows water-side free cooling for a data center using cold-aisle containment. Water-side free cooling can also be used to supplement other cooling configurations. Benefits and disadvantages of free air cooling By using outside air to avoid or limit CRAH use, free-air cooling can significantly reduce a data center s energy requirements. However, free-air cooling systems require full integration with the facility system and must be carefully designed and adjusted to the data center s regional location. Air containment strategies Air containment strategies separate cold supply air from warm return air to maximize air handler efficiency. Selecting an air containment strategy generally depends on whether there is a raised floor or dropped ceiling, and/or the ability to meet fire codes. You can use one of two containment strategies: Cold-aisle containment Hot-aisle containment 6

Cold-aisle containment In a cold-aisle containment system, two ITE rows face front-to-front, with the cold air coming up through vented floor tiles of the common aisle. Cold air is isolated from the rest of the room air with curtains and/or Plexiglas panels (Figure 6). This ensures that only cold air is drawn into the ITE racks. The warm exhaust air exits the rear of the racks and is collected conventionally by the CRAHs. Figure 6: Cold-aisle containment strategy Basic room requirement: Raised floor ITE racks CRAH Cold-aisle containment is generally the easiest strategy to implement in a data center. There are no restrictions for cable trays above the racks, and IT systems are easily scalable. But because much of the data center room is at the hot air stream temperature, cold air meant to cool personnel in the room will mix with the warm air from the ITE racks, resulting in cooler air returned to the CRAH. Low return-air temperature can reduce cooling system efficiency, causing it to use shorter compressor cycles and/or re-adjust the chilled-water flow more often. If you retro-fit an existing data center with an air containment strategy, you must adhere to fire codes. Fire suppression systems must have access to the face of the ITE racks. 7

Hot-aisle containment In a hot-aisle containment system, two ITE rows are positioned back-to-back. As illustrated in Figure 7, the warm air exhausts into a common aisle area that is isolated from the rest of the room by doors and vertical ducts. The warm exhaust air is directed up to a dropped ceiling plenum and back to the CRAH. Figure 7: Hot-aisle containment with dedicated ductwork Basic room requirement: Dropped ceiling CRAH ITE racks A variation of hot-aisle containment uses a vertical air duct that vents exhaust air from the rear area of an ITE rack to a dropped ceiling air plenum. The HP 10K-G2 Series 42U Rack Air Duct Kit (Figure 8) can be the answer for an individual rack or in environments where full hot-aisle containment is not practical from a maintenance perspective. Figure 8: Hot-aisle containment with the HP 10K-G2 Series 42U Rack Air Duct Kit Hot-aisle containment eliminates the mixing of hot and cold air, thus ensuring that only exhaust air returns to the CRAH. This allows the air conditioning system to operate more efficiently than cold-aisle containment systems. The disadvantage of hot-aisle containment, particularly in facilities without raised flooring, is the lack of cold air pressurization that helps direct air to the front of the IT racks. Air duct Rack with rear extension kit installed Ceiling Grid Floor Grid 8

Closed-loop cooling systems Closed-loop systems are designed specifically for cooling IT equipment. They are the best solution for high-density systems consuming many kilowatts of power. These systems have separate cool air distribution and warm air return paths that are isolated from the open room air. Closed-loop systems typically use heat exchangers that use chilled water for removing heat created by IT equipment (Figure 9). Figure 9: Closed-loop cooling system Basic requirement: Chilled water supply Chilled water cooling system Heat exchanger Heat exchanger Since they are self-contained, closed-loop cooling systems offer flexibility and are adaptable to a wide range of locations and environments. Closed-loop cooling systems can also accommodate a wide range of server and power densities. We offer two closed-loop cooling systems: HP Modular Cooling System HP Performance-Optimized Datacenter HP Modular Cooling System The HP Modular Cooling System (MCS) Generation 2 (G2) is a closed-loop system contained within a modified HP 10000 Series G2 rack. Using a chilled water supply and integrating a heat exchanger, the HP MCS G2 can cool IT equipment consuming up to 35 kw. The system is entirely self-contained. 9

The HP MCS G2 supports the front-to-back cooling principle. It evenly distributes cold supply air at the front of each rack so components receive adequate supply air regardless of their position or density. Warm exhaust air collects in the rear of the rack. Fans direct it into the heat exchanger where it is cooled and re-circulated to the front of the rack. Condensation flows through a discharge tube to a tray integrated in the base assembly. The HP MCS G2 can be configured to cool 35 kw of equipment in a single rack or two racks of 17.5 kw each (Figure 10). Figure 10: HP MCS G2 system Single Rack Configuration Dual Rack Configuration The HP MCS G2 includes an Automatic Emergency Door Release Kit designed to open the front and rear doors in case of a sudden temperature increase. The open doors let ambient air cool the equipment. Flexible installation capabilities allow you to place the HP MCS G2 in locations not originally intended for IT equipment or in older data centers not designed for high-density, high-kilowatt systems. HP MCS chilled water requirements There are three potential sources of chilled water for the HP MCS G2: Direct connection to the building s chilled water system A dedicated chilled water system A water-to-water heat exchanger unit connected to a chilled water system or building water system The HP MCS G2 connects directly to the facility s chilled water supply. When it is necessary to isolate the fluid supply/return loop from the main building water system, we recommend using a separate water-to-water heat exchanger. The heat exchanger provides easier control and monitoring of water quality. It also lets you maintain a higher water temperature to reduce condensation. 10

HP MCS cooling requirements The HP MCS G2 adds minimal heat to the room because most of the heat generated inside the cabinet is removed through the chilled water loop. It exchanges a small amount of air with the room during normal operation. Depending on the room temperature, rack power consumption, and system configuration, up to 10 percent of the total internal heat load may pass into the room. In anticipation of future heat loads, isolated-loop chilled water piping should be designed and installed to support the following: Specific heat load increments of 17.5 or 35kW The specific number of HP Modular Cooling Systems per row or loop Other site build-out planning parameters As cooling, rack space, and equipment density requirements increase, additional HP MCS G2 systems can be quick-coupled into the isolated chilled water system. For detailed information, refer to the links provided in the For More Information section at the end of this document. HP Performance-Optimized Datacenter The HP Performance-Optimized Datacenter (POD) (Figure 11) is a containerized data center with power, cooling, and IT equipment preinstalled. It has a complete closed-loop cooling system. HP PODs are available as 20-foot or 40-foot containers. The 20-foot HP POD 2000c can contain up to ten 50U 19-inch industry-standard racks for a maximum power capacity of 290 kilowatts non-redundant or 145 kilowatts redundant. Figure 11: HP POD 2000c The 40-foot maximum-density HP POD 4000c can contain twenty-two 50U, 19-inch industry-standard racks for a maximum power capacity of 600 kilowatts non-redundant or 380 kilowatts redundant. With close-coupled temperature control, you can use higher chilled water supply temperatures to reduce power consumption of the chilled water supply. 11

The HP POD supports HP or any third-party IT equipment that conforms to 19-inch industry-standard mounting and front-to-back air flow. The only requirements at the location of installation are as follows: Utility power: 2 x 480 V/3-phase delta, 400 A, 50-60 Hz Communications Chilled water: Flow rate: HP POD 2000c; 120 gpm HP POD 4000c; 240 gpm Maximum inlet temperature: 55º F 75º F (12º C 24º C) HP PODs can be installed inside existing buildings or outdoors. They can be used to expand an existing data center or can become the basis for a new data center. Choosing the best cooling strategy We do not recommend a specific cooling strategy as the best strategy for all data center environments. The optimum cooling strategy depends on factors such as facility characteristics, room mapping, power consumed per rack, and server density. Cooling decisions based on facility characteristics Table 1 shows a list of facility characteristics and probable cooling strategies. Table 1. Cooling strategies based on facility characteristics Facility characteristics No raised floor or dropped ceiling Raised floor only Dropped ceiling only Raised floor and dropped ceiling Cooling strategy most easily adapted Free cooling or closed-loop system Cold aisle containment Hot aisle containment Depends on room layout of ITE racks/rows 12

Cooling decisions based on room layout If facility characteristics allow several cooling options, the rack layout can suggest which cooling strategy should be used. Figure 12 compares two data center layouts, each with four rows of racks. Figure 12A shows two pairs of rows with each pair facing each other to create two cold aisles and three hot aisles. This layout lends itself to a cold-aisle cooling strategy since only two rows require containment. Figure 12B shows a layout with only one pair of rows facing each other, resulting in three cold aisles and two hot aisles. In this case, a hot-aisle containment strategy might be preferred since only two rows require containment. Figure 12: Cooling strategies based on room layout (top view) A: Layout suggesting cold-aisle containment B: Layout suggesting hot-aisle containment Hot aisle Cold aisle Cooling decisions based on server density/power per rack Your room characteristics and data center layout can suggest a specific cooling strategy, but equipment density and power consumption ultimately determine the best choice. We can make some general assumptions: Traditional data center cooling is adequate for racks using up to 10 kw. Racks using 15 kw or more will likely require some form of containment strategy. Closed-loop cooling accommodates the widest range of server/power densities. 13

Managing data center cooling High-density data centers require close monitoring and management to achieve and maintain efficient and economical cooling. We offer two solutions, HP Systems Insight Manager and HP Environmental Edge, to manage and monitor cooling status. These tools can be used separately or together as a comprehensive management solution. Managing ITE cooling with HP Systems Insight Manager HP Systems Insight Manager (SIM), a total management solution for data centers, lets you closely monitor the thermal status of all racks by monitoring sensors within the racks and server chassis. The resulting thermal status display (Figure 13) lets you identify possible cooling issues such as hot spots. You can then take action such as adjusting the cooling system or, much more conveniently, using HP SIM to control power consumption of servers in a specific location. Figure 13: HP SIM screen showing the thermal status of IT racks 14

Managing the data center environment with HP Environmental Edge HP Environmental Edge technology helps you monitor two critical areas of the data center: power and cooling. Using sensors with wireless radio modules and software running on an Environmental Edge server, HP Environmental Edge measures, compiles, and analyzes data center power and cooling data. By presenting a thermal map of the data center environment (Figure 14), HP Environmental Edge technology lets you adjust for potential infrastructure issues to make the most of data center capacity and reduce cooling costs. Figure 14: HP Environmental Edge screen showing the thermal map of a data center Conclusion As data centers require more efficient cooling solutions, we at HP work diligently to provide solutions to meet current and future needs. Since conditions vary from one data center to another, we provide a full range of solutions that you can choose from to cool your computing environment efficiently. 15

For more information For additional information, refer to the resources listed below. Resource description HP Rack and Rack Options HP 10000 G2 Rack Air Duct Kit HP Modular Cooling System HP Performance Optimized Data Center HP Systems Insight Software HP Environmental Edge Web address http://h18004.www1.hp.com/products/servers/proliantstora ge/racks/index.html http://h18004.www1.hp.com/products/servers/proliantstora ge/rack-options/rack-air-duct/index.html http://h10010.www1.hp.com/wwpc/us/en/sm/wf05a/34 47589-3447589-3446285-3446371-3461917-3657806.html http://h18004.www1.hp.com/products/servers/solutions/dat acentersolutions/pod/index.html?jumpid=reg_r1002_usen http://h18013.www1.hp.com/products/servers/management /hpsim/index.html?jumpid=go/hpsim http://h18004.www1.hp.com/products/servers/solutions/dat acentersolutions/environmentaledge/index.html?jumpid=reg_r 1002_USEN Call to action Send comments about this paper to TechCom@HP.com Copyright 2010 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein. TC100904TB, September 2010