Energy Impact of Increased Server Inlet Temperature



Similar documents
Improving Data Center Energy Efficiency Through Environmental Optimization

How To Find A Sweet Spot Operating Temperature For A Data Center

Analysis of data centre cooling energy efficiency

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Elements of Energy Efficiency in Data Centre Cooling Architecture

National Grid Your Partner in Energy Solutions

Benefits of Cold Aisle Containment During Cooling Failure

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Reducing Data Center Energy Consumption

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

APC APPLICATION NOTE #92

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

How to Build a Data Centre Cooling Budget. Ian Cathcart

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

STULZ Water-Side Economizer Solutions

The Different Types of Air Conditioning Equipment for IT Environments

Benefits of. Air Flow Management. Data Center

How To Test A Theory Of Power Supply

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Green Data Centre Design

Rating Water-Source Heat Pumps Using ARI Standard 320 and ISO Standard

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Example Retrocommissioning Measure: Opening Throttled Discharge Valves

DataCenter 2020: first results for energy-optimization at existing data centers

Data Center Technology: Physical Infrastructure

Enclosure and Airflow Management Solution

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE ? SLIDE 1

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

APC APPLICATION NOTE #112

Reducing the Annual Cost of a Telecommunications Data Center

AIRAH Presentation April 30 th 2014

Data Center Energy Profiler Questions Checklist

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Infrastructure & Cities Sector

Thermal Mass Availability for Cooling Data Centers during Power Shutdown

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

NEBB STANDARDS SECTION-8 AIR SYSTEM TAB PROCEDURES

Data Center Power Consumption

In Row Cooling Options for High Density IT Applications

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Improving Data Center Efficiency with Rack or Row Cooling Devices:

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

Environmental Data Center Management and Monitoring

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Wet Bulb Temperature and Its Impact on Building Performance

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Impact of Hot and Cold Aisle Containment on Data Center Temperature and Efficiency

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Data Center Temperature Rise During a Cooling System Outage

How To Improve Energy Efficiency In A Data Center

Data Center Temperature Rise During a Cooling System Outage

How To Run A Data Center Efficiently

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

The New Data Center Cooling Paradigm The Tiered Approach

Rack Blanking Panels To Fill or Not to Fill

Data Center Components Overview

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

COMMERCIAL HVAC CHILLER EQUIPMENT. Air-Cooled Chillers

Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

Managing Data Centre Heat Issues

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

Airflow Simulation Solves Data Centre Cooling Problem

Hot-Aisle vs. Cold-Aisle Containment for Data Centers

Chiller-less Facilities: They May Be Closer Than You Think

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

News in Data Center Cooling

BRUNS-PAK Presents MARK S. EVANKO, Principal

State of the Art Energy Efficient Data Centre Air Conditioning

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

White Paper #10. Energy Efficiency in Computer Data Centers

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

CoolTeg Plus Chilled Water (CW) version: AC-TCW

Top 5 Trends in Data Center Energy Efficiency


Office of the Government Chief Information Officer. Green Data Centre Practices

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

APC by Schneider Electric

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Transcription:

Energy Impact of Increased Server Inlet Temperature By: David Moss, Dell, Inc. John H. Bean, Jr., APC White Paper #138

Executive Summary The quest for efficiency improvement raises questions regarding the optimal air temperature for data centers. The ASHRAE TC-9.9 committee has recently adopted an extension of the recommended thermal envelope for server inlet temperature and humidity. A popular hypothesis suggests that total energy demands should diminish as the server inlet temperatures increase. This paper tests that hypothesis through the development of a composite power consumption baseline for a mixture of servers as a function of inlet temperature and applying this data to a variety of cooling architectures. The goal is to find the optimal temperature range where the combined IT and cooling load is minimized. Data presented is based upon actual scaled testing of different cooling systems when subjected to the simulated composite server behavior. The testing revealed a complex interaction between server power and total data center energy consumption in which energy savings is realized within a temperature sweet spot. This temperature sweet spot varies by equipment, containment solution, and other factors. 2

Introduction Recent pressures on power availability and energy cost have placed a new emphasis on data center efficiency. In response to growing data center energy consumption ASHRAE TC9.9 committee has developed an expanded recommended thermal operating envelope for data centers. It is anticipated that increasing data center temperatures will reduce the amount of energy needed to transport the heat out of the building thereby allowing a reduction of energy consumption by the data center cooling infrastructure. However, the dynamic nature of IT Equipment cooling fans may diminish or even negate the cooling system gains. Server fans will typically respond to a demand for increased airflow as inlet temperature to the server reaches or exceeds 25 C (77 F), consequentially increasing server energy consumption. This paper will explore the power demand reduction on the cooling infrastructure against the potential increase in IT power. The metric of interest is the net effect on total power, including both the cooling infrastructure and the server. Server Baseline Testing for the Server / IT equipment was performed in the Dell server thermal lab in Round Rock, Texas. An Instek model GPM-8212 power meter was used for all the power measurements which were done in a large thermal chamber (Figure 1) capable of producing constant temperatures over a wide range. The IT equipment was powered at low line (110 volt), so the power values measured are higher than they would have been at high line (208 volt). A mixture of IT server equipment was utilized, each type exhibiting different cooling fan behavior. Figure 1 Server chamber IT server fan behavior typical falls into 3 categories. Some products have constant speed fans, so system power varies little over a range of inlet temperatures. Other systems vary flow rates either as discrete steps or a continuously variable response over a range of ambient temperatures, configuration, and/or utilization. Three different server systems were chosen to model a typical facility mixed IT behavior. Although these systems were chosen for expediency, they do represent a good variance in the types of fan control. Of the three systems, two were relatively lightly configured and were monitored during a nominal stress test. The other system was much more heavily configured and running a cluster type benchmark. For this reason, the base level power for this Server L is much higher than the other two systems. 3

Type L server behavior Server L exhibits a smooth power response to increasing ambient temperature as shown in Figure 2. This response is largely due to continuously variable fan speed response. The power consumption increased more than 80 watts over the temperature window from 17 C (62 F) to 35 C (95 F). This server type was also tested with the fan speeds held constant, resulting in system power increases of less than 0.9 watts per C (0.5 watts per F). This additional power is presumably due to increases in semiconductor leakage current. The Type L system was heavily configured running a high stress load. It is typical of a mainstream tier 1 server with elaborate algorithms to control fan speed. Figure 2 Type "L" server power vs. inlet temperature Watts Server "L" Power 450.0 440.0 430.0 420.0 410.0 400.0 390.0 380.0 370.0 360.0 350.0 60 65 70 75 80 85 90 95 Inlet Temperature ( o F) Type S server behavior Server S exhibits a stepwise increase over the same inlet temperature range which is typical of products with a bit less sophisticated fan control algorithms. This system was typically configured with a moderate load (Figure 3). Figure 3 Type "S" server power vs. inlet temperature 4

Type X server behavior Server X is a system with constant speed fans. The variance seen here is almost within measurement uncertainty. The general trend of slightly increasing power in Figure 4 may also be due to increases in leakage current at higher temperatures. This system, typical of many white-box servers, exhibits near constant power over a range of inlet temperatures. It is actually a chassis containing multiple servers, so the near constant 202 watts is representative of a single system. This system was typically configured with a light load. Figure 4 Type "X" server power vs. inlet temperature Composite server behavior The three types of systems were combined in a manner which would represent a typical combination of the three types of equipment. Since the goal of this study was to find the optimal temperature range where the combined IT and cooling power consumption is minimized, it is important to portray a typical mix of constant versus varying power equipment. Arbitrarily, the mix of L systems was set to 50% since it represents a server typical of tier 1 behavior. The other two systems were set at 25% each. When combined proportionally as just described, the composite IT response to temperature is represented in Figure 5. This was replicated by heat / airflow load simulators at APC for the facility portion of the tests. 5

Figure 5 Composite server power vs. inlet temperature Cooling Architectures Considered This study considered three distinctive row level cooling methods, including two chilled water (CW) versions and one integral compressor direct expansion (DX) cooling systems. The CW solution was investigated as both an open row configuration and Hot Aisle Containment System (HACS). The three cooling configurations considered in this paper include. 1. CW InRow, RC cooling w/ HACS (delta T and supply temp control) 2. CW InRow, RC cooling open-row (rack inlet and supply temp control) 3. DX InRow, RD 10kW cooling HACS (delta T and supply temp control) Each configuration was tested using a quantity of three APC InRowTM cooling modules with a nominal cooling capacity of 20kW each for the CW RCs (ACRC100) and 11kW each for the DX RDs (ACRD101). Each of the cooling modules has a form factor of a standard IT rack at ½ the nominal width. This physical configuration allows for ease of integration into the IT row environment. The IT load for CW RC testing was varied from 28kW at the lower ambient to 32kW at the upper ambient. The upper IT load for the DX RD 10kW was limited at 30kW. The scale of this testing was limited to a sub set of a typical data center, but is representative of actual large scale implementations. The server simulators were adjusted to mimic the composite response of the three server classes benchmarked, power and airflow, at the various operating temperatures. Hot Aisle Containment System (HACS) testing was conducted with a fan speed preference of medium, maintaining a constant temperature delta across the cooling modules of 11 C (20 F). Cold aisle temperatures for each of the three test scenarios included: 17, 18, 21, 23, 25, 27, and 31 C (63, 65, 70, 73, 77, 81, 84, and 88 F) with chilled water supply temperatures of: 4, 6, 8, 10, 12, 14, 16, and 18 C (39, 43, 46, 50, 54, 57, 61, and 64 F) respectively for chilled water cooling solutions. The chilled water 6

temperature was adjusted to maintain similar temperature difference between supply chilled water and supply airflow, allowing chiller plant efficiency gains at higher cold aisle temperatures. The water cooled DX system had the condenser water temperature low enough to allow compressor discharge pressure to be held within the intended operating range. Each test condition was run for a period of 2.5 to 4 hours after stabilization with energy consumption normalized over this period to kw Hr/Hr. The cluster of racks and cooling modules was thermally isolated from the larger room environment by means of a temporary insulated wall system that ran full height from floor to ceiling. The surrounding room dewpoint temperature was set to 13 C (56 F) with a tolerance of ±2.8 C (5 F) for all testing and was held at a dry-bulb temperature of 22 C ± 1.7 C (72 F ± 3 F). A separate air-handler with dehumidification / humidification was used for this purpose. Power consumption of the air-handler used for dew-point control and surrounding room temperature control was not monitored nor considered as part of the evaluation. During the execution of the three test scenarios the cooling coils remained dry, producing no measurable condensate. The composite server load for power and airflow over the ambient range was mimicked by a quantity of eight (8) IT server simulators. The server simulators are housed in a 10U rack mounted chassis with a maximum power dissipation of 5,750 watts and maximum airflow of 316 L/s (670 CFM) each. Simulators had calibrated airflow adjustment along with stepped control of power settings with an adjustment resolution of 250 watts. Figure 6 shows a typical installation of one of the server simulators. Blanking panels above the simulator were removed for the purpose of the photo; all empty U spaces had blanking panels installed during actual testing. Figure 6 Server simulator Figure 7 illustrates the power monitoring instrumentation used for this testing. All measurements were Captured using Square D Power Logic meter (model PM-210) with data collected over a ModBus interface. The power meter measurements captured included: voltage, amps, power factor, and totalized energy consumption. The following points/loads were separately monitored for power/energy consumption throughout test duration: 7

1. Chiller 2. InRow Cooler(s) DX 3. InRow Cooler(s) CW 4. IT Load 5. CW Pump Figure 7 Test power measurement points Test Results Each of the three configurations previously mentioned had similar test sequences executed to evaluate electrical power/energy requirements at the eight different dry-bulb temperature test conditions. The InRow CW cooling solution had the supply chilled water temperature reset as a function of the cold aisle air temperature under evaluation. The data collected during execution of each test sequences was sampled at an interval of 30 seconds. All data was stored and imported into an Excel spreadsheet for analysis. The chilled water system testing was supported by a nonstandard chiller configuration. The chiller plant was significantly oversized for the load and was buffered by a large storage tank on the primary water loop. Full design chilled water flow was continuously circulated between the chiller and storage tank. The primary tank and loop was held at a temperature several degrees Fahrenheit below the secondary loop. Secondary loop circulation was provided by a separate circulating pump with VFD control. This loop provided the nominal flow rate needed by the InRow Cooling equipment. Temperature regulation of the secondary loop supply temperature was facilitated by regulated mixing between the chiller primary loop and secondary loop. Mixing was modulated by an electronically controlled actuator as needed to maintain the desired secondary loop supply chilled water temperature. The particular chiller configuration contributed to reduction of 8

coefficient of performance (COP) from what would typically be expected. While the COP was reduced the resulting shape of the curve was not affected. The COP is defined as the cooling capacity provided divided by the power consumed by cooling equipment. An example would be a cooling capacity of 30kW that consumed 10kW (chiller, pumps, air-handlers) of power would have a COP of 3. Cooling systems with a higher COP are more efficient from system with a lower value. CW RC InRow with HACS Figure 8 illustrates the second order polynomial curve fits for the cooling energy consumption (BLUE) and total combined (IT + cooling) energy consumption (RED) for RC CW InRow Cooling w/ HACS. It is notable that the rate of reduction in cooling energy required as a function of dry-bulb temperature begins to taper off as the cold aisle temperature increases past 29 C (84 F). While the COP for the chiller continued to improve beyond this temperature the total heat load increased as a function of increasing server airflow/power. The combined effect of diminishing reduction in cooling energy needed combined with increasing IT energy needed gives a minimal energy rate at 25 C to 27 C (77 F to 81 F). This suggests that total energy required does not continue to improve as a function of increased cold aisle setpoints beyond this temperature range for this particular system. Figure 8 RC cooling solution total electrical demand (IT + cooling system) with containment 47 46 Poly. (Cooling) Poly. (Total) 18.0 17.0 45 16.0 Total kw-hr/hr 44 43 42 15.0 14.0 13.0 12.0 Cooling kw-hr/hr 41 11.0 40 10.0 17 19 21 23 25 27 29 31 33 [63] [66] [70] [73] [77] [81] [84] [88] [91] Supply Air Setpoint ( C) [ F] 9

Chilled water InRow without containment Figure 9 illustrates the second order polynomial curve fits for the cooling energy consumption (BLUE) and total combined (IT + cooling) energy consumption (RED) for RC CW InRow Cooling open-row/ no HACS. It is notable that the reduction in cooling energy required as a function of increasing cold aisle dry-bulb temperature reversed as the temperature increases past 29 C (84 F), at which point cooling energy actually begins to increase. While the COP for chiller continued to improve the total energy for the chiller and along with row cooling energy increased as a function of increasing server airflow and power, thereby requiring more energy per unit of time for the cooling solution at higher temperatures. The difference in behavior between the HACS solution and open row/no containment is attributed to a more aggressive response of cooler airflow from the open row solution. While the point of cooling system energy increase lagged the point of increase for total combined (IT + cooling) energy increase the net effect results in a rather steep increase at about 27 C (81 F) for total combined energy. In this case total energy consumptions are more or less equal between 33 C (92 F) and 19 C (67 F). The combined effect of cooling energy needed combined with increasing IT energy needed gives a minimal energy rate at 27 C (81 F). This suggests that total energy required would not continue to improve as a function of increased cold aisle setpoints beyond this temperature for this particular system. Figure 9 RC cooling solution total electrical demand (IT + cooling system) no containment 53 51 Poly. (Cooling) Poly. (Total) 20 19 Total kw-hr/hr 49 47 18 17 16 Cooling kw-hr/hr 45 15 43 14 17 19 21 23 25 27 29 31 33 35 [63] [66] [70] [73] [77] [81] [84] [88] [91] [95] Rack Inlet Setpoint ( C) [ F] 10

DX InRow with HACS Figure 10 illustrates the second order polynomial curve fits for the cooling energy consumption (BLUE) and total combined (IT + cooling) energy consumption (RED) for RD DX InRow Cooling w/ HACS. In this scenario it is notable that the cooling COP is significantly better than either chilled water solution. This is due to the nonstandard chiller configuration as discussed at the beginning of the Test Results section. Consistent with the other contained hot aisle testing the energy improvement of cooling system began to diminish as the temperature increased above 29 C (84 F). Again the row cooling had to increase cooler airflow to match the increased flow rate of the IT equipment. Also in this scenario as the cooling represented a smaller contribution total energy needed the point at which combined energy consumption increased occurred at a lower temperature. With this particular scenario the point of optimal efficiency, minimum combined energy consumption, is at 24 C (76 F). Figure 10 RD 10kW cooling solution total electrical demand (IT + cooling system) with containment 40 9.4 39 Poly. (Cooling) Poly. (Total) 9.2 9.0 Total kw-hr/hr 38 37 8.8 8.6 8.4 8.2 8.0 Cooling kw-hr/hr 7.8 36 7.6 17 19 21 23 25 27 29 31 33 [63] [66] [70] [73] [77] [81] [84] [88] [91] Supply Air Setpoint ( C) [ F] A note on choosing IT based on thermal performance Since the energy associated with the cooling solution in most cases decreases with increasing operating temperature, it might be tempting to choose IT equipment that has very little variance in power at higher temperatures. But if the equipment flow rate doesn t scale up at high temperatures, it doesn t scale down at lower temperatures. Fan selection is done to achieve proper internal temperatures at elevated temperatures. It has to be capable of handling the upper end. It scales down at lower temperatures 11

because it doesn t need higher flow rates to achieve acceptable temperatures. The piece of IT equipment that doesn t scale cannot take advantage of low energy use at low temperatures. If at all possible, the IT vendor should give guidance on how much fan power the equipment uses over a broad range of temperatures. Considerations for System Design It is important to note that this study and analysis focuses on the inlet air temperature to the IT equipment. For a row based air conditioner this is nearly the same as the supply air temperature from the air conditioning system; however for a typical room based cooling system the supply air temperature from the air conditioning system will be much cooler than the inlet air temperature of the IT equipment. For example, in a typical room cooling system the supply temperature might be 13 C (55 F) while the IT inlet temperature is on the order of 21 C (70 F). This difference is due to air mixing in a room based cooling system and represents a major negative impact on the efficiency and capacity of the air conditioner. Changes to the system that reduce air mixing, such as row oriented cooling or containment, will reduce this temperature difference and will always improve cooling system efficiency but not necessarily the efficiency of the facility. Besides identifying the point of minimal energy, the data presented in this paper suggests a relative magnitude of the energy advantage for operation at higher temperatures. It is important to note that not all data centers can immediately take advantage of the hike in operational temperatures. Historically, room level cooling results in a disparity in delivered temperatures. Because of mixing, hot spots, where delivered temperatures are significantly above the supply temperature, are common among a small percentage of IT equipment. An attempt to raise the temperature can put the systems located in hot spots at risk. This is typically much less of a challenge with row level cooling and is typically eliminated when containment is applied to any cooling system. To take advantage of the energy advantages of increased temperature operation, the data center may have to consider the type of cooling system and/or airflow management techniques such as containment. Conclusions Due to variable flow rates in IT equipment, there is truly a sweet spot of minimal energy use; and the answer is not necessarily higher is better. In the three scenarios studied, the lowest energy use occurred anywhere between 24 C (76 F) and 27 C (81 F). The trigger in each case is the IT fan power increasing and exceeding the incremental decreases in the energy required to cool. Although not a comment about the sweet spot, it is obvious by comparing the total energy of Figures 8 and 9 that the addition of containment saved about 10%. The DX solution in this experiment actually ran more efficiently than the two chilled water tests due to the nonstandard chiller configuration as discussed at the beginning of the Test Results section. With the 12

cooling solution playing less of a role in the overall energy use, its relative contribution tended to shift the sweet spot to a lower temperature since the IT contribution was more predominant. Facilities with greater efficiencies would tend to follow this pattern. In the quest for the highest efficiency cooling system, it is imperative to operate at the proper temperature. The pursuit of this goal should include considerations for the energy expended in the chilling process and the IT fan power, and it should also consider the best cooling solution and airflow management techniques to get there. Supply air temperatures should be raised but only after considering the implications to every piece of IT equipment. If a data center is successful in raising their delivered temperatures, there are large savings in energy at the chiller, potential savings in moisture control, and there is a potential increase in the number of hours that economizer modes can be used. About the authors David Moss has more than 26 years experience in electronics packaging including 8 years in defense electronics and 18 years at Dell. He has designed products from notebooks to servers and holds more than 20 of Dell's patents, primarily in the area of thermal engineering. In the early 2000 timeframe he transitioned from server system architecture to data center strategy and currently works for the VP of Datacenter Infrastructure helping to define Dell s power/cooling strategy. He is a voting member in the ASHRAE datacom technical committee, TC9.9 and is a participating member of The Green Grid John Bean Jr. is the Director of Innovation for Racks Cooling Solutions at American Power Conversion. Previously John was the World Wide Engineering Manager for Cooling Solutions at APC, developing several new product platforms and establishing engineering and laboratory facilities in both the USA and Denmark. Before joining APC, John was the Engineering Manager for other companies involved in the development and manufacture of mission critical cooling solutions. 13