How To Find A Sweet Spot Operating Temperature For A Data Center



Similar documents
Chiller-less Facilities: They May Be Closer Than You Think

Rack Blanking Panels To Fill or Not to Fill

Energy Impact of Increased Server Inlet Temperature

Improving Data Center Energy Efficiency Through Environmental Optimization

Guide to Minimizing Compressor-based Cooling in Data Centers

Dell s Next Generation Servers: Pushing the Limits of Data Center Cooling Cost Savings

Environmental Data Center Management and Monitoring

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

DataCenter 2020: first results for energy-optimization at existing data centers

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

How To Improve Energy Efficiency Through Raising Inlet Temperatures

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

AIRAH Presentation April 30 th 2014

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Managing Data Centre Heat Issues

Data Center Environments

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Direct Fresh Air Free Cooling of Data Centres

Drives and motors. A guide to using variable speed drives and motors in data centres Meeting your Carbon Reduction Commitment (CRC)

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

White Paper: Free Cooling and the Efficiency of your Data Centre

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Elements of Energy Efficiency in Data Centre Cooling Architecture

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

How To Improve Energy Efficiency In A Data Center

The Benefits of Supply Air Temperature Control in the Data Centre

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

Increasing Energ y Efficiency In Data Centers

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Drives and motors. A guide to using variable-speed drives and motors in data centres

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Benefits of. Air Flow Management. Data Center

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

The Different Types of Air Conditioning Equipment for IT Environments

Specialty Environment Design Mission Critical Facilities

Data Center Energy Consumption

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Building the Energy-Smart Datacenter

How To Run A Data Center Efficiently

Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Analysis of data centre cooling energy efficiency

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Reducing the Annual Cost of a Telecommunications Data Center

Improving Data Center Efficiency with Rack or Row Cooling Devices:

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

The New Data Center Cooling Paradigm The Tiered Approach

The Mission Critical Data Center Understand Complexity Improve Performance

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Reducing Data Center Energy Consumption

Data Center Power Consumption

An Introduction to Cold Aisle Containment Systems in the Data Centre

Data Centre Cooling Air Performance Metrics

Driving Data Center Efficiency Through the Adoption of Best Practices

Datacenter Efficiency

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager

Design Guide. Retrofitting Options For HVAC Systems In Live Performance Venues

CHILLER PLANT CONTROL MULTIPLE CHILLER CONTROLS

Using Time-of-Day Scheduling To Save Energy

Data Center Equipment Power Trends

Bring the fresh in. Dedicated Outside Air Systems

- White Paper - Data Centre Cooling. Best Practice

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Best Practices in HVAC Design/Retrofit

What to Consider When Designing Next-Generation Data Centers

Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions Sponsored by:

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Data Center. Ultra-Efficient chilled water system optimization. White paper. File No: Date: december 03, 2015 Supersedes: new Date: new

How To Test A Theory Of Power Supply

CASE STUDY: COOLERADO CORPORATION

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Greening Commercial Data Centres

Optimization of Water - Cooled Chiller Cooling Tower Combinations

Green Data Centre Design

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

The Data Center Dilemma - A Case Study

Effect of Rack Server Population on Temperatures in Data Centers

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Economizer Modes of Data Center Cooling Systems

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Transcription:

Data Center Operating Temperature: The Sweet Spot A Dell Technical White Paper Dell Data Center Infrastructure David L. Moss

THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL ERRORS AND TECHNICAL INACCURACIES. THE CONTENT IS PROVIDED AS IS, WITHOUT EXPRESS OR IMPLIED WARRANTIES OF ANY KIND. 2011 Dell Inc. All rights reserved. Reproduction of this material in any manner whatsoever without the express written permission of Dell Inc. is strictly forbidden. For more information, contact Dell. Dell, the DELL logo, and the DELL badge are trademarks of Dell Inc. APC is a registered trademark of American Power Conversion Corporation. Other trademarks and trade names may be used in this document to refer to either the entities claiming the marks and names or their products. Dell Inc. disclaims any proprietary interest in trademarks and trade names other than its own. June 2011 ii

Introduction Customers frequently ask Dell, what is the best temperature to run your servers? In 2009 when we first published our conclusions, we found that typical facilities would conserve the most energy when running between 75 F and 80 F (24 C and 27 C). With improvements in IT equipment design, recent analysis suggests the range may have shifted a few degrees higher, which is deserving of an update to our original paper. 1 The original paper focused solely on compressor-based facilities. With increasing interest in economizers, this new paper will also discuss a sweet spot operating temperature, or a lack thereof, for non-compressor-based facilities. Operating temperature is a topic that has been discussed at length in both The Green Grid and the ASHRAE (American Society for Heating, Refrigerating, and Air Conditioning Engineers) datacom technical group. ASHRAE has recently increased its recommended operating temperatures by several degrees and has also created new IT equipment classifications specifically intended for chiller-less operation. 2 Our initial theory was that temperature-induced increases in IT fan power might negate the facility energy advantages associated with running compressor-based cooling at a higher temperature set point. With this theory in mind, we approached our colleagues at APC (American Power Conversion) to engage in the discussion. What we found confirmed our assumption: there is a sweet spot operating temperature for compressor-based cooling in your data center. Data Center Temperature Air conditioning systems are commonly controlled by monitoring and maintaining a specific return air temperature. When discussing the room temperature or set-point, you are often talking about the return temperature. This is not the temperature that is critical for IT equipment. The only temperature that matters to the IT equipment is the inlet temperature. To be perfectly clear, when discussing the ideal operating temperature, we are referring to the temperature at the inlet of the IT equipment. Hypothesis IT equipment is actually quite efficient at exhausting its own heat when delivered a low-to-moderate inlet temperature. Designed to accommodate temperatures as high as 95 F (35 C), internal fans spin fast and consume larger amounts of air to maintain acceptable component temperatures. With a moderate temperature of 75 F (24 C), the fans spin down, consuming a relatively small amount of air and fan energy. IT fan speeds generally bottom out around 75 F (24 C). Without selectively turning fans off, air consumption is usually about as low as it will go at this moderate temperature. At the time of this paper s original publishing date, a bank of fans would consume about 10 watts at moderate temperatures and spin up 5x or more to cool at the highest allowable temperature. Today s Dell servers have improved on the low end but have drastically decreased the fan power on the upper end. Now, a typical bank of fans uses about 8 watts at moderate temperatures but only doubles the fan power when spinning up to cool at 95 F (35 C). 1 Data Center Operating Temperature: What Does Dell Recommend; David Moss; 2009 http://content.dell.com/us/en/corp/d/business~solutions~whitepapers~en/documents~dci-data-center- Operating-Temperature-Dell-Recommendation.pdf.aspx 2 2011 Thermal Guidelines for Data Processing Environments Expanded Data Center Classes and Usage Guidance; ASHRAE TC9.9; 2011: http://tc99.ashraetcs.org/documents/ashrae%20whitepaper%20-%202011%20thermal %20Guidelines%20for%20Data%20Processing%20Environments.pdf 3

Prior to this experiment, we hypothesized a temperature point somewhere between the moderate 75 F (24 C) and the high 95 F(35 C), where the increase in fan energy might be greater than the energy saved in the chilling process. We set out to test this assumption. Initial Testing Dell conducted testing in our controlled thermal chambers, measuring energy and airflow consumption for several types of servers over a broad range of inlet temperatures. The combined power measurements are shown in Figure 1. Figure 1. Composite Server Power To explore the connection between server power and cooling power, APC also conducted a set of tests. APC Facility Testing These server characteristics were recreated in an APC test facility capable of measuring facility power and cooling energy use. 3 APC sectioned a portion of their data center test facility to isolate the test from the remainder of the room, essentially building a little room within the lab. Using racked simulator units to produce the power and flow characteristics of the servers, cooling was adjusted within the test room to provide cooling to the servers. Chilled water and supply air temperatures were adjusted to deliver a range of server inlet temperatures similar to the server tests run at Dell. As expected, the facility cooling energy decreased with increased operating temperature, but the IT power began to increase at a temperature in the mid-70s ( F). When combined, the total energy (including IT power and cooling) hit a minimum level between 75 F 80 F (24 C 27 C), which is shown in Figure 2 and Figure 3. 3 Energy Impact of Increased Server Inlet Temperature; David Moss, John H. Bean, Jr., 2008: http://content.dell.com/us/en/enterprise/d/business~solutions~whitepapers~en/documents~dci-energy-impactof-increased-inlet-temp.pdf.aspx 4

Figure 2. Comparison of IT Power and Cooling Power Graphed over the wide power range, the trend is subtle. When isolated to the range for the combined load, the trend becomes clearer, as shown in Figure 3. Figure 3. IT Power Added to Facility Cooling Power Although not specifically graphed, the fans also sped up in the air handling units as the IT gear spun up. When looking at the combination of diminishing cooling power and increasing IT and facility fan power as shown in Figure 3, a range of lowest energy usage becomes evident. The optimal energy range falls within 75 F 79 F (24 C 26 C). The IT equipment used in these tests is currently more than four years old. By superimposing current generation hardware data, it appears that improved server thermal design may have shifted the sweet spot to just above 80 F (27 C). 5

Economized Facilities Figure 2 suggests a substantial savings in facility cooling energy in conjunction with a shift in IT operational temperature from a cold 55 F (13 C) to a moderate 75 F (24 C) or 80 F (27 C), but the remaining cooling energy is still significant. In an economized facility, even more energy can be saved. In a compressor-based facility, there is energy associated with moving air and moving fluids, but the largest amount of energy is attributed to the refrigeration cycle. An economizer bypasses the refrigeration cycle and makes use of cooler outside air. We will concentrate solely on an air economizer for this discussion. An air economizer is typically designed into the facility up-front. It is a system of controls, dampers, and fans that changes the facility entirely, or in part, from using compressor-based cooling to bringing in fresh air to cool the IT equipment. Compressor energy goes away and facility fan energy typically increases in order to draw in and exhaust the fresh air. For the same IT inlet temperature, energy use during economization is a step function lower than that of compressor-based cooling, even at its optimal point. Just like the compressor-based facility, as IT equipment inlet temperatures increase past the mid-70s ( F), the IT fan power will start increasing, and the facility s air compliment will grow as well. In theory, there is an operating point, similar to the sweet spot, at which the combined fan power for the IT equipment and the facility has increased to a level comparable to operating the chiller at moderate temperatures. If the data from the test at APC is an indicator, this extrapolated point would probably happen somewhere above 100 F (38 C) and would only be reached if the IT equipment fan speeds continue to increase at that high temperature. Some IT equipment may have hit its maximum fan speed before reaching this crossover temperature, so you might not see this happen with today s 95 F (35 C) equipment. You might hit that crossover point with tomorrow s equipment, which complies with the new ASHRAE classifications for extended environment. Therefore, there really is not a sweet-spot temperature while economizing; just follow the ambient to the temperature you are comfortable with. Most economized facilities are not that aggressive, however; they have chillers and tend to operate within ASHRAE s recommended range, but they can still be quite efficient. If economizing to the upper end of ASHRAE s recommendation of 81 F (27 C), the result would be economization for more than 90% of the year for many locations. That can be considered good coverage for the year, and there is still more room for economization between 81 F (27 C) and the upper allowable IT limit. Cold outside temperatures are generally not an issue in an economized facility. The economizer system is typically built such that incoming cold air can go through a controlled mix with IT equipment exhaust to raise the temperature of the air delivered back to the IT equipment. This is done for several reasons. The comfort issue is obvious; freezing temperatures can be mixed to a comfortable level reusing the IT equipment s waste heat. Humidity effects can be minimized by not bringing in as much outside air. Finally, you can actually lower facility fan energy by remixing. Because the economizer operation requires extra fan energy to pull air into and exhaust from the facility, with more mixing, less air will be drawn into the facility. It might be tempting to mix when it is warm outside to lower this fan energy, but if the IT equipment fans are having to spin up, that increases the facility s burden to produce more air; both the facility fans and the economizer fans will need to increase as well. The end result would be higher energy use and an extremely hot data center. There is a small but growing interest in chiller-less facilities. It is not hard to see that if 81 F (27 C) operation results in free cooling around 90% of the time, it would be desirable to extend this operation to the last 10% and avoid the capital costs of compressor-based cooling altogether. There are moderate 6

climates in which you could capture that last 10% with current IT equipment designed to operate at 95 F (35 C). The opportunities would be opened up substantially if equipment were able to sustain in locales that cannot quite manage operation with 95 F (35 C) equipment or cannot handle that occasional hot year in an otherwise moderate climate. There is likely a point at which a chiller-less facility with no additional means of cooling uses more energy than the same facility running a chiller. We think that point is greater than 100 F (38 C). Those operating chiller-less facilities do so primarily for the CapEx advantage, not for the opportunity to squeeze out the last few hours of cheaper cooling. If the crossover point has been reached and the facility truly is using more energy turning fans than running a chiller, the tradeoff amounts to 100s (or low 1000s) in US dollars in energy costs per MW of IT in missed OpEx for the opportunity to save millions per MW of CapEx for a chiller plant. If you do not have an economizer in your current facility, you may in your next one. In 2010, ASHRAE Standard 90.1 removed the exemption for data centers from many mandatory energy-saving requirements, including economizer usage. As soon as local codes adopt the 2010 version, you may be required to install an economizer in your next facility or in a significant retrofit of a current facility. When it comes to economizers, you are likely to fall into one of these four groups: You plan to fully embrace a chiller-less strategy, including: o Enjoying large CapEx savings (not to mention maintenance savings as well) o Contending with the contamination potential (not overly difficult) You are unable to go completely chiller-less, but you decide to include a small amount of chilling capability, allowing you to: o Take the edge off the hottest of days o Close your facility if necessary (for example, in an extreme unexpected contamination event) You are a typical economizer user, who: o Operates conservatively o Still enjoys free cooling the vast majority of the year, if operating up to ASHRAE s recommended limit o May experiment going beyond ASHRAE s recommended limit (still saving more energy than if you turn on the chiller) You have a traditional data center and hopefully will not face a mandate for economizers, but you can still greatly improve energy use if you operate at the sweet spot Summary When using compressor-based means to cool a data center, the highest operating temperature may not be the most efficient. The example test facility used in this paper showed that above a certain temperature, its incremental IT-power increase grew faster than the decrease in energy expended in the cooling process. Depending upon your IT equipment, the ideal operating temperature is somewhere in the upper 70s or lower 80s ( F). No open data center is without some variance in delivered temperatures. Whether you are economizing or not, Dell highly recommends that you consider some form of containment to create the temperature consistency necessary to allow for higher facility set points. If you do have an economizer, feel free to use it and ride the temperature up higher. You are only limited by your leasttolerant IT equipment. 7