How To Improve Energy Efficiency Through Raising Inlet Temperatures



Similar documents
Improving Data Center Energy Efficiency Through Environmental Optimization

Benefits of. Air Flow Management. Data Center

Reducing the Annual Cost of a Telecommunications Data Center

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

How To Run A Data Center Efficiently

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Element D Services Heating, Ventilating, and Air Conditioning

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Analysis of data centre cooling energy efficiency

Benefits of Cold Aisle Containment During Cooling Failure

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Data Center Components Overview

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

APC APPLICATION NOTE #112

- White Paper - Data Centre Cooling. Best Practice

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Measure Server delta- T using AUDIT- BUDDY

Power and Cooling for Ultra-High Density Racks and Blade Servers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Environmental Data Center Management and Monitoring

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Using CFD for optimal thermal management and cooling design in data centers

Effect of Rack Server Population on Temperatures in Data Centers

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

An Introduction to Cold Aisle Containment Systems in the Data Centre

Rack Hygiene. Data Center White Paper. Executive Summary

BRUNS-PAK Presents MARK S. EVANKO, Principal

APC APPLICATION NOTE #92

Airflow Simulation Solves Data Centre Cooling Problem

Data Center Power Consumption

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

How Row-based Data Center Cooling Works

How To Improve Energy Efficiency In A Data Center

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

Data Center Airflow Management Retrofit

Direct Fresh Air Free Cooling of Data Centres

The New Data Center Cooling Paradigm The Tiered Approach

Analysis of the UNH Data Center Using CFD Modeling

Elements of Energy Efficiency in Data Centre Cooling Architecture

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

STULZ Water-Side Economizer Solutions

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

Application of CFD modelling to the Design of Modern Data Centres

Great Lakes Data Room Case Study

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

News in Data Center Cooling

The Different Types of Air Conditioning Equipment for IT Environments

Center Thermal Management Can Live Together

Using thermal mapping at the data centre

Managing Data Centre Heat Issues

Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems

Data center upgrade proposal. (phase one)

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Improving Data Center Efficiency with Rack or Row Cooling Devices:

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Driving Data Center Efficiency Through the Adoption of Best Practices

How to Build a Data Centre Cooling Budget. Ian Cathcart

Data Centre Cooling Air Performance Metrics

Best Practices in HVAC Design/Retrofit

Energy Performance Optimization of Server Room HVAC System

White Paper #10. Energy Efficiency in Computer Data Centers

The CEETHERM Data Center Laboratory

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Virtual Data Centre Design A blueprint for success

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

The Benefits of Supply Air Temperature Control in the Data Centre

Managing Cooling Capacity & Redundancy In Data Centers Today

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS

Venice Library Humidity Study. for Williams Building Diagnostics, LLC th Street West Bradenton, FL Report April 13, 2015

Transcription:

Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD 20816 301-320-2870 www.midatlanticinfrared.com Saving energy has become a key goal of data center operators. Virtually all of the power consumed by computer room servers is converted to heat. This heat must be removed by Computer Room Air Conditioning (CRAC) units so that safe equipment temperature limits are not exceeded. Maximizing the efficiency with which heat is removed from the data center is one of the critical steps to saving energy in data center operations. This paper discusses the significant operating cost savings that can be realized by simply raising server rack inlet temperatures. However, raising rack inlet temperatures without adequate supply air flows will almost certainly lead to hot spots due to recirculation. This paper illustrates why proper cooling air supply volumes are necessary for efficient server operation. The paper will also describe both conventional and newly introduced technologies for measuring supply adequacy and establishing required cooling air volumes. The paper will quantify the energy and cost savings that can be realized through improved air flow and raised rack inlet temperatures. Low Cost Efficiency Improvements Through Raised Inlet Temperatures It is widely acknowledged that substantial energy savings can be achieved by raising server rack inlet temperatures. In fact, the total cooling loads on the CRACS remain equal to the IT load, regardless of the server inlet temperature. If, for example, a rack requires 9kW for data processing operations, than 9kW of heat must be removed, regardless of inlet temperatures. The energy savings come from a different source. When server inlet temperatures rise, return air temperatures to the CRACs also rise. The elevated return temperatures actually produce an increase in compressor or chiller Coefficient of Performance (COP). (COP is the amount of cooling produced by the compressor divided by the amount of power consumed by the compressor or kw cooling / kw power.) The improved COP occurs because the compressor capacity increases as the evaporator temperature (the cooling coil in the CRAC) approaches the temperature of the condenser coil. This means that the same amount of cooling is provided (9kw, for the above example), but with reduced power consumption. Power consumption is reduced because increased compressor capacity means that air temperature set points will be satisfied with less operating time and the compressor will stage down or shut down for increased periods of time. Compressor power savings of 20% and more can be achieved while delivering the same amount of cooling.

However, the benefits of higher rack inlet temperatures can be realized only if adequate cooling air volume is delivered to the rack inlets. When adequate air volume is not provided, the racks will meet their cooling needs using ambient air in the data center rather than cooling air flowing from the raised floor perforated tiles. Cooling from ambient air, rather than the supply from the raised floor is known as hot air recirculation. When hot air recirculation occurs, ambient air (typically in part, made up from the hot rack exhaust air) is drawn into the cold aisles from the ends of the racks and over the top of the racks. The result of hot air recirculation will be elevated rack inlet temperatures. Hot air recirculation can produce inlet temperatures in excess of ASHRAE guidelines. When recirculation occurs and produces inlet hot spots, data center operators often respond by sub-cooling large portions of the data center to provide relief for localized heating. This is an expensive and often unnecessary response to the problem. When cooling air flows are inadequate, many schemes to improve energy efficiency, including raised inlet temperatures, will fail. This is because persistent inlet hot spots will frustrate any new operating design. Prior to implementing any program to improve cooling efficiency, adequate cooling air flows must be ensured. How Can We Determine The Presence of Inadequate Cooling Air Volumes? The presence of inadequate cooling air volumes is indicated by two phenomena. The first is the presence of hot air recirculation. The second is the presence of elevated inlet temperatures. We use two methods to identify these air flow deficiencies. The presence of hot air recirculation will result in elevated rack inlet temperatures. We have developed techniques to join, or mosaic thermal infrared images to produce temperature maps for every square inch of rack inlet surface. With these images, under-supply or over-supply of cooling air is readily seen. The infrared images also reveal recirculation. Further, the images permit us to quantify all areas above or below ASHRAE recommended minimum and maximum inlet temperatures. The temperature data can be used to calculate various rack inlet and outlet performance indices. Consider this example: The following photograph shows one row of server racks. Each of 10 racks contains 44 servers for a total of 440 servers.

Now, we can observe infrared mosaic images of two different racks, below. The loads on these racks are similar: Upper racks: 64kW; Lower racks: 69kW. The upper racks have an average inlet temperature of 75.5 o F. The lower racks have an average temperature of 66.8 o F. The upper racks have 26% of inlet temperatures above ASHRAE recommended maximum (80.6 o F) and 1% of inlet temperatures below ASHRAE recommended minimum (64.4 o F). The lower racks have.03% of inlet surfaces above ASHRAE maximum and 31% of inlet surfaces below ASHRAE minimum.

The thermal patterns in the upper row demonstrate that recirculation air is coming over the top of the upper racks, producing elevated inlet temperatures. The thermal patterns on the lower rack show that server cooling requirements are being met by cooling air from the perforated tiles. What accounts for the difference in rack performance? Cooling air supply volumes! The upper racks are receiving 4,878 CFM from the perforated tiles. The lower racks are receiving 5,948 CFM of air from the perforated tiles. The average cooling air temperature (from the perforated tiles) for each set of racks is 61.2 o F. Hot spots and recirculation are eliminated in the lower racks by the provision of sufficient cooling air flows from the perforated tiles. However, this arrangement is still wasting energy: The rack inlet temperatures are substantially cooler than necessary as evidenced by the large portion of overcooled inlets (31% below ASHRAE minimum). Increased supply temperature to the lower rack will enable the same cooling to be accomplished at a lower cost. This will be discussed later. Our second tool for identifying inadequate air flow is our patent pending Data Air Measurement and Mapping (DAMM) cart. This instrument produces a 3 dimensional vector field drawing of the data center that shows air flow direction, speed and temperature. The image below demonstrates the recirculation seen in the upper infrared image. In the image, we see warm air flowing from the hot aisle over the top of the rack at and into the server inlets. The upper row in the DAMM image corresponds to the upper infrared mosaic. All red arrows show air temperatures above 80.6oF. The length of the arrow corresponds to velocity.

How do we achieve uniform inlet temperatures within ASHRAE guidelines? Answer: Achieve the proper supply air flows to meet the cooling requirements of the racks. Then, raise the inlet temperature. This is how it is done. During our data center assessment process, we gather measurements on rack power usage and air flow. We can use this data to calculate how much perforated tile air flow is required to satisfy actual or expected IT loads. Engineering calculations can be performed to determine the air flow required to achieve a specified heat transfer and temperature drop between rack inlets and outlets. However, room configuration, tile layout, CRAC placement and CRAC performance will all influence whether the required amount of air can actually be delivered to the rack inlets. These multiple interactions are too complex for simple engineering calculations. Thus, we begin our analysis with the simple engineering calculations, but then turn to Computational Fluid Dynamics (CFD) supplemented with customized CRAC unit engineering models to determine what can be achieved and the associated energy savings. This process is illustrated below using CFD models. We constructed a small, simple data center containing one row of five racks that consumes 30 kw for data processing. The room contains a raised floor plenum and with five supply grates (one in front of each rack). Air is returned to the CRAC unit via a ceiling plenum return. The CRAC unit is located in an adjacent room. The down flow DX CRAC unit has two compressor stages with 60 kw of capacity. The data center is shown in the image, below.

Example 1: Operation in alarm--in this case, we have a flow of 4,000 CFM with a supply air temperature of 64oF. The image below shows rack inlet temperatures on a four color scale. All red areas exceed ASHRAE recommended maximum. All blue colors are below the ASHRAE recommended minimum. The average inlet temperature is 72.3oF. As is clear from this image, substantial inlet hot spots are present. Cold supply air only reaches low level rack inlets. Upper level server inlets are supplied with a combination of ambient and rack exhaust air. In this case, the data center operators must take action to eliminate their hot spots.

Example 2: Our next case illustrates a typical response to hot spots: dial down air temperature setpoints (reduce the supply air temperature). In the case below, we reduce the supply air temperature until all surface temperatures are below the recommended ASHRAE maximum. This occurs with a supply air temperature of 58oF, a reduction of 6oF from our base case. As we can see, all inlet temperatures are now below the ASHRAE recommended maximum. The average inlet temperature is now 66.1oF. This occurs because the inlet temperatures along the bottom are now so cold that the discharge air temperatures from the racks are all below 80.6. Hot air recirculation is still occurring, however the recirculated air is now colder and hot spots no longer occur. In this case, the compressor COP is 3.3 and total annual CRAC power consumption is 83,012 kwh. Now, let us see what can be done to increase efficiency.

Example 3: Our engineering calculations indicate that this rack requires a cooling air flow of 8,000 CFM. In this example, we provide 8,000 CFM at our base case supply temperature of 64oF. The results are seen in the following image: In this image, we see that increased air flow has eliminated all hot spots. Recirculation is clearly reduced. We also see that nearly all inlet temperatures are running below the ASHRAE recommended minimum temperatures. The average inlet temperature has dropped to 65.8oF. This is not an efficient solution. We are substantially overcooling the servers. The CRAC unit s compressor COP has improved to 4.8, However CRAC blower power requirements have increased from 3,456 kwh to 21,462 kwh. Total operating costs are 76,224 kwh. This is an energy savings of 8.2% from case 2.

Example 4: Now we have sufficient air volume to minimize recirculation. At this point, we can raise the supply air temperature further and realize the benefits of higher compressor COP. This is seen in the image below. In this image, we achieve a supply temperature of 68oF. All inlet temperatures are within recommended ASHRAE limits. The average inlet temperature is 69.9oF. We find that if we raise the supply temperature farther, hot spots will occur. As we see in all of these images, we get elevated temperatures along the outer vertical edges and top center of the server row. This phenomenon occurs because the air pressure on the exhaust side of the rack is always greater than on the supply side of the rack. This means that recirculation cannot be entirely eliminated. However, our efficiency has improved. Compressor COP is now at 5.29 and total power consumption is now 71,086 kwh. This represents an energy savings of 14.4%.

Example 5: In example 4 we found that the air pressure differentials in the room would cause hot spots along the edges of the row if we raised the supply temperature beyond 68oF. In this case, we will install 2 foot deep solid barriers along the top and side edges of the server row. This will serve to mitigate the effects of air pressure imbalances in the room. The barriers are illustrated below. The results are seen below.

In this case, we have raised the supply temperature to 74oF. The average inlet temperature is 74.6oF. This is actually warmer than our initial in alarm example (72.3oF). Any further increase in supply temperature will result in hot spots. However, the compressor COP is now 5.29 and total CRAC power consumption is 68,892 kwh. This is a savings of 17% over example 2. The cost to achieve this improvement is minimal- belts and pulleys for the CRAC (to alter fan speed) and some rigid barriers for the racks. In this example, significant energy savings from raising inlet temperatures were achieved despite a substantial increase in fan power. In systems that already produce adequate air flow, the energy savings from only raising air temperature set points can be considerably larger. The use of free cooling by means of outside air is receiving increased emphasis as an energy savings measure. The cooling air flow optimization achieved in this final example will enable free cooling to be achieved successfully at increased outdoor temperatures. The additional potential savings will be considered in a future paper. Conclusions This paper provides an example of the energy savings that can be readily achieved by means of higher server rack inlet temperatures. The savings can be significant and may be achieved at little cost. However, these savings cannot be safely achieved unless adequate cooling air flow volumes are provided from the perforated tiles. Our analytical approach evaluates available cooling air flows and the rack cooling air requirements. Our optimization service then uses CFD and CRAC unit engineering models to bring air supply and air demand into balance. Once balance is determined, we calculate the achievable setpoints and impacts on energy consumption. Our analytical approach delivers the optimal use of existing capacity to provide the required cooling at the lowest operating cost.