How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems



Similar documents
Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions Sponsored by:

DC Pro Assessment Tools. November 14, 2010

Managing Data Centre Heat Issues

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Environmental Data Center Management and Monitoring

Data Center Energy Use, Metrics and Rating Systems

Specialty Environment Design Mission Critical Facilities

Modeling and Simulation of HVAC Faulty Operations and Performance Degradation due to Maintenance Issues

Re Engineering to a "Green" Data Center, with Measurable ROI

FEAR Model - Review Cell Module Block

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Data Center Leadership In the DOE Better Buildings Challenge

Data Center Energy Profiler Questions Checklist

Data Center Energy Efficiency. SC07 Birds of a Feather November, 2007 Bill Tschudi wftschudi@lbl.gov

DataCenter 2020: first results for energy-optimization at existing data centers

DOE Data Center Air-Management (AM) Tool: Data Collection Guide. Version 1.18 (November 24, 2014)

Green Data Centre Design

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

Chiller-less Facilities: They May Be Closer Than You Think

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Data Center Environments

DOE Data Center Energy Efficiency Program

Lawrence Berkeley National Laboratory Lawrence Berkeley National Laboratory

Top-Level Energy and Environmental Dashboard for Data Center Monitoring

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

Improving Data Center Efficiency with Rack or Row Cooling Devices:

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Increasing Energ y Efficiency In Data Centers

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Data Centre Cooling Air Performance Metrics

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

Energy Efficient High-tech Buildings

Reducing Data Center Energy Consumption

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Improving Data Center Energy Efficiency Through Environmental Optimization

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

How To Improve Energy Efficiency In A Data Center

How To Find A Sweet Spot Operating Temperature For A Data Center

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings

Guide to Minimizing Compressor-based Cooling in Data Centers

Top 5 Trends in Data Center Energy Efficiency

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

Harmonizing Global Metrics for Data Center Energy Efficiency

Energy Performance Optimization of Server Room HVAC System

Energy Efficiency and Availability Management in Consolidated Data Centers

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER

Energy Impact of Increased Server Inlet Temperature

THE GREEN GRID DATA CENTER POWER EFFICIENCY METRICS: PUE AND DCiE

AbSTRACT INTRODUCTION background

Energy Efficiency and Effective Equipment Cooling In Data Centers. Presentation at ASHRAE Summer Meeting Quebec City, Canada, June 27, 2006

Best Practices in HVAC Design/Retrofit

Greening Data Centers

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Benchmarking results from Indian data centers

Reducing the Annual Cost of a Telecommunications Data Center

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Dealing with Thermal Issues in Data Center Universal Aisle Containment

SURVEY RESULTS: DATA CENTER ECONOMIZER USE

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Using Simulation to Improve Data Center Efficiency

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Data Center Technology: Physical Infrastructure

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

Benefits of. Air Flow Management. Data Center

AIRAH Presentation April 30 th 2014

San Francisco Chapter. Deborah Grove, Principal Grove-Associates

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Technical Support for ENERGY STAR Windows Version 6.0 Specification Revision. Gregory K. Homan, Christian Kohler, Emily Zachery* and Dariush Arasteh

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

Transcription:

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer Srirupa Ganguly, Research Associate Dale Sartor, P.E., Applications Team Leader William Tschudi, P.E., Program manager Environmental Energy Technologies Division Lawrence Berkeley National Laboratory April 2009

Disclaimer This document was prepared as an account of work sponsored by the United States Government. While this document is believed to contain correct information, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by its trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof, or The Regents of the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof or The Regents of the University of California. Acknowledgements This work was supported by the New York State Energy Research and Development Authority (NYSERDA) and the Assistant Secretary for Energy Efficiency and Renewable Energy, Building Technologies Program, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. 2

HPAC Magazine FINAL VERSION ACCEPTED FOR PUBLICATION How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems 1 Introduction Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer Srirupa Ganguly, Research Associate Dale Sartor, P.E., Applications Team Leader William Tschudi, P.E., Program manager Lawrence Berkeley National Laboratory Data centers are among the most energy intensive types of facilities, and they are growing dramatically in terms of size and intensity [EPA 2007]. As a result, in the last few years there has been increasing interest from stakeholders - ranging from data center managers to policy makers - to improve the energy efficiency of data centers, and there are several industry and government organizations that have developed tools, guidelines, and training programs. There are many opportunities to reduce energy use in data centers and benchmarking studies reveal a wide range of efficiency practices. Data center operators may not be aware of how efficient their facility may be relative to their peers, even for the same levels of service. Benchmarking is an effective way to compare one facility to another, and also to track the performance of a given facility over time. Toward that end, this article presents the key metrics that facility managers can use to assess, track, and manage the efficiency of the infrastructure systems in data centers, and thereby identify potential efficiency actions. Most of the benchmarking data presented in this article are drawn from the data center benchmarking database at Lawrence Berkeley National Laboratory (LBNL). The database was developed from studies commissioned by the California Energy Commission, Pacific Gas and Electric Co., the U.S. Department of Energy and the New York State Energy Research and Development Authority. 2 Data Center Infrastructure Efficiency (DCIE) This metric is the ratio of the IT equipment energy use to the total data center energy use and can be calculated for annual site energy, annual source energy, or electrical power. DCIE site = IT site energy use / Total site energy use DCIE source = IT source energy use/ Total source energy use DCIE elecpower = IT electrical power / Total electrical power Note that the data center energy use is the sum of all energy used by data center, including campus chilled water and steam if present. The DC Pro tool [DOE 2008] can be used to assess DCIE for site and source energy. DCIE provides an overall measure of the infrastructure efficiency i.e. lower values relative to the peer group suggest higher potential to improve the efficiency of the infrastructure 1

systems (HVAC, power distribution, lights) and vice versa. Note that it is not a measure of IT efficiency. Therefore a data center that has a high DCIE may still have major opportunities to reduce overall energy use through IT efficiency measures such as virtualization. It should also be noted that DCIE is influenced by climate and tier level, and therefore the potential to improve it may be limited by these factors. Some data center professionals prefer to use the inverse of DCIE, known as Power Utilization Effectiveness (PUE), but both metrics serve the same purpose. The benchmarking data in the LBNL database show a DCIE elecpower range from just over 0.3 to 0.75. Some data centers are capable of achieving 0.9 or higher [Greenberg et al. 2009]. 0.80 Data Center Infrastructure Efficiency 0.70 0.60 IT power/ Total power 0.50 0.40 0.30 0.20 0.10 0.00 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 Data center ID Figure 1. Data center infrastructure efficiency for data centers in the LBNL database. Note that these DCiE values are based on power, not energy. 3 Temperature and Humidity Ranges The American Society of Heating, Refrigerating and Air-conditioning Engineers (ASHRAE) guidelines [ASHRAE 2008] provide a range of allowable and recommended supply temperatures and humidity at the inlet to the IT equipment. The recommended temperature range is between 64F-80F, while the allowable is 59F-90F. A low supply air temperature and a small temperature differential between supply and return typically indicate the opportunity to improve air management, raise supply air temperature and thereby reduce energy use. Strategies to improve air management include better isolation between cold and hot aisles using blanking panels and strip curtains, optimizing configuration of supply diffusers and return grilles, better cable management, blocking gaps in floor tiles, etc. 2

90 Temperature (F) 80 70 60 50 ASHRAE Recommended Range 40 30 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Data center ID Temp setpoint (return) Actual Supply Temp Actual Return Temp Figure 2. Return air temperature setpoints, measured supply and return temperature for data centers in the LBNL database show that many data centers are operated at lower temperatures than required. The recommended humidity range is between a lower end defined as a minimum dew point of 42F and the upper end set at 60% relative humidity and 59F dewpoint. The allowable relative humidity range is between 20%-80% and 63F maximum dewpoint. A small, tightly controlled, relative humidity range suggests opportunities to reduce energy use, especially if there is active humidification and dehumidification. Centralized active control of the humidification units reduces conflicting operations between individual units, thereby improving the energy efficiency and capacity. 3

80 Relative Humidity (%) 70 60 50 40 30 20 10 ASHRAE Recommended Upper Limit ASHRAE Recommended Lower Limit = 42F min dew point Approximately equivalent to 25% to 45% RH 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Data center ID RH Setpoint (return) Actual Supply RH Actual Return RH Figure 3. Return air relative humidity setpoints, measured supply and return relative humidity for data centers in the LBNL database Since temperature and humidity affect the reliability and life of IT equipment, any changes to the air management and temperature and humidity settings should be evaluated with metrics such as the Rack Cooling Index (RCI) [Herrlin 2005], which can be used to assess the thermal health of the IT equipment. Many data centers operate well without active humidity control. While humidity control is important for physical media such as tape storage, it is generally not critical for the rest of the data center equipment. Studies by LBNL and the Electrostatic Discharge Association suggest that humidity may not need to be as tightly controlled. 4 Return Temperature Index This metric is a measure of the energy performance resulting from air management [Herrlin 2007]. The primary purpose of improving air management is to isolate hot and cold airstreams. This allows elevating both the supply and return temperatures and maximizes the difference between them while keeping the inlet temperatures within ASHRAE recommendations. It also allows reduction of the system air flow rate. This strategy allows the HVAC equipment to operate more efficiently. The return temperature index (RTI) is ideal at 100% wherein the return air temperature is the same as the temperature leaving the IT equipment. RTI is also a measure of the excess or deficit of supply air to the server equipment. An RTI value of 100% is ideal. An RTI value of less than 100% indicates that the some of the supply air is by-passing the racks, and a value greater than 100% indicates that there is recirculation of air from the hot aisle. The RTI value can be close to ideal (100%) by improving air management. 4

Return Temperature Index (RTI) % 60 70 80 90 100 110 120 130 140 Potential Efficiency Opportunity Med/High Low Med/High Figure 4. Benchmarks for Return Temperature Index 5 UPS load factor This metric is the ratio of the load of the uninterruptible power supply (UPS) to the design value of its capacity. This provides a measure of the UPS system over-sizing and redundancy. A higher value of this metric indicates a more efficient system. UPS load factors below 0.5 may indicate an opportunity for efficiency improvements, although the extent of the opportunity is highly dependent on the required redundancy level. The load factor can be improved by several means, including the following: Shutdown some UPS modules when Redundancy Level exceeds N+1 or 2N Install a scalable/modular UPS Install a smaller UPS size to fit present load capacity Transfer loads between UPS modules to maximize load factor % per active UPS 1.00 UPS Load Factor 0.90 0.80 0.70 UPS Load Factor 0.60 0.50 0.40 0.30 0.20 0.10 0.00 1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526 Data center ID Figure 5. UPS load factor for data centers in the LBNL database 5

6 UPS System Efficiency This metric is the ratio of the UPS input power to the UPS output power. The UPS efficiency varies depending on its load factor and therefore the benchmark for this metric depends on the load factor of the UPS system. At UPS load factors below 40% the system can be highly inefficient due to no load losses. Figure 6 shows the range of UPS efficiencies from factory measurements of different topologies. Figure 7 shows the UPS efficiencies for data centers in the LBNL database. These measurements taken several years ago illustrate that efficiencies vary considerably. Manufacturers claim that improved efficiencies are available today. When selecting UPS systems, it is important to evaluate performance over the expected loading range. Selection of more efficient UPS systems, especially the ones that perform well at expected load factors (e.g. below 40%) improves energy savings. For non-critical IT work bypassing the UPS system using factory supplied hardware and controls may be an option. Reducing the level of redundancy by using modular UPS systems also improves the efficiency. Figure 6. Range of UPS system efficiencies for factory measurements of different topologies 6

100 UPS System Efficiency 90 80 UPS System Efficiency % 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 Data center ID Figure 7. UPS efficiency for data centers in the LBNL database 7 Cooling system efficiency The key metrics and benchmarks to evaluate the efficiency of cooling systems in data centers are no different than those typically used in other commercial buildings. These include chiller plant efficiency (kw/ton), pumping efficiency (hp/gpm), etc. Since these are well-documented elsewhere, they are not further discussed here. Figure 8 shows the cooling plant efficiency for LBNL benchmarked data centers. Based on data from the LBNL database, 0.8 kw/ton could be considered as a good practice benchmark and 0.6 kw/ton as a better practice benchmark. 7

1.80 Chiller Plant Efficiency 1.60 1.40 1.20 Plant kw/ton 1.00 0.80 0.60 0.40 0.20 0.00 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 Data Center ID Figure 8. Cooling plant efficiency for LBNL benchmarked data centers 8 Air economizer utilization This metric characterizes the extent to which air-side economizer system is being used to provide free cooling. It is defined as the percentage of hours in a year that the economizer system can be in full or complete operation (i.e. without any cooling being provided by the chiller plant). The number of hours that the air economizer is being utilized could be compared to the maximum possible for the climate in which the data center is located. This can be determined from simulation analysis. As a point of reference, Figure 9 shows results from simulation analysis for four different climate conditions. The Green Grid has developed a tool to estimate savings from air- and water-side free cooling [TGG 2009], though the assumptions are different than those used for the results presented in Figure 9. 8

Figure 9. Simulated air-side economizer utilization potential for four different locations. Data source: Syska Hennessy 2007 9 Summary and Outlook for Productivity Metrics This article presented key metrics that data center operators can use to track the efficiency of their infrastructure systems. The system level metrics in particular allow operators to help identify potential efficiency actions. The article also presented data from the LBNL benchmarking database which shows that there is a wide range of efficiency across the surveyed datacenters. DCIE is gaining increasing acceptance as a metric for overall infrastructure efficiency, and can be computed in terms of site energy, source energy or electrical load. However, DCIE does not address the efficiency of IT equipment. Organizations such as the Green Grid are working to develop productivity metrics (e.g. Data Center energy Productivity (DCeP)) which will characterize the work done per unit of energy. The challenge is to categorize the different kinds of work done in a data center and identify appropriate ways to measure them. As they become available, they will complement the infrastructure metrics described in this article. 10 References ASHRAE 2008. 2008 ASHRAE Environmental Guidelines for Datacom Equipment - Expanding the Recommended Environmental Envelope. ASHRAE Datacom series. American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. 9

DOE 2008. Data Center Profiling tool DC Pro http://dcpro.ppc.com/ EPA 2007. Report to Congress on Server and Data Center Energy Efficiency Public Law 109-43. U.S. Environmental Protection Agency ENERGY STAR Program. Greenberg, S., Khanna, A., and Tschudi, W. 2009. "High Performance Computing with High Efficiency" ASHRAE Transactions TRNS-00232-2008. In press. To be presented at the ASHRAE Annual Conference, Louisville KY, June 2009. Herrlin, M.K. 2005. Rack cooling effectiveness in data centers and telecom central offices: The rack cooling index (RCI). ASHRAE Transactions 111(2):725-731 Herrlin, M.K. 2007. Improved Data Center Energy Efficiency and Thermal Performance by Advanced Airflow Analysis. Digital Power Forum, 2007. San Francisco, CA, September 10-12, 2007. http://www.ancis.us/publications.html Syska Hennessy 2007. The Use of Outside Air Economizers In Data Center Environments. White paper 7. Syska Hennessy Group. http://www.syska.com/critical/knowledge/wp/wp_outsideair.asp TGG 2009. Free cooling estimated savings calculation tool. The Green Grid. http://cooling.thegreengrid.org/calc_index.html. 10