Data Center 2020: Delivering high density in the Data Center; efficiently and reliably



Similar documents
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: first results for energy-optimization at existing data centers

DataCenter Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

- White Paper - Data Centre Cooling. Best Practice

Data Centre Testing and Commissioning

Data Center Temperature Rise During a Cooling System Outage

How to Build a Data Centre Cooling Budget. Ian Cathcart

European Code of Conduct for Data Centre. Presentation of the Awards 2014

Elements of Energy Efficiency in Data Centre Cooling Architecture

Data Center Temperature Rise During a Cooling System Outage

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Reducing the Annual Cost of a Telecommunications Data Center

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Thermal Storage System Provides Emergency Data Center Cooling

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Benefits of Cold Aisle Containment During Cooling Failure

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Great Lakes Data Room Case Study

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Green Data Centre Design

Reducing Data Center Energy Consumption

Managing Data Centre Heat Issues

Rittal Liquid Cooling Series

Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers. Sami Alkharabsheh, Bahgat Sammakia 10/28/2013

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Improving Data Center Energy Efficiency Through Environmental Optimization

News in Data Center Cooling

How To Run A Data Center Efficiently

Data Center Technology: Physical Infrastructure

Impact of Virtualization on Data Center Physical Infrastructure White Paper #27. Tom Brey, IBM Operations Work Group

The New Data Center Cooling Paradigm The Tiered Approach

Airflow Simulation Solves Data Centre Cooling Problem

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Data center upgrade proposal. (phase one)

Leveraging Thermal Storage to Cut the Electricity Bill for Datacenter Cooling

COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION

NetApp Data Center Design

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Thermal Mass Availability for Cooling Data Centers during Power Shutdown

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

Data Center Components Overview

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

University of St Andrews. Energy Efficient Data Centre Cooling

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Measure Server delta- T using AUDIT- BUDDY

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Analysis of data centre cooling energy efficiency

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Chiller-less Facilities: They May Be Closer Than You Think

An Introduction to Cold Aisle Containment Systems in the Data Centre

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Data Center Equipment Power Trends

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Design Best Practices for Data Centers

Choosing Close-Coupled IT Cooling Solutions

Specialty Environment Design Mission Critical Facilities

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Benefits of. Air Flow Management. Data Center

AIRAH Presentation April 30 th 2014

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

BRUNS-PAK Presents MARK S. EVANKO, Principal

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

Optimization of energy consumption in HPC centers: Energy Efficiency Project of Castile and Leon Supercomputing Center - FCSCL

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Center Thermal Management Can Live Together

White Paper Rack climate control in data centres

Thinking Outside the Box Server Design for Data Center Optimization

European Code of Conduct for Data Centre. Presentation of the Awards 2016

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

How To Test A Theory Of Power Supply

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

How To Improve Energy Efficiency Through Raising Inlet Temperatures

abstract about the GREEn GRiD

Rate of Heating Analysis of Data Centers during Power Shutdown

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

National Grid Your Partner in Energy Solutions

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

The State of Data Center Cooling

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

Server Room Thermal Assessment

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Transcription:

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review: Results of the Optimization Phase The researchers at T-Systems and Intel were able to show reductions in energy consumption in the first optimization phase of the data center. The measures were straightforward and at a reasonable cost. The improved energy efficiency is mainly due to two effects: 1. The airflow was optimized thru a strict separation of hot and cold air using a cold-aisle containment and reduced leakage from under the raised floor. This allows a reduction of fan speed (with an even greater reduction in fan energy) in the room-level air handling units. 2. The increase of supply air temperature to the raised floor (T1) and cold aisle containment; allowing a simultaneous increase in chilled water temperature reduced the amount of time requiring chiller operation and increased the time the data center could operate in a free-cooling mode. The PUE was optimized by the team while keeping the server inlet air temperature well below the ASHRAE recommended IT inlet temperature (TR) of 27 C. With these measures, the Data Center 2020 researchers were able to reduce PUE from 1.8 to 1.4 and still maintain a computer inlet temperature of 22 C. Power Usage Effectiveness (PUE) measures the efficiency of the energy used in the data center. It is a ratio of total energy used in the data center to the energy used by the IT equipment. Energy consumption at data centers can be reduced using methods that are easy to effect.

The following two figures illustrate these changes. Air short circuit T 2 =24 C Cold aisle T 2 =38 C Cold aisle T=6K (Q=c v x m x dt) T=17K ( Q=c v x m x dt) T R =24 C T R =22 C CRAC Unit CRAC Unit Fan 100% Fan 30% Leakage Leakage Raised floor T 1 = 18 C p=16pa Raised floor T 1 = 21 C p=4 pa Figure 1 shows the initial temperature values (typical in standard data centers today) with air leakage and no strict separation of cold and hot air. The three key temperatures are: T1 = Supply to the raised floor (18 C in the initial configuration TR = IT inlet temperature (initially 22 C) T2 = CRAC return air temperature (initially 24 C) The fan speed in the room cooling unit (CRAC in figure 1) is at 100% and it s ΔT (return minus supply temperature) is at 6 C. Figure 1 shows impressive improvements when the raised floor leakage is corrected and the cold aisle containment is added. The IT inlet temperature (TR) remains constant at 22 C. The sealing of the raised floor and the clear separation of cold and warm air results in an increase in pressure under the raised floor as well as an increased ΔT. This allows a reduced airflow volume and reduced fan speeds. The reduced airflow absorbs more heat, increasing the return airflow temperature (T2). The return air temperature went from 24 C to 38 C. In addition, the supply air temperature under the raised floor (T1) can be increased from 18 C to 21 C. The room air handler (CRAC in Fig 2) consumes far less energy with the fan speed of 30% (initially 100%) After this initial optimization phase the fan motor uses roughly 90% less energy. Combined with the CRAC ΔT being raised from 6 to 17 C, the fan cooling unit operates much more efficiently than before the changes.

Next Steps: Increasing the energy density to 22 kw/rack In the first optimization phase, rack density began at 5 kw/rack and was increased to 10 kw/rack. This caused the total data center IT load to go from 40kW to 80kW, doubling the total connected load. This also allowed the PUE to be improved. This increased rack density contributed to the overall PUE improvement of 30%. To understand the scalability of the data center infrastructure to handle higher rack densities and to verify that the partial load range retained the scalable same level of efficiency, the team increased densities to 22 kw/rack. The IT inlet temperatures (TR) remained constant at 22 C. The team chose two scenarios: In the first scenario a single room cooling unit was operated with a chilled water temperature of 8 C. With the 22kW rack density the PUE was reduced to 1.32. In the second scenario two room cooling units were operated with a chilled water temperature of 16 C and correspondingly reduced fan speed to cool the data center with a rack density of 10kW. Reducing the airflow to half (50%) from each unit will also reduce the energy consumption. From the fan affinity laws, running two fans at ½ speed will only require ¼ of the total power to run one fan at full speed. The warmer chilled water temperature also reduces the use of the chiller, allowing more hours of the indirect free cooling, further reducing the total energy consumption. This allowed the DC2020 team to reduce the PUE even further to 1.23. In conclusion, energy densities of 22 kw/rack can be reached with standard techniques. As the density increases the overall result is a flattening of the PUE curve, asymptotically approaching a limit. The measured densities of 22 kw/rack are rarely achieved in reality and only in some specific cases (HPC or high density bladed applications). Therefore, the more common lower thermal loads routinely dealt with in existing data centers will result in slightly higher PUEs. However, in the construction or design of a new data center high densities and optimum conditions can be built in. Energy Density and Availability The optimization of the data center moving towards higher density must also include considerations of high availability and reliability for the server. The data center operator must always balance the benefits of higher density with the risk of a shorter response time in the event of a cooling system failure. For the most critical data centers; energy efficiency must not take away from high availability. To gain information on the optimal balance the Data- Center 2020 test lab was operated in various failure modes (loss of chiller, pumps, and room cooling unit fans). The tests included different energy densities and spatial configurations and the impact of different space temperatures during the failure. Method: The server inlet temperature (TR) in the DataCenter 2020 was controlled to 22 C. Today servers generally have a maximum inlet temperature (TR) of 35 C. Below this temperature the manufactures will warranty the reliability of the server. The team then simulated a power failure with the result of the entire cooling system being offline. The servers continued to run on the UPS, releasing heat to the space. As a result the server inlet temperatures (TR) cycled up. PUE 1,50 1,45 1,40 1,35 1,30 1,25 1,20 1,15 1,10 1,05 1,00 5,5 7,5 10 12 15 18,4 22 Density [kw/rack] Generally, the cooling equipment will be on emergency generators which will supply electricity to the data center. After a shutdown chillers take some time to reset and then time for the entire data center to be able to return to appropriate operation temperatures. The red line in Figure 4 shows the critical period (about 5 minutes), after which time it may be expected that servers may fail or shutdown due to overheating (>35 C inlet temperatures). PUE, 8 C Water-Supply 1 CRAC PUE, 16 C Water-Supply 2 CRAC 10kW/rack

Response Time for 35 C Cold-Aisle Temperature Figure 4 00:20:10 00:17:17 00:14:24 00:11:31 00:08:38 00:05:46 00:02:53 00:00:00 Status Q Step 1 Step 2 Step 3 Cold-Aisle Enclosure Room Configuration 5,5 kw/rack 9,8 kw/rack 17,5 kw/rack The measurements showed that with reduced leakage and cold aisle containment (to ensure separation of the warm and cold air), the energy density or IT load per rack can be roughly three times that as the standard data center and have the same reliability. In this specific example, the data center operations at 17.5 kw/rack could run longer without the cooling system returning than a standard data center with 5.5 kw racks. Thus, a relatively inexpensive cold aisle containment supporting the 17.5 kw/rack could replace the need for the comparatively expensive UPS for some cooling outages (obviously the enclosure and UPS are not an even trade-off in all aspects). In this case the enclosure allowed three times the duration more than the hotaisle/cold-aisle configuration before the 35 C limit was reached. Influence of Ceiling Height T-Systems and Intel have also discussed in the literature the effect ceiling height may have on room temperatures in the event of cooling system failure. The popular theory is that the higher ceiling height increases the cold air volume, slowing the rate of the temperature increase. This would allow more time before IT system shutdowns due to high temperatures. Response Time for 35 C Cold-Aisle Temperature 00:20:10 00:17:17 00:14:24 00:11:31 00:08:38 00:05:46 00:02:53 00:00:00 Roomheight 3,7 m Roomheight 2,7 m 5,5 kw/rack Status Q Step 1 Step 2 Step 3 Cold-Aisle Enclosure Room Configuration As Figure 5 shows the higher ceiling had some beneficial affect, but not as significant as was expected. This can only be confirmed as a secod order effect, with a 40% larger volume not providing a correspondingly longer reaction time. At higher rack densities the effect is even less, with the curves closer together. In the case of cold aisle containment, the ceiling height has no significant impact. It remains to be seen what the results are when the DataCenter2020 ir reconfigured as hot-aisle containment. x3 Conclusion and Outlook Our second phase of research in the DataCenter 2020 has shown that energy densities of greater than 20 kw/rack can be obtained reliably with available standard air cooling technology. Methodical application of the first White Paper provided a means to achieve significant optimizations. Now, with the cold aisle containment we see the ability to reach higher densities and have a better response in various failure scenarios. In the Datacenter 2020, the researchers will repeat the above measurements in the next few weeks for a hot-aisle containment to see how the results compare with the cold-aisle containment. Furthermore, they will focus on the IT capability / energy density and its effect on data center total energy consumption. Total energy consumption is just as important as PUE. PUE is not the only indicator of improved energy efficiency in data centers. Interestingly the PUE value will rise when the IT is more efficient in itself, and consumes less power. For this reason the researchers are also considering the IT performance as a measure of efficiency. For further informationen: www.datacenter2020.com Copyright 2010 Intel Corporation and T-Systems International GmbH All rights reserved. Powered by