Airflow Simulation Solves Data Centre Cooling Problem



Similar documents
APC APPLICATION NOTE #92

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Strategies for Deploying Blade Servers in Existing Data Centers

APC APPLICATION NOTE #112

Data Center Components Overview

Data center upgrade proposal. (phase one)

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Great Lakes Data Room Case Study

Data Center Power Consumption

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Benefits of. Air Flow Management. Data Center

Managing Data Centre Heat Issues

Benefits of Cold Aisle Containment During Cooling Failure

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

How Row-based Data Center Cooling Works

Server Room Thermal Assessment

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Power and Cooling for Ultra-High Density Racks and Blade Servers

How To Improve Energy Efficiency Through Raising Inlet Temperatures

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Using CFD for optimal thermal management and cooling design in data centers

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

- White Paper - Data Centre Cooling. Best Practice

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

How To Improve Energy Efficiency In A Data Center

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Cheat Sheets to Data Center Infrastructure Management

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Using Simulation to Improve Data Center Efficiency

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Data Center Technology: Physical Infrastructure

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Data centers have been cooled for many years by delivering

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Data Center Cooling: Fend Off The Phantom Meltdown Of Mass Destruction. 670 Deer Road n Cherry Hill, NJ n n

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Measure Server delta- T using AUDIT- BUDDY

Application of CFD modelling to the Design of Modern Data Centres

Choosing Close-Coupled IT Cooling Solutions

Rittal Liquid Cooling Series

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Small Data / Telecommunications Room on Slab Floor

Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems

Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Element D Services Heating, Ventilating, and Air Conditioning

Data Center Cooling A guide from: Your partners in UPS Monitoring Systems

Computational Fluid Dynamics (CFD) Modeling Service

Thermal Management of Datacenter

Data Centre Infrastructure Assessment

abstract about the GREEn GRiD

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Energy-efficient & scalable data center infrastructure design

Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

How To Run A Data Center Efficiently

Energy Management Services

Driving Data Center Efficiency Through the Adoption of Best Practices

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

The Advantages of Row and Rack- Oriented Cooling Architectures for Data Centers

Impact of Virtualization on Data Center Physical Infrastructure White Paper #27. Tom Brey, IBM Operations Work Group

Airflow Management Solutions

Thinking Outside the Box Server Design for Data Center Optimization

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Managing Cooling Capacity & Redundancy In Data Centers Today

Specialty Environment Design Mission Critical Facilities

Rack Hygiene. Data Center White Paper. Executive Summary

Statement Of Work Professional Services

Reducing Data Center Energy Consumption

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Data Center Equipment Power Trends

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Predictive Data Center Infrastructure Management (DCIM)

Transcription:

Airflow Simulation Solves Data Centre Cooling Problem The owner s initial design for a data centre in China utilized 40 equipment racks filled with blade servers spread out in three rows along the length of a 47 foot by 32 foot room. The owners contacted American Power Conversion (APC), West Kingston, Rhode Island, to ask for a quote on computer room air conditioners (CRACs) to cool the data centre. Because the layout utilized a high number of servers in a relatively limited amount of space, APC recommended that instead of using a raised floor to distribute cooling air that the computer room air conditioners (CRACs) be located in the rows of equipment racks to improve cooling efficiency. While providing the quote, APC engineers simulated the initial design and discovered that the failure of a CRAC would cause the temperatures to rise above 90 F, close to the point at which equipment would begin shutting down. APC engineers simulated a number of different designs and determined that adding one more row of equipment and one more CRAC would maintain the room at safe temperatures even after the failure of a CRAC.

The owner recognized from the beginning the potential for problems in cooling their new data centre. The data centre made extensive use of blade servers, which greatly increase the amount of computing power that can be packed into a given amount of space but at the same time generate much more heat than traditional servers. A standard server cabinet dissipates on the order of 2-3 kilowatts of power while vendors are now designing blade servers that can demand over 20 kw of cooling per rack. The initial design utilized the traditional raised floor approach for distributing cooling air to the data centre. The drawback of the raised floor approach is that the sources of cooling air are located far from the equipment that needs to be cooled. This creates inevitable inefficiencies in moving the air to where it is needed and creates the possibility that significant amounts of cool air will never even reach the servers that require cooling. The simplest way to address this challenge is to add more airconditioning capacity but this is expensive and usually does not solve the problem, which is one of air distribution. Improving on the initial design Ben Steinberg, senior applications engineer for APC, who was assigned to create the proposal, felt that the initial design could be improved by eliminating the raised floor and deploying CRACs within the rows of equipment. The advantage of this approach is that it provides the potential to deliver the cooling air to where it is needed with much smaller losses. Steinberg used hand calculations to determine that one of the company s NetworkAIR IR 40 kw in-row precision air conditioning units positioned in each row should be able to handle the data centre cooling requirements. But Steinberg was far from done. In this type of application, the biggest challenge is usually ensuring that the data centre will continue to operate despite losing an air conditioning unit. It s almost impossible to determine if redundancy truly exists by just looking at the design, so I made the decision to use computational fluid dynamics

(CFD) to simulate the heat generation, air flow, heat removal in the room, Steinberg said. CFD can calculate and graphically illustrate complete airflow patterns, including velocities and distributions of variables such as pressure and temperature. As part of the analysis, the user may change the layout of the building or the operating conditions, and observe the effect of the changes on the airflow patterns and temperature distribution, and how they impact the cooling performance. Engineers are able to quickly evaluate the performance of alternative equipment configurations. APC uses FloVENT CFD software from Mentor Graphics to analyze and optimize data centre cooling configurations. FloVENT is designed specifically for modeling heating and cooling applications so it is both easier to use and more powerful than general purpose CFD codes when evaluating data centre cooling, Steinberg said. FloVENT also has a team of support engineers that provide excellent support because they have a very good understanding of data centre cooling issues. Simulating cooling performance of the data centre Steinberg worked with the computer-aided design drawing of the data centre provided by the owner. His basic approach was to configure the four aisles created by the three rows of equipment as alternating hot and cold aisles.starting from one of the 47-foot walls, he positioned the servers and CRACs to make the successive aisles cold, hot, cold, and hot. Steinberg positioned the rack-mounted servers so that the rear of the servers blow hot air into hot aisles while the front of the servers draw in cool air from the cold aisles. He put one CRAC in each row positioned so that it draws in hot air from a hot aisle and emits cool air to a cold

The original design provided by the customer aisle. This approach optimizes cooling efficiency by minimizing the ability for the hot air to mix with the cold. He constructed a box representing the room and created cubes representing each rack of equipment. The owner provided the make and model number of each server and Steinberg obtained technical specifications from the manufacturer s web site to determine their power consumption and airflow. He modeled the cooling units by entering their airflow rate and inlet and outlet temperatures. Steinberg then ran a steady state analysis of the data centre with all CRAC units operating. The CFD simulation provided the temperatures, airflows, and pressures at all areas in the room. This information not only made it possible to determine the cooling performance of the design but also provided information that helped understand the reasons behind the design s performance. As expected, the analysis showed that the CRAC units were easily able to cool the data centre. Then Steinberg moved on to the more challenging part of the analysis. He removed the CRACs from the model one after another and reran the simulation. The results showed that the design was able to maintain acceptable temperatures with the CRAC in row 1 or row 2 out of operation because whichever unit was still in operation was able to keep hot aisle 2 cool. On the other hand, when the CRAC in row 3 was removed from the simulation, temperatures quickly reached the unacceptable 90 F level, primarily because there was no CRAC unit to remove heat from hot aisle 4.

Developing a new design that solved the problem A new design developed by APC Steinberg went back to the customer and showed them the problem with the original design. The customer was very happy that APC had gone beyond simply responding to their request for proposal and used simulation to evaluate whether or not the proposed solution would actually provide reliable computing performance. The data centre owner s executives stated that its critical nature made it essential to maintain redundancy. Steinberg suggested adding a fourth row of equipment with a fourth CRAC unit. He pointed out that the fourth row would provide each hot aisle with one CRAC unit on each side of the aisle, providing redundancy in case one unit failed. He also mentioned that the fourth row would make it possible to easily expand the data centre in the future without changing the cooling configuration. He further recommended the use of blanking panels for unused vertical space in the rack enclosures to prevent hot server exhaust from taking a shortcut back to the equipment intake.

In order to confirm this assumption, Steinberg modified the FloVENT model to add the fourth row of equipment and the fourth CRAC. He simulated the new design with all CRAC units operating and then once with each one of the CRAC units removed from the simulation. The results showed that regardless of which CRAC was removed, temperatures remained at safe levels of between 75 F and 80 F throughout the data centre. Steinberg showed these results to the owner and the owner made the decision to go with the new design and purchase the four CRACs from APC. It would have been extremely costly to install the equipment in the data centre, run tests, and then discover that it would not properly cool the servers in the event of an equipment failure, Steinberg said. But it would have been far more costly for the customer to have its data centre go down because of cooling problems. This helps explain why we run simulations for many of our proposals. Simulation provides a fast, relatively inexpensive, and accurate method of evaluating data centre cooling performance without going to the time and expense required to actually install and test the equipment.