Data Center Airflow Management Retrofit



Similar documents
Wireless Sensors Improve Data Center Energy Efficiency

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Improving Data Center Efficiency with Rack or Row Cooling Devices:

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

How To Run A Data Center Efficiently

Benefits of. Air Flow Management. Data Center

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Specialty Environment Design Mission Critical Facilities

Managing Cooling Capacity & Redundancy In Data Centers Today

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Managing Data Centre Heat Issues

Element D Services Heating, Ventilating, and Air Conditioning

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

- White Paper - Data Centre Cooling. Best Practice

Choosing Close-Coupled IT Cooling Solutions

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Data Center Power Consumption

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Variable-Speed Fan Retrofits for Computer-Room Air Conditioners

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Data Center Components Overview

National Grid Your Partner in Energy Solutions

Power and Cooling for Ultra-High Density Racks and Blade Servers

Green Data Centre Design

Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers

Improving Data Center Energy Efficiency Through Environmental Optimization

Center Thermal Management Can Live Together

Rack Hygiene. Data Center White Paper. Executive Summary

How To Improve Energy Efficiency In A Data Center

Data Center Energy Profiler Questions Checklist

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

Reducing Data Center Energy Consumption

APC APPLICATION NOTE #92

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Data center upgrade proposal. (phase one)

APC APPLICATION NOTE #112

Using CFD for optimal thermal management and cooling design in data centers

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

Re Engineering to a "Green" Data Center, with Measurable ROI

Data Center Leadership In the DOE Better Buildings Challenge

DC Pro Assessment Tools. November 14, 2010

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Re-examining the Suitability of the Raised Floor for Data Center Applications

Statement Of Work Professional Services

Analysis of the UNH Data Center Using CFD Modeling

The New Data Center Cooling Paradigm The Tiered Approach

Thermal Optimisation in the Data Centre

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Strategies for Deploying Blade Servers in Existing Data Centers

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Using Simulation to Improve Data Center Efficiency

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

Statement Of Work. Data Center Power and Cooling Assessment. Service. Professional Services. Table of Contents. 1.0 Executive Summary

Measure Server delta- T using AUDIT- BUDDY

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Energy Management Services

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Guide to Minimizing Compressor-based Cooling in Data Centers

The Advantages of Row and Rack- Oriented Cooling Architectures for Data Centers

Rittal Liquid Cooling Series

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Energy Efficient High-tech Buildings

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Greening Commercial Data Centres

BRUNS-PAK Presents MARK S. EVANKO, Principal

Blade Server & Data Room Cooling Specialists

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Great Lakes Data Room Case Study

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Colocation and Cloud Hosting Solutions. Build a strong and efficient physical infrastructure that delivers service reliability and profitability

Transcription:

FEDERAL ENERGY MANAGEMENT PROGRAM Data Center Airflow Management Retrofit Technology Case Study Bulletin: September 2010 Figure 1 Figure 2 Figure 1: Data center CFD model of return airflow short circuit (Courtesy: ANCIS) Figure 2: Data center CFD model - airflow cross section (Courtesy: ANCIS) As data center energy densities, measured in power-use per square foot, increase, energy savings for cooling can be realized by optimizing airflow pathways within the data center. Due to constraints from under-floor dimensions, obstructions, and leakage, this is especially important in existing data centers with typical under-floor air distribution. Fortunately, airflow-capacity can be improved significantly in most data centers, as described below in the airflow management overview. In addition, a generalized air management approach is provided listing measures to improve data center airflow. This case study bulletin presents air management improvements that were retrofitted in an older legacy data center at Lawrence Berkeley National Laboratory (LBNL). Particular airflow improvements, performance results, and benefits are reviewed. Finally, a series of lessons learned gained during the retrofit project at LBNL is presented. Airflow Management Overview Airflow retrofits can increase data center energy efficiency by freeing up stranded airflow and cooling capacity and make it available for future needs. Effective implementation requires information technologies (IT) staff, in-house facilities technicians, and engineering consultants working collaboratively. Together they can identify airflow deficiencies, develop solutions, and implement fixes and upgrades. Identifying airflow deficiencies Data center operators can benefit from airflow visualization models created by using computational fluid dynamics (CFD). Understanding airflow within data centers is complicated because under-floor air distribution and unguided pathways often result in unintended turbulences and vortices. These turbulences and vortices limit cooling capacity and waste energy. A CFD model of the data center can help identify detrimental airflow short-circuits, mixing, imbalances, and hotspots. Examples of CFD modeling at LBNL are provided in Figures 1 and 2. Airflow visualization can also be achieved by using wireless monitoring sensor technology. Visualization is provided by real-time thermal mapping software that uses temperature and pressure data from properly deployed wireless sensors. The thermal map gives continuous feedback to the data center operator to initially optimize floor tile outlet-size and location and to subsequently identify operational problems. Wireless sensors are a lowercost alternative to hard-wired sensors and can be relocated easily when a data center is modified. A companion case study technical bulletin is available regarding the application of wireless sensors at LBNL, see DOE Wireless Sensors Improve Data Center Efficiency Technology Case Study Bulletin. Another visualization technique is thermal imaging. This technique uses infrared thermography to provide immediate visual feedback to identify performance deficiencies. However, thermal imaging cannot be used for continuous monitoring. Developing solutions Many airflow deficiencies can be expected in legacy data centers that have gone through numerous modifications, rearrangements, and refreshes of sever equipment. Air leaks, obstructions, perforated tile locations, cable penetrations, and missing blanking plates can cause poor air distribution that can be remedied with low-cost solutions. In certain situations, customized solutions with more extensive construction costs can still have acceptable payback periods because they can free-up existing cooling capacity. Implementing fixes and upgrades Generally, fixing what s broke makes the most energy and economic sense in continued >

2 FEDERAL ENERGY MANAGEMENT PROGRAM legacy data centers. The challenge is achieving this goal without having to rebuild the entire data center, though this sometimes can be the best economic solution. Investing retrofit funds in passive components such as sealing leaks under the floor, repairing ductwork, replacing floor tiles, and adding blanking-plates are usually beneficial and cost-effective solutions. Generalized Approach to Airflow Improvement Each of these measures will contribute to increasing the cooling capacity of your existing data center systems and may help avoid the complexity of installing new, additional cooling capacity. Identify common airflow problems Hot spots Leaks Mixing Recirculation Short circuiting Obstructions Disordered pathways Turbulence Reverse flows Fundamental Repairs Complete basic maintenance before initiating more complex repairs. Remove or relocate as many obstructions to airflow under floor as practical. Fix under-floor air leaks primarily at cable and pipe penetrations and under racks. Install missing blanking plates and side panels in server racks. Re-locate tiles to balance under floor flow. Fix or replace leaking or broken floor tiles. Develop Retrofit Solutions Determine chiller and cooling tower system capacity. Study airflow patterns with a visualization tool/method. Develop designs to optimize airflow circulation. Consider establishing a return air path with an overhead plenum. Add/connect ductwork from computer room air conditioner (CRAC) unit air intake to overhead plenum. Included isolation dampers in air intake ductwork. Prepare servers for possible down-time. Commissioning Thoroughly commission the air management installation including: Confirm point-to-point connections of temperature sensors. Ensure airflow leakage around floor tiles is minimized. Verify server airflow temperature at IT equipment inlets. Check air temperature at server inlets and outlets for differential temperature ( T). Confirm isolation damper operation at CRAC air inlets (returns). Check for leaks at cable penetrations in floor. Review and test new control sequences for CRAC units. Additional Measures Operators of data centers using underfloor air distribution should consider and implement additional appropriate energy-efficiency measures including: Increasing data center setpoint temperature. Optimizing control coordination by installing an energy monitoring and control system (EMCS). Disabling or broadening control range of humidification systems that can have unintended, simultaneous operations. Consider installing barriers, e.g., curtains, for hot/cold aisle isolation. Retrofit Experience at LBNL At LBNL, many common airflow problems existed that were exacerbated by airflow deficiencies particular to this 40-year old data center. Existing perforated tiles were inefficiently arranged, causing airflow from under-floor outlets to be misdirected. In addition, low volume flow rates were noted in certain sections of the data center resulting in hotspots (see Figure 3). Notably, supply tile outlet size (known as tile free-hole area) and location optimization was enhanced by using wireless temperature monitoring devices 1 and associated visualization software. In this data center, hot air short circuiting through the racks was widespread. To limit this situation, LBNL installed server rack blanking-plates and recommends using side-panels. In addition, LBNL used brush-type seals around cabling to mitigate this air-leak pathway. The combination of air management best practices implemented at LBNL improved airflow and under-floor air pressure. Figure 3: Before retrofit auxiliary fans cool hot spots 1 See Technical Case Study Bulletin on wireless sensor monitoring at URL.

FEDERAL ENERGY MANAGEMENT PROGRAM 3 Figure 4: Return duct from plenum to CRAC unit Cooling Arrangement and Problems Identified at LBNL Cooling was provided by 7 down-flow CRAC units with under-floor supply. Figure 5: Airflow before retrofit (red arrows=hot; blue arrows=cold) Supplementary cooling was also supplied overhead, mixing with hot server return air. Server racks were missing blanking panels at inlets and between racks. Too many perforated floor tiles lowered under-floor pressure limiting distribution. Air-flow mixing and short-circuiting created hotspots. CRAC units were simultaneously humidifying and dehumidifying wasting energy. Retrofit Solutions Completed basic repairs: Patched under-floor leaks. Added blanking panels in racks. Sealed cable penetrations in floor. Figure 6: Revised airflow with return plenum (red arrows=hot; blue arrows=cold) Ceiling plenum became return airflow pathway. Extended CRAC intakes into new overhead hot-air return plenum. Duct CRAC units to ceiling plenum (see Figure 4). Rebalanced air distribution. Removed supply outlets in hot aisles. Reduced floor tile opening in cold aisles. Implemented main retrofits: Converted false ceiling to hot-air plenum. Redirected supplementary cooling to under floor. Relocated existing overhead supply to under-floor plenum (see Figures 5 and 6). Installed isolation curtains. Added strip curtains to hot aisles (see Figure 7). Figure 7: Isolation air curtains with fusible links

4 FEDERAL ENERGY MANAGEMENT PROGRAM Retrofit Results Improved return airflow A return plenum was created from an existing overhead ceiling that substantially improved hot air return to the CRAC unit intakes. Eliminated short-circuit airflow Airflow short-circuits were eliminated by adding ductwork at the top of each CRAC unit to connect to the newly configured hot-air return plenum in the data center s ceiling. Reduced mixing Cold and hot airflow mixing was significantly reduced by redirecting cooling air from an existing overhead supply system to the under-floor cooling plenum already in use by the CRAC units. Strip curtains improved isolation of cold and hot airstreams. Optimized under-floor air distribution Under-floor air distribution was examined and optimized for pressure and flow distribution. Installation of many wireless temperature and pressure sensors aided the optimization process (refer to DOE Wireless Sensors Improve Data Center Efficiency Technology Case Study Bulletin). The following improvements were made to the tile configuration (see Figure 8): 27 perforated floor tiles removed. 30 floor tiles converted from highto low-flow. 4 floor tiles converted from lowto high-flow. Retrofit Benefits The airflow management retrofit at LBNL resulted in many benefits. The benefits included some surprises such as freeing up 60 tons of stranded cooling capacity. This excess capacity allowed 30 tons of CRAC capacity to be used immediately for cooling new servers and the remaining 30 tons to be redundant. The following benefits were realized: Released 60 tons of stranded CRAC capacity initially. Figure 8: Floor tiles; before and after retrofit Increased cooling capacity by 21% (~150kW). Decreased fan energy by 8%. Raised CRAC unit setpoints 3 F warmer due to more effective cooling. Eliminated most hot spots. Turned off one 15 ton unit for stand-by operation. Created redundancy with one extra 15 ton unit; on-line but operating at minimum. Extend life of existing data center infrastructure. Figure 9 shows a five-month history of improved under-floor airflow. Note the significant performance improvement resulting from CRAC maintenance that included tightening loose supply fan belts. Retrofit Costs The airflow management retrofit at LBNL corrected numerous air distribution deficiencies that may not be found in all data centers. The costs incurred were used to not only improve an unusual airflow arrangement but included repairing CRAC units, fixing dampers, moving chilled water pipes, and relocating fire sprinklers. Expenditures amounted to approximately $61,000 plus $4,100 for the airflow isolation curtains in this 5,300 square foot data center, which is slightly more than $12 per sq. ft. Depending on how the benefits listed above are valued, the airflow management retrofit results in a simple payback of 2 to 6 years.

FEDERAL ENERGY MANAGEMENT PROGRAM 5 Lessons Learned The demonstration project at LBNL provided lessons learned that are applicable in many existing data centers. Most data centers will reduce energy use by implementing similar air management projects. Maintain Airflow Devices Regular preventative maintenance, inspection, and tune-ups are highly effective in reducing energy waste in data centers. A sizeable increase in airflow resulted from basic maintenance. In particular, LBNL facilities personnel found CRAC fan belts to be loose and slipping. Manage Energy With Metering The LBNL demonstration project clearly validated the old energy-use axiom you cannot manage energy without monitoring energy. Adequate metering and monitoring are essential to providing reliable energy data and trends to sustain the performance of a data center. Supervise Performance with EMCS An EMCS, Building Automation System (BAS), or other monitoring system, should be used to gather air temperatures and to develop trend information prior to installing air management retrofits. Generally in a data center, an EMCS with centralized, automated, direct digital control can coordinate energy use of cooling systems such as CRAC units, maximizing performance of the data center. For example, one beneficial result of the airflow improvements at LBNL was a relatively large amount of excess cooling capacity created within the data center. The newly found capacity required coordinating existing CRAC unit operation to allow shutdown of redundant CRAC units. This was achieved by manual control of the existing CRAC units as no centralized control system was available. Standard CRAC return-air control methods, which are beyond the scope of this bulletin, can present a challenging situation. Use Advanced Wireless Monitoring Devices LBNL employed a wireless monitoring system to maximize energy savings. In conjunction with an EMCS, wireless monitoring devices can significantly enhance data collection at much reduced cost compared to hard-wired sensors and devices. At LBNL, monitoring temperatures and under-floor air pressures helped to identify improvement strategies. Optimize Rack Cooling Effectiveness Data center operators should consider the following items to maximize rack cooling effectiveness: Match under-floor airflow to IT equipment needs. Locate higher density racks near CRAC units; verify airflow effectiveness. Locate severs to minimize vertical and horizontal empty space in racks. Consolidate cable penetrations to reduce leaks. Load racks bottom first in underfloor distribution systems. Use blanking plates and panels. Eliminate floor openings in hot aisles. Establish hot and cold airstream isolation. References Hydeman, M. Lawrence Berkeley National Laboratory (LBNL) Case Study of Building 50B, Room 1275 Data Center from 2007 to 2009. Taylor Engineering, LLC. December 2009. Acknowledgements Bulletin Primary Author: Geoffrey C. Bell, P.E. 510.486.4626 gcbell@lbl.gov Reviewers and Contributors: Paul Mathew, Ph.D. 510.486.5116 pamathew@lbl.gov William Tschudi, P.E. 510.495.2417 wftschudi@lbl.gov David H. Edgar Information Technology Division M.S. 50E-101 510.486.7803 dhedgar@lbl.gov For more information on FEMP: Will Lintner, P.E. Federal Energy Management Program U.S. Department of Energy 1000 Independence Ave., S.W. Washington, D.C. 20585-0121 202.586.3120 william.lintner@ee.doe.gov EERE Information Center 1-877-EERE-INFO (1-877-337-3463) www.eere.energy.gov/informationcenter DOE/Technical Case Study Bulletin September 2010 Printed with a renewable-source ink on paper containing at least 50% wastepaper, including 10% post consumer waste. CSO 19820