Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
|
|
|
- Lorin Lawrence
- 10 years ago
- Views:
Transcription
1 Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory Rod Mahdavi, P.E. LEED A.P. September 2013
2 Contacts Rod Mahdavi, P.E. LEED AP Lawrence Berkeley National Laboratory (510) For more information on FEMP: Will Lintner, P.E. Federal Energy Management Program U.S. Department of Energy (202)
3 Contents Executive Summary... 6 Overview... 7 Assessment Process... 8 Challenges...9 EEMs for HPC Data Centers Air Management Adjustment Package...10 Cooling Retrofit Package...10 Generator Block Heater Modification Package...11 Full Lighting Retrofit Package...11 Chilled Water Plant Package...11 Additional Opportunities Next Steps
4 List of Figures Figure 1. Data Center Thermal Map, After Containment... 9 Figure 2. Data Center Thermal Map, After Raised Temperature... 9 Figure 3. Hoods for Power Supply Unit Air Intake
5 List of Tables Table 1. HPC Sites Energy/GHG Savings... 6 Table 2. Computers and Cooling Types in HPC Sites... 7 Table 3. EEM Packages Status for HPC Data Centers Table 4. Summary of Power Losses and PUE in DOD HPC Data Centers
6 Executive Summary The U.S. Department of Energy s Office of Energy Efficiency and Renewable Energy (EERE) s Federal Energy Management Program (FEMP) assessed the energy use at six Department of Defense (DOD) High Performance Computing (HPC) data centers in Table 1 provides a breakdown of the potential energy and cost savings and average payback periods for each data center. The total energy saved was estimated at more than 8,000 Megawatt-hours (MWh) with an annual greenhouse gas (GHG) emission reduction of 7,500 tons. Energy cost savings of approximately one million dollars per year were found to be possible through energy efficiency measures (EEM) that had an average payback period of less than two years. The individual data centers contained a variety of IT systems and cooling systems. Server rack power densities were measured from 1kW (non HPC racks) to 25kW. In a conventional data center with rows of sever racks, separation of the rows into hot and cold aisles followed by containment of the hot or the cold aisle can result in substantial reductions of cooling energy. This arrangement does not fit well with some of the HPC data centers because of the integrated local cooling and different cooling air path. The most effective EEM was increasing the data center temperature which made it possible to save cooling energy by raising the chilled water supply temperature. Sites Payback years Annual Energy Saving MWh Annual Greenhouse Gas Reduction (GHG) Emission Reduction Ton Site , Site 2A 2.5 3,060 2,750 Site 2B 2 2,520 2,300 Site Site Site , Table 1. HPC Sites Energy/GHG Savings 6
7 Overview The DOD Office of High Performance Computing Management Program and (FEMP) jointly funded assessments by personnel from the Lawrence Berkeley National Laboratory (LBNL) because of an earlier success in identifying EEMs at an HPC data center in Hawaii. This case study includes the results of the assessments, lessons learned, and recommended EEMs. Table 2 shows the IT equipment for each site with their corresponding cooling systems. High power density is the most common feature of this equipment. Power used by each rack averaged up to 10-25kW. Another common feature is the use of 480V power without intervening transformation. Since with each transformation there is power loss and heat rejection, eliminating several power transformations reduced the need to remove the generated heat. Many of the systems also have self-contained cooling using local heat exchangers in the form of refrigerant cooling or rear door heat exchangers. The advantage of this feature is that the heat removed is very close to the source enabling heat transfer to be done at higher temperatures. This by itself can make the cooling very efficient since very cold chilled water will not be needed. The result is increased of hours of air side or water side economizer operation and reduced compressor operating hours. Most of the HPC server racks are different from the usual server racks in the way cooling air flows through them. An example is the Cray unit with cooling air flowing from the bottom to the top, rather than horizontally. Site IT System Cooling Type Site 1 Cray XE6 Refrigerant Cooled using local water cooled refrigerant pumps SGI Altix 4700 Rear Door Heat Exchanger using central chilled water SGI Altix Ice 8200 Air cooled Site 2A Air cooled with integrated fan, bottom air Cray XT5 intake, top exhaust Air cooled, ready for rear door heat exchanger SGI Altix Ice 8200 Site 2B using central chilled water Linux Woodcrest Air cooled Cray XE6 Refrigerant Cooled using local water cooled refrigerant pumps Site 3 Cray XT3 Air cooled with integrated fan, bottom air intake, top exhaust Cray XT4 Air cooled with integrated fan, bottom air intake, top exhaust SGI Altix Ice 8200 Rear Door Heat Exchanger using central chilled water IBM Cluster 1600 P5 Air cooled Site 4 Cray XT5 Air cooled with integrated fan, bottom air intake, top exhaust IBM Power 6 Air cooled Site 5 Dell Power Edge M610 Air cooled Table 2. Computers and Cooling Types in HPC Sites 7
8 Assessment Process The first step of the assessment was to baseline the environmental conditions in the HPC data centers vs. the ASHRAE recommended and allowable thermal guidelines. LBNL installed a wireless monitoring system. The following environmental sensor points were installed throughout the data center: temperature sensors at all computer room air handlers (CRAH) or/and computer room air conditioners (CRAC) supply and return. Supply thermal nodes were installed under the raised floor just in front of the unit. Return nodes were placed over the intake filter or were strapped to the beam on the top of the intake chimney. For CRAH/CRAC with their chimneys extended to the data center ceiling, thermal nodes were placed over the ceiling tile very close to the CRAH/CRAC intake. Humidity was measured by the same thermal node. Temperature sensors were placed on front of the rack near the top, at the middle and near the base. Other temperature sensors were installed on the back of the rack, in the sub-floor, and on every third rack in a row. Pressure sensors were located throughout the data center to measure the air pressure differential between sub-floor supply plenum and the room. The same approach was used for data centers with no raised floor by installing the sensors in the ductwork. The wireless sensor network continuously sampled the data center environmental conditions and reported at five minute intervals. Power measurements (where possible), power readings from equipment (Switch gear, UPS, PDU, etc.), and power usage estimation facilitated the calculation of power usage effectiveness. Generally, power use information is critical to the assessment. IT power should be measured as close as possible to IT equipment. More recent power supply units measure power and can communicate measurements to a central monitoring system. If this is not possible, then communication or manual reading of PDU or at a higher chain level, UPS power, can be assumed as IT power usage. Loss in the electrical power chain can be measured or can be estimated while considering how efficient the UPS units are loaded. Lighting power usage can be estimated by counting the fixtures or estimated based upon watt per square foot. Cooling load measurement, including chiller power usage, can be a challenge. If reading from panel is not possible, then estimation based on theoretical plant efficiency can work. While a spot measurement at CRAH units with constant speed fans is sufficient to estimate power use, it is more advantageous to continuously monitor CRAC units because of the variable power usage of the compressors. The next step was to analyze the data and present the result of the assessment. The collected data provided empirical measures of recirculation and by-pass air mixing, and cooling system efficiency. The assessment established the data center s baseline energy utilization and identified the EEM s and their potential energy savings benefit. In some of the centers, a few low cost EEMs were completed and their impacts were observed in real time. For instance, Figure 1 is the thermal map of one the data centers that was assessed. The map shows the impact of partially contained hot aisles. The major power usage comes from the two rows on the left. The containment helped to isolate hot air to some extent. 8
9 Figure 1. Data Center Thermal Map, After Containment Figure 2 shows the impact of raised CRAH supply air temperature. The result is a more uniform and higher temperature within the data center space. Chilled water temperature was then increased, which reduced energy usage by the chiller plant. In this case, the IT equipment cooling air intake was in front and exhaust was at the back of the rack. Challenges Figure 2. Data Center Thermal Map, After Raised Temperature The LBNL technical team was challenged by these energy assessments, as many of the HPC data centers employed different cooling types and configurations in the same room. For example, 9
10 while one area used building HVAC, another system had rack based air cooling using refrigerant coils and a third system had rear door cooling heat exchangers using central chilled water system. The air intake to the IT equipment posed another challenge. While in conventional server racks air enters horizontally at the face of the rack and exits from the back, in some HPC systems (e.g., Cray) air enters from the bottom (under the raised floor) and exits at the top. With vertical air displacement the containment of aisles by itself does not impact the cooling energy use. The same holds true with in-row and in-rack cooling systems, or rear door heat exchangers, since the exhaust air can be as cold the intake air. This cold air mixes with the hot air. Temperature difference across the air handler s coils is reduced which results in inefficiencies in the cooling systems. EEMs for HPC Data Centers For the presentation of results, LBNL personnel created packages of EEMs categorized based on their cost and simple payback. Typical EEMs covered in each package with its rough cost estimate are described as follows: Air Management Adjustment Package Seal all floor leaks, Rearrange the perforated floor tiles locating them only in cold aisles for conventional front to back airflow; solid everywhere else, Contain hot air to avoid mixing with cold air, Seal spaces between and within racks, Raise the supply air temperature (SAT), Disable humidity controls and humidifiers, Control humidity only on makeup air only, and Turn off unneeded CRAH units. Typical Cost of this package is $100/kW of IT power. A typical simple payback is ~1 year. Cooling Retrofit Package Install variable speed drives for air handler fan and control fan speed by air plenum differential pressure, Install ducting from the air handler units to the ceiling to allow hot aisle exhaust to travel through the ceiling space back to the computer room air handling units, Convert computer room air handler air temperature control to rack inlet air temperature control, and Raise the chilled water supply temperature thus saving energy through better chiller efficiency. 10
11 Typical Cost of this package is $180/kW of IT power. Typical simple payback is 2.5years. Generator Block Heater Modification Package Equip generator s block heater with thermostat control, Seal all floor leaks, and Reduce the temperature set point for the block heater. Typical Cost of this package is $10/kW of IT power. A typical simple payback is 2.5 years. Full Lighting Retrofit Package Reposition light fixtures from above racks to above aisles, Reduce lighting, and Install occupancy sensors to control fixtures. Typical Cost of this package is $15/kW of IT power. A typical simple payback is 3 years. Chilled Water Plant Package Install water side economizer, Investigate with HPC equipment manufacturer whether chilled water supply temperature to their heat exchanger can be increased, Run two chilled water loops, one for those equipment with lower chilled water temperature requirement and the other for those with rear door heat exchangers, Install VFD on pumps, and Purchase high efficiency chillers and motors if renovation or capacity increase is planned. Typical Cost of this package is $400/kW of IT power. A typical simple payback is 3 years. Additional Opportunities The use of water cooling can lead to major reductions in power use due to the higher energy carrying capacity of liquids in comparison to air. Higher temperature water can also be used for cooling, saving additional energy. This strategy can work very well with rear door heat exchangers. This strategy can also be integrated with the use of cooling towers or dry coolers to provide cooling water directly, thus bypassing the compressor cooling. Refrigerant cooling systems can be specified to operate using higher water temperatures than the current o F. There is already equipment that operates at 65 o F. This will increase compressor-less cooling hours. 11
12 Site Recommended Packages Implementation Site 1 Site 2A Site 2B Site 3 Site 4 Site 5 Air Management Adjustment Cooling Retrofit Lighting Retrofit Chilled Water Plant Air Management Adjustment Cooling Retrofit EG Block Heater Lighting Retrofit Chilled Water Plant Air Management Adjustment Cooling Retrofit EG Block Heater Lighting Retrofit Chilled Water Plant Air Management Adjustment Cooling Retrofit EG Block Heater Lighting Retrofit Chilled Water Plant Air Management Adjustment Cooling Retrofit EG Block Heater Lighting Retrofit Chilled Water Plant Air Management Adjustment Cooling Retrofit EG Block Heater Lighting Retrofit Chilled Water Plant potential Implemented Implemented Table 3. EEM Packages Status for HPC Data Centers Lessons Learned The main barrier to increasing the supply air temperature was the IT equipment and refrigerant pumping system maximum temperature requirements. In one of the sites, the refrigerant pumping system required chilled water supply temperatures of 45 o F. Granular monitoring enables temperature measurements at the server level, allowing for the implementation of some low cost simpler EEMs during the assessment without interrupting the data center operation. Chilled water supply temperature setpoint optimization can result in large energy savings, especially for the cooling systems that utilize air cooled chillers. 12
13 Some conventional approaches, such as sealing the floor, are applicable to HPC data centers, but some commonly applied EEMs to enterprise data centers such as hot/cold aisle isolation are not suitable to HPC data centers where the racks are cooled internally by refrigerant coils or rear door heat exchangers. Hoods are installed as is shown in Figure 3, to direct cold air to Cray power supply units, an example of site engineer s remedy to prevent mixing of cold and hot air. This is a universal problem and will work in similar situations where IT equipment has an unusual configuration. In this case, the airflow from the bottom of the racks did not reach the power supply unit so they required additional air flow from the back of the cabinet. Installing just the perforated tiles would have caused mixing of cold and hot air but with installation of the hoods on top of the perforated tiles and direct air from under raised floor to the cabinet. To avoid cold air supply release into the aisle the remainder of the perforated tiles that were exposed to the room was blanked off. Figure 3. Hoods for Power Supply Unit Air Intake Openings within the HPC create problems with overheating of cores when data center temperature is increased to save energy. Installation of internal air dams to prevent recirculation of air within the racks helped to address this problem. 13
14 Site Current IT Load kw/sqft Current IT Load kw Elec Dist. Loss kw Cooling Load kw Fan Load kw Other users kw Current PUE PUE Site , Site 2A 180 1, Site 2B Site , Site Site Next Steps Table 4. Summary of Power Losses and PUE in DOD HPC Data Centers In order to implement the remaining EEMs, FEMP recommends an invest grade assessment by a firm experienced in data center efficiency improvements. If agency funds are not available for implementing the EEMs, then private sector financing mechanisms such as energy savings performance contracts (ESPC) or utilities energy savings performance contracts (UESC) may be appropriate, considering the attractive payback periods and the magnitude of savings. FEMP can assist in exploring such opportunities. In addition, the LBNL experts recommend installing metering and monitoring systems, especially with a comprehensive dashboard to present the environmental data and the energy efficiency related data, including the power use by different components and systems. The dashboard will allow the operators to make real time changes to optimize the energy efficiency. The LBNL experts also recommend that future purchases of IT equipment include a preference for water cooled systems. At a minimum, the IT equipment should be capable of operating at more than a 900F air intake temperature, operate at a high voltage (480V is preferred), and contain a variable speed server fan controlled by the server core temperature. 14
15 DOE/EE-0971 September 2013 Printed with a renewable-source ink on paper containing at least 50% wastepaper, including 10% post consumer waste.
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Version 1.0 - November, 2011 Incorporating Data Center-Specific Sustainability Measures into
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
Data Center Airflow Management Retrofit
FEDERAL ENERGY MANAGEMENT PROGRAM Data Center Airflow Management Retrofit Technology Case Study Bulletin: September 2010 Figure 1 Figure 2 Figure 1: Data center CFD model of return airflow short circuit
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
Data Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
Re Engineering to a "Green" Data Center, with Measurable ROI
Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices
Improving Data Center Efficiency with Rack or Row Cooling Devices:
Improving Data Center Efficiency with Rack or Row Cooling Devices: Results of Chill-Off 2 Comparative Testing Introduction In new data center designs, capacity provisioning for ever-higher power densities
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project
Control of Computer Room Air Handlers Using Wireless Sensors Energy Efficient Data Center Demonstration Project About the Energy Efficient Data Center Demonstration Project The project s goal is to identify
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
University of St Andrews. Energy Efficient Data Centre Cooling
Energy Efficient Data Centre Cooling St. Andrews and Elsewhere Richard Lumb Consultant Engineer Future-Tech Energy Efficient Data Centre Cooling There is no one single best cooling solution for all Data
National Grid Your Partner in Energy Solutions
National Grid Your Partner in Energy Solutions National Grid Webinar: Enhancing Reliability, Capacity and Capital Expenditure through Data Center Efficiency April 8, 2014 Presented by: Fran Boucher National
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
How To Improve Energy Efficiency In A Data Center
Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice
Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers
Information Technology and Data Centers HVAC Showcase November 26, 2013 Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers Welcome. Today s webinar is being recorded and will be
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER DATA CENTERS 2009 IT Emissions = Aviation Industry Emissions Nations Largest Commercial Consumers of Electric Power Greenpeace estimates
General Recommendations for a Federal Data Center Energy Management Dashboard Display
General Recommendations for a Federal Data Center Energy Management Dashboard Display Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
Guide to Minimizing Compressor-based Cooling in Data Centers
Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi
Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks
Presentation Outline Common Terms / Concepts HVAC Building Blocks Plant Level Building Blocks Description / Application Data Green opportunities Selection Criteria Air Distribution Building Blocks same
Center Thermal Management Can Live Together
Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots
Data Center Power Consumption
Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
The New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360
Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy
Best Practices in HVAC Design/Retrofit
Best Practices in HVAC Design/Retrofit Little Server Room BIG $AVINGS Justin Lewis, P.E., LEED AP, DCEP Sr Energy Project Manager C:530.400.6042 O:530.754.4870 [email protected] 1 What s the problem here?
Guideline for Water and Energy Considerations During Federal Data Center Consolidations
Guideline for Water and Energy Considerations During Federal Data Center Consolidations Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010
White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence
How to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
European Code of Conduct for Data Centre. Presentation of the Awards 2014
European Code of Conduct for Data Centre Presentation of the Awards 2014 Paolo Bertoldi European Commission DG JRC Institute for Energy and Transport 1 The Code of Conduct for Data Centre Results European
Variable-Speed Fan Retrofits for Computer-Room Air Conditioners
Variable-Speed Fan Retrofits for Computer-Room Air Conditioners Prepared for the U.S. Department of Energy Federal Energy Management Program Technology Case Study Bulletin By Lawrence Berkeley National
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
Wireless Sensors Improve Data Center Energy Efficiency
Wireless Sensors Improve Data Center Energy Efficiency Technology Case Study Bulletin: September 2010 As data center energy densities, measured in power per square foot, increase, energy savings for cooling
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
QUALITATIVE ANALYSIS OF COOLING ARCHITECTURES FOR DATA CENTERS
WHITE PAPER #30 QUALITATIVE ANALYSIS OF COOLING ARCHITECTURES FOR DATA CENTERS EDITOR: Bob Blough, Emerson Network Power CONTRIBUTORS: John Bean, APC by Schneider Electric Robb Jones, Chatsworth Products
DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers
DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and
NetApp Data Center Design
NetApp Data Center Design Presentation for SVLG Data Center Efficiency Summit November 5, 2014 Mark Skiff, PE, CEM, CFM Director, IT Data Center Solutions Data Center Design Evolution Capacity Build Cost
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Introduction to Datacenters & the Cloud
Introduction to Datacenters & the Cloud Datacenter Design Alex M. Hurd Clarkson Open Source Institute March 24, 2015 Alex M. Hurd (COSI) Introduction to Datacenters & the Cloud March 24, 2015 1 / 14 Overview
Blade Server & Data Room Cooling Specialists
SURVEY I DESIGN I MANUFACTURE I INSTALL I COMMISSION I SERVICE SERVERCOOL An Eaton-Williams Group Brand Blade Server & Data Room Cooling Specialists Manufactured in the UK SERVERCOOL Cooling IT Cooling
Stanford University Server/Telecom Rooms Design Guide
Stanford University Server/Telecom Rooms Design Guide CONTENTS 1.0 How to Use this Guide...3 1.1 Document Organization...3 2.0 Summary Form and Sample Options Matrix...3 3.0 Options Matrix...7 4.0 Recommended
Energy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation
Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.
Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite
High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.
High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white
Data Center Leadership In the DOE Better Buildings Challenge
Symposium Southern California February 18th, 2015 Data Center Leadership In the DOE Better Buildings Challenge Applying Best Practices to meet the Challenge Dale Sartor, PE Lawrence Berkeley National Laboratory
Analysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School
A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of
IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 8, Loughborough University, 29 th June 2015 Topics to be considered 1. Progress on project tasks 2. Development of data centre test facility
Top 5 Trends in Data Center Energy Efficiency
Top 5 Trends in Data Center Energy Efficiency By Todd Boucher, Principal Leading Edge Design Group 603.632.4507 @ledesigngroup Copyright 2012 Leading Edge Design Group www.ledesigngroup.com 1 In 2007,
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
DATA CENTER ASSESSMENT SERVICES
DATA CENTER ASSESSMENT SERVICES ig2 Group Inc. is a Data Center Science Company which provides solutions that can help organizations automatically optimize their Data Center performance, from assets to
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Introduction Agenda Introduction Overview of Data Centres DC Operational
Glossary of Heating, Ventilation and Air Conditioning Terms
Glossary of Heating, Ventilation and Air Conditioning Terms Air Change: Unlike re-circulated air, this is the total air required to completely replace the air in a room or building. Air Conditioner: Equipment
The CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
The Different Types of Air Conditioning Equipment for IT Environments
The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
Data Center Technology: Physical Infrastructure
Data Center Technology: Physical Infrastructure IT Trends Affecting New Technologies and Energy Efficiency Imperatives in the Data Center Hisham Elzahhar Regional Enterprise & System Manager, Schneider
Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
Specialty Environment Design Mission Critical Facilities
Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,
Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package
Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Munther Salim, Ph.D. Yury Lui, PE, CEM, LEED AP eyp mission critical facilities, 200 west adams street, suite 2750, Chicago, il 60606
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 1 Measuring PUE at Dedicated Data Centers 15 July 2010 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations
Elements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
Power and Cooling for Ultra-High Density Racks and Blade Servers
Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw
Data Centre Cooling Air Performance Metrics
Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. [email protected] Abstract Data centre energy consumption
Choosing Close-Coupled IT Cooling Solutions
W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of
White Paper #10. Energy Efficiency in Computer Data Centers
s.doty 06-2015 White Paper #10 Energy Efficiency in Computer Data Centers Computer Data Centers use a lot of electricity in a small space, commonly ten times or more energy per SF compared to a regular
Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions 6-28-10. Sponsored by:
Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions 6-28-10 Sponsored by: Disclaimer This document was prepared as an account of work sponsored by the United States Government.
STULZ Water-Side Economizer Solutions
STULZ Water-Side Economizer Solutions with STULZ Dynamic Economizer Cooling Optimized Cap-Ex and Minimized Op-Ex STULZ Data Center Design Guide Authors: Jason Derrick PE, David Joy Date: June 11, 2014
Rack Hygiene. Data Center White Paper. Executive Summary
Data Center White Paper Rack Hygiene April 14, 2011 Ed Eacueo Data Center Manager Executive Summary This paper describes the concept of Rack Hygiene, which positions the rack as an airflow management device,
Green Data Centre Design
Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS
Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings
Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings ASHRAE Conference, Denver Otto Van Geet, PE June 24, 2014 NREL/PR-6A40-58902 NREL is a national laboratory of the U.S. Department
DataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
Direct Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
Great Lakes Data Room Case Study
Great Lakes Data Room Case Study WeRackYourWorld.com Problem: During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure in the network enclosure. After an equipment
BICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.
BICSInews m a g a z i n e july/august 2012 Volume 33, Number 4 plus + Industrial-grade Infrastructure and Equipment + I Say Bonding, You Say Grounding + Data Center Containment & design deployment The
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
