Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
|
|
|
- Bryan Greer
- 10 years ago
- Views:
Transcription
1 Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National Laboratory Rod Mahdavi, P.E. LEED AP May 2014
2 Contacts Rod Mahdavi Lawrence Berkeley National Laboratory 1 Cyclotron Road Berkeley, CA Phone: (510) [email protected] For more information on the Federal Energy Management Program, please contact: William Lintner Federal Energy Management Program U.S. Department of Energy EE-2L 1000 Independence Ave., S.W. Washington, D.C Phone: (202) [email protected] 2
3 Abbreviations and Acronyms BMS BTU CRAH DC EEM Genset GHG IT kw LBNL MWh PDU PUE UPS Building Management System British Thermal Unit Control Room Air Handler Data Center Energy Efficiency Measure Standby Generator Greenhouse gas Information Technology Kilowatt Lawrence Berkeley National Laboratory Megawatt-hours Protocol Data Unit Power Usage Effectiveness Uninterruptable Power Supply 3
4 Contents Executive Summary... 5 Introduction... 6 Assessment Process... 7 Observations... 7 Analysis... 9 Challenges... 9 Energy Efficiency Measures for the Federal Data Centers... 9 Forecasted Results... 9 Lessons Learned Next Steps and Recommendations List of Figures Figure 1. Power Usage in the Data Centers... 6 Figure 2. DC1 Hot Aisle Containment... 7 Figure 3. DC3 Openings in Racks Figure 4. DC3 PDU Open Skirts... 8 Figure 5. Air Recirculation Under the Racks Figure 6. DC2 Lighting, and Perforated Tiles... 8 Figure 7. Data Center EEMs Figure 8. Power Usage Savings Based on Implementation of EEMs Figure 9. Electricity Flows in Data Centers List of Tables Table 1. Data Centers Potential Energy, GHG, and Cost Savings... 5 Table 2. Summary of Power, Losses, Current and Potential PUE
5 Executive Summary Lawrence Berkeley National Laboratory (LBNL) evaluated three data centers for potential energy efficiency improvements during the summer of Geographic location, facility layouts, data center spatial configuration, Information Technology (IT) power supply and demands, mechanical, and electrical and cooling system equipment varied among the three sites. The data centers also contained a variety of new and old infrastructure support equipment; dynamic, growing IT platforms; and, thousands of computing, storage, and data transport servers and devices. Table 1 illustrates the results of the three data center (DC) assessments, including potential energy savings, potential greenhouse gas (GHG) emission reductions, and average payback periods for completing recommended projects at each data center. Total estimated potential energy savings were 11,500 Megawatt-hours (MWh). This potential energy savings is equivalent to an annual GHG emission reduction of 10,800 tons. In addition to reduced fuel consumption and GHG reductions are annual energy cost savings of approximately $700,000. All of the referenced savings are attainable by implementing LBNL-identified energy efficiency measures (EEM). These measures have an average payback period of approximately 2 years. LBNL s conclusion was that annual cost savings can be achieved by aligning IT rack units and equipment rows into hot and cold aisles. Containment of the hot aisle air flow will result in substantial reductions in cooling energy expenditures, as well as increased efficiency and lowered costs. In DC1, most of the hot aisles were contained. In DC3, high density racks were enclosed and hot air was exhausted through chimneys directly to the ceiling plenum. The most ubiquitous and significant findings involved increasing the data center supply air temperatures and increased use of a waterside economizer. Increasing the room temperature also reduces the need for cooling provided by computer room air handlers (CRAH), allowing many to be turned off. In addition, sufficient cooling could be provided with higher temperature-chilled water supply, thus reducing the number of hours of compressor-based cooling. Site Estimated Payback, Years Estimated Annual Energy Saving, MWh Estimated Annual GHG Emission Reduction, Ton Estimated Cost Savings Estimated EEMs Implementation Cost DC ,300 3,000 $200,000 $380,000 DC ,900 1,800 $115,000 $276,000 DC3 2 6,300 6,000 $385,000 $770,000 Table 1. Data Centers Potential Energy, GHG, and Cost Savings 5
6 Introduction The Office of Energy Efficiency and Renewable Energy tasked Lawrence Berkeley National Laboratory (LBNL) to provide measurements of energy usage, and total building energy consumption. LBNL also evaluated data center and IT systems and devices, and building systems that automate and regulate energy management and affect energy consumption. Included in the task was a request for recommendations to monitor and improve energy consumption, balance and optimize equipment, and to develop a plan to avoid energy and cost savings erosion over time. This case study includes the results of 3 data center assessments, lessons learned, and recommended EEMs. Table 2 identifies IT equipment power density, end-use power breakdown, and power usage effectiveness (PUE) for the 3 data centers during the evaluation period. The potential PUE as the result of implementation of the recommended energy efficiency measures is also illustrated. Sites Current IT Load W/ sqft Current IT Load kw Elec Dist. Loss kw Cooling Load kw Fan Load kw Other Users kw Total DC kw Current PUE Potential PUE DC1 33 1, , DC DC3 62 1, , Table 2. Summary of Power, Losses, Current and Potential PUE Figure 1 illustrates the power use relationships in the 3 data centers during the summer of DC1 consumes 33%, DC2 consumes 14%, and DC3 consumes 53% of the overall power consumed. MWh DC 1 DC 2 DC 3 UPS Cooling Fans Cooling Lighting Standby Gen PDU/trans loss UPS Loss IT Load Figure 1. Power Usage in the Data Centers 6
7 Assessment Process LBNL used a portable wireless monitoring kit to baseline the environmental conditions (temperature, humidity, airflow ranges) in the data centers. For those data centers where the monitoring kit was used, the following environmental sensors and points of measurement are described: Temperature sensors were installed at all CRAHs air supply and return. No intrusive sampling of devices occurred inside racks. Exterior placements were made on the front of IT racks (top, middle, and bottom), on the back of the rack (top and middle), and in the sub-floor space. Humidity was also measured by the same thermal nodes. Pressure sensors were installed to measure sub-floor pressure. Power measurements were also obtained from equipment display such as switch gear, uninterruptable power supply units (UPS), and power distribution units. Power measurements were also taken from selected electrical panel locations using power logging devices. In some cases that a direct power measurement was not possible, power usage was conservatively estimated to calculate PUE. Loss in the electrical power chain was either measured or estimated while considering efficiency of the UPS units based on their load factor. Lighting power usage was calculated by counting the fixtures. Determining the cooling power measurement, including chiller power usage, was a challenge. Assessors used BTU and power meters. However, if direct power measurements or readings from the control panels were not available, then calculations based on theoretical plant efficiency were applied. While use of specifications for CRAHs with constant speed fans provided sufficient information, it was more advantageous to have building management system (BMS) log reports as was the case for DC3. Observations The data centers stakeholders were concerned about high-energy use. In one DC2, the agency installed low cost hot aisle containment, and provided weekly manual logging of the temperatures on every rack. DC1 had some hot aisles contained, as illustrated in Figure 2. This data center also had a rack cooling system for higher power density racks. DC2 had two out of twelve CRAHs turned off. Figure 2. DC1 Hot Aisle Containment 7
8 DC3 had installed a wireless monitoring system. This center also had two chilled water loops, with the warmer loop serving the CRAHs. In the same center, a water side economizer was installed. Unfortunately, the waterside economizer was not integrated into the chilled water loop, which minimizes its active hours. Efficiencies associated with air management included closing openings in the floor, between and within racks (Figure 3), and closing openings around the Protocol data unit (PDU) by applying solid skirts to stop the escape of cool air into the room (Figure 4). Figure 3. DC3 Openings in Racks Figure 4. DC3 PDU Open Skirts When inefficient air flow issues were observed, recommendations were provided to optimize equipment use and to improve and balance air flow. Once these energy opportunities were identified, efficiencies were immediately gained by quick fixes such as re-arranging 98 floor tiles at one location. Hot air re-circulation through the bottom of the racks was another issue, as illustrated in Figure 5. In this case supply air from the perforated tile was 62 o F, though the depicted temperature sensor illustrates 75 o F just above the frame. Figure 6 illustrates excessive lighting in DC2. Tiles were installed in areas other than in front of the racks (cold aisle). Figure 5. Air Recirculation Under the Racks Figure 6. DC2 Lighting, and Perforated Tiles 8
9 Analysis The collected data included empirical measurements of recirculation and by-pass air mixing, and cooling system efficiency. The assessment established the data center s baseline energy utilization, identified relevant EEMs and potential energy savings benefits. Operational procedures were also examined, actions were taken during the assessment, and their impacts observed in real time. For instance, in DC3, 35% of the CRAHs were turned off with no impact on the operations. No considerable change in IT equipment air intake temperatures were observed by turning off the CRAH units. Actual annual saving (associated with turning off the CRAHs) was 1,300 MWh or $75,000 annually. Challenges Site and data center observations occurred during a two-week period. The LBNL assessment team was initially challenged by security requirements and restricted site access. These challenges existed because LBNL assessors did not have appropriate security clearances. However, with the assistance of Site Security personnel, access was not a major impediment to evaluating infrastructure or the data center. Widespread cooperation and the use of security escorts resulted in dynamic understandings and daily progress. A second challenge involved the potential risk of disrupting mission critical activities. Eventually, working with the host and as a team these obstacles subsided. Energy Efficiency Measures for the Federal Data Centers Several EEMs were both common and applicable at each data center. These efficiency measures along with their rough cost estimates are described generally in Figure 7 (next page). Estimated Savings for 3 data Centers Figure 8 illustrates estimated power savings achievable in the 3 data centers if the recommended EEMs are fully implemented. The waterside economizer will have the most beneficial impact in DC1 and DC2 while fan power savings can be optimized in DC3. Although the total savings are proportional to the data center IT load or data center square feet, different levels of impact are associated with each specific EEM. 9
10 Improve UPS Load Factor Offline UPS units by transferring its load to the online units thus increase load factor Typical Cost of this package is $30-$70/kW of IT power. A typical payback is 1.9 years. Chilled Water Plant Install/integrate waterside economizer Typical Cost of this package is $130- $360/kW of IT power. A typical payback is 2.3 years. UPS Rooms Cooling and Genset Block Heater Minimize cooling by widening temperature range Minimize cooling and fan energy by running CRAH with VFDs Minimize energy use by standby generator block heater Typical Cost of this package is $20-$40/kW of IT power. A typical simple payback is 2.5 years. Full Lighting Retrofit Reposition light fixtures from above racks to above aisles Reduce lighting Install occupancy sensors to control fixtures, install multiple circuits for large areas Air Management Adjustment Seal all floor leaks including that from floor mounted electrical panels Rearrange the perforated floor tiles locating them only in cold aisles Contain hot air to avoid mixing with cold air as it was done in one center Utilize contained racks with exhaust chimneys as it was done in one of the centers Seal spaces between and within racks Raise the supply air temperature (SAT) Install variable frequency drives (VFD) for CRAHs fan and control fan speed by air plenum pressure Convert computer room air handler return air temperature control to rack inlet air temperature control Raise the chilled water supply temperature thus saving energy through better chiller efficiency Typical Cost of this package is $80-$220/kW of IT power. A typical payback is around 2 years. Typical Cost of this package is $2-$6/sf of data center. A typical payback is around 2.2 years. Figure 7. Data Center EEMs 10
11 Potential Power Saving - kw DC1 DC2 DC3 UPS cooling, Misc Offline UPS, increase load factor Water side conomizer Lighting improvements Seal, Contain, Install VFD on CRAHs Figure 8. Power Usage Savings Based on Implementation of EEMs Lessons Learned The main barrier for increasing the supply air temperature is the IT equipment maximum temperature limitations specified by the vendors. Rack unit monitoring enables real time temperature measurements at the server level allowing the implementation of EEMs without concerns for interrupting data center operations. Monitoring at the rack level yields real time visualization and enables corrective action if needed. Chilled water supply temperature set-point optimization can result in additional large energy savings. LBNL recommended against procurement of air-cooled chillers in lieu of water cooled chillers in DC1. Next Steps and Recommendations To implement the remaining EEMs, LBNL recommends that an investment-grade verification be performed, for a favorable return on investment, by a design/build contractor experienced in data center efficiency improvements. Opportunities exist in other spaces beyond data centers. Those can be combined with the data center EEMs to make a more attractive case for energy savings 11
12 performance contracts. In addition, it is recommended that metering and monitoring systems be installed, along with a comprehensive dashboard to present the environmental data and the energy efficiency related data, including end use power breakdown by the various IT components, power chain, and infrastructure. Figure 9 illustrates the suggested metering location. Of course, multiple meters might be required for certain areas because of electrical distribution layout. The use of direct liquid cooling to the chip and application of increased temperatures (the upper side of ASHRAE Standard 90.9 (2011)) can lead to major reductions in energy use because of the higher energy carrying capacity found in liquids compared with air. This should be planned to future procurement of IT equipment. Electricity Flows in Data Centers M HVAC system local distribution lines ATS M lights, office space, etc. uninterruptible to the building, 480 V load M M M M UPS PDU computer racks computer equipment backup diesel generators UPS = Uninterruptible Power Supply PDU = Power Distribution Unit; Figure 9. Electricity Flows in Data Centers The dashboard will assist data center operators and facility managers to make real time changes and optimize energy efficiency. A dashboard is defined as a visual display of the most important information needed to achieve one or more objectives, consolidated and arranged on a single or multiple screens so the information relevant to each user can be observed at a glance. In other words, dashboard provides quick access to actionable visual data. The recommended dashboard: Displays the most important performance indicators and performance measures that are being monitored; these are usually user defined, user friendly and easy to understand. Displayed content includes different kinds of charts, measured or calculated numbers mostly presented in a graphical manner 12
13 Provides information for all stakeholders. Supports interactivity filtering, drilling down, or customizing screens to meet the needs of stakeholders. Stores data and generates reports on various aspects of energy, as needed or defined by the stakeholders. LBNL also recommends that future purchases of IT equipment and cooling systems include a preference for water cooled systems. If air cooled systems are procured, at a minimum, the IT equipment should be capable of operating at more than a 90 0 F supply air intake temperature (ASHRAE class A2). Regardless of cooling options, the servers should be powered at a high voltage (480V is preferred), and equipped with variable speed server fans controlled by the server core temperature. For new data centers use of direct current in lieu of alternate power is recommended since the power loss due to multiple conversions and inversions are avoided. 13
14 14 DOE/EE May 2014 Printed with a renewable-source ink on paper containing at least 50% wastepaper, including 10% post consumer waste.
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
General Recommendations for a Federal Data Center Energy Management Dashboard Display
General Recommendations for a Federal Data Center Energy Management Dashboard Display Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
Re Engineering to a "Green" Data Center, with Measurable ROI
Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices
Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project
Control of Computer Room Air Handlers Using Wireless Sensors Energy Efficient Data Center Demonstration Project About the Energy Efficient Data Center Demonstration Project The project s goal is to identify
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Version 1.0 - November, 2011 Incorporating Data Center-Specific Sustainability Measures into
Guideline for Water and Energy Considerations During Federal Data Center Consolidations
Guideline for Water and Energy Considerations During Federal Data Center Consolidations Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
Data Center Airflow Management Retrofit
FEDERAL ENERGY MANAGEMENT PROGRAM Data Center Airflow Management Retrofit Technology Case Study Bulletin: September 2010 Figure 1 Figure 2 Figure 1: Data center CFD model of return airflow short circuit
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
Data Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
Guide to Minimizing Compressor-based Cooling in Data Centers
Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi
Improving Data Center Efficiency with Rack or Row Cooling Devices:
Improving Data Center Efficiency with Rack or Row Cooling Devices: Results of Chill-Off 2 Comparative Testing Introduction In new data center designs, capacity provisioning for ever-higher power densities
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack
Energy Efficient High-tech Buildings
Energy Efficient High-tech Buildings Can anything be done to improve Data Center and Cleanroom energy efficiency? Bill Tschudi Lawrence Berkeley National Laboratory [email protected] Acknowledgements California
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
Wireless Sensors Improve Data Center Energy Efficiency
Wireless Sensors Improve Data Center Energy Efficiency Technology Case Study Bulletin: September 2010 As data center energy densities, measured in power per square foot, increase, energy savings for cooling
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
Variable-Speed Fan Retrofits for Computer-Room Air Conditioners
Variable-Speed Fan Retrofits for Computer-Room Air Conditioners Prepared for the U.S. Department of Energy Federal Energy Management Program Technology Case Study Bulletin By Lawrence Berkeley National
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
DC Pro Assessment Tools. November 14, 2010
DC Pro Assessment Tools November 14, 2010 William Tschudi, PE Lawrence Berkeley National Laboratory [email protected] 510-495-2417 1 Slide 1 Assessment tools LBNL is developing tools to assist in performing
Reducing Data Center Loads for a Large-Scale, Net Zero Office Building
rsed Energy Efficiency & Renewable Energy FEDERAL ENERGY MANAGEMENT PROGRAM Reducing Data Center Loads for a Large-Scale, Net Zero Office Building Energy Efficiency & Renewable Energy Executive summary
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
Specialty Environment Design Mission Critical Facilities
Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360
Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
National Grid Your Partner in Energy Solutions
National Grid Your Partner in Energy Solutions National Grid Webinar: Enhancing Reliability, Capacity and Capital Expenditure through Data Center Efficiency April 8, 2014 Presented by: Fran Boucher National
Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint
Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint Energy usage in the data center Electricity Transformer/ UPS Air 10% Movement 12% Cooling 25% Lighting, etc. 3% IT Equipment
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
Data Centre Testing and Commissioning
Data Centre Testing and Commissioning What is Testing and Commissioning? Commissioning provides a systematic and rigorous set of tests tailored to suit the specific design. It is a process designed to
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
How To Test A Theory Of Power Supply
A CASE STUDY ON: THE EFFECTS OF HIGHER OPERATING TEMPERATURES ON DATA CENTER EFFICIENCY Victor Garcia Director of Facilities Brocade October 24, 2012 Jeff Stein, PE Principal Taylor Engineering 2010 Brocade
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 1 Measuring PUE at Dedicated Data Centers 15 July 2010 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations
Top 5 Trends in Data Center Energy Efficiency
Top 5 Trends in Data Center Energy Efficiency By Todd Boucher, Principal Leading Edge Design Group 603.632.4507 @ledesigngroup Copyright 2012 Leading Edge Design Group www.ledesigngroup.com 1 In 2007,
Best Practices in HVAC Design/Retrofit
Best Practices in HVAC Design/Retrofit Little Server Room BIG $AVINGS Justin Lewis, P.E., LEED AP, DCEP Sr Energy Project Manager C:530.400.6042 O:530.754.4870 [email protected] 1 What s the problem here?
Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
AIA Provider: Colorado Green Building Guild Provider Number: 50111120. Speaker: Geoff Overland
AIA Provider: Colorado Green Building Guild Provider Number: 50111120 Office Efficiency: Get Plugged In AIA Course Number: EW10.15.14 Speaker: Geoff Overland Credit(s) earned on completion of this course
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.
Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite
Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions 6-28-10. Sponsored by:
Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions 6-28-10 Sponsored by: Disclaimer This document was prepared as an account of work sponsored by the United States Government.
Increasing Energ y Efficiency In Data Centers
The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes
The CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
BRUNS-PAK Presents MARK S. EVANKO, Principal
BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations
Center Thermal Management Can Live Together
Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots
Data Center Leadership In the DOE Better Buildings Challenge
Symposium Southern California February 18th, 2015 Data Center Leadership In the DOE Better Buildings Challenge Applying Best Practices to meet the Challenge Dale Sartor, PE Lawrence Berkeley National Laboratory
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential
TECHNICAL REPORT Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential By Lars Strong, P.E., Upsite Technologies, Inc. Bruce Long, Upsite Technologies, Inc. +1.888.982.7800 upsite.com
How to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
Energy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics
On January 13, 2010, 7x24 Exchange Chairman Robert Cassiliano and Vice President David Schirmacher met in Washington, DC with representatives from the EPA, the DOE and 7 leading industry organizations
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
How To Improve Energy Efficiency In A Data Center
Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice
Data Center Technology: Physical Infrastructure
Data Center Technology: Physical Infrastructure IT Trends Affecting New Technologies and Energy Efficiency Imperatives in the Data Center Hisham Elzahhar Regional Enterprise & System Manager, Schneider
Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs
WHITE PAPER Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs By Lars Strong, P.E., Upsite Technologies, Inc. 505.798.000 upsite.com Reducing Room-Level
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
Data Center Infrastructure Management (DCIM) Utility Energy Efficiency Opportunities Discussion & Demos July 2, 2013
Data Center Infrastructure Management (DCIM) Utility Energy Efficiency Opportunities Discussion & Demos July 2, 2013 Ted Brown Sr. Energy Management Analyst Data Center / IT Energy Efficiency [email protected]
Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
Energy Efficiency and Availability Management in Consolidated Data Centers
Energy Efficiency and Availability Management in Consolidated Data Centers Abstract The Federal Data Center Consolidation Initiative (FDCCI) was driven by the recognition that growth in the number of Federal
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 2 Measuring PUE for Data Centers 17 May 2011 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations for Measuring
Choosing Close-Coupled IT Cooling Solutions
W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of
Great Lakes Data Room Case Study
Great Lakes Data Room Case Study WeRackYourWorld.com Problem: During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure in the network enclosure. After an equipment
Increasing Data Center Efficiency through Metering and Monitoring Power Usage
White Paper Intel Information Technology Computer Manufacturing Data Center Efficiency Increasing Data Center Efficiency through Metering and Monitoring Power Usage To increase data center energy efficiency
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Introduction Agenda Introduction Overview of Data Centres DC Operational
The Data Center Dilemma - A Case Study
Second Place: Industrial Facilities or Processes, Existing The TCS data center makes use of white cabinets and reflective surfaces as one method to reduce power for its lights and cooling. Data Center
Analysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
Data Center Energy Efficient Cooling Control Demonstration PIR 10 052. (formerly Federspiel Controls)
Data Center Energy Efficient Cooling Control Demonstration PIR 10 052 (formerly Federspiel Controls) Background 2009 PIER sponsored project resulted in the installation of intelligent supervisory control
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
APC APPLICATION NOTE #92
#92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center
White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010
White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
