ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS AND RENEWABLE ENERGY RESOURCES
|
|
|
- Flora Walton
- 10 years ago
- Views:
Transcription
1 ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS AND RENEWABLE ENERGY RESOURCES Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN UNIVERSITY POLITEHNICA BUCHAREST, Romania. Rezumat: Studiul dezvoltă modelul analitic a unui sistem de recuperare al energiei pentru răcirea unui centru de date folosind resurse de energie regenerabile. Sistemul de răcire constă într-un răcitor cu absorţie, acţionat de energia termică recuperată din componentele Centrului de Date şi adiţional din energia solară acumulată extra. Această lucrare include informaţii generale despre centrele de date şi o descriere detaliată a sistemului de recuperare de energeie propus. Cuvant cheie: centru de date, răcitor cu absorpţie, energie solară. Abstract:The study develops the analytical model of an energy recovery system for the cooling of a data center using renewable energy resources. The cooling system consists of an absorption chiller, driven by the thermal energy recovered from the Data Center components and additional extra solar energy. The paper includes general information on data centers and a detailed description of the proposed energy recovery system. Keywords: data centers, absorption chiller, solar energy. 1. INTRODUCTION Several studies focused on the need to recover the thermal energy dissipated in data centers, given the large power consumption and associated power losses. It is generally accepted across the industry that the large power hungry data centers lack efficient energy recovery systems. However, most of the prior art [1 7] and associated studies have focused mostly on the design optimization and analysis of these data centers, without targeting the alternative recovery of the energy losses to the ambient. The present study fills this gap, and proposes an efficient energy recovery system used to cool the equipments and to provide an optimal microclimate using the energy lost by the Data Centers. When the residual energy losses from the data centers are not sufficient to activate the cooling systems, it is suggested to use supplemental energy from alternative recovery systems mainly the solar energy, when available. Similarly, the geothermal energy and other alternative sources of energy can be used for the same purpose. In principle, the heat dissipated to the ambient by the data center racks is used to vaporize the cooling agents and thus activate the absorption Brwater refrigeration unit and cool the water which in turn will be used to cool the data center environment. When the thermal energy received from the data centers servers is not sufficient to activate the refrigeration unit, a dual system is proposed to heat the water needed for the absorption chiller, using solar energy or any other available regenerative heat source. The current study will provide a solution for the optimal design of the data center/server cooling systems, the appropriate selection and the functional analysis of the absorption chiller together with the complementary heat exchangers required by the overall system. The efficiency of the system is defined in terms of the Coefficient of Performance (COP), the ratio between the recovered energy from the system and the total power dissipated by the system. The COP values can reach 0.33 for a typical system Data Center Cooling General Information The cooling infrastructure is a significant part of a data center. The complex connection of chillers, compressors and air handlers create the optimal TERMOTEHNICA 1/
2 Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN computing environment, ensuring the longevity of the servers installed within. The EPA s oft cited 2007 report predicted that the data center energy comsumption, if left unchecked, would reach 100 million kwh by 2011 with a corresponding energy bill of $ 7.4 billion. This conclusion is not strictly based on Moore s law or by the need for greater bandwidth. In light of these news, the industry is turning a critical eye towards cooling, recognizing both the inefficiencies of current approaches and the improvements possible through new technologies. The information is design to assist the data center professionals who must balance the growing demand for computing power with the pressure to reduce energy consumption. Until recently, no standard measurement existed for Data Center efficiency. The Green Grid, a consortium promoting responsible energy use within critical facilities has successfully introduced two new terms; Power Usage Efficiency (PUE) and Data Center Infrastructure Efficiency (DciE) Power Usage Effectiveness (PUE) PUE is derived by dividing the total incoming power by the IT equipment load. The total incoming power includes, in addition to the IT load, the data center's electrical and mechanical support systems such as chillers, air conditioners, fans, and power delivery equipment. Lower results are better, as they indicate more incoming power is consumed by IT equipment instead of the intermediary, support equipment. Cooling can be a major player in PUE measurement. Consider the following diagram [2][3], where the combination of the chiller, humidifier, and CRAC consume 45% of the total energy coming into the facility. Fig. 1. Where does the energy go? (Source: The Green Grid) The Uptime Institute approximates an industry average PUE of 2.5. There are no tiers or rankings associated with the values, but PUE allows facilities to benchmark, measure, and improve their efficiency over time. Companies with large-scale data center operations, like Google and Microsoft, have published their PUE. In 2008, Google had an average PUE of 1.21 across their six company data centers. Microsoft's new Chicago facility calculated an average annual PUE of Data Center Infrastructure Efficiency (DCiE) DCiE is the inverse of PUE-Total IT Power/Total Facility Power x 100%. DCiE presents a quick snapshot into the amount of energy consumed by the IT equipment. To examine the relationship between PUE and DCiE, "A DCiE value of 33% (equivalent to a PUE of 3.0) suggests that the IT equipment consumes 33% of the power in data center." ASHRAE released their "2008 ASHRAE Environmental Guidelines for Datacom Equipment" which expanded their recommended environmental envelope as follows: Table Version Temperature Humidity Version 20 C (68 F) to 25 C (77 F) 40% RH to 55% RH 18 C (64.4 F) to 27 C (80.6 F) 5.5 C DP (41.9 F) to 60% RH & 15 C DP (59 F DP) The cooling systems are categorized into aircooled and liquid-cooled systems. The definitions of these categories are: Air Cooling Conditioned air is supplied to the inlets of the rack / cabinet for convection cooling of the heat rejected by the components of the electronic equipment within the rack. It is understood that within the rack, the transport of heat from the actual source component (e.g., CPU) within the rack itself can be either liquid or air based, but the heat rejection media from the rack tothe terminal cooling device outside of the rack is air. Liquid Cooling Conditioned liquid (e.g., water, refrigerant, etc., and usually above dew point) is channeled to the actual heat-producing electronic equipment components and used to transport heat from that component where it is rejected via a heat exchanger (air to liquid or liquid to liquid) or 70 TERMOTEHNICA 1/2011
3 ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS extended to the cooling terminal device outside of the rack Air Cooling Overview Air cooling is the most common source of cooling for electronic equipment within datacom rooms. Chilled air is delivered to the air intakes of the electronic equipment through underfloor, overhead, or local air distribution systems. Current industry guidelines recommend that electronic equipment be deployed in a hot-aisle/cold-aisle configuration (as illustrated in Figure 1-1) (ASHRAE TC ). On each side of the cold aisle, electronic equipment is placed with the intakes (fronts) facing the cold aisle. The chilled air is drawn into the intake side of the electronic equipment and is exhausted from the rear of the equipment into the hot aisle. In an underfloor distribution system, chilled air is distributed via a raised floor plenum Fig. 2. Hot-aisle/cold-aisle cooling principle and is introduced into the room through perforated floor tiles (Figure 1-2) and other openings in the raised floor (i.e., cable cutouts). Fig. 3. Raised floor implementation using baffles to limit hot-aisle/cold-aisle mixing. TERMOTEHNICA 1/
4 Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN 1.5. Liquid-Cooled Computer Equipment Most computers are cooled with forced air. With the increased microprocessor power densities and rack heat loads, some equipment requires liquid cooling to maintain the equipment within the environmental specifications required by manufacturers. The liquids considered for cooling electronic equipment are water, FluorinertTM, or refrigerant. Manufacturers normally supply the cooling system as part of the computer equipment and the liquid loop would be internal to the equipment. The transfer of heat from the liquidcooled computer system to the environment housing the racks takes place through a liquid-towater or water/glycol heat exchanger. Fig. 4. Internal liquid cooling loop restricted within rack extent. Figures 1-5 and 1-6 depict the two possible liquid-cooled systems. Figure 1-4 shows a liquid loop internal to the rack where the exchange of heat with the room occurs with a liquid to air heat exchanger. In this case the rack appears as an aircooled rack to the client and is classified as an air-cooled system. It is included here show the evolution to liquid-cooled systems. Figure 1-5 depicts a similar liquid loop internal to the rack used to cool the electronics within the rack, but in this case the heat exchange is with a liquid to chilled water heat exchanger. Typically the liquid circulating within the rack is maintained above dew point to eliminate any condensation concerns. Figure 1-4 depicts a design very similar to Figure 1-4 but where some of the primary liquid loop components are housed outside the rack to permit more space within the rack for electronic components. Fig. 5. Internal liquid cooling loop extended to liquid-cooled external modular cooling unit. 72 TERMOTEHNICA 1/2011
5 ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS Fig. 6. Internal liquid cooling loop with rack extents, liquid cooling loop external to racks. Figure 6 Internal liquid cooling loop with rack extents, liquid cooling loop external to racks. Liquid Coolants for Computer Equipment The liquid loops for cooling the electronics shown in Figures 1-4, 1-5, and 1-6, [1], are typically of three types: a) FluorinertsTM (fluorocarbon liquids); b) Water (or water/glycol mix); c) Refrigerants (pumped and vapor compression). 2. PROPOSED RECOVERY ENERGY SYSTEM As seen in Figure 1, only 30% of the total data center power is used to drive the server activity, while the remainder is lost as heat to the surroundings. Therefore, in the paper we are proposing to increase the efficiency of the power in the system [DCiE], by recycling the lost energy and by using it primarily to cool off the system equipment. The main energy recycling process occurs at the IT level, by recycling the server rack dissipated power and using it to power an absorption based refrigeration plant preparing the water necessary for the data center cooling. Additionally, the energy from the data center ambient air is recycled as well, yet this happens at a lower temperature (and exergy levels), thus the use of over 75C heated water needed to activate the absorption chiller is not possible. Figure 2-1 shows the schematic of the IT energy recycling system and the hot water preparation process for the absorption chiller, to further prepare the chilled water needed to cool off the environment (climatization). The system components are specified below. Fig. 7. Heating Recovery System by cooling Servers and the Absorption LiBr/H2O Chiller driven by this energy to Prepare Cold Water for Conditioning of the Data Center. TERMOTEHNICA 1/
6 Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN The system operates as follows; the heat is taken from the data center servers while vaporizing a refrigerant (more details in next section) and transferred in a heat exchanger by condensing the water which heats over 75C. The warm water enters the Absorption Chiller generator and will prepare the cold water which further enters the Air Handling Unit [AHU], which will further enable Air Conditioning for the Data Center Environment. When the heat coming from Servers is not enough for the process, additional thermal energy from Solar Collectors [SC] is employed. This energy is stored in tank R, then is distributed to the Absorption Chiller generator to add to existing energy needed to activate the Chiller. The System has a Cooling Tower which will take the heat dissipated by the Chiller absorber and condenser Units. This heat can be reused for other practical purposes, such as the AHU air heating, also preparing utilitary/sanitary water Energy Balance for the Energy Recovery System for Proposed Data Center Cooling The various component powers are defined as follows: server thermal power, kw; solar thermal power, kw; thermal power of generator G (from chiller MF), kw; power needed to cool the air, kw Where: Chiller MF cooling power is thus: COP is the chiller Coefficient of Performance, value ~ 0.6. Also QTR is the cooling power in the Cooling Tower: QTR= QC+QA=QG+QE, Where: QC = the thermal power of the condenser in the chiller unit QA = the thermal power of the absorber in the chiller unit QAHU, the thermal power of the cooling air unit in the Air Handling Unit is: QAHU=QE For a specific Data Center configuration with known thermal power dissipated by the servers (QS = 52 kw), need to calculate the thermal powers for the other components: - The thermal power of the generator G: QG QS=52 kw; - The refrigeration power: QE=0.6*52=31.2 kw; - The cooling power of the solar system collector is assumed to be about half of the generator G: QS=0.5*QG=26 Kw; - The thermal power of the Cooling Tower: QTR= = 83.2 kw - The thermal power in the AHU battery: QAHU=31.2 kw Referring to Figure 1-1, with the given thermal powers calculated, the DATA CENTER infrastructure efficiency increases from ~ 33% to ~ 60%. 2. DETAILS ON SERVERS HEAT RECOVERY SYSTEMS The cooling refrigerant is chosen such that it will not interact chemically with the electronic chips and packages to be cooled. The refrigerant absorbs the heat from the components and vaporizes, then heats up the water in the heat dissipation process while it condenses. Two solutions are proposed: 1) the refrigerant is re-circulated over the surface of the pumps in the server using a pump (Fig. 3-1), and 2) the refrigerant circulates over the servers in a natural way, using a thermo-syphon (Fig 3-2). For both scenarios the racks are mounted in vertical position, unlike the typical horizontal position used for the air cooled systems. First solution with pumps (Fig. 3) includes: Data Center DC, Server Plates - S, Pump Liquid P, Liquid Distributor D, Closed Server Room SR. The racks are placed in a room separated from the rest of the Data Center equipments. Below this encapsulated room is located the liquid refrigerant re-circulated through the distributor D via pump P over the server plaques in the form of a film. The refrigerant partially vaporizes by absorbing the heat from servers; the vapors rise towards condenser C and the remaining liquid is collected at the bottom of the encapsulated room. The vapors will rise and further condense on the surface of the tubes with water which is heating up and is further distributed to the generator of the absorption chiller system. The water is recirculated via a 3-way throttle device until it reaches the temperature required for the solution boiling in the generator. The liquid refrigerant from condenser C returns to the system through distributor D. The power dissipated by the solution pump is negligible 74 TERMOTEHNICA 1/2011
7 ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS Fig. 8. First heat recovery solution Fig. 9. Second method for cooling servers compared to the energy consumption inside the Data Center. Data Center Option 1 (Fig. 1) Design Example A smaller Data Center is chosen, having 2 racks each with 60 server blades. Each server includes 2 CPU processors, dissipating 105W at about 90C temperatures. The overall thermal power dissipated by each rack is about 26 kw thus both racks will dissipate about 52 kw. Each sever is 0.3 x 0.7 square meters, about 1 mm thick. The racks are placed vertically, forming an overall height of about 1.5 meters. The thermal power of the condenser is 52 kw; R123 is the chosen refrigerant, with a smaller saturation pressure at phase changing temperatures compared to other refrigerants. The vaporizing and condensing temperatures are To ~ Tc = 90C, while CU Ps = 6.4 bar. The latent vaporizing heat is 10 kj/kg. Thus the mass flow rate needed to prepare hot water is ṁ=0.37 kg/s. We choose the inlet/outlet chiller generator temperatures at 80/85C as seen also in Fig. 3-1; the hot water mass rate will be in this case ṁw=2.5 kg/s. The refrigerant volumetric heat flow rate at 90C is V = 0.3*10 ˉ³m³/s[ 4.8 m³/h]. We double the TERMOTEHNICA 1/
8 Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN volumetric flow rate for the refrigerant recirculated in the pump, part of which will vaporize and the rest will return as liquid. The refrigerant pump will be chosen for a refrigerant volumetric flow rate of V t=0.6 m³/s at an pressure of 10 Mpa. This refrigerant flow rate will provide satisfactory 0.5 mm coolant films on the server blades. In Fig. 3-2 is presented the second solution, with the refrigerant circulated via a thermo-syphon. The components details are provided in the Legend. The refrigerant vapors will rise on the server blades due to the buoyancy force, then will condense on the surface of condenser C. There are 2 solutions to achieve the upward motion of the refrigerant, one using a custom built heat tube surrounding each server blade and a second option would be to build more server blades such that the condensation could follow a separate path. Hence, the first solution (with recirculation pump) has several advantages over the second one, such as: design simplicity, lower power consumption required to activate the pump and the reliability of the system. The system will be built and tested in order to confirm these hypotheses. 3. CONCLUSIONS The paper presents a novel solution of recovering the thermal energy dissipated by the data center server blades, using it to prepare hot water to activate a LiBr-HSO absorption chiller to prepare the cold water necessary to further cool the data center environment. When the additional energy dissipated by the Data Center is not enough to cover the inputs necessary for the cooling process via absorption, additional energy is provided by a solar system or other renewable energy system. The heat dissipated by the server blades is absorbed by a liquid refrigerant which vaporizes then condenses and releases heat to warm up the water needed by the absorption chiller. The refrigerant is circulated via a pump, a superior solution to thermo-syphon. The pump is able to circulate twice the refrigerant flow rate, thus creating a 0.5 mm film on the surface of the blades, some of which participates in the vaporizing process. The proposed energy recovery system will lead to a COP of ~ 0.6, double of the 0.33 value for the system without recovery. The current study is purely theoretical, enabling a better understanding of what the pros/cons of building such a recovery system are to improve the efficiency; next step is to build an experimental setup to validate the proposed solution. NOMENCLATURE IT CPU CRAC PCB PDU RAM UPS REFERENCES Information technology. Central Processing Unit. Computer Room Air Conditioning. Printed Circuit Board. Power Distribution Unit. Random Access Memory. Uninterruptible Power Modules [1] ASHRAE Datacom Series CD 2 EDITION: Thermal Guidelines for Data Processing Environments, Power Trends and Cooling Applications, Design Considerations for Datacom Equipments Centers. [2] Cappuccio, D. (2008), Creating Energy-Efficient, Low- Cost, High Performance Data Centers. Gartner Data Center Conference, (p. 4). Las Vegas. [3] EPA. (2007, August 2). EPA Report to Congress on Server and Data Center Energy Efficiency. [4] McGuckin, P. (2008). Taming the Data Center Energy Beast. Gartner Data Center Conference, (p. 5). Las Vegas. [5] Sullivan, R. (2002). Alternating Cold and Hot Aisles Provides More Reliable Cooling for Server Farms. [6] Green Grid. (2008, October 21). Seven Strategies To Improve Data Center Cooling Efficiency. [7] The Green Grid. (2009). The Green Grid: Home. [8] Keith E. Herold, Reinhard Rademacher, Sanford A. Klein, "Absorption Chillers and Heat Pumps", CRCPress [9] Chiriac,F, Chiriac,V (2010), An Overview and Comparison of Various Refrigeration Microelectronics Cooling.NATO Conference, Bucharest,2010 [10] Chiriac, F, Gavriliuc, R., Dumitrescu, R. Hybrid Ammonia Water Absorption System of Small and Medium Capacity, Proceedings of ASME IMECE 03, Washington D.C., [11] Chiriac, F., Ilie, A., Dumitrescu, R., 2003, Ammonia Condensation Heat Transfer in Air-Cooled Mesochannel Heat Exchangers, Proceedings of ASME IMECE, Washington DC. [12] Minea, V., and Chiriac, F, Hybrid Absorption Heat Pump with Ammonia/Water Mixture some design guidelines and district heating application, 2006, International Journal of Refrigeration, 29, pg , ELSEVIER. 76 TERMOTEHNICA 1/2011
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.
Data Center Equipment Power Trends
Green field data center design 11 Jan 2010 by Shlomo Novotny Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. explores water cooling for maximum efficiency - Part 1 Overview Data
How to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
The New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
Analysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers
DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and
Data Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,
Data Centre Cooling Air Performance Metrics
Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. [email protected] Abstract Data centre energy consumption
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Rittal Liquid Cooling Series
Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de
DataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 8, Loughborough University, 29 th June 2015 Topics to be considered 1. Progress on project tasks 2. Development of data centre test facility
Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.
Energy Efficient Data Centre at Imperial College M. Okan Kibaroglu IT Production Services Manager Imperial College London 3 March 2009 Contents Recognising the Wider Issue Role of IT / Actions at Imperial
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 4, Lancaster University, 2 nd July 2014 Background Data centres estimated to use 2-3% of total electricity consumption in the UK and generate
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
Thermal Mass Availability for Cooling Data Centers during Power Shutdown
2010 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ashrae.org). Published in ASHRAE Transactions (2010, vol 116, part 2). For personal use only. Additional reproduction,
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
Elements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
BRUNS-PAK Presents MARK S. EVANKO, Principal
BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations
Green Data Centre Design
Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
Driving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
The CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
Overview of Green Energy Strategies and Techniques for Modern Data Centers
Overview of Green Energy Strategies and Techniques for Modern Data Centers Introduction Data centers ensure the operation of critical business IT equipment including servers, networking and storage devices.
How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy
Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy Executive Summary Data center owners and operators are in constant pursuit of methods
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
Building Energy Systems. - HVAC: Heating, Distribution -
* Some of the images used in these slides are taken from the internet for instructional purposes only Building Energy Systems - HVAC: Heating, Distribution - Bryan Eisenhower Associate Director Center
UNIT 2 REFRIGERATION CYCLE
UNIT 2 REFRIGERATION CYCLE Refrigeration Cycle Structure 2. Introduction Objectives 2.2 Vapour Compression Cycle 2.2. Simple Vapour Compression Refrigeration Cycle 2.2.2 Theoretical Vapour Compression
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER DATA CENTERS 2009 IT Emissions = Aviation Industry Emissions Nations Largest Commercial Consumers of Electric Power Greenpeace estimates
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 1 Measuring PUE at Dedicated Data Centers 15 July 2010 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations
7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America
7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 2 Measuring PUE for Data Centers 17 May 2011 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations for Measuring
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation
Guide to Minimizing Compressor-based Cooling in Data Centers
Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
Defining Quality. Building Comfort. Precision. Air Conditioning
Defining Quality. Building Comfort. Precision Air Conditioning Today s technology rooms require precise, stable environments in order for sensitive electronics to operate optimally. Standard comfort air
Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics
On January 13, 2010, 7x24 Exchange Chairman Robert Cassiliano and Vice President David Schirmacher met in Washington, DC with representatives from the EPA, the DOE and 7 leading industry organizations
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
The Different Types of Air Conditioning Equipment for IT Environments
The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.
Alcatel-Lucent Modular Cooling Solution
T E C H N O L O G Y W H I T E P A P E R Alcatel-Lucent Modular Cooling Solution Redundancy test results for pumped, two-phase modular cooling system Current heat exchange methods for cooling data centers
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
Direct Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS
International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES
White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010
White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER
DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER Omendra K. Govind 1, Annu Govind 2 1 Advanced Level Telecom Training Centre (ALTTC), BSNL, Ghaziabad,(India) 2 Research Scholar, Department of Electrical Engineering,
Data Center Environments
Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of
Reducing the Annual Cost of a Telecommunications Data Center
Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities
Energy Efficiency and Green Data Centers. Overview of Recommendations ITU-T L.1300 and ITU-T L.1310
Energy Efficiency and Green Data Centers Overview of Recommendations ITU-T L.1300 and ITU-T L.1310 Paolo Gemma Chairman of Working Party 3 of ITU-T Study Group 5 International Telecommunication Union Agenda
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
Energy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
How to Meet 24 by Forever Cooling Demands of your Data Center
W h i t e P a p e r How to Meet 24 by Forever Cooling Demands of your Data Center Three critical aspects that are important to the operation of computer facilities are matching IT expectations with the
Typical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution
Typical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution White Paper July 2013 Yong Quiang Chi University of Leeds Contents Background of the Work... 3 The Design of the Iceotope Liquid
Energy Performance Optimization of Server Room HVAC System
International Journal of Thermal Technologies E-ISSN 2277 4114 2 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijtt/ Research Article Manoj Jadhav * and Pramod Chaudhari Department
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
Office of the Government Chief Information Officer. Green Data Centre Practices
Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain
Data Center Power Consumption
Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center
Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie
Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie Mukesh Khattar, Energy Director, Oracle National Energy Efficiency Technology Summit, Sep. 26, 2012, Portland, OR Agenda
Green Data Centers development Best Practices
Draft Recommendation ITU-T L.DC Green Data Centers development Best Practices Summary This recommendation is a contribution for the reduction of negative climate impact of Data Centers.
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0
BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0 To achieve GREEN MARK Award Pre-requisite Requirement All relevant pre-requisite requirements for the specific Green Mark Rating are to be complied
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Version 1.0 - November, 2011 Incorporating Data Center-Specific Sustainability Measures into
How High Temperature Data Centers and Intel Technologies Decrease Operating Costs
Intel Intelligent Management How High Temperature Data Centers and Intel Technologies Decrease Operating Costs and cooling savings through the use of Intel s Platforms and Intelligent Management features
News in Data Center Cooling
News in Data Center Cooling Wednesday, 8th May 2013, 16:00h Benjamin Petschke, Director Export - Products Stulz GmbH News in Data Center Cooling Almost any News in Data Center Cooling is about increase
Introduction to Datacenters & the Cloud
Introduction to Datacenters & the Cloud Datacenter Design Alex M. Hurd Clarkson Open Source Institute March 24, 2015 Alex M. Hurd (COSI) Introduction to Datacenters & the Cloud March 24, 2015 1 / 14 Overview
A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School
A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of
IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
European Code of Conduct for Data Centre. Presentation of the Awards 2016
European Code of Conduct for Data Centre Presentation of the Awards 2016 Paolo Bertoldi European Commission DG JRC Institute for Energy and Transport 1 The Code of Conduct for Data Centre Results European
HPC TCO: Cooling and Computer Room Efficiency
HPC TCO: Cooling and Computer Room Efficiency 1 Route Plan Motivation (Why do we care?) HPC Building Blocks: Compuer Hardware (What s inside my dataroom? What needs to be cooled?) HPC Building Blocks:
Energy Efficiency in the Data Center
Energy Efficiency in the Data Center Maria Sansigre and Jaume Salom Energy Efficieny Area Thermal Energy and Building Performance Group Technical Report: IREC-TR-00001 Title: Energy Efficiency in the Data
NetApp Data Center Design
NetApp Data Center Design Presentation for SVLG Data Center Efficiency Summit November 5, 2014 Mark Skiff, PE, CEM, CFM Director, IT Data Center Solutions Data Center Design Evolution Capacity Build Cost
Green Building Handbook for South Africa Chapter: Heating, Ventilation and Cooling Luke Osburn CSIR Built Environment
Green Building Handbook for South Africa Chapter: Heating, Ventilation and Cooling Luke Osburn CSIR Built Environment The heating, ventilation and cooling loads of typical commercial office space can range
Optimization of energy consumption in HPC centers: Energy Efficiency Project of Castile and Leon Supercomputing Center - FCSCL
European Association for the Development of Renewable Energies, Environment and Power Quality (EA4EPQ) International Conference on Renewable Energies and Power Quality (ICREPQ 12) Santiago de Compostela
