Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources
|
|
|
- Maximilian Stevens
- 10 years ago
- Views:
Transcription
1 Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services Department Transylvania University of Brasov Turnului Street, No. 5, Brasov ROMANIA Abstract: - The study develops the analytical model of an energy recovery system for the cooling of a data center using renewable energy resources. The cooling system consists of an absorption chiller, driven by the thermal energy recovered from the Data Center components and additional extra solar energy. The paper includes general information on data centers and a detailed description of the proposed energy recovery system. Additionally, a novel design is developed using the absorption chiller and alternative renewable energy, including solar energy. The conclusions highlight the efficiency of the system and comparison to the classical solutions. Key-Words: - data centers, solar energy, absorption chiller, air conditioning 1 Nomenclature IT CPU CRAC PCB PDU RAM UPS Information technology. Central Processing Unit. Computer Room Air Conditioning. Printed Circuit Board. Power Distribution Unit. Random Access Memory. Uninterruptible Power Modules 2 Introduction Several studies focused on the need to recover the thermal energy dissipated in data centers, given the large power consumption and associated power losses. It is generally accepted across the industry that the large power hungry data centers lack efficient energy recovery systems. However, most of the prior art [1 7] and associated studies have focused mostly on the design optimization and analysis of these data centers, without targeting the alternative recovery of the energy losses to the ambient. The present study fills this gap, and proposes an efficient energy recovery system used to cool the equipment and to provide an optimal micro-climate using the energy lost by the Data Centers. When the residual energy losses from the data centers are not sufficient to activate the cooling systems, it is suggested to use supplemental energy from alternative recovery systems mainly the solar energy, when available. Similarly, the geothermal energy and other alternative sources of energy can be used for the same purpose. In principle, the heat dissipated to the ambient by the data center racks is used to vaporize the cooling agents and thus activate the absorption Br-water refrigeration unit and cool the water which in turn will be used to cool the data center environment. When the thermal energy received from the data centers servers is not sufficient to activate the refrigeration unit, a dual system is proposed to heat the water needed for the absorption chiller, using solar energy or any other available regenerative heat source. The current study will provide a solution for the optimal design of the data center/server cooling systems, the appropriate selection and the functional analysis of the absorption chiller together with the complementary heat exchangers required by the overall system. The efficiency of the system is defined in terms of the Coefficient of Performance (COP), the ratio between the recovered energy from the system and the total power dissipated by the system. The COP values can reach 0.33 for a typical system. 2.1 Data Center Cooling General Information The cooling infrastructure is a significant part of a data center. The complex connection of chillers, compressors and air handlers create the optimal ISBN:
2 computing environment, ensuring the longevity of the servers installed within. The EPA s oft cited 2007 report predicted that the data center energy consumption, if left unchecked, would reach 100 million kwh by 2011 with a corresponding energy bill of $ 7.4 billion. This conclusion is not strictly based on Moore s law or by the need for greater bandwidth. In light of these news, the industry is turning a critical eye towards cooling, recognizing both the inefficiencies of current approaches and the improvements possible through new technologies. The information is design to assist the data center professionals who must balance the growing demand for computing power with the pressure to reduce energy consumption. Until recently, no standard measurement existed for Data Center efficiency. The Green Grid, a consortium promoting responsible energy use within critical facilities has successfully introduced two new terms; Power Usage Efficiency (PUE) and Data Center Infrastructure Efficiency (DciE) Power Usage Effectiveness (PUE) PUE is derived by dividing the total incoming power by the IT equipment load. The total incoming power includes, in addition to the IT load, the data center's electrical and mechanical support systems such as chillers, air conditioners, fans, and power delivery equipment. Lower results are better, as they indicate more incoming power is consumed by IT equipment instead of the intermediary, support equipment. Cooling can be a major player in PUE measurement. Consider the following diagram [2], [3], where the combination of the chiller, humidifier, and CRAC consume 45% of the total energy coming into the facility. The Uptime Institute approximates an industry average PUE of 2.5. There are no tiers or rankings associated with the values, but PUE allows facilities to benchmark, measure, and improve their efficiency over time. Companies with large-scale data center operations, like Google and Microsoft, have published their PUE. In 2008, Google had an average PUE of 1.21 across their six company data centers. Microsoft's new Chicago facility calculated an average annual PUE of Data Center Infrastructure Efficiency (DCiE) DCiE is the inverse of PUE-Total IT Power/Total Facility Power x 100%. DCiE presents a quick snapshot into the amount of energy consumed by the IT equipment. To examine the relationship between PUE and DCiE, "A DCiE value of 33% (equivalent to a PUE of 3.0) suggests that the IT equipment consumes 33% of the power in data center." ASHRAE released their "2008 ASHRAE Environmental Guidelines for Datacom Equipment" which expanded their recommended environmental envelope as follows: 2004 Version 2008 Version 20 C (68 F) to 18 C (64.4 F) to Temperature 25 C (77 F) 27 C (80.6 F) 5.5 C DP (41.9 F) 40% RH to Humidity to 60% RH & 15 C 55% RH DP (59 F DP) Table ASHRAE Environmental Guidelines for Datacom Equipment Fig. 1. Where does the energy go? (Source: The Green Grid) The cooling systems are categorized into aircooled and liquid-cooled systems. The definitions of these categories are: Air Cooling Conditioned air is supplied to the inlets of the rack / cabinet for convection cooling of the heat rejected by the components of the electronic equipment within the rack. It is understood that within the rack, the transport of heat from the actual source component (e.g., CPU) within the rack itself can be either liquid or air based, but the heat rejection media from the rack to the terminal cooling device outside of the rack is air. Liquid Cooling Conditioned liquid (e.g., water, refrigerant, etc., and usually above dew point) is ISBN:
3 channeled to the actual heat-producing electronic equipment components and used to transport heat from that component where it is rejected via a heat exchanger (air to liquid or liquid to liquid) or extended to the cooling terminal device outside of the rack. 3 Air Cooling Overview Air cooling is the most common source of cooling for electronic equipment within datacom rooms. Chilled air is delivered to the air intakes of the electronic equipment through underfloor, overhead, or local air distribution systems. Current industry guidelines recommend that electronic equipment be deployed in a hot-aisle/cold-aisle configuration (as illustrated in Figure 1) (ASHRAE TC ). On each side of the cold aisle, electronic equipment is placed with the intakes (fronts) facing the cold aisle. The chilled air is drawn into the intake side of the electronic equipment and is exhausted from the rear of the equipment into the hot aisle. 4 Liquid-Cooled Computer Equipment Most computers are cooled with forced air. With the increased microprocessor power densities and rack heat loads, some equipment requires liquid cooling to maintain the equipment within the environmental specifications required by manufacturers. The liquids considered for cooling electronic equipment are water, FluorinertTM, or refrigerant. Manufacturers normally supply the cooling system as part of the computer equipment and the liquid loop would be internal to the equipment. The transfer of heat from the liquidcooled computer system to the environment housing the racks takes place through a liquid-to-water or water/glycol heat exchanger. Figure 4. Internal liquid cooling loop restricted within rack extent. Fig. 2. Hot-aisle/cold-aisle cooling principle. In an underfloor distribution system, chilled air is distributed via a raised floor plenum and is introduced into the room through perforated floor tiles (Figure 2) and other openings in the raised floor (i.e., cable cutouts). Fig. 3. Raised floor implementation using baffles to limit hot-aisle/cold-aisle mixing. Figures 5 and 6 depict the two possible liquid-cooled systems. Figure 4 shows a liquid loop internal to the rack where the exchange of heat with the room occurs with a liquid to air heat exchanger. In this case the rack appears as an air cooled rack to the client and is classified as an air-cooled system. It is included here show the evolution to liquid-cooled systems. Figure 5 depicts a similar liquid loop internal to the rack used to cool the electronics within the rack, but in this case the heat exchange is with a liquid to chilled water heat exchanger. Typically the liquid circulating within the rack is maintained above dew point to eliminate any condensation concerns. Figure 4 depicts a design very similar to Figure 4 but where some of the primary liquid loop components are housed outside the rack to permit more space within the rack for electronic components. The liquid loops for cooling the electronics shown in Figures 4, 5, and 6, [1], are typically of three types: a) FluorinertsTM (fluorocarbon liquids); b) Water (or water/glycol mix); c) Refrigerants (pumped and vapor compression). ISBN:
4 exchanger by condensing the water which heats over 75 o C. The warm water enters the Absorption Chiller generator and will prepare the cold water which further enters the Air Handling Unit [AHU], which will further enable Air Conditioning for the Data Center Environment. Figure 5. Internal liquid cooling loop extended to liquid-cooled external modular cooling unit. Figure 6. Internal liquid cooling loop with rack extents, liquid cooling loop external to racks. 5 Proposed Recovery Energy System As seen in Figure 1, only 30% of the total data center power is used to drive the server activity, while the remainder is lost as heat to the surroundings. Therefore, in the paper we are proposing to increase the efficiency of the power in the system [DCiE], by recycling the lost energy and by using it primarily to cool off the system equipment. The main energy recycling process occurs at the IT level, by recycling the server rack dissipated power and using it to power an absorption based refrigeration plant preparing the water necessary for the data center cooling. Additionally, the energy from the data center ambient air is recycled as well, yet this happens at a lower temperature (and exergy levels), thus the use of over 75C heated water needed to activate the absorption chiller is not possible. Figure 7 shows the schematic of the IT energy recycling system and the hot water preparation process for the absorption chiller, to further prepare the chilled water needed to cool off the environment (acclimatization). The system components are specified below. The system operates as follows; the heat is taken from the data center servers while vaporizing a refrigerant (more details in next section) and transferred in a heat Figure 7. Heating Recovery System by cooling Servers and the Absorption LiBr/H2O Chiller driven by this energy to Prepare Cold Water for Conditioning of the Data Center. Legend: G Generator; C Condenser; V- Evaporator; A Absorber; DC Data Center; AC Lithium- Bromide/Water Absorption Chiller; SC Solar Collectors; WS Warm Water Storage; AHU Air Handling Unit; CT Cooling Tower. When the heat coming from Servers is not enough for the process, additional thermal energy from Solar Collectors [SC] is employed. This energy is stored in tank R, then is distributed to the Absorption Chiller generator to add to existing energy needed to activate the Chiller. The System has a Cooling Tower which will take the heat dissipated by the Chiller absorber and condenser Units. This heat can be reused for other practical purposes, such as the AHU air heating, also preparing utilitary/sanitary water. 5.1 Energy Balance for the Energy Recovery System for Proposed Data Center Cooling The various component powers are defined as follows: Q - server thermal power, kw; Q - solar thermal power, kw; Q - thermal power of generator G (from chiller MF), kw; Q - power needed to cool the air, kw ISBN:
5 Where: Q = Q + Q (1) Chiller MF cooling power is thus: Q = COP Q (2) server using a pump (Figure 8 and 9) the refrigerant circulates over the servers in a natural way, using a thermo-syphon (Figure 9). For both scenarios the racks are mounted in vertical position, unlike the typical horizontal position used for the air cooled systems. COP is the chiller Coefficient of Performance, value ~ 0.6. Also Q TR is the cooling power in the Cooling Tower: Q = Q + Q = Q + Q (3) Where: Q C = the thermal power of the condenser in the chiller unit Q A = the thermal power of the absorber in the chiller unit Q AHU = the thermal power of the cooling air unit in the Air Handling Unit, which is: Q = Q (4) For a specific Data Center configuration with known thermal power dissipated by the servers (QS = 52 kw), need to calculate the thermal powers for the other components: - The thermal power of the generator G: QG QS=52 kw; - The refrigeration power: QE=0.6*52=31.2 kw; - The cooling power of the solar system collector is assumed to be about half of the generator G: QS=0.5*QG=26 kw; - The thermal power of the Cooling Tower: QTR= = 83.2 kw - The thermal power in the AHU battery: QAHU=31.2 kw Referring to Figure 1, with the given thermal powers calculated, the DATA CENTER infrastructure efficiency increases from ~ 33% to ~ 60%. 5.2 Details on Servers Heat Recovery Systems The cooling refrigerant is chosen such that it will not interact chemically with the electronic chips and packages to be cooled. The refrigerant absorbs the heat from the components and vaporizes, then heats up the water in the heat dissipation process while it condenses. Two solutions are proposed: 1) the refrigerant is re-circulated over the surface of the pumps in the Figure 8. First proposal for heat recovery from servers, with pump circulation. First solution with pumps (Figure 8) includes: Data Center DC, Server Plates - S, Pump Liquid P, Liquid Distributor D, Closed Server Room SR. The racks are placed in a room separated from the rest of the Data Center equipment. Below this encapsulated room is located the liquid refrigerant re-circulated through the distributor D via pump P over the server plaques in the form of a film. The refrigerant partially vaporizes by absorbing the heat from servers; the vapors rise towards condenser C and the remaining liquid is collected at the bottom of the encapsulated room. The vapors will rise and further condense on the surface of the tubes with water which is heating up and is further distributed to the generator of the absorption chiller system. The water is recirculated via a 3-way throttle device until it reaches the temperature required for the solution boiling in the generator. The liquid refrigerant from condenser C returns to the system through distributor D. The power dissipated by the solution pump is negligible compared to the energy consumption inside the Data Center. 5.3 Data Center Option 1 Design Example A smaller Data Center is chosen, having 2 racks each with 60 server blades. Each server includes 2 CPU processors, dissipating 105W at about 90 o C temperatures. The overall thermal power dissipated by each rack is about 26 kw thus both racks will dissipate about 52 kw. Each sever is 0.3 x 0.7 square ISBN:
6 meters, about 1 mm thick. The racks are placed vertically, forming an overall height of about 1.5 meters. The thermal power of the condenser is 52 kw; R123 is the chosen refrigerant, with a smaller saturation pressure at phase changing temperatures compared to other refrigerants. The vaporizing and condensing temperatures are T o ~ T c = 90 o C, while CU Ps = 6.4 bar. The latent vaporizing heat is 10 kj/kg. Thus the mass flow rate needed to prepare hot water is ṁ=0.37 kg/s. We choose the inlet/outlet chiller generator temperatures at 80/85 o C as seen also in Figure 8; the hot water mass rate will be in this case ṁ w=2.5 kg/s. The refrigerant volumetric heat flow rate at 90 o C is V = 0.3*10 - ³m³/s [4.8 m³/h]. We double the volumetric flow rate for the refrigerant recirculated in the pump, part of which will vaporize and the rest will return as liquid. The refrigerant pump will be chosen for a refrigerant volumetric flow rate of V t=0.6 m³/s at a pressure of 10 MPa. This refrigerant flow rate will provide satisfactory 0.5 mm coolant films on the server blades. In Figure 9 is presented the second solution, with the refrigerant circulated via a thermo-syphon. The components details are provided in the Legend. The refrigerant vapors will rise on the server blades due to the buoyancy force, then will condense on the surface of condenser C. There are 2 solutions to achieve the upward motion of the refrigerant, one using a custom built heat tube surrounding each server blade and a second option would be to build more server blades such that the condensation could follow a separate path. Figure 9. Second proposal for heat recovery from servers, with thermosyphon. Second solution with thermosyphon (Fig. 9) includes: Data Center DC, Server Plates - S, Pump Liquid P, Liquid Distributor D, Condenser C, Liquid Tube with Distributor LT, Closed Server Room SR. Hence, the first solution (with recirculation pump) has several advantages over the second one, such as: design simplicity, lower power consumption required to activate the pump and the reliability of the system. The system will be built and tested in order to confirm these hypotheses. 6 Conclusions The paper presents a novel solution of recovering the thermal energy dissipated by the data center server blades, using it to prepare hot water to activate a LiBr-HSO absorption chiller to prepare the cold water necessary to further cool the data center environment. When the additional energy dissipated by the Data Center is not enough to cover the inputs necessary for the cooling process via absorption, additional energy is provided by a solar system or other renewable energy system. The proposed energy recovery system will lead to a COP of ~ 0.6, double of the 0.33 value for the system without recovery. The current study is purely theoretical, enabling a better understanding of what the pros/cons of building such a recovery system are to improve the efficiency; next step is to build an experimental setup to validate the proposed solution. References: [1] X1. ASHRAE Datacom Series CD 2 EDITION: Thermal Guidelines for Data Processing Environments, Power Trends and Cooling Applications, Design Considerations for Datacom Equipments Centers. [2] X2. Cappuccio, D. (2008), Creating Energy- Efficient, Low-Cost, High Performance Data Centers. Gartner Data Center Conference, (p. 4), Las Vegas. [3] X3. EPA. (2007, August 2). EPA Report to Congress on Server and Data Center Energy Efficiency. [4] X4. McGuckin, P. (2008). Taming the Data Center Energy Beast. Gartner Data Center Conference, (p. 5). Las Vegas. [5] X5. Sullivan, R. (2002). Alternating Cold and Hot Aisles Provides More Reliable Cooling for Server Farms. [6] X6. Green Grid. (2008, October 21). Seven Strategies To Improve Data Center Cooling Efficiency. [7] X7. The Green Grid. (2009). The Green Grid: Home. ISBN:
ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS AND RENEWABLE ENERGY RESOURCES
ENERGY RECOVERY SYSTEMS FOR THE EFFICIENT COOLING OF DATA CENTERS USING ABSORPTION CHILLERS AND RENEWABLE ENERGY RESOURCES Florea CHIRIAC, Victor CHIRIAC, Alexandru ŞERBAN UNIVERSITY POLITEHNICA BUCHAREST,
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
Data Center Equipment Power Trends
Green field data center design 11 Jan 2010 by Shlomo Novotny Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. explores water cooling for maximum efficiency - Part 1 Overview Data
How to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
The New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers
DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and
Data Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
Analysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 8, Loughborough University, 29 th June 2015 Topics to be considered 1. Progress on project tasks 2. Development of data centre test facility
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,
DataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
Data Centre Cooling Air Performance Metrics
Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. [email protected] Abstract Data centre energy consumption
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
Rittal Liquid Cooling Series
Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de
Thermal Mass Availability for Cooling Data Centers during Power Shutdown
2010 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ashrae.org). Published in ASHRAE Transactions (2010, vol 116, part 2). For personal use only. Additional reproduction,
Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 4, Lancaster University, 2 nd July 2014 Background Data centres estimated to use 2-3% of total electricity consumption in the UK and generate
7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America
7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer
Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.
Energy Efficient Data Centre at Imperial College M. Okan Kibaroglu IT Production Services Manager Imperial College London 3 March 2009 Contents Recognising the Wider Issue Role of IT / Actions at Imperial
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
BRUNS-PAK Presents MARK S. EVANKO, Principal
BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
Elements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
Green Data Centre Design
Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS
Driving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
Building Energy Systems. - HVAC: Heating, Distribution -
* Some of the images used in these slides are taken from the internet for instructional purposes only Building Energy Systems - HVAC: Heating, Distribution - Bryan Eisenhower Associate Director Center
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
Overview of Green Energy Strategies and Techniques for Modern Data Centers
Overview of Green Energy Strategies and Techniques for Modern Data Centers Introduction Data centers ensure the operation of critical business IT equipment including servers, networking and storage devices.
UNIT 2 REFRIGERATION CYCLE
UNIT 2 REFRIGERATION CYCLE Refrigeration Cycle Structure 2. Introduction Objectives 2.2 Vapour Compression Cycle 2.2. Simple Vapour Compression Refrigeration Cycle 2.2.2 Theoretical Vapour Compression
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
Direct Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010
White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
Alcatel-Lucent Modular Cooling Solution
T E C H N O L O G Y W H I T E P A P E R Alcatel-Lucent Modular Cooling Solution Redundancy test results for pumped, two-phase modular cooling system Current heat exchange methods for cooling data centers
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy
Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy Executive Summary Data center owners and operators are in constant pursuit of methods
DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER
DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER Omendra K. Govind 1, Annu Govind 2 1 Advanced Level Telecom Training Centre (ALTTC), BSNL, Ghaziabad,(India) 2 Research Scholar, Department of Electrical Engineering,
The CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER DATA CENTERS 2009 IT Emissions = Aviation Industry Emissions Nations Largest Commercial Consumers of Electric Power Greenpeace estimates
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 2 Measuring PUE for Data Centers 17 May 2011 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations for Measuring
Energy Efficiency and Green Data Centers. Overview of Recommendations ITU-T L.1300 and ITU-T L.1310
Energy Efficiency and Green Data Centers Overview of Recommendations ITU-T L.1300 and ITU-T L.1310 Paolo Gemma Chairman of Working Party 3 of ITU-T Study Group 5 International Telecommunication Union Agenda
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
Reducing the Annual Cost of a Telecommunications Data Center
Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities
Office of the Government Chief Information Officer. Green Data Centre Practices
Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain
Energy Performance Optimization of Server Room HVAC System
International Journal of Thermal Technologies E-ISSN 2277 4114 2 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijtt/ Research Article Manoj Jadhav * and Pramod Chaudhari Department
International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS
International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 1 Measuring PUE at Dedicated Data Centers 15 July 2010 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations
Defining Quality. Building Comfort. Precision. Air Conditioning
Defining Quality. Building Comfort. Precision Air Conditioning Today s technology rooms require precise, stable environments in order for sensitive electronics to operate optimally. Standard comfort air
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics
On January 13, 2010, 7x24 Exchange Chairman Robert Cassiliano and Vice President David Schirmacher met in Washington, DC with representatives from the EPA, the DOE and 7 leading industry organizations
The Different Types of Air Conditioning Equipment for IT Environments
The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation
Guide to Minimizing Compressor-based Cooling in Data Centers
Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi
News in Data Center Cooling
News in Data Center Cooling Wednesday, 8th May 2013, 16:00h Benjamin Petschke, Director Export - Products Stulz GmbH News in Data Center Cooling Almost any News in Data Center Cooling is about increase
FEAR Model - Review Cell Module Block
F.E.A.R. Returns: 5 Steps Toward Data Center Efficiency Today Objectives Benefits for ITS professionals Brief review of the FEAR data center Define data center efficiency Link efficiency to the FEAR data
How to Meet 24 by Forever Cooling Demands of your Data Center
W h i t e P a p e r How to Meet 24 by Forever Cooling Demands of your Data Center Three critical aspects that are important to the operation of computer facilities are matching IT expectations with the
Data Center Environments
Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of
Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.
Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
Center Thermal Management Can Live Together
Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots
Glossary of Heating, Ventilation and Air Conditioning Terms
Glossary of Heating, Ventilation and Air Conditioning Terms Air Change: Unlike re-circulated air, this is the total air required to completely replace the air in a room or building. Air Conditioner: Equipment
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
IT@Intel. Thermal Storage System Provides Emergency Data Center Cooling
White Paper Intel Information Technology Computer Manufacturing Thermal Management Thermal Storage System Provides Emergency Data Center Cooling Intel IT implemented a low-cost thermal storage system that
Energy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
Introduction to Datacenters & the Cloud
Introduction to Datacenters & the Cloud Datacenter Design Alex M. Hurd Clarkson Open Source Institute March 24, 2015 Alex M. Hurd (COSI) Introduction to Datacenters & the Cloud March 24, 2015 1 / 14 Overview
A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School
A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of
Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
Green Data Centers development Best Practices
Draft Recommendation ITU-T L.DC Green Data Centers development Best Practices Summary This recommendation is a contribution for the reduction of negative climate impact of Data Centers.
NetApp Data Center Design
NetApp Data Center Design Presentation for SVLG Data Center Efficiency Summit November 5, 2014 Mark Skiff, PE, CEM, CFM Director, IT Data Center Solutions Data Center Design Evolution Capacity Build Cost
HPC TCO: Cooling and Computer Room Efficiency
HPC TCO: Cooling and Computer Room Efficiency 1 Route Plan Motivation (Why do we care?) HPC Building Blocks: Compuer Hardware (What s inside my dataroom? What needs to be cooled?) HPC Building Blocks:
Optimization of energy consumption in HPC centers: Energy Efficiency Project of Castile and Leon Supercomputing Center - FCSCL
European Association for the Development of Renewable Energies, Environment and Power Quality (EA4EPQ) International Conference on Renewable Energies and Power Quality (ICREPQ 12) Santiago de Compostela
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks
Presentation Outline Common Terms / Concepts HVAC Building Blocks Plant Level Building Blocks Description / Application Data Green opportunities Selection Criteria Air Distribution Building Blocks same
An Overview of Solar Assisted Air-Conditioning System Application in Small Office Buildings in Malaysia
An Overview of Solar Assisted Air-Conditioning System Application in Small Office Buildings in Malaysia LIM CHIN HAW 1 *, KAMARUZZAMAN SOPIAN 2, YUSOF SULAIMAN 3 Solar Energy Research Institute, University
Energy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling
Energy Efficient Server Room and Data Centre Cooling Computer Room Evaporative Cooler EcoCooling EcoCooling CREC Computer Room Evaporative Cooler Reduce your cooling costs by over 90% Did you know? An
