Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre
|
|
- Ellen Sanders
- 8 years ago
- Views:
Transcription
1 Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading, United Kingdom 2 Informatics Research Centre, Henley Business School, University of Reading, United Kingdom 3 Capgemini UK, Woking, Surrey ABSTRACT *Corresponding author: t.s.luong@student.reading.ac.uk Power consumption in data centres has been rapidly increasing and based on current trends this will inevitably cause major challenges for both data centre designers and operators. More than a decade ago, a standard server cabinet housed computers using single core processors drawing a low (5kW) amount of power. As each cabinet contained one or two computers the energy required to provide conditioned air was not a major concern for data centre operators. In contrast, in 2011, a server cabinet of the similar dimension can house up to 128 blade servers making them more densely populated therefore drawing more power and increasing heat dissipation per square meter. Due to the increase in heat dissipation the data centre cooling systems are at risk of being unable to adequately provide enough cooling. Both increased density in computer hardware and cooling demands are having a huge impact on power consumption. This paper examines the challenges data centres are facing regarding power and cooling management. It also makes references to the environmental requirements and specifications for data centres recommended for organisations based on best practices. We also look at one of the current practices for building a sustainable state-of-the-art data centre and finally present a recommended approach to reducing power consumption in cooling systems. Keywords: Data centres, sustainability, power, cooling, intelligent management 1. INTRODUCTION Data centres around the world are encountering power, cooling, space and environmental issues whilst supporting the growth of an organisation. Due to the demands of more processing capability and storage space data centres are on the verge of running out of floor space and consuming more electrical power than it can be supplied with. As IT equipment become more compact, they draw more electrical power per sq. feet. Data centres consume large amounts of electricity and it has been estimated that the total magnitude is believe to be between 1.5 per cent and 3 per cent of total electricity generated. In addition, the cost of powering and cooling IT equipment for three years is equivalent to 1.5 times the cost of purchasing server hardware (Brill, 2007). According to a report written by Sheihing (2009) the cost of powering data centres in the United States cost $4.5 billion and this figure is predicted to grow 12% per year. And, according to the Code of Conduct on Data Centres 1
2 Energy Efficiency (European Commission, 2008) data centres in Western Europe had consumed 56 TWh of electricity in the year If data centre power consumption follows the predicted unsustainable trend then there would be a serious power shortage as energy suppliers will not be able to cope with such demands. The management of power and cooling towards a sustainable data centre is a combination best practices, hardware and software. Best practices include following standards, guidelines and using the best approaches. Hardware involves the selection of the IT equipment to install in the data centre and software comprises of the technologies used for the management of power and cooling. There needs to be a balance between the three categories in order to achieve energy efficient and sustainable data centre. For example, a data centre professional might design a new data centre to include the best practices but must also consider there could be potential for the data centre to use heat reclamation technology that is further enhanced by a cheap software solution. In this paper, we will discuss the challenges of power and cooling in section 2, examine the environmental requirements and specifications in section 3 by exploring how the requirements have changed previously to help data centres reduce power consumption and better manage their cooling systems. Section 4 describes how a state-of-the-art data centre utilise best practices and standards to construct the leading sustainable data centre. Sections 5 present a recommended approach to the intelligent management of power and cooling, and finally conclude the paper in section THE CHALLENGE OF POWER & COOLING There has been a dramatic increase in the number of organisations adopting environmental sustainable policies in order to be environmentally friendly and ultimately reduce their operation costs. One of the key factors that are always taken into consideration when an organisation implements their strategy is reducing IT operation costs. Data centres rely on power availability and transmission capabilities, which are generally being affected by increasing demand for more computing power. In United States, the energy used by data centres had more than doubled between the year 2000 and 2006 as illustrated in Figure 1 below. 2
3 Figure 1 Past and Projected Electricity Use by Data Centres in the U.S. (U.S. Environmental Protection Agency, 2007) Figure 1 also shows predicted scenarios so that comparisons can be made on the future of energy use. Based on current historical trends, in the year 2000, data centre energy usage was approximately 30 billion kwh/year and in 2011 is predicted to peak over 120 billion kwh/year. Clearly the Historical Trend scenario is highly unsustainable due to the cost of maintaining data centre operation and the lack of power stations to provide enough electricity. Organisations would want to try to achieve the best practice or state of the art scenario, which is to significantly reduce their energy consumption. There are many challenges a data centre operator will constantly encounter due to the ever changing technology. Some of the common challenges include new system deployments or upgrades, scalability and life cycle costs. To elaborate on challenge of deploying new systems or upgrading them, in the past, when demand was not high it was very easy to commission new servers and add more CRAC units inside the data centre. The energy cost for powering and cooling servers was not a major issue for organisations. However, today s IT equipment is considerably more compact and delivers greater performance than a decade ago causing the power density to increase dramatically. Because technology is always advancing a data centre should be designed to be able to adapt and scale to accommodate these changes. But realistically, it is difficult to determine the numbers of years a data centre should be designed for knowing that in future high performance computing will always need more energy. The major challenge for data centre operators is how to cool densely packed IT equipment. A decade ago, a server rack s power design was approximately 5kW. This is expected to rise to 37kW by the year The more power each rack consumes, the more heat is generated. Essentially, including investments in advanced cooling solutions there has to be a trade-off 3
4 between maximising the floor space with many low-performance servers or have fewer and more powerful servers so that less floor space is used. The technical challenge here is how to deliver cooling to or how to remove heat from the IT equipment. We will consider the two solutions and the technical challenge involved in air and water cooling. Air cooling involves cooling the entire facility and server platform with air. This normally involves a direct expansion (DX) chiller, air-side economiser and fans all of which increases the energy consumption in a data centre. However, some of this equipment such as using an air-side economiser to exhaust hot air out of a data centre maybe costly and difficult to retrofit. They often do not have a good return on investment if the initial data centre design did not include this type of equipment. On the other hand, if a data centre was designed to maximise the equipment s usage they can be very efficient. The other limitation that affects air cooling and air flow to the server racks is the practical constraints in the existing design of a data centre. For example, if a data centre has a raised floor plenum height of 24 inches with under floor cabling system causing obstruction these will obviously set an upper boundary on the air cooling. It is difficult to determine the upper boundary in data centres due to the lack of real-time airflow values but instead data centre operators determine this based on their operational experience. As for cooling the actual server racks, the manufacturers typically provide airflow data to inform data centre operators how much cooling the server rack needs. Ultimately, the ability to optimise the data centres capacity to provide cool air and reduce power consumption is down to how it is configured, i.e. using hot aisle/cold aisle configuration (Wang, 2006), following best practices on placement and proximity of perforated tiles, and using blanking plates and cut-outs to prevent hot and cold air mixing. To further optimise the air cooling capacity containment doors and roofs are used for either the hot or cold aisle (U.S. Department of Energy, 2011). But all of these enhancements come at an additional cost and complication. The alternative, water cooling, aims to bring cooled liquid closer to the server racks. Water is a significantly better thermal conductor than air (Engineering Toolbox, 2011) and so it should be more effective at heat removal. There are several variants of liquid cooling ranging from chip/rack level cooling to liquid-cooled door that is positioned behind the racks. Chip and rack level liquid cooling are clearly the best solutions for removing heat especially from the CPU. But this type of solution is highly complex and expensive which would not be considered a mainstream solution but possibly for situations where many high performance server racks have to be densely packed together. Regardless of which liquid cooled variant is used there still needs to be some airflow to cool the other components on the server platform and for the actual heat exchange between server rack and the liquid cooling system. Liquid cooling solutions can be simply retrofitted into data centres with the benefit of a raised floor plenum but ultimately there is the potential risk of the fluids evaporating or leaking onto the IT equipment causing serious physical damage. As with all liquid cooling solutions there is 4
5 the additional cost of installing a leak detection system. There are many discussions (Ellsworth, et al., 2008; Sharma, et al., 2009) around the efficiency of liquid cooling as at the basic level in air cooling there is a heat exchange using liquid in the chillers. But with water cooling we are bringing that heat exchange closer to the racks to better cool the servers. But, as mentioned previously, there still needs to be some airflow to cool the entire server platform. 3. ENVIRONMENTAL REQUIREMENTS & SPECIFICATIONS Before 2002, the thermal designs for IT equipment was rather complex due to the lack of standards and environmental specification. Equipment vendors manufactured their products based on their own specifications so data centres generally had to accommodate varying environmental requirements, which made cooling and heat removal more challenging. Post 2002, data centres continue to accommodate products from different vendors. A technical committee formed by the American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE, 2002) published Thermal Guidelines for Data Processing Environments that maintains an Equipment Environment Specifications, which is the recommended operating envelope and now a widely accepted standard for manufacturing IT equipment for data centres. This ensured that participating vendors manufacture their products within the specified design envelope therefore alleviating some of the cooling challenges data centres were facing. Furthermore, in order to help data centres reduce their energy consumption the Equipment Environmental Specifications had been revised as shown in table Version 2008 Version Low End Temperature 20 o C (68 o F) 18 o C (64.4 o F) High End Temperature 25 o C (77 o F) 27 o C (80.6 o F) Low End Moisture 40% RH 5.5 o C DP (41.9 o F) High End Moisture 55% RH 60% RH & 15 o C DP (59 o F) Table 1 Comparison of 2004 and 2008 recommended operating envelope In 2004, the temperature and humidity boundaries for IT equipment was initialised allowing manufacturers to test their products within that design envelop to ensure the equipment operates reliably within those boundaries as high operating temperatures may cause hardware failure or reduced reliability. In 2008, the recommend operating envelope was revised to help reduce power consumption. Although, this does not ensure optimum energy efficiency it does offer a window of opportunity to save energy in contrast to the 2004 version. This can be 5
6 achieved by lowering the air optimiser fan speeds or the DX (direct expansion) chilling units to allow the ambient temperature in the data centre to be raised. Improvements made to IT equipment such as variable fan speed offers data centres the flexibility to operate at the recommended temperatures. The temperature of the components on the IT equipment is affected by the ambient temperature and the component temperature tracks closely to the ambient temperature. For example, based on a scenario that the equipment uses a constant fan speed at max power an inlet ambient temperature of 17 o C would make the component temperature to be 40 o C and an inlet ambient temperature of 38 o C would make the component temperature to be 60 o C. Considering in this scenario that 60 o C is the allowable temperature range for reliable operation a variable fan speed can operate at a slow fan rate at lower temperatures to save energy. Between the inlet ambient temperatures of 16 o C to 25 o C the fan rate is on low and the component temperature remains at 60 o C but beyond 25 o C the fan rate increases to maintain a constant component temperature of 60 o C therefore not affecting the reliability of the components. Overall, a variable fan speed would have saved more energy during the 16 o C to 25 o C operation temperature in comparison to a constant fan speed. As with the temperature, raising the boundary limits for relative humidity could affect the performance and reliability of the components (ASHRAE, 2008). High relative humidity causes conductive anodic filament failure to printed circuit boards. In addition, high relative humidity and common atmospheric contaminants causes hydroscopic corrosion. Low relative humidity causes electrostatic discharge (ESD) and very high voltages can build up in very dry environments. In severe instances, ESD can damage sensitive electronics causing hardware failures. Following the standards and guidelines had initially helped data centres tackle some of the cooling challenges but as power density increased over the years the recommended operation temperature guidelines was revised to help reduce power consumption. As IT vendors continue to improve their manufacturing process and produce more durable equipment it is expected that the Equipment Environmental Specification is likely to be revised again so that data centres can further improve their energy efficiency and have more flexibility to how they accommodate IT equipment. 4. THE STATE-OF-THE-ART COOLING AND POWER REDUCTION 6
7 The efficiency of a data centre is measured by the Power Usage Effectiveness (PUE) metric (The Green Grid, 2007). PUE is a ratio of the total power consumed by a data centre over the power consumed by the IT equipment. The theoretical minimum PUE value is 1.0 which means for every watt of IT power, no additional watt is consumed for cooling or distributing power to the IT equipment. The report to the U.S. Congress (U.S. Environmental Protection Agency, 2007) listed four categories of data centre efficiency as shown in table 2 below. Scenario Current Trends Improved Operations Best Practices State-of-the-Art PUE Table 2 Data Centre Efficiency Targets (U.S. Environmental Protection Agency, 2007) In July 2010, the industry average data centre PUE measurement was 2.5 (Miller, 2010). During 2010, IT firm Capgemini constructed a new highly resilient (Uptime Institute, Tier 3 certified) data centre called Merlin (Capgemini, 2010), which achieved a PUE of 1.11 largely driven by its use of fresh air cooling within a modular approach. Other factors contributing to the industry leading PUE and sustainability credentials included its location which enabled the use of fresh-air cooling, and its use of renewable and reusable components. Merlin can accommodate up to 12 modular data halls within a 3,000m 2 brown-field warehouse. Each module has the capability of housing 104 racks. Figure 4 illustrates the floor plan of one of the data centre module. The centre design had followed the approach of a front-to-back configuration to ensure the separation of hot and cold air. Rather than using an under floor plenum to supply cold air and letting hot air naturally rise to the ceiling, the module is on a single floor with the cold air supply contained and fed directly to the front of the cabinets while the hot air is ducted and circumvented directly away to the Air optimiser or exhausted out of the module. Essentially, both cold air and hot air are contained and ducted directly to and from the IT equipment to maximise the module s potential in delivering cool air and removing heat from the source. Air temperature, humidity and velocity sensors are located in each cold aisle and are connected to a building management system (BMS). The cold aisles have doors with louvers which the BMS sets to control how much cold air should be supplied to each of the four cold aisles. Ultimately energy is saved through the use of 12 variable speed fans which are adjusted by the BMS to meet the server cooling demands. The climate controlled Module positioned at left side of the module is controlled by the BMS which supplies air at the right temperature. Because this is a modular data centre, if required, another Module could be attached to the right side to supply cold air. 7
8 Figure 4 Merlin data centre module floor plan (Capgemini, 2010) Merlin s use of free fresh air cooling carefully channelled through the IT equipment is critical to it achieving its low PUE. 5. RECOMMENDED COOLING METHODOLOGIES As mentioned in the introduction, achieving a sustainable data centre is a combination of best practices, hardware and software. Data centre professionals design the infrastructure based on best practices, guidelines and standards but ultimately for the purpose of distributing power to the IT equipment and to provide an environment specifically for cooling high performance computers. Clearly, there has been major advancements in all three categories: best practices are starting to include the use of free cooling to supply cold air or photovoltaic panels for generating electricity on-site, variable speed fans on server components or water cooling heat sinks pre-built onto the processors for quick and easily installation, and intelligent BMS software for climate control. In future work, we will examine the optimal density of server racks and how it affects the whole cooling process, energy consumption, total cost of ownership and data centre design in general. This paper discussed the challenges of cooling a data centre and the standards or practices involved which raises the research question how close should the server racks be installed together in both air and water cooling solutions; are organisations willing to upgrade to a more powerful server if they knew this would help tackle the challenge of power and cooling in data centres thus reducing their total cost of ownership. 8
9 We will also research the use intelligent agents to improve power distribution and cold air delivery to the IT equipment. There is a lot of research into the use of intelligent agents or multi-agent systems for building control with the emphasis of minimising energy consumption and provide a comfortable environment for human users (Duangsuwan & Liu, 2009; Wang et al., 2010; Wang et al., 2010). However, there appears to be a gap in this field where intelligent agents are not applied to industrial buildings that do not accommodate human users but only IT equipment. There is a lot of potential where building control in data centres could be optimised or made intelligent and giving it the ability to learn to improve power consumption and cooling systems. The following steps will require a survey as to why data centres has not explored the use of intelligent agents and how intelligent agents could be a possible solution to help tackle the on-going challenge of managing power and cooling. 6. CONCLUSION Data centres are increasingly becoming challenging to maintain due to the increasing density of IT equipment, the difficulties in heat dissipation and heat removal from the source. Organisations are usually constrained by their existing data centre facilities and therefore are looking for cost effective solutions to ensure their IT services can be maintained at a lower cost using less energy and minimising the amount of carbon emissions produced. Designing a sustainable data centre is a combination of best practices, hardware and software. The business challenge across the data centre industry is trying to keep energy expenses and carbon emissions low whilst maintaining a reliable service and delivering increasing demands for computing power. As technology continues to decrease in size, increased performance will be more densely packed to each server rack requiring more energy and generating more heat. Both air and water cooling solutions have technical challenges and limitations. When trying to improve the cooling system in an existing data centre, there is the risk of the equipment not performing to expectations due to incompatibility with the data centre design. The existing infrastructure and the initial design considerations (e.g. the choice of height for the under floor plenum) will have already determined the upper boundary (in terms of heat dissipation) and lifespan of the data centre. It then becomes a question of how long before IT equipment reaches the data centre s upper boundary when it can no longer deliver sufficient cooling and therefore forcing an organisation to consider building an additional data centre. The environmental specifications for data centres were first defined in 2004 when data centre designers and IT vendors voluntarily participated in using the standard to help reduce the challenge of power and cooling. In 2008, the specifications were revised. IT hardware vendors are now manufacturing more durable hardware which allows data centre operators to 9
10 raise the ambient temperature and humidity levels inside its data centres enabling the reduction of power consumption and carbon emissions. It is expected the specifications will be revised again in the near future as IT vendors manufacture more durable hardware able to handle hotter environments. The state-of-the-art data centre constructed by Capgemini has achieved a world leading PUE of 1.11 by innovatively designing a modular data centre combined with best practice air handling enabling free fresh air cooling. Data centres continue to improve through the various ways of utilising best practices and approaches, and being selective about the IT equipment deployed. Future work will involve researching the optimal density of server racks and the use of intelligent agents for building control in data centres. REFERENCES ASHRAE, 2004, Thermal Guidelines for Data Processing Environments, ASHRAE Publication. ASHRAE, 2008, 2008 ASHRAE Environmental Guidelines for Datacom Equipment Expanding the Recommended Environmental Envelope. Brill, K. G., 2007, Data Center Energy Efficiency and Productivity, White Paper, The Uptime Institute, Inc., The Uptime Institute Symposium 2007: The Invisible Crisis in The Data Center: How IT Performance is Driving the Economic Meltdown of Moore s Law. Capgemini 2010, A Closer Look at Merlin Technical specifications for the world s most sustainable data centre, Capgemini UK. Duangsuwan, J. & Liu, K., 2009, Normative Multi-Agent System for Intelligent Building Control, 2009 Pacific-Asia Conference on Knowledge Engineering and Software Engineering, IEEE, /09. Ellsworth, M. J., et al., 2008, The Evolution of Water Cooling for IBM Large Server Systems: Back to the Future, 11 th Intersociety Conference on Thermal and Thermomechanical Phenomena in Electonic Systems, ITHERM 2008, European Commission, 2008, Code of Conduct on Data Centres Energy Efficiency Version 1.0. Miller, R., 2010, How A Good PUE Can Save 10 Megawatts, Data Center Knowledge, 13 th September 2010, [online] Available at: centerknowledge.com, [accessed: 29/5/2011]. 10
11 Scheihing, P., 2009, DOE Data Center Energy Efficiency Program, U.S. Department of Energy, Energy Efficiency and Renewable Energy. Sharma, R., et al., 2009, Water efficiency management in datacenters: Metrics and Methodology, IEEE International Symposium on Sustainable Systems and Technology 2009, ISSST 09, The Green Grid, 2007, The Green Grid Data Center Power Efficiency Metrics: PUE and DCIE, Technical Committee White Paper. U.S. Environmental Protection Agency, 2007, Report to Congress on Server and Data Center Energy Efficiency Public Law , ENERGY STAR Program, August 2, Wang, D., 2006, Cooling Challenges and Best Practices for High Density Data and Telecommunication Centers, Proceedings of HDP 06, IEEE, /06. Wang, Z., et al., 2010, Multi-Agent Control System with Intelligent Optimization for Smart and Energy-Efficient Buildings, IECON th Annual Conference on IEEE Industrial Electronics Society, IEEE, /10. Wang, Z., et al., 2010, Multi-Agent Intelligent Controller Design for Smart and Sustainable Buildings, Systems Conference, th Annual IEEE, IEEE, /10. Engineering Toolbox, 2011, Thermal Conductivity of some common Materials and Gases, The Engineering ToolBox, [online] Available at: [accessed: 29/5/2011]. 11
European Code of Conduct for Data Centre. Presentation of the Awards 2013
European Code of Conduct for Data Centre Presentation of the Awards 2013 Paolo Bertoldi European Commission DG JRC Institute for Energy and Transport 1 The Code of Conduct for Data Centre Results Started
More informationEnvironmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
More informationData Center Equipment Power Trends
Green field data center design 11 Jan 2010 by Shlomo Novotny Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. explores water cooling for maximum efficiency - Part 1 Overview Data
More informationRe Engineering to a "Green" Data Center, with Measurable ROI
Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices
More informationManaging Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
More informationAnalysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
More informationThe New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
More informationGreen Data Centre Design
Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS
More informationRittal Liquid Cooling Series
Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de
More informationDataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
More informationData Center Environments
Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of
More informationGUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
More informationData Center Power Consumption
Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling
More informationLiquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
More informationGreen Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs
Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs Topics The Why? The How? Conclusion The Why? Drivers for Green IT According to Forrester Research, IT accounts
More informationDealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe Daniele.Tordin@Panduit.com AGENDA Business Drivers Challenges
More informationReducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
More informationData Centre Cooling Air Performance Metrics
Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. info@dc-oi.com Abstract Data centre energy consumption
More informationGREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.
More informationHow To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
More informationGreening Commercial Data Centres
Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling
More informationDriving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
More informationData center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
More informationBest Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
More informationIT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center
More informationHow to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
More informationBenefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
More informationEducation Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
More informationData Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
More informationSustainable Data Centres Approaches and Challenges. * Corresponding author: t.s.luong@student.reading.ac.uk
Proceedings of Conference: TSBE EngD Conference, TSBE Centre, University of Reading, Whiteknights Campus, RG6 6AF, 6 th July 2010. http://www.reading.ac.uk/tsbe/ Sustainable Data Centres Approaches and
More information80% of. 50% of. 30% of. 67% of enterprises. Environmental and business demands are converging
Delivering the Green Datacenter Jerome Riboulon, HP 2008 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice 2 24 Abril, 2008 Environmental and
More informationThe European Programme for Energy Efficiency in Data Centres: The Code of Conduct
The European Programme for Energy Efficiency in Data Centres: The Code of Conduct Paolo Bertoldi European Commission DG JRC Institute for Energy and Transport 1 Why Data Centres? Continuing demand for
More informationBest Practices for Wire-free Environmental Monitoring in the Data Center
White Paper Best Practices for Wire-free Environmental Monitoring in the Data Center April 2012 Introduction Monitoring for environmental threats in the data center is not a new concept. Since the beginning
More informationFree Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
More informationFive Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency
Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Executive Summary As electricity prices and IT
More informationEnergy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
More informationData Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Introduction Agenda Introduction Overview of Data Centres DC Operational
More informationCooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
More informationBRUNS-PAK Presents MARK S. EVANKO, Principal
BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations
More informationEnergy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services
More informationHow High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
More informationOffice of the Government Chief Information Officer. Green Data Centre Practices
Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain
More informationAir, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360
Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy
More informationDirect Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
More informationIMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
More informationEnergy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
More informationHow Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
More informationExcool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling
Excool Indirect Adiabatic and Evaporative Data Centre Cooling World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling Indirect Adiabatic and Evaporative Data Centre Cooling Solutions by
More informationData Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
More informationTechnical Systems. Helen Bedford
Technical Systems in Data Centres Helen Bedford Technical Systems in Data Centres Overview Cooling IT and Peripheral Equipment Security: Access and Control Building Management System (BMS) Data Centre
More informationServer Room Thermal Assessment
PREPARED FOR CUSTOMER Server Room Thermal Assessment Analysis of Server Room COMMERCIAL IN CONFIDENCE MAY 2011 Contents 1 Document Information... 3 2 Executive Summary... 4 2.1 Recommendation Summary...
More informationIncreasing Energ y Efficiency In Data Centers
The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes
More informationDataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
More informationEnabling an agile Data Centre in a (Fr)agile market
Enabling an agile Data Centre in a (Fr)agile market Phil Dodsworth Director, Data Centre Solutions 2008 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without
More informationAIRAH Presentation April 30 th 2014
Data Centre Cooling Strategies The Changing Landscape.. AIRAH Presentation April 30 th 2014 More Connections. More Devices. High Expectations 2 State of the Data Center, Emerson Network Power Redefining
More informationCarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense
CarbonDecisions The green data centre Why becoming a green data centre makes good business sense Contents What is a green data centre? Why being a green data centre makes good business sense 5 steps to
More informationCombining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
More informationData Centers: How Does It Affect My Building s Energy Use and What Can I Do?
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to
More informationManaging Cooling Capacity & Redundancy In Data Centers Today
Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS
More informationCURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
More informationAPC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
More informationVISIT 2010 Fujitsu Forum Europe 0
VISIT 2010 Fujitsu Forum Europe 0 Virtualization & Automation Room 13a Shaping tomorrow with you. Green Data Center Services Martin Provoost Director, Data Centers and Networks, Fujitsu UK & Ireland 14:00
More informationImproving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
More informationRaising Inlet Temperatures in the Data Centre
Raising Inlet Temperatures in the Data Centre White paper of a round table debate by industry experts on this key topic. Hosted by Keysource with participants including the Uptime Institute. Sophistication
More informationContainment Solutions
Containment Solutions Improve cooling efficiency Separate hot and cold air streams Configured to meet density requirements Suitable for commercial and SCEC endorsed cabinets Available for retrofit to SRA
More informationThe Effect of Data Centre Environment on IT Reliability & Energy Consumption
The Effect of Data Centre Environment on IT Reliability & Energy Consumption Steve Strutt EMEA Technical Work Group Member IBM The Green Grid EMEA Technical Forum 2011 Agenda History of IT environmental
More informationOptimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009
Optimum Climate Control For Datacenter - Case Study T. Prabu March 17 th 2009 Agenda 2 About EDEC (Emerson) Facility Data Center Details Design Considerations & Challenges Layout Design CFD Analysis Of
More informationHigh Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.
High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white
More informationOptimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack
More informationEnergy and Cost Analysis of Rittal Corporation Liquid Cooled Package
Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Munther Salim, Ph.D. Yury Lui, PE, CEM, LEED AP eyp mission critical facilities, 200 west adams street, suite 2750, Chicago, il 60606
More information2010 Best Practices. for the EU Code of Conduct on Data Centres. Version 2.0.0 Release Public Review Public
2010 Best Practices for the EU Code of Conduct on Data Centres Version 2.0.0 Release Public Review Public 1 Document Information 1.1 Version History Version 1 Description Version Updates Date 2.0.0 2010
More informationEnergy Efficiency and Availability Management in Consolidated Data Centers
Energy Efficiency and Availability Management in Consolidated Data Centers Abstract The Federal Data Center Consolidation Initiative (FDCCI) was driven by the recognition that growth in the number of Federal
More informationData Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
More informationReducing the Annual Cost of a Telecommunications Data Center
Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities
More informationThe Benefits of Supply Air Temperature Control in the Data Centre
Executive Summary: Controlling the temperature in a data centre is critical to achieving maximum uptime and efficiency, but is it being controlled in the correct place? Whilst data centre layouts have
More information- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
More informationThe CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
More informationData Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
More informationChoosing Close-Coupled IT Cooling Solutions
W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of
More informationUniversity of St Andrews. Energy Efficient Data Centre Cooling
Energy Efficient Data Centre Cooling St. Andrews and Elsewhere Richard Lumb Consultant Engineer Future-Tech Energy Efficient Data Centre Cooling There is no one single best cooling solution for all Data
More information2008 ASHRAE Environmental Guidelines for Datacom Equipment -Expanding the Recommended Environmental Envelope-
2008 ASHRAE Environmental Guidelines for Datacom Equipment -Expanding the Recommended Environmental Envelope- Overview: The current recommended environmental envelope for IT Equipment is listed in Table
More informationBest Practices for Wire-free Environmental Monitoring in the Data Center
White Paper 11800 Ridge Parkway Broomfiled, CO 80021 1-800-638-2638 http://www.42u.com sales@42u.com Best Practices for Wire-free Environmental Monitoring in the Data Center Introduction Monitoring for
More informationThe 5th Greater Pearl River Delta Conference - Smart Management System in Building Facilities for Sustainability of Low Carbon Environment
The 5th Greater Pearl River Delta Conference - Smart Management System in Building Facilities for Sustainability of Low Carbon Environment Deployment of Advanced Technologies for Carbon Efficiency in Process
More information5 Reasons. Environment Sensors are used in all Modern Data Centers
5 Reasons Environment Sensors are used in all Modern Data Centers TABLE OF CONTENTS 5 Reasons Environmental Sensors are used in all Modern Data Centers Save on Cooling by Confidently Raising Data Center
More informationOverview of Green Energy Strategies and Techniques for Modern Data Centers
Overview of Green Energy Strategies and Techniques for Modern Data Centers Introduction Data centers ensure the operation of critical business IT equipment including servers, networking and storage devices.
More informationVerizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
More informationCIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
More informationData Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015
Data Center Cooling & Air Flow Management Arnold Murphy, CDCEP, CDCAP March 3, 2015 Strategic Clean Technology Inc Focus on improving cooling and air flow management to achieve energy cost savings and
More informationWhite Paper Rack climate control in data centres
White Paper Rack climate control in data centres Contents Contents...2 List of illustrations... 3 Executive summary...4 Introduction...5 Objectives and requirements...6 Room climate control with the CRAC
More informationWireless Sensor Technology for Data Centers
Wireless Sensor Technology for Data Centers Richard Oberg and Ed Sanchez, Sacramento Municipal Utility District Patricia Nealon, Synapsense Corporation ABSTRACT This paper describes the development and
More informationData Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services
Data Center & IT Infrastructure Optimization Trends & Best Practices Mickey Iqbal - IBM Distinguished Engineer IBM Global Technology Services IT Organizations are Challenged by a Set of Operational Issues
More informationHow To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
More informationServer room guide helps energy managers reduce server consumption
Server room guide helps energy managers reduce server consumption Jan Viegand Viegand Maagøe Nr. Farimagsgade 37 1364 Copenhagen K Denmark jv@viegandmaagoe.dk Keywords servers, guidelines, server rooms,
More informationMinimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis
Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. info@dc-oi.com
More informationUnified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
More informationEMC PERSPECTIVE Managing Energy Efficiency in the Data Center
EMC PERSPECTIVE Managing Energy Efficiency in the Data Center A tactical approach can deliver immediate problem resolution plus long-term savings and efficiencies Managing Energy Efficiency in the Data
More informationGreen Data Centers development Best Practices
Draft Recommendation ITU-T L.DC Green Data Centers development Best Practices Summary This recommendation is a contribution for the reduction of negative climate impact of Data Centers.
More informationCooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
More informationCreating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
More informationBlade Server & Data Room Cooling Specialists
SURVEY I DESIGN I MANUFACTURE I INSTALL I COMMISSION I SERVICE SERVERCOOL An Eaton-Williams Group Brand Blade Server & Data Room Cooling Specialists Manufactured in the UK SERVERCOOL Cooling IT Cooling
More information