DataCenter 2020: first results for energy-optimization at existing data centers
|
|
|
- Avis Warner
- 10 years ago
- Views:
Transcription
1 DataCenter : first results for energy-optimization at existing data centers July Powered by
2 WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction Environmentally friendly and sustainable IT is a hot topic given the current climate change and ecological challenges. According to Gartner s market researchers, IT equipment is responsible for two percent of all CO emissions worldwide. This is almost equal to the amount of carbon dioxide generated by air traffic. With rising energy costs added to the mix, more and more companies are paying greater heed to their consumption of IT resources and ensuring that the design of their IT infrastructure is as environmentally friendly ( green ) as possible. The objective is to use energy resources more efficiently while also reducing CO emissions. Much focus lies on achieving maximum energy efficiency that is, to optimize the amount of energy used per unit of output. Energy consumption at data centers can be reduced using methods that are easy to effect. Those responsible for IT in companies and data centers face major challenges. On the one hand, the number of IT applications and the data volumes to be stored are increasing rapidly. This requires greater processing power, and therefore more server hardware. In addition to operational electricity costs, there is also the added expense of cooling systems. According to analysts at IDC, for every dollar that the CIO spends on a new server, another 5 cents is spent on electricity and cooling. The most important tool for combating power consumption and improving the climate footprint of the data centers is higher energy efficiency. Needless to say, IT s commitment to the environment is not solely limited to lower power consumption. However, energy efficiency plays an important role and it is a fundamental prerequisite for an environmentally friendly data center. This poses the question: how can the existing infrastructure (cooling, power, space etc.) use the latest technology with maximum efficiency that is, how can the best possible ratio of power consumption to required performance be achieved? This is the subject of the technology partnership between Intel and T-Systems in the context of the DataCenter project. The two companies are working together on solutions for the industrialization and automation of ICT services with the aim of making these marketready with maximum efficiency and cost effectiveness. The nucleus of this collaboration for energy efficiency is a test laboratory at the Euroindustriepark in Munich. The aim of the Intel and T-Systems laboratory is to calculate measured values that can be used as a starting point for optimizing existing data centers and making them more energy efficient. These findings will also be used to create a model for the data center of the future.
3 WHITE PAPER: DataCenter Partnership between Intel and T-Systems The technology partnership between Intel and T-Systems centers on essential questions regarding the use of technology today and in the near future. Together, the two companies want to develop future-proof ICT solutions that use energy efficiently, save costs and pave the way for new ICT services. To increase the pace of innovation throughout the IT industry, Intel and TSystems are making the results of their collaboration and the recommended courses of action publicly available. The partnership focuses on DataCenter for the optimization of existing data centers and for joint research on the data center of the future. The collaboration between Intel and T-Systems facilitates new knowledge and methods as part of a holistic approach (End-to-End). Both partners place enormous value on the practicability and actual application of these, since existing data centers already harbor huge energy-saving potential. Procedures such as the arrangement of racks in the warm aisle/cold aisle or housing concepts are already standard nowadays, but there is still clearly room for improvement. At DataCenter, Intel and T- Systems are developing a realistic roadmap that details how the energy efficiency of data centers can be greatly increased. This white paper describes the experimental environment of Data- Center and presents the first research results from the test lab for energy-optimization at existing data centers. Essentially, to date the experts from Intel and T-Systems have focused on two measures: through the strict separation of cold and warm air, they were able to reduce the fan speed of the forced air cooling devices, and with the increase in room temperature, they were able to prolong the time dedicated to free cooling, which occurs in conjunction with the outside temperature. DataCenter : the test lab at Munich s Euroindustriepark The test lab is equipped with around rack servers, as well as the latest in energy, air-conditioning, measuring and control technology. Approximately data points record parameters such as humidity, room temperature, temperature difference between the incoming and outgoing air, processor load, and fan speeds. The most important instrument is the electricity meter. In addition, the raised floor houses a smoke generator that monitors airflow. The smoke that is generated makes the direction and speed of the airflow visible. At the same time, the engineers can identify short-circuits in the power supply, i.e. detect permeable areas (leakage air) and places into which air is absolutely not allowed to flow. At the data center, airflow plays a vital part in climate control. The engineers carry out various tests, for instance on the cold aisle containment. Here, the forced air cooling device channels cold air through the raised floor to cool the servers, and the cold air is kept strictly separate from the warm air. In accordance with the set room temperature, humidity and CPU load, the testers can then optimize parameters such as cold water supply or ventilator speed. In addition to the cold aisle containment, the partners examine systems with warm aisle containment (where warm air is sucked out and then cooled). They test the efficiency of liquid-cooled racks and the properties of various forced air cooling devices that use different technology. To simulate data centers with different ceiling heights, the test lab employs a lift slab with a variable height of between.5 and 3.7 meters. The tests conducted up to now were based on a ceiling height of.7 meters. The primary objective is to determine which solution offers the data center the best in energy efficiency and under which conditions. To achieve this, the engineers examine the entire process chain, from energy supply through to consumption. Besides improved air-conditioning technology, much focus is placed on the use of energy-saving IT, for instance in server and processor technology. PUE value as a measurement for energy efficiency To measure energy efficiency at DataCenter, T-Systems and Intel use the Power Usage Effectiveness (PUE) industry standard defined by the Green Grid organization that is, the efficiency of energy consumption. This value measures how much consumed energy is actually converted into processing power. PUE is the ratio of total facility power consumption to IT equipment power consumption. At present, the average PUE value at existing data centers is.9. IT equipment power consumption is the amount of power used by all IT devices in the data center, including servers and other computers, storage devices and network systems, switches, monitors, and other peripheral and telecommunications devices. Total facility power includes, in addition to the power consumption for IT, the electricity costs for the infrastructure that supports the IT operations. This comprises systems such as UPS, switching systems, batteries, cooling systems, pumps and lighting. Total facility power is quite easy to read on the electricity meter.
4 WHITE PAPER: DataCenter Starting point: today s standard data center In conventionally designed data centers, the server cooling system eats up around half of all the energy needed. This high level of consumption is also based on different inefficiencies. At Data- Center, the researchers from Intel and T-Systems produced their first measurements by simulating the conditions at current up-and-running data centers with all their faults. The PUE value resulting from this environment was approximately.8. The following conditions were defined for the measurements. step step Starting conditions: 74 % load = 4 kw 8 Racks 5kW/Rack: Chiller mode yearly average at Munich DC site. Leakage in the raised floor, racks and cable feedthroughs causes thermal short circuits., Background: the forced air cooling device draws warm air in, cools it down and blows it back into the raised floor at the appropriate temperature. Cooling is achieved through an internal heat exchanger that is supplied with water from chillers located outside of the building. Inside the chiller, a compression refrigerator raises the temperature of the coolant to the extent that it can be chilled with ambient air. This consumes a massive amount of energy. At lower ambient temperatures, the coolant can instead be chilled directly using external air without the need for a chiller. This process is referred to as indirect free cooling and it is far more efficient than cooling using a compression chiller. 5. Since PUE represents an annual mean value, the experts at DataCenter developed a mathematical yearly model for the water chiller so as to eliminate non-representative snapshot readings. They also incorporated the average temperatures for the Munich area into the model. As a result, the temperature gradation can be used to forecast the potential for using indirect free cooling in the annual mean. Under these conditions including inefficiencies the measurements gave a PUE value of approximately.8. Optimization phase I: separating cold and warm air Having simulated the conditions in a conventionally designed data center, the experts at DataCenter began the first optimization phase. This phase ended with a PUE value of.48. It focused on the forced separation of cold and warm air in three steps: Background: at the data center, airflow plays a vital part in climate control. It is essential to clearly separate cold air from warm air and to avoid unnecessary airstreams within the data center. Take the example of a raised floor: in many data centers, a raised floor is used to separate the cold air for cooling under high pressure. It is also generally used for routing power cables. The raised floor must be expertly sealed to avoid so-called leakage air, which causes cold and warm air to mix, leading to thermal short circuits. Leakage must be prevented in the racks and cable feedthroughs too.. The speed of the forced air cooling devices is set to the maximum ( percent) to ensure that, despite leakages in the raised floor, enough air reaches the servers in the cold aisle. 3. The IT load is limited to approximately 5 kw/rack (8 racks with a total of 74 servers are in use) or kw/m². 4. The inlet temperature (air) in the raised floor is set to 8 C. This results in a server intake temperature of around C.. Sealing of leakages step step Blind blanks Floor leakage reduced to % The first step taken by the experts was to make the compartments airtight, thereby preventing air from escaping unnecessarily. They eliminated leakage air by sealing the raised floor (for example, where power cables were fed through) and by inserting dummy panels in the racks (between the rack units that house the servers)., Sealing the leakages had no impact on the PUE, however, since all devices were still running in the same way. Indeed, the fact that air could no longer unnecessarily escape led to a considerable rise in air pressure within the raised floor. This forms the precondition for the next step.
5 WHITE PAPER: DataCenter. Adjustment of fan speed in the forced air cooling device step step CRAC fan speed optimization (to be repeated in any of the following steps) The experts at DataCenter were now able to reduce pressure in the raised floor to the minimum required, thereby ensuring that the intake temperature remained sufficient throughout the rack height. They achieved this by lowering the fan speed in the forced air cooling device. Since the fan was now rotating much more slowly and therefore using less energy, the PUE value decreased from.8 to.55., step step Replacement of perforated tiles (38% opening) with grating tiles (98% opening) in cold aisle In addition, replacing the usual perforated tiles in the cold aisle with grating tiles achieved a reduction in aerodynamic resistance, since air flows through the larger openings with more ease (Slide 8). This allowed for a greater decrease in the fan speed of the forced air cooling device. Lastly, the cold air ceiling was closed to prevent the cold and warm air from mixing across the upper side of the rack. Following the third step of the first optimization phase, the PUE value sank once more, from.55 to.48., Given that energy saving effects could only be achieved in this phase (separating the airflows) by adjusting the fan speed, this optimization step is repeated in all subsequent steps, even without specific reference. step step Doors installed at the end of the aisle (without ceiling) 3. Gradual improvement of the cold and warm air isolation The strict separation of cold and warm air was applied in this step to future-proof against thermal short circuits. Doors were fitted at the ends of the aisles, preventing cold and warm air from mixing through the rack sides at the beginning and end of an aisle., Optimization phase II: Increasing the inlet temperature The second optimization phase focused on increasing the inlet temperature. In principle, the ideal temperature range for humans is around C, but the same does not apply for servers. However, many data centers fail to use energy efficiently because they are actually too cold. Today, many servers can tolerate ambient temperatures of 3 to 35 degrees Celsius, while the air intake temperature in most data centers is between and 5 degrees Celsius. If the intake air does not need to be cooled as much, the air-conditioning system will use less energy. Therefore: if the inlet temperature is increased by raising the coolant temperature (that is, the water supply temperature), there will be less need for forced cooling, generated for example using compressors. Forced air cooling devices feature a heat exchanger that is supplied with water from chillers located outside of the building. Inside the chiller, a compression refrigerator raises the temperature of the coolant to the extent that it can be chilled with ambient air. At lower ambient temperatures, the coolant can be chilled directly using external air without the need for a chiller. This process is referred to as indirect free cooling. The higher the temperature of the cooling agent is and the lower the outside temperature, the less energy is required. The efficiency of the chiller also increases with every degree in room temperature. The experts at DataCenter must therefore clarify the following question: what is the ideal water supply temperature?
6 WHITE PAPER: DataCenter 4. Reaching the limits of the existing cooling infrastructure With standard chillers, the water supply temperature is limited to 4 C. However, an increase in this limit would see a drop in the need for forced cooling and indirect free cooling would feature in the annual mean for longer. In a test, the experts at DataCenter introduced an air cooling device with an EC motor featuring a direct-current ventilator and raised the flow temperature to 4 C. EC motors require around 3 percent less power than conventional AC motors. This setup allowed the experts at DataCenter to reduce the PUE value even further, this time from.48 to.43. step step Raise water supply temperature from 8 C to 4 C Fix coolant flow (valve to 75% manual) 5. Increasing the inlet temperature in line with current ASHRAE recommendations The next step taken by the DataCenter team was to raise the water supply temperature to 4 C as per the recommendations of ASHRAE (American Society of Heating, Refrigerating and Air Conditioning Engineers), corresponding to a server intake temperature of 7 C. The PUE value thus dropped to an optimal.4. step step Water supply temp increased from 4 C to 4 C Total effects: Reduction of 5 kw ( 9 %),, ASHRAE ( is the American Society of Heating, Refrigerating and Air-Conditioning Engineers. The four-volume ASHRAE manual is a reference guide to air conditioning technology. A new volume is published every year. ASHRAE also publishes standards and guidelines in the area of air conditioning technology, which are referred to in building regulations. step step Maximum raise of water supply temp to 34 C 6. Reaching the server s thermal limits Since the specifications of most servers today allow intake temperatures of up to 35 C, the researchers next step was to reach this limit and raise the inlet temperature to 35 C. However, this made the server fans run faster, using more energy, and thus in this experiment the PUE value increased again slightly to Increasing the IT load to kw/rack or 4 kw/m² In the last test, the experts at DataCenter doubled the IT load from around 5 kw/rack or kw/m (= initial load) to around kw/rack or 4 kw/m². Since this doubles the IT equipment power from kw to kw, the total facility power increases too. This experimental procedure achieved another improvement in efficiency and a PUE value of.3. step step,, Doubling
7 WHITE PAPER: DataCenter Summary Energy consumption at data centers can be reduced using methods that are easy to effect. To lower the PUE value, the researchers from Intel and T-Systems experimented with different scenarios at DataCenter. The improved energy efficiency at existing data centers is due mainly to the following two measures:. The strict separation of cold and warm air and the resulting optimization of airflow, which allows for a reduction in the fan speed of the forced air cooling device. This result forms the basis for all further steps and must be implemented with comparatively cost-efficient measures.. Increasing the room temperature or inlet temperature. This measure shortens the time for forced cooling and lengthens the time for indirect free cooling. The experts achieved the best result at a server intake temperature of 7 C in accordance with ASHRAE s recommendations. This necessitates a detailed inspection of the existing infrastructure and buildings, so that they can be used as effectively as possible in line with design options. Location, power supply and customer focus are important criteria too for completing a holistic assessment of existing data centers. Outlook T-Systems and Intel will implement the initial findings from research conducted at DataCenter into their own projects for the optimization of data centers. In the next step of the project, the experts will examine to what extent the data center infrastructure can be controlled by the IT load so that energy consumption is also optimized in the partial-load operational range. The long-term objective is infrastructure on demand, whereby the infrastructure in the data center supplies the servers with the exact amount of cooling output that they need at this moment in time. For further information Copyright Intel Corporation and T-Systems International GmbH All rights reserved. By implementing straightforward changes, energy consumption at an existing data center can quickly be reduced to moderate costs. To achieve this, the individual measures must be expertly coordinated and stringently implemented. The integrated approach described here reduces operating costs where the load remains constant, or frees up space to allow for an increase in IT capacity where total energy consumption remains constant. To achieve maximum efficiency, the experts at DataCenter recommend always using the available cooling capacities and power supply performance at their maximum levels. Powered by
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
White Paper Rack climate control in data centres
White Paper Rack climate control in data centres Contents Contents...2 List of illustrations... 3 Executive summary...4 Introduction...5 Objectives and requirements...6 Room climate control with the CRAC
DataCenter 2020. Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.
DataCenter 2020. Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation. Dr. Rainer Weidmann, DC Architecture & DC Innovation Dr. Rainer Weidmann, DC Architecture
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
Elements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
Office of the Government Chief Information Officer. Green Data Centre Practices
Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain
International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS
International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
AIR-SITE GROUP. White Paper. Green Equipment Room Practices
AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider
Overview of Green Energy Strategies and Techniques for Modern Data Centers
Overview of Green Energy Strategies and Techniques for Modern Data Centers Introduction Data centers ensure the operation of critical business IT equipment including servers, networking and storage devices.
Direct Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
News in Data Center Cooling
News in Data Center Cooling Wednesday, 8th May 2013, 16:00h Benjamin Petschke, Director Export - Products Stulz GmbH News in Data Center Cooling Almost any News in Data Center Cooling is about increase
Data Center Power Consumption
Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling
AIRAH Presentation April 30 th 2014
Data Centre Cooling Strategies The Changing Landscape.. AIRAH Presentation April 30 th 2014 More Connections. More Devices. High Expectations 2 State of the Data Center, Emerson Network Power Redefining
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
Rittal Liquid Cooling Series
Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
State of the Art Energy Efficient Data Centre Air Conditioning
- White Paper - State of the Art Energy Efficient Data Centre Air Conditioning - Dynamic Free Cooling - Release 2, April 2008 Contents ABSTRACT... 3 1. WHY DO I NEED AN ENERGY EFFICIENT COOLING SYSTEM...
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
Thermal Optimisation in the Data Centre
Thermal Optimisation in the Data Centre Best Practices for achieving optimal cooling performance and significant energy savings in your data centre 1. Thermal analysis 2. Cold-Aisle-Containment 3. Seal
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore
Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Introduction Agenda Introduction Overview of Data Centres DC Operational
How To Improve Energy Efficiency In A Data Center
Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense
CarbonDecisions The green data centre Why becoming a green data centre makes good business sense Contents What is a green data centre? Why being a green data centre makes good business sense 5 steps to
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
Energy Efficiency and Green Data Centers. Overview of Recommendations ITU-T L.1300 and ITU-T L.1310
Energy Efficiency and Green Data Centers Overview of Recommendations ITU-T L.1300 and ITU-T L.1310 Paolo Gemma Chairman of Working Party 3 of ITU-T Study Group 5 International Telecommunication Union Agenda
Specialty Environment Design Mission Critical Facilities
Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
How To Find A Sweet Spot Operating Temperature For A Data Center
Data Center Operating Temperature: The Sweet Spot A Dell Technical White Paper Dell Data Center Infrastructure David L. Moss THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL
The New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres
What factors determine the energy efficiency of a data centre? Where is the energy used? Local Climate Data Hall Temperatures Chiller / DX Energy Condenser / Dry Cooler / Cooling Tower Energy Pump Energy
I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015
I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 8, Loughborough University, 29 th June 2015 Topics to be considered 1. Progress on project tasks 2. Development of data centre test facility
Data Centre Testing and Commissioning
Data Centre Testing and Commissioning What is Testing and Commissioning? Commissioning provides a systematic and rigorous set of tests tailored to suit the specific design. It is a process designed to
How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
Reducing the Annual Cost of a Telecommunications Data Center
Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities
COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION
COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION 1/16 Contents: 1. Introduction: the increasing importance of energy efficiency 1.1 Power consumers in the data
Increasing Energ y Efficiency In Data Centers
The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes
European Code of Conduct for Data Centre. Presentation of the Awards 2014
European Code of Conduct for Data Centre Presentation of the Awards 2014 Paolo Bertoldi European Commission DG JRC Institute for Energy and Transport 1 The Code of Conduct for Data Centre Results European
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,
BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0
BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0 To achieve GREEN MARK Award Pre-requisite Requirement All relevant pre-requisite requirements for the specific Green Mark Rating are to be complied
Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
How to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS
International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES
ICT and the Green Data Centre
ICT and the Green Data Centre Scott McConnell Sales Manager c/o Tanya Duncan MD Interxion Ireland Green Data Centres Our Responsibility Data centre greenhouse gas emissions are projected to quadruple by
1. Data Centre Environment Why focus on Data Centres? Hitachi Eco-Friendly Data Centre Solutions
Hitachi Eco-Friendly Data Centre Solutions Why are efficient data centres a necessity and how can they be delivered 24 th September 2009 1 1. Data Centre Environment Why focus on Data Centres? Why are
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
Analysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY
AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY The Aegis Services Power and Assessment Service provides an assessment and analysis of your data center facility and critical physical
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
Alcatel-Lucent Modular Cooling Solution
T E C H N O L O G Y W H I T E P A P E R Alcatel-Lucent Modular Cooling Solution Redundancy test results for pumped, two-phase modular cooling system Current heat exchange methods for cooling data centers
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America
7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer
Server Room Thermal Assessment
PREPARED FOR CUSTOMER Server Room Thermal Assessment Analysis of Server Room COMMERCIAL IN CONFIDENCE MAY 2011 Contents 1 Document Information... 3 2 Executive Summary... 4 2.1 Recommendation Summary...
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Drives and motors. A guide to using variable-speed drives and motors in data centres
Drives motors A guide to using variable-speed drives motors in data centres The power behind the data Private public businesses, from banks to supermarkets, to telecommunications companies internet providers
Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie
Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie Mukesh Khattar, Energy Director, Oracle National Energy Efficiency Technology Summit, Sep. 26, 2012, Portland, OR Agenda
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power savings through the use of Intel s intelligent
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center
Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009
Optimum Climate Control For Datacenter - Case Study T. Prabu March 17 th 2009 Agenda 2 About EDEC (Emerson) Facility Data Center Details Design Considerations & Challenges Layout Design CFD Analysis Of
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack
Recommendations for Measuring and Reporting Overall Data Center Efficiency
Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 2 Measuring PUE for Data Centers 17 May 2011 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations for Measuring
Greening Commercial Data Centres
Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation
Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Version 1.0 - November, 2011 Incorporating Data Center-Specific Sustainability Measures into
Chiller-less Facilities: They May Be Closer Than You Think
Chiller-less Facilities: They May Be Closer Than You Think A Dell Technical White Paper Learn more at Dell.com/PowerEdge/Rack David Moss Jon Fitch Paul Artman THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.
Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL
Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency NREL 1 National Renewable Energy Laboratory Presentation Road Map A Holistic Approach to Efficiency: Power, Packaging, Cooling, Integration
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
HPC TCO: Cooling and Computer Room Efficiency
HPC TCO: Cooling and Computer Room Efficiency 1 Route Plan Motivation (Why do we care?) HPC Building Blocks: Compuer Hardware (What s inside my dataroom? What needs to be cooled?) HPC Building Blocks:
THE GREEN DATA CENTER
GREEN IT THE GREEN DATA CENTER WHERE ECOLOGY MEETS ECONOMY We truly live in an information age. Data Centers serve a very important purpose they provide the global community with nearly unlimited access
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics
On January 13, 2010, 7x24 Exchange Chairman Robert Cassiliano and Vice President David Schirmacher met in Washington, DC with representatives from the EPA, the DOE and 7 leading industry organizations
Energy Efficiency Best Practice Guide Data Centre and IT Facilities
2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits
The Effect of Data Centre Environment on IT Reliability & Energy Consumption
The Effect of Data Centre Environment on IT Reliability & Energy Consumption Steve Strutt EMEA Technical Work Group Member IBM The Green Grid EMEA Technical Forum 2011 Agenda History of IT environmental
Questions to be responded to by the firm submitting the application. Why do you think this project should receive an award? How does it demonstrate:
Questions to be responded to by the firm submitting the application Why do you think this project should receive an award? How does it demonstrate: innovation, quality, and professional excellence transparency
RiMatrix S Make easy.
RiMatrix S Make easy. 2 RiMatrix S Rittal The System. The whole is more than the sum of its parts. The same is true of Rittal The System. With this in mind, we have bundled our innovative enclosure, power
Data Center Equipment Power Trends
Green field data center design 11 Jan 2010 by Shlomo Novotny Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. explores water cooling for maximum efficiency - Part 1 Overview Data
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
EHDC. Cooling system for Data Center and server cabinets.
EHDC Cooling system for Data Center and server cabinets. EHDC PEN LP EHDC L is a High Density Cooling module for an pen Loop configuration. It is ideally installed with Enoc Systems cold corridor containment
