How to Save Over 60% on Cooling Energy Needed by a Data Center? The Roadmap from Traditional to Optimized Cooling

Similar documents
Liebert EFC from 100 to 350 kw. The Highly Efficient Indirect Evaporative Freecooling Unit

Liebert : The Right Cooling For IT Environments. Precision Cooling For Business-Critical Continuity

Liebert CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment. Precision Cooling For Business-Critical Continuity

The High Performance Floormount Indoor Package Cooling Solution. Precision Cooling for Business-Critical Continuity

Liebert HPM Designed for top level performance and reliability. Precision Cooling for Business-Critical Continuity

Advantages and Efficient Use of Programmable Sockets. By Miguel Rascon AC Power Senior Product Manager

Protecting Our Information, Protecting Our Resources

Legacy Data Centres Upgrading the cooling capabilities What are the options?

Liebert Hiross HPW The High Performance Wallmount Cooling Solutions for Telecom Mobile Remote Access Nodes

Engineering White Paper UTILIZING ECONOMIZERS EFFECTIVELY IN THE DATA CENTER

How To Make A Data Center More Efficient

Uninterruptible Power Supply Product Regulations and Standardization in the European Union. By Matteo Tricarico Patents and Regulatory Engineer

Liebert CW kW Chilled Water Cooled Precision Air Conditioning For Data Centers

State of the Art Energy Efficient Data Centre Air Conditioning

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

AIRAH Presentation April 30 th 2014

Precision Cooling for Business-Critical Continuity. Liebert PCW. Cool the Cloud

Liebert CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Knürr DCL Modular Rack Cooling from 6 kw to 60 kw

DISCONTINUED PRODUCT. Infrastructure Management & Monitoring For Business-Critical Continuity. Liebert icom-do. User Manual

CUSTOMER EXPERIENCE CENTER

Liebert APM from kw. The Compact Row-Based UPS With FlexPower Technology TM

Economizer Fundamentals: Smart Approaches to Energy-Efficient Free-Cooling for Data Centers

SmartDesign Intelligent, Integrated Infrastructure For The Data Center. Solutions For Business-Critical Continuity

Colocation Rack Secure, Separate Server Hosting and Housing in Data Centers

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Protecting The Critical IT Infrastructure of the United States

CRITICAL COOLING FOR DATA CENTERS

Reducing Data Center Energy Consumption

Elements of Energy Efficiency in Data Centre Cooling Architecture

Liebert SiteScan Web. Quick-Start Guide. Monitoring For Business-Critical Continuity

Comparing Air Cooler Ratings Part 1: Not All Rating Methods are Created Equal

Liebert NXC from 10 to 200 kva

How To Run A Data Center Efficiently

Environmental Data Center Management and Monitoring

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

AERMEC DATA CENTRE SOLUTIONS THE COMPLETE RANGE OF AERMEC SOLUTIONS FOR DATA CENTRES

SiteWeb Monitoring System. Real Time Power and Environment Monitoring Platform for Telecom Applications

Hiwall Hiross for Telecom. High Performance Air Conditioning

Moisture Control. It s The Dew Point. Stupid! Its not the humidity.

CoolTeg Plus Chilled Water (CW) version: AC-TCW

Hybrid heat pumps. saving energy and reducing carbon emissions

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

National Grid Your Partner in Energy Solutions

In Row Cooling Options for High Density IT Applications

White Paper #10. Energy Efficiency in Computer Data Centers

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Technical Systems. Helen Bedford

COMMERCIAL HVAC CHILLER EQUIPMENT. Air-Cooled Chillers

Analysis of data centre cooling energy efficiency

STULZ Water-Side Economizer Solutions

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Direct Fresh Air Free Cooling of Data Centres

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

White Paper: Free Cooling and the Efficiency of your Data Centre

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

COOLSIDE box R410A. conditioned server - racks for direct installation into the room INVERTER E C

Defining Quality. Building Comfort. Precision. Air Conditioning

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

How do I measure the amount of water vapor in the air?

News in Data Center Cooling

The natural ombination

Rittal White Paper 305: Selecting Air Conditioners for Industrial Enclosures By: Judith Koetzsch Mark Corcoran, Editor

Characteristics of Evaporators

SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS

- White Paper - Data Centre Cooling. Best Practice

Controlling Wrap-Around Heat Pipes for Systems with Strict Space Humidity Requirements. By Michael O. Davis, PE, November 2015

Data Center Infrastructure Management Managing the Physical Infrastructure for Greater Efficiency

Climate solutions. evaporative cooling, adiabatic humidification and programmable controllers. energy saving solutions for data centers. carel.

Liebert COMPREHENSIVE DETECTION OF WATER BASED LEAKS IN CRITICAL SPACES. Liebert Liqui-tect. M o n i t o r i n g S o l u t i o n s

THE PSYCHROMETRIC CHART: Theory and Application. Perry Peralta NC State University

Top 5 Trends in Data Center Energy Efficiency

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Modular Data Center Cooling. Ultra High Efficiency. Oasis Optimum DCIE

The ASHRAE HQ Building Can the energy efficiency of the different mechanical systems really be compared? RESIDENTIAL LIGHT COMMERCIAL COMMERCIAL

Indirect Evaporative Cooling (IDEC) Unit for Data Centers

Excool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

DCIM Readiness on the Rise as Significant Data Center Capacity Remains Unused. A Research Report from the Experts in Business-Critical Continuity TM

Liebert icom Thermal System Controls Greater Data Center Protection, Efficiency & Insight

Total Heat Versus Sensible Heat Evaporator Selection Methods & Application

Absolute and relative humidity Precise and comfort air-conditioning

Scroll Compressor Development for Air-Source Heat Pump Water Heater Applications

How To Save Energy On A Power Supply

Chapter 3.4: HVAC & Refrigeration System

AERIS DATA CENTRE INDIRECT EVAPORATIVE COOLING SYSTEM

DataCenter 2020: first results for energy-optimization at existing data centers

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Eni Green Data Center - Specification

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Improving Data Center Energy Efficiency Through Environmental Optimization

Dr. Michael K. West, PE 1 Dr. Richard S. Combes, PE 2 Advantek Consulting / Melbourne, Florida

60-150kW. For more information on Airedale visit SmartCool kW. > SmartCool.

Humid Air. Water vapor in air. Trace Glasses 1% Argon (A) Water vapor (H 2

Optimization of Water - Cooled Chiller Cooling Tower Combinations

Transcription:

How to Save Over 60% on Cooling Energy Needed by a Data Center? The Roadmap from Traditional to Optimized Cooling

Abstract Data center managers are increasingly in search of methods and technologies for reducing facility energy consumption, while at the same time optimizing cooling efficiencies. Varying solutions are available to meet these requests and satisfy customers increasing demanding needs. This white paper is thus designed to demonstrate how data center cooling efficiencies can be enhanced by replacing currently installed cooling units with advanced technology units. In the development of this analysis, varying cooling solutions have been taken into consideration. The evaluation covers all elements from a general cooling product portfolio, demonstrating how the system s energy performance can be progressively improved with the adoption of certain technologies, to the use of the utmost efficient solutions currently available on the market. For analysis purposes, two data centers located in Europe (one in Frankfurt and the other in London), have been used as reference sites. The two data centers are structurally identical: each consists of a 300 kw installation requiring N+1 cooling redundancy and adopt a Direct Expansion (DX) solution. The only difference is due to their varying annual temperature profiles. The two cases will illustrate how cooling efficiency increases occur despite the impact external temperatures may have on data centers. Moreover, although the wide-spread assumption that Direct Expansion technology is associated with higher energy consumption when compared to chilled-water systems, the current analysis will demonstrate how the latest cooling technologies have allowed that gap to be dramatically reduced. 2

Introduction The existing installation of both the Frankfurt and London data center models consist of Direct Expansion units. These data centers operate according to traditional cooling approaches where air conditioning temperature and humidity levels are controlled through the return air path with a set point of 22 C and a relative humidity (RH) of 50%. The six conditioning units initially installed incorporate standard scroll compressors - with R407C refrigerant - as well as traditional fixed speed fans and AC motors. The nominal capacity of each unit is 60 kw. Thus, five units will be running; with an additional unit in stand-by for redundancy purposes. The annual consumption of these systems amounts to 1,253,260 kwh and 1,254,400 kwh for London and Frankfurt, respectively. The ppue (or partial PUE, which represents only the mechanical cooling segment of the annual PUE figure) that has been observed in these data centers is 1.4, contributing to an overall PUE of 1.6. 1. First Solution: Technology Upgrade In order to provide a solution with increased efficiency, the first recommended enhancement to the installed conditioning units would involve AC fans being upgraded to EC (variable speed) fans, and the original refrigerant substituted with the R410A refrigerant. This would immediately lead to an energy saving of 5% (60,000 kwh per year) and would largely be attributed to the variation in consumption levels of the AC versus EC fan technology. Essentially, the owners of both the London and Frankfurt data centers, can thus choose to replace the existing units with those possessing the same technology and incurring the same annual running costs. An alternative, and more efficient, solution would involve implementing the latest fan and compressor technologies available. The latter option reduces annual running costs, providing a minimized pay-back time due to highly efficient technological components. Table 1 shows the economic and energetic figures when comparing the abovesaid solutions for the Frankfurt data center. Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ / kw] Table 1. Economic and energetic comparison of AC vs EC fan solutions for Frankfurt Data Centers. As shown in table 1, the most efficient solution is that of the EC Fan Air Cooled alternative. This solution offers further CO 2 emission reductions within the entire data center, saving 5% of energy for cooling; depending upon the national power grid fuel mix. If the data center manager decides to continue with the standard solution, they should nevertheless consider compensating for the extra CO 2 impact. The manager could, for example, draw on a renewable power source, such as solar panels, to make up for the lost energy. In such a scenario, to produce the same amount of energy, the data center would need roughly 280 m 2 of solar panels (more than the equivalent of one tennis court which is 261 m2 ). 3

2. Second Solution: Indirect Freecooling A second proposal, to further enhance the currently installed systems in Frankfurt and London, is to use indirect freecooling units, where the water coil is connected to an external dry cooler. This system allows for further increases in energy savings by using the outside air temperatures in cooler seasons in order to cool the internal air through the water that has previously undergone the freecooling process. This technology guarantees energy savings and ensures system availability. Here, savings would amount to 28% of the annual energy consumption registered both in the Frankfurt and London data centers. Below is a figure which illustrates this: Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ /kw] EC Fan Freecooling 145% 89,656 35,610 1.18 1,048 Table 2. Economic and energetic comparison of different cooling solutions for Frankfurt and London Data Centers. Repeating the comparison between the energy saved and the amount of alternative energy which would be required to produce the same amount of saved kwhs - in this case - the customer would need the equivalent of six tennis courts of solar panels. 3. Third Solution: Direct Freecooling Despite the use of direct freecooling becoming increasingly common in data center applications, its use is still limited when compared to other solutions, such as shelters or cabins, mainly due to humidity control. Direct freecooling is not only limited by external temperatures but, in particular, by humidity levels. Dry air can absorb varying quantities of vapor depending on its temperature. At a given temperature, there is a maximum quantity of vapor grams per kilogram which can be absorbed by air. As the amount of vapor increases, however, it cannot be absorbed by air any longer, causing a liquid (i.e. water) to form. Given specific external ambient temperature conditions, air will have a defined temperature and humidity level. This is defined as absolute humidity. When comparing the absolute humidity with the maximum level of absorption the air can hold at a certain temperature, the relative humidity is obtained. From a physics point of view, relative humidity is represented by the ratio of the partial pressure of water vapor in an air/water mixture, compared to the saturated vapor pressure of water at a prescribed temperature. For example, at 18 C the maximum level of humidity is 12.89 g/kg. However, at 18 C, with 50% of the absolute humidity, it becomes 6.38 g/kg. The higher the air temperature, the greater amount of humidity is absorbed. These rules are widely applied in physics using the psychrometric chart that presents the different temperature and humidity conditions at sea level pressure. This physics theory can also be applied to the fresh air freecooling operation. During the European winter season, the air is colder, thus the maximum humidity can be extremely low. For example, at 5 C the maximum level of humidity is already below the ASHRAE recommended limit 1 and, as a matter of comparison, corresponds to the same absolute humidity of 24 C, at 28% 1 AHSRAE: 2011 Thermal Guidelines for Data Processing Environments 4

absolute humidity. Under such conditions, extremely dry air enters the data center, thus requiring the use of a humidifier (a high energy consumption component) to compensate. Similarly, during European wet seasons, such as spring or autumn, the risk is the opposite as air entering the data center requires dehumidification. With external temperature at 15 C, and 100% RH with elements of fog, there would be an absolute humidity level of 80%; where relative humidity would sit at 24 C. Efficient freecooling softwares allow the customer to choose whether to use freecooling operation or not, based on external weather conditions and energy consumption levels. The savings surrounding direct freecooling operation are closely linked to data center humidity range limits. For a standard control of +/-10% of the relative humidity based on the standard set point of 24 C and relative humidity of 50%, savings only make up 12% of the annual consumption. Details are described in the table below. Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ / kw] EC Fan Freecooling 145% 89,656 35,610 1.18 1,048 EC Fan Economizer 127% 109,732 15,534 1.61 1,126 Table 3. Economic and energetic comparison of different cooling solutions for Frankfurt Data Center. In order to reap real benefits from the direct freecooling application, it is recommended that the customer extends the humidity control limits to those defined by ASHRAE as the recommended working zone specifications 2. Classes (a) Dry Bulb Temperature ( C) (e) (g) Equipment Enviromental Specifications Product Operations (b) (c) Humidity Range, non Condensing (h) (i) Maximum Dew Point ( C) Maximum Elevation (m) Maximum Rate of Change ( C/hr) (f) Dry Bulb Temperature ( C) Product Power Off (c) (d) Relative Humidity (%) Maximum Dew Point ( C) Recommended (Applies to all A classes; individual data centers can choose to expand this range based upon the analysis described in this document) A1 to A4 18 to 27 5.5º C DP to 60% RH and 15º C DP Allowable A1 15 to 35 20% to 80% RH 17 3050 5/20 5 to 45 8 to 80 27 A2 15 to 35 20% to 80% RH 21 3050 5/20 5 to 45 8 to 80 27 A3 5 to 40-12 C DP & 8% RH to 85% RH 24 3050 5/20 5 to 45 8 to 85 27 A4 5 to 45-12 C DP & 8% RH to 90% RH 24 3050 5/20 5 to 45 8 to 90 27 B 5 to 35 8 RH to 80% RH 28 3050 NA 5 to 45 8 to 80 29 C 5 to 40 8 RH to 80% RH 28 3050 NA 5 to 45 8 to 80 29 Table 4. AHSRAE: 2011 Thermal Guidelines for Data Processing Environments. 2 AHSRAE: 2011 Thermal Guidelines for Data Processing Environments 5

With these working zone extensions, the total savings can arrive at a staggering 33% (41,774 / year) for the Frankfurt-based data center. For London, on the other hand, which is characterized by a more friendly climate as far as direct freecooling applications are concerned, savings can reach up to 47%. Details surrounding these savings are shown below in Table 5. Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ / kw] EC Fan Freecooling 145% 89,656 35,610 1.18 1,048 EC Fan Economizer 127% 109,732 15,534 1.61 1,126 EC Fan Extreme Economizer 127% 83,492 41,774 0.60 951 Table 5. Economic and energetic comparison of different cooling solutions for Frankfurt Data Center. This solution has specifically been developed for data center applications that use external air only when temperature and humidity conditions are deemed suitable for the specific installation. 4. Fourth Solution: Variable Cooling Capacity and Latest Technologies In order to further enhance energy savings and maintain a distinction between internal and external environments, the optimal solution is the alternative which uses an efficient modulating cooling technology. The most efficient technology that can thus be implemented is the digital scroll compressor with an EC fan and Electronic Expansion Valve 3. The use of superior cooling technology reduces the data centers cooling system energy consumption (for both the Frankfurt and London data centers) by almost half, making the return on investment visible in only a few months time. Details on superior cooling technology savings are illustrated below. Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ / kw] EC Fan Freecooling 145% 89,656 35,610 1.18 1,048 EC Fan Economizer 127% 109,732 15,534 1.61 1,126 EC Fan Extreme Economizer 127% 83,492 41,774 0.60 951 Digital Air Cooled 128% 66,710 58,556 0.45 843 Table 6. Economic and energetic comparison of different cooling solutions for Frankfurt Data Center. Repeating the previous calculation in terms of solar panels, the equivalent of 10 tennis courts would be required to produce the same level of energy savings. 3 For more details on the savings resulting from a Digital Scroll see white paper: Liebert HPM Digital Offers More Requires Less - Liebert HPM: using the latest industry technologies to reduce energy consumption of your data center by 50%. 6

The energy saved also leads to reductions in CO 2 emissions of varying levels depending upon the fuel mix taken from the national power grid. On average, to produce the same amount of energy in Europe, 267,000 kg CO 2 /year would be produced. To further clarify this concept, the amount of CO 2 emitted by European cars can be used as an example. The mean number of kilometers travelled per year, per person in Europe is 12,000 km and the emissions of a small car equal around 119 g/km. Using the most efficient units, equipped with superior cooling technologies, to cool a 300 kw data center would save the same amount of CO 2 as stopping 188 city cars in the European Union from circulating. 5. Fifth Solution: Reducing Energy Consumption and Optimizing the Installation Investment In order to dramatically reduce energy consumption and thus truly optimize on installation investment, the data center manager must choose the solution that adjusts the cooling capacity to the servers specific needs. This solution includes the separation of the hot and cold areas using cold or hot aisle containment. This allows the cooling units to operate at higher air temperatures, therefore increasing both capacity and efficiency. Such a solution is designed to include the latest cooling unit (i.e. compressor modulating technology with digital scroll, EC fan and Electronic Expansion Valve) with the best data center application controls and optimized distribution of temperature and air. The ideal solution consists of a precise control of temperature, humidity, and air flow rate at the server level to ensure the exact availability of airflow needed by the server to guarantee its maximum lifetime and the highest reliability. This solution is known as SmartAisle TM control for cold aisle containment, which includes the most superior market technologies, such as the digital air cooled modulating compressor, the EC Fan and the Electronic Expansion Valve all controlled in the most efficient way. The overall result is a reduction in power consumption of more than 60%, again independently of the European reference site, with savings of up to 793,000 kwh (please see Table 7 for details). Solutions Available for Frankfurt Data Center Capital Energy [ / Savings [ / Time [years] years [ / kw] EC Fan Freecooling 145% 89,656 35,610 1.18 1,048 EC Fan Economizer 127% 109,732 15,534 1.61 1,126 EC Fan Extreme Economizer 127% 83,492 41,774 0.60 951 Digital Air Cooled 128% 66,710 58,556 0.45 843 SmartAisle TM + Digital Air Cooled 154% 45,944 79,322 0.63 785 Table 7. Economic and energetic comparison of different cooling solutions for Frankfurt Data Center. Producing the same amount of energy with solar panels would require a surface area equivalent to 3,800 m 2 (10% larger than the Colosseum). A similar amount of electricity as the one produced in Europe would lead to the emission of 372,710 kg of CO 2 and would produce the same savings as the ones obtained by stopping 261 city cars in Europe from circulating. 7

Conclusion Cooling a data center can be achieved in numerous ways. Methods can range from traditional technologies to increasingly efficient means of addressing the same cooling needs by using the latest industry-leading technologies. The SmartAisle TM cold aisle containment control logic solution, together with the most efficient technologies developed for a modulated cooling unit (Digital Scroll, Electronic Expansion Valve and EC Fan), allow for a cost saving of more than 60%. Furthermore, this solution provides a return on investment in only a few months. This is clearly shown in the graph below demonstrating the amount of savings earned by the Frankfurt and London data centers. SmartAisle TM, in fact, allows the maximization of energy savings and avoids any possible issues that might emerge from similar solutions where the external environment is in direct contact with the internal data room. The achievable energy savings also lead to significant environmental sustainability initiatives with a reduction in CO 2 emissions by approximately 370,000 kg/year. The traditional approach, claiming that Direct Expansion cooling solutions are high-energy consuming, has thus been proven inaccurate. The results of the analysis demonstrate that even city-center based data centers that are not equipped with chilled-water cooling methods are able to achieve a ppue of 1.15; which is consistently lower compared to the initial ppue of 1.4 observed in traditional cooling solutions. Annual Energy ( /year) k 135 k 125 k 115 k 105 k 95 k 85 k 75 k 65 k 55 5% Saving 15% Saving 28% Saving 40% Saving 47% Saving 63% Saving k 45 No EC Fan Aircooled EC Fan Aircooled EC Fan Freecooler Frankfurt EC Fan Economizer London Extreme EC Fan Economizer Digital Aircooled Digital Aircooled Smart Aisle TM Graph 1. Comparison between energy savings obtained using different cooling solutions 8

APPENDIX Calculation Assumptions: Energy 0.1 /kwh 300 kw Data Center; Set point 22 C +/- 1 C; RH 50% +/- 10%. Annual energy consumption has been calculated with Emerson Network Power s calculating program utilizing the annual temperature profile of Frankfurt and London. CO 2 /kwh for Europe 0.59 kg CO 2 /kwh - Reference: IEA (International Agency of Energy) Solar Panel Production: Southern Europe, kwh/year 4650 with 22 m 2 of solar panels Bibliography http://www.iea.org/ 9

Ensuring The High Availability Of Mission-Critical Data And Applications. About Emerson Network Power Emerson Network Power, a business of Emerson (NYSE:EMR), delivers software, hardware and services that maximize availability, capacity and efficiency for data centers, healthcare and industrial facilities. A trusted industry leader in smart infrastructure technologies, Emerson Network Power provides innovative data center infrastructure management solutions that bridge the gap between IT and facility management and deliver efficiency and uncompromised availability regardless of capacity demands. Our solutions are supported globally by local Emerson Network Power service technicians. Learn more about Emerson Network Power products and services at www.emersonnetworkpower.eu Locations Emerson Network Power Global Headquarters 1050 Dearborn Drive P.O. Box 29186 Columbus, OH 43229, USA Tel: +1 614 8880246 Emerson Network Power Thermal Management EMEA Via Leonardo Da Vinci, 16/18 Zona Industriale Tognana 35028 Piove di Sacco (PD) Italy Tel: +39 049 9719 111 Fax: +39 049 5841 257 ThermalManagement.NetworkPower.Eu@Emerson.com Emerson Network Power United Kingdom George CurlWay Southampton SO18 2 RY, UK Tel: +44 (0)23 8061 0311 Fax: +44 (0)23 8061 0852 While every precaution has been taken to ensure accuracy and completeness herein, Emerson assumes no responsibility, and disclaims all liability, for damages resulting from use of this information or for any errors or omissions. Specifications subject to change without notice. Globe Park Fourth Avenue Marlow Bucks SL7 1YG Tel: +44 1628403200 Fax: +44 1628403203 Uk.Enquiries@Emerson.com EmersonNetworkPower.eu Emerson. Consider it Solved., SmartAisle TM and Emerson Network Power are trademarks of Emerson Electric Co. or one of its affiliated companies. 2014 Emerson Electric Co.