Technical Systems. Helen Bedford



Similar documents
DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Analysis of data centre cooling energy efficiency

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

The Different Types of Air Conditioning Equipment for IT Environments

State of the Art Energy Efficient Data Centre Air Conditioning

Webinar: Precision Cooling in Data Centres and other applications 24 June 2014

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Elements of Energy Efficiency in Data Centre Cooling Architecture

Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy

AIRAH Presentation April 30 th 2014

White Paper: Free Cooling and the Efficiency of your Data Centre

EXAMPLE OF A DATA CENTRE BUILD

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Legacy Data Centres Upgrading the cooling capabilities What are the options?

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014

AERMEC DATA CENTRE SOLUTIONS THE COMPLETE RANGE OF AERMEC SOLUTIONS FOR DATA CENTRES

STULZ Water-Side Economizer Solutions

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

University of St Andrews. Energy Efficient Data Centre Cooling

European Code of Conduct for Data Centre. Presentation of the Awards 2013

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures

In Row Cooling Options for High Density IT Applications

- White Paper - Data Centre Cooling. Best Practice

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

How to Build a Data Centre Cooling Budget. Ian Cathcart

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Direct Fresh Air Free Cooling of Data Centres

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Introduction to Datacenters & the Cloud

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

White Paper #10. Energy Efficiency in Computer Data Centers

Re Engineering to a "Green" Data Center, with Measurable ROI

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Containment Solutions

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Economizer Modes of Data Center Cooling Systems

Recommendations for Measuring and Reporting Overall Data Center Efficiency

National Grid Your Partner in Energy Solutions

Economizer Fundamentals: Smart Approaches to Energy-Efficient Free-Cooling for Data Centers

Improving Data Center Energy Efficiency Through Environmental Optimization

Modular Data Centre. Flexible, scalable high performance modular data centre solutions. Data Centre Solutions...energy efficient by design

European Code of Conduct for Data Centre. Presentation of the Awards 2014

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Reducing Data Center Energy Consumption

Guide to Minimizing Compressor-based Cooling in Data Centers

Free Cooling for Data Centers

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

How To Improve Energy Efficiency In A Data Center

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

The Pros and Cons of Modular Systems

Scalable. Affordable. Flexible. Fast.

How to Save Over 60% on Cooling Energy Needed by a Data Center? The Roadmap from Traditional to Optimized Cooling

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

White Paper Rack climate control in data centres

European Code of Conduct for Data Centre. Presentation of the Awards 2016

QUALITATIVE ANALYSIS OF COOLING ARCHITECTURES FOR DATA CENTERS

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Submersion Cooling Evaluation

5 Reasons. Environment Sensors are used in all Modern Data Centers

Defining Quality. Building Comfort. Precision. Air Conditioning

DataCenter 2020: first results for energy-optimization at existing data centers

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE ? SLIDE 1

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

APC by Schneider Electric

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Data Center Equipment Power Trends

Multi-client Technical Systems in Data Centres in the UK. By: Helen Bedford Quality Approved: Lone Hansen September 2013 Project: 57315

Data Centre Cooling Air Performance Metrics

Critical Infrastructure for the World s Leading Data Centers

Managing Data Centre Heat Issues

Capgemini UK Infrastructure Outsourcing

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

FEAR Model - Review Cell Module Block

Engineers Ireland Cost of Data Centre Operation/Ownership. 14 th March, 2013 Robert Bath Digital Realty

PAYGO for Data Center -- Modular Infrastructure

Excool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

Data Center Energy Profiler Questions Checklist

Raised Floor Data Centers Prepared for the Future

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Climate solutions. evaporative cooling, adiabatic humidification and programmable controllers. energy saving solutions for data centers. carel.

DATA CENTER & COOLING SOLUTIONS

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

HVAC Infrastructure Retrofit Solutions

Business white paper. A new approach to industrialized IT. HP Flexible Data Center

Data Centre Testing and Commissioning

Transcription:

Technical Systems in Data Centres Helen Bedford

Technical Systems in Data Centres Overview Cooling IT and Peripheral Equipment Security: Access and Control Building Management System (BMS) Data Centre infrastructure Management (DCIM) Overall Data Centre Modular Data Centre Solution Power: Provision i and Back-Up Fire Detection and Suppression Excellence in Market Intelligence 2

Recent Changes to Cooling Requirements Resulted in New Technology Adoption Power Usage Effectiveness is a measure used to understand the energy efficiency of Data Centres IT + DC Power PUE = IT Power At the moment the average PUEinmany Data Centres ranges from 1.8-2.5, however, the most recent deployments, such as Capgemini Data Centre in Swindon (Merlin) states PUE=1.08 As cooling can account as much as 30% of operating costs, considerations of the type of cooling systems installed can directly impact PUE and operating costs after project deployment. However, traditionally, the main priority in the critical managed facilities was security and stability of operations running with low downtime risk. As IT equipment was more likely to suffer from faults and failures at higher temperatures, the cooling systems in Data Centres in the UK provided temperatures at ~18-20 C, inlet temperature falling sometimes as low as 12 C The new temperature and humidity ranges were relaxed in the latest ASHRAE9.9 guidelines to reflect the technological changes in server equipment able to operate effectively at higher levels At the same time, the back-up options should be implemented to ensure that the data centres stay operational under all conditions, incl. cases of prolonged power cuts and main cooling systems failures (DX is the most common back-up option in the UK). Another consideration is that, if the air temperature is raised to 25 C, then servers will start to aggressively increase air flow to the servers, thus counteracting the energy savings achieved in cooling (through increased fan power consumption by servers). Hot and cold aisle containment is a popular solution, providing a cost effective way to increase energy efficiency by blocking the air streams through installation of air blankets. Hot aisle is more popular in New Data Centre Market applications while cold aisle containment is the most popular option in retrofit. Rising inlet and return temperatures is promoted as an effective way to increase PUE and allow for the direct free cooling method being successfully deployed in certain regions in the UK (mainly in the North). Excellence in Market Intelligence 3 ASHRAE 9.9 Guidelines expanded allowable humidity and temperature corridors for Data Centre Applications Source: HP 2012, ASHRAE 2011, Green Grid ASHRAE 2004 and 2008 Guidelines: Comparison Year 2004 2008 Guidelines Recommended Allowable Recommended Allowable Temperature 20 C 25 C 15 C 32 C 18 C 27 C 10 C 35 C Humidity 40 % - 55 % 20 % - 80 % 5.5 C 60 C 20 % - 80 % In 2011, recommended range stayed at 2008 level, with allowable temperatures for A1 class being 15 C 32 C and extended to lower classes to 5 C 45 C; allowable humidity range 8% - 90%

Most Popular Cooling Solutions in Data Centres in the UK As a rule of thumb, if the data centre has 50+ racks it is most likely to go down the chilled water route, whereas those with <10 racks tend mostly to use DX condensers to cool the cooling solutions. Thus, DX condensers are more popular for smaller Data Centres, while chillers are mostly suitable for large Data Centres due to their applicability and the levels of costs. DX condensers in smaller Data Centres are installed in double capacity to cater for redundancy. Water Cooling Towers are not commonly used in the Data Centres in the UK, mainly owing to the risk associated with legionella disease. It is estimated that 70% of IT equipment in the UK is now cooled using hot or cold aisle containment, with Hot Aisle Containment being used mainly in the New Build Data Centres market segment. Cold Aisle Containments is a very cost-effective option for brown-field Data Centres. Hot Aisle containment is a less preferable option for shared Data Centre Space. Air and pressure blankets allow for implementation of Aisle containment and are relatively cheap. Excellence in Market Intelligence Most Common Cooling Systems Overview System Type Heat Rejection Mechanical Cooling Equipment Chilled Water Direct Expansion (DX) Source: ASHRAE, 2011 Cooling Tower Chiller (air-cooled) Chiller (watercooled) Condenser CRAC C (DX) Dry Cooler Hot Aisle Containment Courtesy http://www.42u.com/cooling Condensing Unit CRAC (water/ glycol) Self-contained AC unit/ rooftop unit Terminal Equipment CRAH (chilled water) Air handler (DX) Cold Aisle Containment

Special Features: Cooling - Overview (1) CRAC (Computer Room Air Conditioning) Still the most common type of cooling system that provides cooling via raised floor or an overhead duct distribution system CRAC units contain a cooling coil, a supply air fan, return air filer; some might also have refrigerant compressors, a humidifier, a reheat coil and a free-cooling coil. Can be combined either with DX condensers or watercooled chillers The cost of CRAC units with fans, centralised ventilation/dehumidification system and separate Direct Expansion (DX) Condensing Units DX systems are ideal for small Data Centres or as a system providing for additional redundancy They perform better in harsh environmental conditions, so are more reliable Normally in the UK they accompany evaporative cooling as a back-up option in case of failure, both in traditional and modular design humidifiers make them one of the most cost effective solutions for a new Data Centre facility Adding Variable Controls and EC Fans will provide the necessary option for energy efficiency; and, alongside other possible measures, such as raising supply and return air, bring additional increase of PUE in a New Data Centre in a cost effective way European players: Emerson, Schneider, GEA Denco, Stulz Not applicable for large Data Centres due to impracticality Require increased level of maintenance due to the multiple pieces of equipment (fans, controls, compressors) Can have high levels of energy consumption Players: Gea Denco, Climaveneta, APC, EdenAir Source: Uniflair Source: Airdale Chillers High efficiency multi scroll with free cooling nibbling at Chillers account for a large portion of the overall cooling system cost in the new Data Centres, especially larger ones the lower end. Above 500-600kW screw and turbocor dominate. Adiabatic cooling more relevant to southern Europe, although rolled out in some projects in the UK, It is estimated that total value of chillers sold in the UK mainly in the educational sector data centre market is 9% of the total value of the UK chiller market. This amounts to 14m in 2011. Cooling towers. Cost of maintenance of cooling towers is reported to be too high and puts a lot of organisations The trend in the market is that free cooling is being made available in smaller machinery. You need to have off installing the equipment, as well as the old chestnut of legionella disease the right water temperature in order to make free cooling a viable option Main leaders: Airdale, Liebert Hiross, Climaveneta and Uniflair. Excellence in Market Intelligence 5

Special Features: Cooling Overview (2) Evaporative (or Adiabatic) Cooling Not a very new method but is gaining popularity Utilises evaporation method as a means of cooling down the air or water temperatures Can be direct and indirect be increased if water is sprayed directly into the air stream going into the Data Centre There are some concerns over the applicability and reliability of evaporative cooling in wider range of temperatures in the UK and potential concerns The application of evaporative cooling can have over legionella ll disease geographic limitations in the UK (especially in the Need for maintenance every 24 hours with double coastal areas with increased levels of humidity) units The humidity of the air inside the data centre could Water consumption can be high and expensive Direct Free Cooling provides for wider range of allowable Allows to capitalise the new ASHRAE temperatures and humidity temperatures guidelines If the temperature goes above 18 C, then DX CAPEX costs reduced compared to indirect condenser will work to bring the temperature cooling down. Lower energy consumption Believed to be acceptable in London due to Direct Free Cooling is allowing ambient air to enter climate, the room with heat exchangers folded away and Close Coupled (In-Rack and In-Row) Open-Loop solutions bringing the heat transfer closer to the equipment rack and not completely independent of the room and interacting with the air. Main types: In-Row, in-line air conditioners, Rear Door Heat Exchangers (fairly popular in the UK), Overhead Heat Exchangers (more rarely applied) Liquid Cooling Rather innovative and requires complete change of mind-set to be adopted, especially from the risk adverse IT managers; thus, highly unlikely to gain much traction in the near 3-5 years Mainly refers to submersion into the liquid gas Novec1230 Source: Stulz Closed-Loop solutions is mainly In-Rack solution that are independent of the air stream in the room. AC is applied to server rack and completely sealed. Allows for the direct cooling of servers and not the whole room Provides for scalability and modularity Players: Rittal, APC, Emerson, Hitachi Source: Hitachi (In-Row), USystems Marginal deployment in the UK, undergoing some trials and minor installations in the academic sector More likely to be accepted as a norm on a micro (semiconductor) level, if successfully developed and deployed There are also other types of liquid possible: either Delivers higher cooling capability and lowers PUE directly into water or mineral oil Source: Iceotope Excellence in Market Intelligence 6

Other Technical Systems in Data Centres in the UK Helen Bedford

Other Technical Systems: Total Market The estimated total value of M&E systems for New Data Centres in the UK (excluding DCIM applications) is around 105 million Estimated Market Size for M&E Systems in New Data Centres in the UK (addressable market) The efficiency of power systems is very BMS Security high and unlikely to be changed in a way Fire that dramatically affects the industry as 5 3.5 4 whole. Introduction of generators feeding into the grid, as well as heat recovery could increase the potential spending on M&E 27 systems in the Data Centre segment, however, overall it is expected to remain at the fairly conservative level of growth of around 3-4%. A marginal dip could be expected in 2013 due to delay of some projects further into 2014. Fire Detection is estimated to be worth around 1.7millionwiththefiresuppression to be nearly twice of thatt at 3.33 million Fire Suppression is likely to remain a cost saving option and unlikely to grow dramatically in value. Cooling 65 Power Power Segment represents the majority of value due to the large expense of generators, transformers and other power provision in the Data Centres to cope with the increased IT load requirements Cooling is another big segment estimated to be around 27 million in 2012 for the New Data Centres in the UK BMS sector might receive further expansion to incorporate energy efficiency monitoring and integration with DCIM systems. Excellence in Market Intelligence

Major Categories of Gas Fire Suppression in the Data Centres in the UK Heat Absorption Inert Gas Agent Oxygen Hypoxic Atmosphere Du Pont FM-200 is the most common Argonite (Chubb) is electrically non- This type of fire suppression has Fire Suppression System application. The main characteristics behind its popularity are: conductive and is similar to air in density. Main disadvantages: increased in usage in recent years, especially becoming more popular in modular Data Centres. It is relatively cheap to install and Relatively slow flame extinguishing It is based on the notion of providing carries a moderate recharge cost Widely available, however, even in period of supply shortage, sales were not greatly affected end users were not driven away. In contract, Novec1230 has a far longer shelf life, up to 30 years, but could not compete on a larger scale with FM-200, as it is more expensive to buy and recharge, and it is more complicated in handling and transportation. In a longer term, it might be the preferable option. High pressure storage Large volume of gas required, limiting space in the Technical Area Inergen releases the gas under high pressure adjusting the atmosphere, so that fire cannot longer survive. Main disadvantages: Cannot be modular in design High pressure method requires excessive storage, in addition to long and complicated high-pressure pipework oxygen below the level required for fire ignition but at the level safe for the staff operating at the Data Centre. The typical solution falling under the category of Oxygen removal is DeOx by DigiPlex. It is achieved by reducing the level of oxygen to 155 beyond the required 16.2% for fire to sustain itself. The low level e of oxygen still allows staff to carry out essential maintenance work, if needed, and is equivalent to the oxygen conditions in high mountain areas. Water mist and other types of fire suppression are less commonly used in the UK in comparison to the US, where water mist is highly utilised. However, some colos are using this method as well (Digital Realty) Excellence in Market Intelligence 9