The Use of Fresh Air in the Data Centre



Similar documents
White Paper: Free Cooling and the Efficiency of your Data Centre

Excool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

AIRAH Presentation April 30 th 2014

Analysis of data centre cooling energy efficiency

Greening Commercial Data Centres

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

ALWAYS ON GLOBALSWITCH.COM

Measuring Power in your Data Center

Environmental Data Center Management and Monitoring

Data Center Environments

Chiller-less Facilities: They May Be Closer Than You Think

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

State of the Art Energy Efficient Data Centre Air Conditioning

Managing Data Centre Heat Issues

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

Data Centers That Deliver Better Results. Bring Your Building Together

Direct Fresh Air Free Cooling of Data Centres

SURVEY RESULTS: DATA CENTER ECONOMIZER USE

Common mistakes in datacentre planning

How To Find A Sweet Spot Operating Temperature For A Data Center

Overview of Green Energy Strategies and Techniques for Modern Data Centers

NIGERIA S PREMIUM DATA CENTRE

Datacenter Efficiency

Energy Saving Fact Sheet Air Conditioning

Energy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling

Absolute and relative humidity Precise and comfort air-conditioning

Technical Systems. Helen Bedford

Time better spent. Take your organisation somewhere new with Fujitsu Mobile Business Solutions. Reshaping ICT, Reshaping Business

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Re Engineering to a "Green" Data Center, with Measurable ROI

Three Steps To Perfectly Green

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Dealing with Thermal Issues in Data Center Universal Aisle Containment

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Data centres that deliver

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

Guide to Minimizing Compressor-based Cooling in Data Centers

T:

- White Paper - Data Centre Cooling. Best Practice

How and Why Mission- Critical Cooling Systems Differ From Common Air Conditioners

Top 5 Trends in Data Center Energy Efficiency

The NREN s core activities are in providing network and associated services to its user community that usually comprises:

DATA CENTER ASSESSMENT SERVICES

Building a data center. C R Srinivasan Tata Communications

THE WORLD S MOST EFFICIENT DATA CENTER

Green Data Centre Design

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Introducing AUDIT-BUDDY

The Mission Critical Data Center Understand Complexity Improve Performance

THE GREEN DATA CENTER

System solution for data centres

The European Programme for Energy Efficiency in Data Centres: The Code of Conduct

Power and Cooling for Ultra-High Density Racks and Blade Servers

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Free Cooling for Data Centers

Data Center Technology: Physical Infrastructure

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Alfa Laval Abatigo. Redefining adiabatic cooling

Scalable. Affordable. Flexible. Fast.

Data Center Infrastructure Management - a special report

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Questions to be responded to by the firm submitting the application. Why do you think this project should receive an award? How does it demonstrate:

Modular Chiller Plants for data centers. Downtime isn t acceptable

Defining Quality. Building Comfort. Precision. Air Conditioning

Critical Environment Risk Management

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Who do the biggest names in servers, search, healthcare, cloud & financial services trust to provide power management solutions?

The Green Grid, Metrics and DCMM. Presented by: Jay Taylor

Data Centre Cooling Air Performance Metrics

Data Centre Outsourcing a Buyer s Guide

How To Improve Energy Efficiency In A Data Center

Energy Savings in the Data Center Starts with Power Monitoring

Introduction to Hitachi Group Environment- conscious Data Center Programme for inspiring change

Thermal Storage System Provides Emergency Data Center Cooling

M2G Intelligent Boiler Load Optimisation

An Introduction to Cold Aisle Containment Systems in the Data Centre

THE GREEN GRID DATA CENTER POWER EFFICIENCY METRICS: PUE AND DCiE

Reducing Data Center Energy Consumption

Data Centre Infrastructure Assessment

Data Center Equipment Power Trends

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Critical Infrastructure for the World s Leading Data Centers

Transcription:

The Use of Fresh Air in the Data Centre White paper of a round table debate by industry experts on this key topic. Hosted by Keysource with participants including the Uptime Institute. Sophistication Through Simplicity

The Use of Fresh Air in the Data Centre The white paper shares the findings of a round table consisting of the industry s leading experts and commentators. Representatives from the Uptime Institute, Operational Intelligence, Colt, Fujitsu, Vocalink, Norland Managed Services and The 451 Group joined Keysource for the round table. The event was designed to generate healthy debate based on real experiences on one of the industry s key topics. This is one of two white papers produced from the round table discussion. Executive Summary There is very little to choose between the on-going operational efficiency (lowest annualised PUE*) of a direct fresh air or an optimised re-circulating indirect system. Generally the CAPEX cost is greater for a direct fresh air system, which due to the operational performance (risk) needs for full mechanical backup, when compared to a re-circulating system. Space requirements depend on site-specific variables and the type of solution chosen, but direct fresh air systems often require more integration with the physical building, increasing complexity and cost over a system that utilises chilled water. It is common to integrate direct fresh air into a modular data centre which mainly overcomes this challenge. There is an ever-increasing drive to remove the need for mechanical cooling such as chillers entirely. Depending on a customer s operational requirements and risk analysis this should be achievable with either a direct or an indirect system. The main consideration with a direct system is whether a customer can afford to turn off the facility due to external factors other than temperature, which are out of their control. There is no one size fits all. The key is to weigh up a number of variables such as climate, temperature, humidity and location - along with the nature of an organisation s business and the type of equipment it will house. There will always be facilities and operators that will not adopt a chillerless environment, with resilience being of far greater importance than energy efficiency or any other consideration. These organisations will become a smaller percentage over time and we are likely to see a considerable change in the landscape over the next ten years in the approach to data centre design. An often forgotten consideration which needs to be factored in is that a peak PUE or max power demand can have a huge impact on CAPEX cost and wasted capacity. This involves reserving mains power to run mechanical systems, even if only occasionally, or in testing and the oversizing of electrical infrastructure and standby power. The choice over different designs can affect the peak demand dramatically. An overriding theme, no matter what system is being used, is if you use direct fresh air and do not have full backup then you must accept the fact that at some point you may have to shut down the data centre. * Note: Power usage effectiveness (PUE) is the industry-recognised metric for infrastructure energy efficiency for data centres, developed by The Green Grid Association and is the ratio of total facilities energy to IT equipment energy. Keysource frequently use the PUEL2,YC measure for client facilities. This is a Category 2 measurement of PUE (as defined by the Green Grid and a 12 month total kwh consumption). 2 The Use of Fresh Air in the Data Centre www.keysource.co.uk

Key Areas of Discussion There are two types of free cooling systems: Direct uses filtered fresh air directly into the data centre. In-Direct uses a plate heat exchanger or something similar, to achieve a fully re-circulating system. It s almost impossible to achieve a one-fit solution as there are too many parameters such as location, temperature, humidity, resilience and risk factors. Fresh air drivers direct or indirect It is possible to analyse and compare the different types of cooling technology on a general and site-by-site basis. But when comparing two separate sites it is important to take many different factors into consideration. Whilst it is not easy to have one common perspective, given the range of variables such as location and climate, it is important that there are measures in place to reduce energy use regardless of the design used. For UK or colder climates the energy usage is practically the same for both a direct or a well-designed optimised indirect cooling solution, so it often comes down to other considerations. In warmer climates an indirect system is advantageous given the need to deal with high humidity. The main issue that was raised with direct cooling systems was the effect of contaminants in the air and the impact on the equipment housed in the data centre. It was generally agreed that systems do provide very good levels of filtration. However in many locations - such as the middle of a seaport, industrial estate or other areas of high contamination - it is not often sensible to use direct fresh air due to the amount of filtration needed and the risk of damage from corrosion or particles. However, in many locations in a normal everyday situation, it would be possible to use a direct fresh air system and still have a very clean data centre. From an initial CAPEX cost perspective, direct fresh air systems were in the main marginally more expensive, when full mechanical backup is required. This is partly due to the additional equipment and often due to the additional construction and integration into the physical building for the required airflow in and out. With an indirect system the free cooling and any mechanical systems can be integrated into one system and so there is far greater potential to achieve a total chillerless solution. In this configuration the main consideration is external ambient temperatures and not air contamination in normal or extreme conditions (e.g. fire). A view-point discussed was whether the use of direct fresh air has led to a plethora of new and recently completed projects over the last few years. These projects are partly and in some cases entirely because of the perception that it is green technology, but the risk has not been effectively evaluated. Furthermore, it was asked if a green rush syndrome was effectively blinding some organisations and preventing them developing solutions that met their real needs, rather than dealing with a perceived quick win. This is obviously not always the case and there are some excellent examples of well-designed systems however there was broad agreement that some were simply looking for a tick in the box (What is the PUE? Has it got fresh air-cooling?) and failing actually to come up with a solution that is more considered, i.e. necessary resilience and total operational performance now and during the facilities lifecycle. Without all of the facts around risk, peak PUE, part load performance and the difficulty in comparing different systems, the decision is often based simply on what an owner or operator wants. It was agreed that both direct and indirect provide the ability to raise server inlet temperatures due to the way the air is controlled and for either approach this normally gives additional hours of free cooling, holding off the need for any supplementary mechanical cooling. Calculating the risks Some concerns were raised about how risk was calculated when opting for fresh air. It was suggested that a risk profile is often developed for individual data centres, but there was not a standard approach because the impact is often so different for each organisation. There was a feeling that such a process was possible to model, but fairly complex so people simply did not undertake the exercise. Moreover, the general view was that actual risk modelling on pure direct fresh air, without a backup, does not really matter. This is because it comes down to the fact that if you need the data centre to stay up, then you would not actually run direct fresh air without the necessary backup systems. Therefore, the decision to use fresh air depends very much on the application and how an organisation calculates the risk. There was also agreement in the figures provided by the Uptime Institute that human error accounts for around 75% of data centre downtime. On that basis, increased complexity can make it harder to manage and potentially more likely to get wrong. Therefore the decisions around design need to factor in the simplicity of normal and failure operation, as well as how it is to be maintained. This is likely to play a major factor in the availability of the overall system. 3 The Use of Fresh Air in the Data Centre www.keysource.co.uk

It is widely recognised that different organisations have very different approaches to risk and how they factor it into their decisions: Large single application businesses with multiple facilities can accommodate downtime in a given location, as they can serve their customer base seamlessly from another location. Organisations with high volume and critical applications or transactions cannot afford to lose any services due to the financial and operational implications. It was agreed that even organisations in the same industry can take very different approaches to data centre strategy and design, both when building their own data centres but also outsourcing to a third party. Regardless of any of these approaches, there was an overriding feeling that calculating risk was not becoming more sophisticated. Different people within the same business have different attitudes to risk but were not really aware of those actually calculating it. However there is thought to be greater use of FMECA (Failure Mode Effects and Criticality Analysis) that does provide an indication of what will happen if a certain set of events occur. Design considerations An organisation requiring high availability is unlikely to install a direct fresh air system without 100% backup on the mechanical cooling. This is because the risks associated with the unknowns of what could happen outside, however infrequent, are generally out of the operators control. Indirect systems pose less or no risk from external pollutants and contaminants. Indirect systems do not require integration into the building fabric, where a direct system often needs large ducts or modifications to the shell. This can increase complexity and cost if, due to space or building height, it is even achievable. Direct systems often require more humidification control depending on which ranges are to be met. With most efficient systems there is some form of adiabatic cooling. With direct systems there is often a reliance on the water to provide capacity rather than simply improve efficiency. In this case there is a much greater reliance on water for normal operation and to maintain availability, which can lead to the need for water storage or other measures. The metric of WUE (Water Usage Effectiveness) needs to be considered. Many data centre facilities are already built with very inefficient cooling solutions. In such cases direct fresh air solutions provide an excellent opportunity to retrofit and run as the primary method of cooling, with the existing inefficient systems as back up. As the backup system is already in place this is often a very affordable option with a clear ROI. One of the biggest advantages for an indirect system is the potential for zero refrigeration. Half of the USA could take this route and even places people would never consider such as Madrid or even Dubai. This inevitably requires the use of and reliance on lots of water, as well as the acceptance of increasing server inlet temperatures during the warmer periods. Density of IT equipment does not make any impact to direct or indirect designs. It is the control of air and the method of air delivery within the space that dictates capacity and air volume requirements. There may be additional considerations for how backup systems and the control strategy between switching cooling methods works in high-density environments due to the risk of thermal increase in very short periods, but this is down to each individual design. Following the agreement that direct fresh air is going to require some sort of a backup system in order to meet availability and customer risk requirements, it is worth considering what benefits might exist for opting for either this or indirect design. Partly due to the different solutions in these two areas and partly because there are other variables on a site specific basis there are not many clear benefits either way, but there are a few considerations included: 4 The Use of Fresh Air in the Data Centre www.keysource.co.uk

The chillerless data centre The final area of discussion looked at the feasibility of chillerless data centre designs. It is recognised that this type of facility is already out there but they are not the norm and are often in climate-specific locations. These types of organisations also have different profiles for the services they deliver and include the likes of Facebook, Google and Yahoo! In many of these cases, whilst they can operate most if not all of the year without chillers, they can also be shut down for a period of time unlike most data centres in the world. If you would like to discuss any of the points raised in this white paper please contact marketing at Keysource and we will be happy to put you in touch with the right specialist. You may also be interested in the second white paper from the round table Raising Inlet Temperatures in the Data Centre. Moving away from single application facilities, other areas without chillers include specialist HPC (High Performance Computing) systems that may have low availability requirements and consume large amounts of power and facilities such as switch sites housing telecoms equipment. Generally there was a view that there is a significant challenge and no great desire for a typical end user or commercial operator to move to this type of design in the UK and most of mainland Europe. Moving forward there was agreement that chillerless solutions will start to become more widely used, especially where an indirect approach is taken and operators push the upper limits of server inlet temperatures. However this is not going to happen quickly when tight Service Level Agreements are in place or the cost of downtime is high. It is most likely a hybrid system will be the norm with a focus on reducing the peak PUE combined with reducing resilience on mechanical systems where they are only used for small periods of time. We would like to thank: Phil Collerton - Managing Director EMEA at the Uptime Institute, Alfonso Aranda - Consultant at the Uptime Institute, Luke Neville - Senior Technical Lead, Design Authority, Data Centre Services (DCS) at Colt, Andy Lawrence - Research Director at The 451 Group, Jim McGregor - Head of Engineering and Data Centre Management at Vocalink and Mike West - Managing Director of Keysource, as well as representatives from Operational Intelligence, Norland Managed Services and Fujitsu for joining us and sharing their expertise. 5 The Use of Fresh Air in the Data Centre www.keysource.co.uk

Keysource focuses on the full cycle of infrastructure technology in the data centre through consultation, design, build, ongoing management and optimisation. When high-performing resilient environments are needed to meet the latest demands of high-density IT and cloud technology, Keysource is trusted for simplifying the delivery of innovative data centre and integrated management solutions. We deploy the latest IT technology and reduce power consumption and operating costs for our customers through our own high performance cooling solution, ecofris. T: +44 (0)845 2043333 E: enquiries@keysource.co.uk W: www.keysource.co.uk Keysource has delivered award winning data centres for some of the most established and innovative brands and organisations in the UK and globally. Our openness and honesty and vendor neutrality mean you ll get a straight, uncomplicated answer and the right solution for your business. The The Use of Fresh Air in the Data Centre white paper is published by Keysource (Copyright Keysource Ltd). All rights reserved. No part of this publication may be reproduced, copied or transmitted in any form or by any means, or stored in a retrieval system, without the prior permission of Keysource except as permitted by the provisions of the Copyright, Designs and Patents Act 1988 and related legislation. Application for permission to reproduce all or part of the Copyright material should be made to Keysource Ltd, North Heath Estate, Horsham, West Sussex, RH12, 5QE. Although the greatest care has been taken in the preparation and compilation of The Use of Fresh Air in the Data Centre white paper, no liability or responsibility of any kind (to extent permitted by law), including responsibility for negligence is accepted by Keysource or its agents. Keysource has reported the views and opinions of participants of the round table and does not endorse these views unless specifically stated. All information is believed correct at time of publication but there is no assurance or guarantee with respect to its accuracy. 6 The Use of Fresh Air in the Data Centre www.keysource.co.uk

T: +44 (0)845 2043333 E: enquiries@keysource.co.uk W: www.keysource.co.uk