Greening Commercial Data Centres



Similar documents
Dealing with Thermal Issues in Data Center Universal Aisle Containment

How To Improve Energy Efficiency In A Data Center

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Managing Data Centre Heat Issues

Analysis of data centre cooling energy efficiency

Green Data Centre Design

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Benefits of. Air Flow Management. Data Center

An Introduction to Cold Aisle Containment Systems in the Data Centre

Power and Cooling for Ultra-High Density Racks and Blade Servers

University of St Andrews. Energy Efficient Data Centre Cooling

- White Paper - Data Centre Cooling. Best Practice

The Benefits of Supply Air Temperature Control in the Data Centre

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

How To Run A Data Center Efficiently

Onsite Support: 24/7/365 onsite team. Security: 24/7/365 onsite manned security and CCTV. Monitoring: 24/7/365 Environmental monitoring and management

Reducing the Annual Cost of a Telecommunications Data Center

Specialty Environment Design Mission Critical Facilities

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Building a data center. C R Srinivasan Tata Communications

The New Data Center Cooling Paradigm The Tiered Approach

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Managing Cooling Capacity & Redundancy In Data Centers Today

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

DataCenter 2020: first results for energy-optimization at existing data centers

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

LDeX Group. Colocation Solutions for High Expectations

Measure Server delta- T using AUDIT- BUDDY

Thermal Optimisation in the Data Centre

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Data Center Power Consumption

Choosing Close-Coupled IT Cooling Solutions

BRUNS-PAK Presents MARK S. EVANKO, Principal

Data Centre Testing and Commissioning

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

The Use of Fresh Air in the Data Centre

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

2014 Best Practices. The EU Code of Conduct on Data Centres

2014 Best Practices. for the EU Code of Conduct on Data Centres

ICT and the Green Data Centre

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Containment Solutions

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Datacentre South London Data sheet

11 Top Tips for Energy-Efficient Data Center Design and Operation

Data Centre Cooling Air Performance Metrics

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Building Energy Systems. - HVAC: Heating, Distribution -

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Raised Floor Data Centers Prepared for the Future

Environmental Data Center Management and Monitoring

How to Meet 24 by Forever Cooling Demands of your Data Center

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

Energy Efficiency and Availability Management in Consolidated Data Centers

White Paper: Free Cooling and the Efficiency of your Data Centre

Data Center Equipment Power Trends

Re-examining the Suitability of the Raised Floor for Data Center Applications

Strategies for Deploying Blade Servers in Existing Data Centers

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

2010 Best Practices. for the EU Code of Conduct on Data Centres. Version Release Public Review Public

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

The Data Center Dilemma - A Case Study

Re Engineering to a "Green" Data Center, with Measurable ROI

Data Centre Infrastructure Assessment

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Infrastructure & Software

Server Room Thermal Assessment

Colocation Selection Guide. Workspace Technology Ltd

Data Center Energy Profiler Questions Checklist

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

In Row Cooling Options for High Density IT Applications

APC APPLICATION NOTE #92

Energy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

Transcription:

Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling requirements Successfully cools high density racks Leonard Kay, Custodian Data Centre Leonard@CustodianDC.com www.custodiandc.com Introduction Custodian Data Centre is based in Maidstone Television Studios in Kent. We are holders of the Green Data Centre award, which is particularly significant because it s in a colocation environment, where we have to meet a wide range of power requirements. Green is a slightly overused term, but what we can offer is a practical, efficient & resilient option for regular data centres. It isn t rocket science, it is common sense and it gives us an annualised PUE figure of sub 1.2 How did Custodian come about? The Television Studios have been broadcasting live television since the early 80 s. A very different industry, but the back end of live TV has some amazing similarities with the requirements of a data centre. There is an obvious always on requirement for redundancy & failover, which is built in to all elements of the infrastructure. The site has switchable twin grid feeds from different sections of the grid, 3 substations on site and UPS generator backed systems. The on-site fuel tanks can supply enough fuel for the generators to run for 240 hours. There have always been comms rooms & small data centres on site. A growing demand for colocation & a need to diversify to make use of the sites excess power resulted in the birth of Custodian. Initially a standard, small (300 hundred sq ft) data centre was built. A growing number of requests for colocation resulted in further investigation into the industry, which showed the site was perfect: It had an excess of power, hard to come by elsewhere

The building had 1000 s of sq ft of convertible space, easily adaptable column & slab structure The existing power & fibre duct infrastructure was already in place It was just 35 miles from Docklands on J7 of the M20 Carriers were requesting to POP the site It already had 24 hour manned security, 30,000sq ft of office space for DR, ample parking, restaurant & a host of other facilities So the decision was made to build a full commercial, proof of concept data floor. From the very outset, we knew that in a commercial environment our margins would be largely dictated by our efficiency. How we developed our Data Centre There are a large number of factors to consider when constructing a data centre, and we tried to take a holistic, future proofed, realistic view across them all. Although our cooling system is the subject of this paper, its development & design is of course intertwined with other areas of our infrastructure: Power. The most efficient, modular UPS offering N+1 & delivering dual feeds as standard to all racks. Power was a weakness of many competitors, either they couldn t get it, their legacy infrastructure couldn t supply it, or they couldn t cool the equipment if they did supply it. We designed our floor to take an average power requirement of 16 Amps per rack, with 64 Amps available for both power & cooling. Fire systems: We use argonite gas suppression, to our knowledge the most environmentally friendly solution. The gas suppression system had to be taken into account when considering using fresh air cooling, as the room has to go closed loop otherwise the gas will escape. As we approached this project, it became clear that we had a number of advantages: The opportunity to use the latest technology We could take a fresh approach, with no brand to risk We had total flexibility within our own building Large amount of power available Fresh Air versus Closed Loop With regard to the cooling system, we interviewed a number of data centre industry leading suppliers, but we also brought in non-industry air conditioning specialists. Although there is talk of fresh air cooling everywhere now, the industry suppliers were still pointing us at closed loop systems with traditional CRAC units. I think it s a case of we ve always done it this way. When we looked at fresh air cooling, we were actually told that there would be issues with pollution & humidity. The filters were expensive & would cost a fortune to replace which would offset any savings. But our non industry people told us this wouldn t be the case, as it has proved. The simple fact was that we were in an area where the outside temperature is below 21 degrees C for almost 95% of the year. So if we could use outside air for cooling we would only need to operate our chillers for 5% of the year. Advantages of Fresh Air Cooling Lower power requirements hence sub 1.2 PUE Greater reliability, higher resilience. It cannot superheat like a closed system, our worst case scenario is multiple chiller failure on a hot day & we bring in 28C air from outside.

Overhead Supply Versus Under-floor The next major decision was the method of supplying the air. Traditional methods as you know are under- floor, which again was recommended to us. This is where the non industry air conditioning people really came in. Why try to push cold air up? We were fortunate to have a lot of headroom which allowed us to run the ducting. The area we used had a raised floor, but the calculations showed that it would need to be significantly greater to give the airflow needed for 16 Amp+ racks. We also needed our system to be modular, & variable to allow for the requirements of different clients. Underfloor systems struggle to cope with variations in rack heat load, giving hot spots. Advantages of Overhead Supply Deliver the amount of air where you need it, underfloor is one zone. Cold air falls giving uniform temperature No obstructions or heating from power cables No leakage from tiles Enables balanced air flow with variable fans & adjustable vents to match equipment needs So the result of all of the research & design was our fresh air, high capacity, overhead, tuneable cooling system. As a system it is more reliable, more efficient & more flexible than traditional systems, and works as follows (see diagram 1.): Diagram 1. Outside air is brought in & mixed as required to give the correct temperature. The fresh air is filtered to controlled atmosphere levels. The grade of filters was chosen to give a balance between level of filtration & the amount of power used by the fans The fans have variable drive and are UPS powered. This gives a greater degree of resilience. Air then enters the ducting & is fed via adjustable vents in front of the racks. This allows us to increase air flow for hotter racks. As the cold air is separated from the contained hot aisles, it falls to give uniform temperatures down the front of the racks. Air is drawn through the equipment & into the contained hot aisle. It is then drawn up into the ceiling plenum where it is used to re-mix if incoming air is too cool, extracted outside, or fed back into the building for heating in winter. Atomizers introduce moisture into the warm mixing air when relative humidity drops. When outside temperature approaches 21 degrees, firstly the hot air mix is reduced, then the fan speeds are brought up. Finally the chillers are brought online, but only to bring the temperature of the outside air down to the necessary amount.

Based on enthalpy, the system can go closed loop if it decides less energy will be used cooling the hot air in the data centre than removing moisture from outside air on a humid, hot day. The system may also need to go closed loop if there is an outside air pollution problem (eg ash cloud). In the event of fire, system goes closed loop to avoid gas going out of the vents Image of overhead ducting during construction Shows one of 5 interconnected ducting legs, interlinked for resilience. Each leg has its own AHU & chiller It shows the adjustable outlets which take the air through ceiling vents to the racks The system includes gas dampers which are deployed in case of fire Image of Racks showing hot aisle containment Shows a simple solution for hot aisle containment which allows for variable rack heights should customers want to use their own racks The curtains have since been brought forward to further prevent air

Air Handling Unit 5 of these are deployed, 1 on each leg of the overhead ducting. The air is drawn in at the base, mixed if necessary with hot air (mixing chamber), & pulled up through the filters. Colocation pleasing all of the people, all of the time As a colocation provider we needed a system that was totally flexible & modular: Each client has their own needs We can t predict the requirements of new clients We get to set the SLA parameters, an SLA with an upper limit of 26C Cooling system has to be variable to deal with hot racks Clients are required to use blanking plates PUE Our PUE was calculated in accordance with the recommendations of the Data Center Metrics Coordination Taskforce, sponsored by 7x24 Exchange, ASHRAE, The Green Grid, Silicon Valley Leadership Group, U.S. Department of Energy Save Energy Now Program, U.S. Environmental Protection Agency s ENERGY STAR Program, United States Green Building Council, and Uptime Institute. We appreciate that PUE is a snapshot measurement & has its weaknesses. For example, it can be lowered by increasing the accepted room temperature, which makes the server fans work harder, moving the load onto the other side of the formula.

Green Data Centre award We won the award because the judges saw that we were passionate about efficiency, & were actively doing something about it in a real world environment. Our facility is proof that there is a simple, proven solution to one of the most obvious area of data centre energy consumption inefficient cooling. Our system of using excess heat from the data centre in the winter to heat our building added to our case. To efficiency & beyond. We are constantly working at our efficiencies, and have made a series of improvements. For example our hot aisle containment curtains were allowing hot air to be drawn into the cold aisle, so they have been moved forward on the racks & modified. We are also looking at fresh air cooling the hot air plenum to reduce heat pick up from the ducting, this has meant increased fan speeds to counteract the temperature rise. Visiting Custodian We are happy to arrange tours of the facility or answer further questions regarding our infrastructure.