Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Similar documents
Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Legacy Data Centres Upgrading the cooling capabilities What are the options?

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Direct Fresh Air Free Cooling of Data Centres

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Office of the Government Chief Information Officer. Green Data Centre Practices

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Capgemini UK Infrastructure Outsourcing

World s Greenest High Performance Data Center

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

European Code of Conduct for Data Centre. Presentation of the Awards 2014

EXAMPLE OF A DATA CENTRE BUILD

Drives and motors. A guide to using variable-speed drives and motors in data centres

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Drives and motors. A guide to using variable speed drives and motors in data centres Meeting your Carbon Reduction Commitment (CRC)

The CEETHERM Data Center Laboratory

Analysis of data centre cooling energy efficiency

Metering and Managing University of Sheffield Data Centres. Chris Cartledge Independent Consultant

University of St Andrews. Energy Efficient Data Centre Cooling

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Blade Server & Data Room Cooling Specialists

Elements of Energy Efficiency in Data Centre Cooling Architecture

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Improving Data Center Efficiency with Rack or Row Cooling Devices:

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

HPC TCO: Cooling and Computer Room Efficiency

Air Conditioning. The opportunity for energy efficiency. Low cost actions to reduce energy usage now

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Data Centers: Definitions, Concepts and Concerns

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

ICT and the Green Data Centre

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers

Data Center Components Overview

Best Practice Guide for Data Centres - a UH Perspective Steve Bowes-Phipps, Data Centres Manager University of Hertfordshire

Data Center Technology: Physical Infrastructure

Element D Services Heating, Ventilating, and Air Conditioning

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

NEW JERSEY CENTER OF EXCELLENCE

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

Technical Systems. Helen Bedford

Making Existing Server Rooms More Efficient Steve Bowes-Phipps, Data Centre Manager University of Hertfordshire.

RULES FOR THE SETTING UP OF SUSTAINABLE BRANCHES Summary

AIR CONDITIONING EFFICIENCY F8 Energy eco-efficiency opportunities in Queensland Foundries

HEATING, VENTILATION & AIR CONDITIONING

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

- White Paper - Data Centre Cooling. Best Practice

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Introducing UNSW s R1 Data Centre

APC APPLICATION NOTE #112

Environmental Data Center Management and Monitoring

How To Run A Data Center Efficiently

AIRAH Presentation April 30 th 2014

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

2010 Best Practices. for the EU Code of Conduct on Data Centres. Version Release Public Review Public

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

State of the Art Energy Efficient Data Centre Air Conditioning

DataCenter 2020: first results for energy-optimization at existing data centers

NREL s Plans & Strategies for Building Green Data Centers

Benefits of. Air Flow Management. Data Center

Questions to be responded to by the firm submitting the application. Why do you think this project should receive an award? How does it demonstrate:

Crowne Plaza Copenhagen Towers The world s greenest hotel

Benefits of Cold Aisle Containment During Cooling Failure

Re Engineering to a "Green" Data Center, with Measurable ROI

AIA Provider: Colorado Green Building Guild Provider Number: Speaker: Geoff Overland

Thermal Storage System Provides Emergency Data Center Cooling

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

imagine SOLAR AIR CONDITIONING MADE EASY

CoolTeg Plus Chilled Water (CW) version: AC-TCW

Data Centre Cooling Air Performance Metrics

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL

Reducing Data Center Energy Consumption

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Direct Contact Liquid Cooling (DCLC ) PRODUCT GUIDE. Industry leading data center solutions for HPC, Cloud and Enterprise markets.

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Chapter 3.4: HVAC & Refrigeration System

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

abstract about the GREEn GRiD

Creating Efficient HVAC Systems

Datacentre Capability March 2015

Energy Efficiency and Availability Management in Consolidated Data Centers

White Paper Rack climate control in data centres

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings

Transcription:

Energy Efficient Data Centre at Imperial College M. Okan Kibaroglu IT Production Services Manager Imperial College London 3 March 2009

Contents Recognising the Wider Issue Role of IT / Actions at Imperial Data Centre Actions based on Burton Group recommendations Data Centre at Imperial CO 2 Cooling / How does it work? Nearly three years in operation Reliability / Issues Analysis of Efficiency Conclusion

Recognising the Wider Issue College Wide Initiatives An active approach through a College-wide programme Upgrade heating plants and insulate heating pipe-work halves the amount of CO 2 released Centralise individual A/C units using more efficient absorption chillers Phased introduction of 10% voltage reduction 500 tonnes of CO 2 each year Metering of power across the campus improved Energy from renewable resources - warm and cool water boreholes Focusing on recycling Food miles, further paper waste, reuse building rubble, transport policies, ecology, etc. Imperial College London is a world leading science-based university whose reputation for excellence in teaching and research attracts students (11,000) and staff (6,000) of the highest international quality.

Role of IT What has been done so far? Use of LCD monitors / CRT replacement initiative Power saving activated on PCs Recycling of PCs Remote Working Provision of voice / tele-conferencing Increased management support for home working OCS infrastructure made available for on-line conferencing Distributed unified printing facility Number of desktop printers reduced Managed idle power-off enhances savings New network edge switches most efficient in its category New equipment comply with green policy guidelines of the College Data Centre initiatives Centralised hosting of group servers

Burton Group Recommendations for Data Centres 1. Employ high-density options: blade servers, virtualisation 500 blades for HPC, plus 40 for Business Systems so far Virtualisation policy: virtual clients by default 2. Consolidate storage space College SAN in place for seven years Storage virtualisation 3. Energy-efficient hardware such as multi-core CPUs Standard server hardware 4. Hot aisle/cold aisle layout New Data Centre implemented this way Recently, also implemented boxed cold aisle 5. Use Liquid Cooling Efficient, because cooling activity is localised and enclosed Volume of CO 2 to move around is much less

New Data Centre in 2006 HEFCE funding for a new e-science Suite For High- Density / Performance computers Obvious candidate liquid cooling Implemented between 2004 and 2006 CO 2 Mission Critical Cooling Designed/tested in Summer 2005 Installed in Oct - Dec 2005 Server occupation April August 2006 Basic characteristics: 300kW Roof mounted air cooled condenser 15 x 20kW rack mounted Coolers Designed for 1 MW server capacity, 750 square meters, 30x CO 2 + 150x standard cabinets Also included standard water cooled chillers at 2x520 kw capacity using efficient screw compressors running in active/passive mode Condenser PLANT SCHEMATIC M Turbocor compresso rs Expansion valve M CO 2 H.P. receiver CO 2 liquid pumps CO 2 Circuit CO 2 Condenser plate & shell Wet return

CO 2 Cooling Units BLADE POPULATED RACKS

CO 2 Racks Design Basis Closed system CO 2 circulating at 14 C Above dew point: no condensation Air drawn into cabinets by blade fans Inlet temperature ambient Server air discharge temperature 35/40 C Discharged air drawn through cooler Heat load absorbed in heat exchanger by partial vaporisation of liquid CO 2 Air discharge temperature ambient Controlled access 3-speed fans; N+1 Fan Hot swap UPS back-up for the fans Off rack temperature display Various displays, indicators, alarms Fan failure, CO 2 discharge, BMS connection Air in 22 o C Discharge Temp 35/40 o C Air out 22 o C Cabinet Servers

CO 2 Cabinet Access Access via rear hinged door Server load discharged ~ 60/70% into Cooler: load absorbed ~ 30/40% into aisle

Nearly three years in operation Reliability / Issues Remote monitoring by TROX used to detect and resolve issues One of the four pumps was replaced without disrupting the service (N+1) London plane trees dusting the cooler inlets with their seeds and causing a shutdown Regularly being combed out during allergy season One major failure causing abrupt increase of ambient temp. Fans continued keeping air flowing A couple of door leaks over three years Detected through the drop of internal CO 2 pressure not serious enough to trigger the CO 2 alarm Racks isolated and doors fixed without disruption to the rest of the system The system has proved extremely reliable and support from Star has been good. Steve Lawlor, Data Centre Manager

Chillers Power Consumption Compared CHW power consumption ranges between 150 and 200 kw; overall average: 175 kw (±15%) CO 2 power consumption ranges between 25 and 50 kw; overall average: 35 kw (±30-40%)

Overall Power Consumption and DCiE 1200 1000 DCiE 0.80 0.70 800 Total Power Input to the DC 0.60 0.50 600 400 Rooms Total Room 1 0.40 0.30 0.20 200 Room 2 0.10 0 18/07/2008 18/08/2008 18/09/2008 18/10/2008 18/11/2008 18/12/2008 18/01/2009 18/02/2009 0.00 Total UPS power consumption of the DC is around 67 kw

Energy Analysis Chilled Water vs. CO 2 System Comparison: # 1: 75% CO 2 + 25% CHW CRAC # 2: 100% CHW CRAC 24x7x365 peak equipment loads Solar and fabric gains omitted Excludes humidifier loads System Duty / Size kw Power Consumption/kW Cooling System 1: CO 2 System kw System 2: Chilled Water System kw 250 0.369 0.506 500 0.344 0.496 750 0.326 0.499 1000 0.327 0.502 Data courtesy: hurleypalmerflatt Suggests 30% better efficiency with CO 2 cooling versus Chilled Water

Cooling Efficiency of CO 2 at Imperial Efficiency calculated in cooling power per server kw Water Cooled Chiller 13 CRAC units Only Traditional cooling in Room 2 (8 CRAC units) kw cooling = 175 x (8/13) = 108 kw Efficiency = 108 / 200 = 0.54 CO 2 coupled cooling in Room 1 (5 CRAC units) kw cooling = 175 x (5/13) = 67 kw Efficiency = (67 + 35) / 300 = 0.34 Turning out as providing around 40% better efficiency

Conclusions Space savings Server Density Two times less space) Power savings Used in conjunction with a chilled water system Bringing about 25-45% more efficiency Good reliability record Design should be carefully thought for mission critical systems 1+1 would be expensive Mixing production / non-production

Questions.

Enclosed Cold Aisle