Capgemini UK Infrastructure Outsourcing



Similar documents
European Code of Conduct for Data Centre. Presentation of the Awards 2014

Direct Fresh Air Free Cooling of Data Centres

Legacy Data Centres Upgrading the cooling capabilities What are the options?

ICT and the Green Data Centre

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

EXAMPLE OF A DATA CENTRE BUILD

Excool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Managing Data Centre Heat Issues

Specialty Environment Design Mission Critical Facilities

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

Data Centre Powergate, London Flexible, advanced and efficient by design.

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Data Centre Basiglio, Milan Flexible, advanced and efficient by design.

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Analysis of data centre cooling energy efficiency

University of St Andrews. Energy Efficient Data Centre Cooling

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

The European Programme for Energy Efficiency in Data Centres: The Code of Conduct

Recommendations for Measuring and Reporting Overall Data Center Efficiency

State of the Art Energy Efficient Data Centre Air Conditioning

Recommendations for Measuring and Reporting Overall Data Center Efficiency

European Code of Conduct for Data Centre. Presentation of the Awards 2016

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Data Center Technology: Physical Infrastructure

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

Colt Colocation Services Colt Technology Services Group Limited. All rights reserved.

Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria

How To Benchmark Power Production

DATA CENTRES UNDERSTANDING THE ISSUES TECHNICAL ARTICLE

Introducing UNSW s R1 Data Centre

Office of the Government Chief Information Officer. Green Data Centre Practices

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

AIRAH Presentation April 30 th 2014

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Datacentre Studley. Dedicated managed environment for mission critical services. Six Degrees Group

University of Melbourne Symposium on ICT Sustainability Dan Pointon 25 November 2008

Best Practice Guide for Data Centres - a UH Perspective Steve Bowes-Phipps, Data Centres Manager University of Hertfordshire

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

Datacentre Reading East 2 Data sheet

Data Centre Testing and Commissioning

Drives and motors. A guide to using variable-speed drives and motors in data centres

Australian Government Data Centre Strategy

Making Existing Server Rooms More Efficient Steve Bowes-Phipps, Data Centre Manager University of Hertfordshire.

Data Center Facility Basics

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Data Centre Harbour Exchange Square Campus, London Access to unrivalled connectivity.

Technical Systems. Helen Bedford

GREEN DATA CENTER DESIGN

Datacentre London 1. Dedicated managed environment for mission critical services. Six Degrees Group

Datacentre South London Data sheet

Reducing Data Center Energy Consumption

About Injazat. Enterprise Cloud Services. Premier Data Center. IT Outsourcing. Learning and Development Services. Enterprise Application Services

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Data Centre Cooling Air Performance Metrics

How to Build a Data Centre Cooling Budget. Ian Cathcart

Drives and motors. A guide to using variable speed drives and motors in data centres Meeting your Carbon Reduction Commitment (CRC)

THE EUROPEAN GREEN BUILDING PROGRAMME. Technical Module on Combined Heat and Power

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Introduction to Datacenters & the Cloud

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

Metrics for Data Centre Efficiency

How To Improve Energy Efficiency In A Data Center

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

National Grid Your Partner in Energy Solutions

SmartDesign Intelligent, Integrated Infrastructure For The Data Center. Solutions For Business-Critical Continuity

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014

Datacentre Maidenhead P1 Data sheet

Australian Government Data Centre Strategy

CANNON T4 MODULAR DATA CENTRE HALLS

2010 Best Practices. for the EU Code of Conduct on Data Centres. Version Release Public Review Public

Independent analysis methods for Data Centers

Location. Central Business District. Ataturk Airport IFC. Sabiha Gökçen Airport

Blade Server & Data Room Cooling Specialists

Enabling an agile Data Centre in a (Fr)agile market

Climate solutions. evaporative cooling, adiabatic humidification and programmable controllers. energy saving solutions for data centers. carel.

Applying ICT and IoT to Multifamily Buildings. U.S. Department of Energy Buildings Interoperability Vision Meeting March 12, 2015 Jeff Hendler, ETS

Greening Data Centers

Datacenter Efficiency Program

Case Study. Commercial Building Energy Efficiency Retrofit 4 Mort Street, Canberra. Results. Background PEOPLE PRACTICES SYSTEMS

Re Engineering to a "Green" Data Center, with Measurable ROI

100 Locust Avenue, Berkeley Heights, New Jersey P R O P E R T Y O F F E R I N G

Datacentre Capability March 2015

DataCenter 2020: first results for energy-optimization at existing data centers

Reducing the Annual Cost of a Telecommunications Data Center

Green Data Centre Design

Transcription:

Together. Free your energies Capgemini UK Infrastructure Outsourcing Project Arthur Improved Data Centre Efficiency Paul Feeney Capgemini Tony Mills Spie Matthew Hall

Project Arthur The Toltec data centre, in the west country was fast approaching it s 1990 s design capacity limitations. The infrastructure therefore required upgrading to meet existing and future customer demands as well as to maintain and improve its resilience and efficiency. Arthur is the 9m project to refit Capgemini's existing Toltec data centre in Bristol with 'Merlin' and other new technology in order to achieve significant advances in efficiency, sustainability, capacity and flexibility. The 9-month project was completed in October 2011 and measured impacts (including PUE reduction from 1.88 to 1.45) are already meeting the ambitious objectives that were set. Our environmental performance is a key dimension of this programme with corporate environmental objectives being publicly set in 2008 and ISO 14001 certification being achieved for our UK business in 2009.

Reasons for Refurbishment Before Arthur, measurement of energy consumption was cumulative, so that it was not possible even to distinguish between office and data centre, or between IT equipment and the underlying items of basic plant supporting the IT. To fully understand our power consumption and to adhere to the TGG metrics, electrical separation and measurement was identified and implemented as part of the upgrade programme. In addition, we adopted the 'enclosed cold aisle' principles, with elevated return air temperatures using pressure based fan speed controls, to minimise energy consumption. In addition to this we developed a top-down cooling principle, whereby the energy consumption through the means of heat rejection was biased heavily towards our energy efficient external fans

What we set out to achieve: Reduced Toltec s infrastructure energy consumption and CO2 emissions in line with corporate objectives. 20% by 2014 Ensure the infrastructure can support the planned growth in new business. 750w/m 2 1000w/m 2 Increase the infrastructure resiliency to comply fully with Tier 3 requirements. Fault tolerant infrastructure Reduce energy bills and thus enable Capgemini to offer more cost-competitive services to clients. Removal of 2x900kw chillers Enable synchronous connectivity with the new Merlin data centre in Swindon. Dark fibre completion

Operationally Live, Mission Critical, Business Reasons Why Arthur should WIN No1 Toltec is a live, operational data centre, with thousands of servers running 24x7x365, serving 100 s of clients from government departments to major multinationals. Unplanned downtime to services was not permitted at any point during the project. Mainframes SAN s Tape & Patch Any shape, Any Size

Reuse, Refit, Reorganise Reasons Why Arthur should WIN - 2 Efficient data centres are not always brand new projects. Arthur achieved the successful transformation of a live data centre from an inefficient legacy to a modern efficient one. This should inspire future projects by demonstrating what can be achieved without the need for an expensive, risky IT migration to a new location. Nitrogen Pipe Freeze (200mm) x 84 Mechanical Hot taps x 14 no.

Top Down Thinking Reasons Why Arthur should WIN No3 Driving greater efficiency through the use of fan speed as well as temperature control. Reduced Peak Ambient Return Water Temperature (Adiabatic Water Spray Assisted) Re-use of such proven Merlin technology enabled significant cost savings to be achieved. Free Cooling for Approximately 2300 Hrs p.a (27%) Mix Mode Cooling For Approximately 6360 Hr p.a (72%) Full Compressor Run Approximately 100 Hours / Year ( 1% p.a) Free Cooling Achievable With Return Condenser Water up to 14.5 C Reduced Peak Ambient Return Water Temperature (Adiabatic Water Spray Assisted) Return Air From Data Hall 36 C Fans @ 100% ALL YEAR Supply Air Into Floor Void 23 C

Operationally Complex, Client Challenging Reasons Why Arthur should WIN No4 The sheer scale and complexity of the project, which involved the re-engineering of 1000sqm of existing IT floor space into hot & cold aisles to enable energy efficiencies, and moving 1000 s of servers and 100 of racks. New Electrical 1.0MW upgrade. Complex system engineering integration and test. Cold Aisles Temperature Sensors Pressure Sensors +8pa/12pa Supply T @ 24C Return T @ 36C

Best Practices Energy efficiency to be improved by 20% - achieved! No batteries achieved! Green Grid annual PUE of 1.5 or better when fully loaded and from Day 1 achieved! Cooling load of 1000w per sqm, with ability to expand to 2000w per sqm without disruption to service achieved! Maximum use of free/mixed cooling, at least 99% of yearly operation achieved Concurrent maintainability, no breaks to service for standard maintenance activities achieved to date! Significant ( 20%+) cut in water use achieved! Environmental certification in ISO 14001- achieved! Metering and energy measurement to meet highest EEC requirement in accuracy achieved! Redundancy = N+N Cooling solution, N+N Power solution, N+1 BMS Load efficiency = Flywheel UPS, Thermal load balancing Cost minimization = By maximise free cooling within the CRAC units and running the Dry Cooler fans at 100% Management = New BMS system monitoring and reporting upon all critical plant Electrical cost separation of IT and non-it load ( via metering)

Design & Technical challenges V Requirements 1,800,000 1,600,000 1,400,000 1,200,000 1,000,000 Cooling Upgrade & Cold Aisle Impact Total Kwh use by Data Centre Total Kwh used by Technical Floor 90,000 80,000 70,000 60,000 50,000 40,000 Cost Reduction Profile Change over point from old to new Cooling 800,000 600,000 400,000 Total KwHours (cooling, lighting, losses etc.) Costs 30,000 20,000 10,000 200,000 0 01/07/2010 01/09/2010 01/11/2010 01/01/2011 01/03/2011 01/05/2011 01/01/2010 01/03/2010 01/05/2010 Completion of Cold Aisle no 9 01/07/2011 01/09/2011 01/11/2011 01/01/2012 01/03/2012 01/05/2012 01/07/2012 01/09/2012 01/11/2012 Change over point from old to new Cooling 0 Jan-10 Mar-10 May-10 Jul-10 Sep-10 Nov-10 Jan-11 Mar-11 May-11 Jul-11 Sep-11 month Nov-11 Jan-12 Mar-12 Total costs for IT(cooling, lighting, losses etc.) May-12 Jul-12 Sep-12 Nov-12 Completion of Cold Aisle no 9, monthly cost saving est 17k

Advice to another organisation planning a similar facility Strong governance in place to ensure effective business representation communication and input to the project Rigorously applied change process to ensure appropriate impacting to remove risk of change Reiterative migration review process to identify and drive out risks from the project Appointing a prime contractor who embraced our requirement for service continuity and who worked with Capgemini from design through to final IST to ensure we met our goals without disruption to our business or our clients. There are a lot of efficiencies to be gained in existing data centres largely by improving cooling and air flow management in a systematic way.

CAPGEMINI TOLTEC DATA CENTRE Full Compressor Run Approximately 100 hrs / Year ( 1% p.a) Approximately 494,040 kwhr / Annum Energy Saving During The Free Cooling Period (No Compressors Running, Dry Cooler Fans At 100% Run) 2300hrs 227,259 kg CO 2 Emissions Saved During The Free Cooling Period (Based On an Average of 0.46 kg CO 2 / kwhr). 6300hrs Approximately 165,810 kwhr / Annum Energy Saving During The Mixed Mode Cooling Period (Partial Compressors Running, Dry Cooler Fans At 100% Run) Total Power Consumption From Compressors is Approximately 1,237,518 kwhr s / Annum Power Consumption From Dry Cooler Fans at 100% Load is 956,592 kwhr s / Annum Approximate Energy Saved / Annum is 1,383,474 kwhr s Approximate 636,398 kg CO 2 Emissions Saved / Annum Data provided by Hurley Palmer Flatt

Work done Replace old chillers with new dry coolers = 1800kw Replace old CRAC upflows with new down flow = 1300kw Install new HV switch gear + 2xTXF Install new switch board Install new 2 x 625kva UPS Install new 3 x 1800kva Generators Replacement of existing fire system Replacement of existing BMS system Removal of redundant legacy equipment. 84 pipe freezes on live systems 14 Hot pipe connections Re align/turn around over 250 Customer racks Restack over 100 servers Install over 500m of cold aisles IST test New electrical system IST test New Mechanical system IST test New Fire control system Remove legacy Cause & Effect Plant shutdowns Re align existing lighting Install new lighting Replace 500 floor grilles Re locate data cabling and tray works Migrate across dual cooling systems Implement pressure controls

PUE reduction 2010 2.0Mw site load ( 500kw cooling, Electrical 200kw, 200kw office ) 1060kw IT PUE 2.0/1060 = 1.88 2012 Site load estimated 1600kw (300kw cooling, 100kw office) PUE 1600/1060

Summary, Q and A