Introducing UNSW s R1 Data Centre



Similar documents
Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

Datacentre Reading East 2 Data sheet

Datacentre Studley. Dedicated managed environment for mission critical services. Six Degrees Group

Scalable. Affordable. Flexible. Fast.

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

The Efficient Enterprise. Juan Carlos Londoño Data Center Projects Engineer APC by Schneider Electric

Data Center Technology: Physical Infrastructure

POD S in Data Centres. The Frame Group Pty Ltd

Datacentre South London Data sheet

BELLE VUE MANCHESTER DATA CENTRE

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

EXAMPLE OF A DATA CENTRE BUILD

Our Hosting Infrastructure. An introduction to our Platform, Data Centres and Data Security.

Data Centre Services. JT First Tower Lane Data Centre Facility Product Description

Datacentre London 1. Dedicated managed environment for mission critical services. Six Degrees Group

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

HealthcareBookings.com Security Set Up

Data Center Infrastructure & Managed Services Outline

About Injazat. Enterprise Cloud Services. Premier Data Center. IT Outsourcing. Learning and Development Services. Enterprise Application Services

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

How To Power And Cool A Data Center

COMMODITIZING THE DATA CENTER MARKET

IBM Portable Modular Data Center Overview for Critical Facilities Round Table

Datacentre South Gyle 1 Data sheet

in other campus buildings and at remote campus locations. This outage would include Internet access at the main campus.

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures

Infrastructure & Software

Datacentre Maidenhead P1 Data sheet

South Datacentre Studley

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

100 Locust Avenue, Berkeley Heights, New Jersey P R O P E R T Y O F F E R I N G

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Enabling an agile Data Centre in a (Fr)agile market

Powerful Dedicated Servers

Portable Energy Efficient Data Centre Solution

Capgemini UK Infrastructure Outsourcing

LDeX Group. Colocation Solutions for High Expectations

IBM Twin Data Center Complex Ehningen Peter John IBM BS 2011 IBM Corporation

out of this world guide to: POWERFUL DEDICATED SERVERS

Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

19 Site Location, Design and Attributes. Site Location, Design and Attributes

Data centres that deliver

Data Centre Solutions... energy efficient by design. APC by Schneider Electric Elite Data Centre Partner. APC s leading Elite Data Centre Solutions

Data Centre Testing and Commissioning

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

Australian Government Data Centre Strategy

Document Details. 247Time Backup & Disaster Recovery Plan. Author: Document Tracking. Page 1 of 12

Data Centre Services. JT Rue Des Pres Data Centre Facility Product Description

European Business Reliance Centres when Commodity serves Green IT

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

NORTHERN VIRGINIA ONE (NoVa) DATA CENTER OVERVIEW

CANNON T4 MODULAR DATA CENTRE HALLS

South Asia s First Uptime Institute Certified TIER- IV IDC in Mumbai delivering % uptime

Modular Data Centre. Flexible, scalable high performance modular data centre solutions. Data Centre Solutions...energy efficient by design

NEW JERSEY CENTER OF EXCELLENCE

112 Linton House Union Street London SE1 0LH T: F:

X-tier Data Facility. Prepared by X-tier SAE Corporation

Managing Data Centre Heat Issues

Data Centre Basiglio, Milan Flexible, advanced and efficient by design.

T:

DATACENTER COLOCATION. Flexible, Secure and Connected

In Row Cooling Options for High Density IT Applications

Strategic Data Centre Site Profiles. Data Centre Co-Location London East

Blade Server & Data Room Cooling Specialists

Elements of Energy Efficiency in Data Centre Cooling Architecture

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Thermal Storage System Provides Emergency Data Center Cooling

ISOMEDIA COLOCATION

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Data Centre Powergate, London Flexible, advanced and efficient by design.

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Strategic Data Centre Site Profiles Data Centre Co-Location London North

Colocation Selection Guide. Workspace Technology Ltd

GREEN DATA CENTER DESIGN

ASX Australian Liquidity Centre. ASXCoLo

European Code of Conduct for Data Centre. Presentation of the Awards 2016

OCDC. OMNIconnect Data Centre.

Building a data center. C R Srinivasan Tata Communications

Data Center Combined Heat and Power Benefits and Implementation. Justin Grau, Google Inc. Sam Brewer, GEM Energy

Energy efficient data centre cooling. InRack Cooling & The Intelligent Cooling System. aircold.dk

NetApp Data Center Design

Intelligent Power Distribution in the Data Center

High-density modular data center making cloud easier. Frank Zheng Huawei

Data Centre Best Practices Summary

DATA CENTRE DATA CENTRE MAY 2015

Data Center Solutions - The Containerized Evolution

A Blueprint for a Smarter Data Center Data Center Services helps you respond to change

Benefits of. Air Flow Management. Data Center

Data Center Consolidation Trends & Solutions. Bob Miller Vice President, Global Solutions Sales Emerson Network Power / Liebert

National Aluminium Co. Ltd.

APC by Schneider Electric

Data Center Energy Profiler Questions Checklist

Transcription:

Introducing UNSW s R1 Data Centre R1-DC Our New High Density Data Centre Right here Right now

- Welcome to our newest UNSW Data Centre R1-DC

R1 Location on UNSW Randwick campus. Keeping it in the family managing our real costs Located in historical tramways building on the Randwick campus Recycles an old building and minimises footprint and environmental impact Randwick Campus showing new R1 Data Centre Opportunity for solar connection to North facing roof Convenient but distinct from Kensington Campus Front entrance to the new R1 Data Centre Old Randwick Tram Depot with R1 building in background

Two halls.separate zones shared mechanical infrastructure West Hall is 350m2 with room for two PODs totalling 60 racks plus 15 racks space of supporting infrastructure for UPS, Disk and Tapes UNSW-BCP Expansion? WEST Hall Generators Chillers EAST Hall Expansion East Hall is 300m2 for expansion, with room for 60 additional racks plus 15 racks space of supporting infrastructure for UPS, Disk and Tapes R1 Data Centre showing floor plans The West Hall is currently fitted out. Air and electrical capacity is sufficient to expand to the East hall with the addition of another chiller, some water pumps and the east Hall switchboard. Over 1Mw increase requires additional feed from existing campus substation, and generator for redundant capability.

The bigger picture.everything designed for future modular expansion WEST Hall EAST Hall UNSW-BCP Expansion? Expansion Generators Chillers

R1-DC Water Mist Fire Suppression.inert gas no CFCs clean air, clean conscience Pod-based nitrogen-fired water mist solution attached to VESDA fire suppression (Pod 1 only on Day One with capacity to supply 3 additional pods. Discharges in pulses after fire detection, before total release if fire not initially suppressed 99.5% recovery of servers and 99.95% recovery of storage after discharge Mark Fisher-Chief IT Architect with nitrogen discharge water mist fire suppression system

Monitoring Systems Together Fire, Electrical, Mechanical, IT, Security Vesda Fire Detection Fire sensors coupled to firesuppression and alarm systems Samsung Video cameras Connected to both Security and IT systems APC In-Row Cooling Connected to APC Infrastruxure monitoring BMS system Manages mechanical and electrical (generators, chillers, pumps and power) APC Infrastruxure integrated IT monitoring Manages all the IT POD-based equipment including sensors, cameras, and power rails Security Connects through to Campus security

Campus connections R1-DC.two hearts beating as one Private connection on the UNSW fibre network Dual 48 pair dark (private) fibre connections Separate routes 1. Wansey Road-Alison Road William Street-King Street 2. Anzac Parade Moore Park King Street Separate North & South connections to the building Dedicated MDF room Fibre connections meet in the R1 MDF room Used for: Data network Storage network Fibre Routes from Kensington Campus to new R1 Data Centre Creates one Virtual Data Centre across two campuses

Power from the grid.1 MW initially with more available Grid Supply (1000KvA): There is a single, dedicated mains sub-station supplying 1000KvA to R1. This is sufficient for both East and West halls at 7.5kW per rack. Main Switchboard (1000KvA) - delivers: IT Services (700KvA) Mechanical Services (300KvA) Further power can be added by utilising spare capacity on the other 1500 KvA Randwick campus substation. Richard Smith-Production Services Manager and Jennifer Hogan-Dell Customer manager, in front of main switchboard in new R1 Data Centre To increase IT power usage beyond 700 KvA in West Hall, or to expand into East Hall, requires an additional feed from the 1500 KvA Randwick campus substation. This would need to be either an additional line 60m under the drive, or taken from the old sub-mains board now only servicing top floor of R-1 (a preferable and cheaper solution. Energy Australia have already been requested by FM to free up power on the campus substation.

Uninterruptible Power from all or nothing to just what you need APC Symettra scaleable modular UPS systems providing near 100% efficiency and utilising only the power required at the time. The 500kW (475kW N+1 configuration) IT UPS receives dual power supplies from the main switchboard. It supplies power to POD 1, and available for POD 2 when installed, at 225kW per POD. For a 30 rack POD this give an average 7.5kW per rack. IT UPS (500 KvA): 60x West hall IT racks at 7.5kW per rack. Mechanical services UPS (80KvA) For generators, lights, pumps Andrew Munsie-Data Centre Manager beside IT UPS Management console in the new R1 Data Centre Efficient UPS technology: latest UPS technology, which means for every $100,000 of power that comes in, the modular UPS delivers more than $99,000 worth of power to the IT equipment. Under traditional UPS technology, for every $100,000 of power supplied to these UPS types results at best $87,000 worth of power available to the IT equipment. Sudeep Joshi-Infrastructure Design & Capacity Manager beside the Mechanical UPS battery bank in the new R1 Data Centre

R1-DC Diesel Generator Backup N+1 guaranteed to start Generators: There are 3 x 450kW generators to provide for mains outages. In times of power failure: 1. All three generators start up to provide power. 2. Once the load has been sensed, redundant generators shut down to standby mode. Should the active generator fail, those in standby mode start up and repeat step 2. Three 450KvA generators at the rear of the new R1 Data Centre Once power has been restored generators sense load shedding and shut down. These generators provide sufficient capacity to also cover the East hall (N+1) when it is developed. Each generator holds sufficient fuel to run at full load for a minimum of 12 hours James Leeper- Faculty IT Director, with mechanical engineer Owen Nugent, in front of one of the three 450KvA generators at the new R1 Data Centre

Circulation from above - through ground - to overhead Simplified cooling systems independently linking pumps through manifolds to in-row cooling units. Water flows from chillers through pumps to distribution units for each POD, and then by individual pipes to each in-row cooling unit. Project manager, Nick Stace, and Data Centre manager, Andrew Munsie, alongside two of the four CDU water distribution units in the West Hall of the new R1 data centre There are no cooling towers that can waste up to 10,000 litres of water per day (based upon a comparable data centre using cooling towers). Side view on the platform of the two McQuay 500 KvA chillers at the new R1 data centre Pumps and distribution pipes in the plant room of the new R1 data centre

Air-cooled chillers.not a drop of water wasted Chillers - There are 2 x 500kW chillers in an N+1 configuration on a raised platform at the south side of the building. To expand the second hall beyond 500Kw, an additional chiller can be added outside the second hall and connected to the existing pumping station. The chillers supply chilled water to enable operation of: The air handling unit that provides conditioned air to the datacentre Each of the in-row coolers that are built into the POD s In the unlikely event of a double chiller failure there is a separate tank holding, 7,200 litres of chilled water, to enable cooling to continue running whilst a graceful shutdown of all systems is carried out. Toby Harrison-Infrastructure Project Manager alongside one of the two 500KvA McQuay air cooled chillers outside the rear of new R1 data centre

R1-DC PODs.each POD is a data centre on its own APC POD technology: There are two PODs planned in each Hall, with only one 30 rack POD initially in the West Hall. Each rack is designed to supply and cool an initial average of 7.5KW/rack. The capacity of the PODs is to cool up to 25KW/rack, meaning an initial 500KW over the entire Hall. Jennifer Hogan-Dell Account manager, outside POD A in the West Hall of the new R1 data centre Joe Repici, Senior Installation Manager within the interior of POD A showing hot-aisle containment, in the West Hall of the new R1 data centre Each POD has a central Hot Aisle where the hot-air is trapped, and re-circulated and cooled by the inrow cooling systems. Schematic of a 10 rack APC POD showing hot-aisle containment as used in the new R1 data centre

Cooling the rack not the room in-row cooling with no false floor In-row cooling systems Can cool up to 30Kw per rack Hall A is average 7.5Kw per rack Modules work together to manage the cooling across the entire POD N+1 redundancy spread across the POD Design: Using APC Infrastruxure management system, manages: Power usage Location of equipment Moves, Adds, Changes Video from connected cameras In-rack temperature front & rear Pictures of an in-row cooling unit and a rack in the West Hall of the new R1 data centre.

R1-DC Switching unified storage with data on the network New Unified Communications: Using Cisco Nexus 7000 & 5020 switches Integrates storage and data networks Saves money lower cost per connection Improves performance Greater resilience Increased reliability Smaller footprint higher density Faster implementation Faster speed and volume/bandwidth Less power consumption Installation Engineer, cabling the Ampnet fibre distribution in the new R1 data centre Greg Sawyer, Communications Manager, alongside Cisco Nexus 7000 & 5000 switches as used in the new R1 data centre

Cabling everything overhead and fibre to the rack Overhead services: Overhead piping and cabling with fibre to the rack where possible giving visibility and simplicity without restricting access and airflow. Fibre to the rack: Fibre patching means speedy installation, speedy connections and less firehoses of copper cable. No raised floor: No raised floor and with all cabling being visible overhead means no cable mess under the floor or loss of cooling efficiency when opening up the raised floor to work on the cooling. Overhead structured cabling of fibre and minimal UTP as used in the new R1 data centre Moves, adds and changes - a reality in any data centre is managed without the need to arrange disruptive sub-floor work. Overhead cabling and water distribution above POD A in the West Hall of the new R1 data centre

R1-DC Green Initiatives Concern, Consideration, Design, Implementation, Management Re-use of existing building Minimise building works no false floors Responsible building methods Higher utilisation of processor farms Greater utilisation of shared storage No water wasted on cooling Cooling equipment directly not entire rooms Insulated rooms without windows Balanced cooling across PODs Hot-aisle containment no mixing of hot & cold air Shorter paths to cooling Less temperature range between hot and cold air Higher density efficient servers Less power usage overall More efficient ratio of electrical load to IT load Energy efficient UPS systems

Results Getting it done - with a little help from your friends New Energy efficient High Density Data Centre design & consulting - Canberra Data Centres POD-based InRow cooling infrastructure - APC InfraStruXure Construction and Building Project management - UNIPORT (with UNSW Facilities Management & IT at UNSW) Chassis Servers and Chassis Switches Dell - M1000, M905/605, Dell/Cisco 3130G Data Centre Cabling design and implementation - UNSW IT Infrastructure Integration and Project Management - UNSW IT Infrastructure Unified Switching in-rack & POD(FC, FCOE & E) Cisco Nexus - 7000, 5000 Consolidation environments Microsoft Windows 2003/8, Vmware - ESX, SRM, LCM

Trends Clouds, Density, Virtualization, Consolidation, Environment, Data Centre Convergence it s a Journey not a destination In-Row cooling Space Switching Consolidation High Density Virtualisation