DATA CENTRES UNDERSTANDING THE ISSUES TECHNICAL ARTICLE

Similar documents
TIA-942 Data Centre Standards Overview WHITE PAPER

About Injazat. Enterprise Cloud Services. Premier Data Center. IT Outsourcing. Learning and Development Services. Enterprise Application Services

Colocation Selection Guide. Workspace Technology Ltd

in other campus buildings and at remote campus locations. This outage would include Internet access at the main campus.

Sovereign. The made to measure data centre

MARULENG LOCAL MUNICIPALITY

Rules of Conduct and Safety

hong kong//china data center specifications tel: fax: internet + intellectual property + intelligence

Data Center Infrastructure & Managed Services Outline

Location. Central Business District. Ataturk Airport IFC. Sabiha Gökçen Airport

singapore//singapore data center specifications tel: fax: internet + intellectual property + intelligence

Onsite Support: 24/7/365 onsite team. Security: 24/7/365 onsite manned security and CCTV. Monitoring: 24/7/365 Environmental monitoring and management

APC APPLICATION NOTE #112

Colocation Service Definition. SD008 v1.3 Issue Date 19 Feb 09

REVIEWED ICT DATA CENTRE PHYSICAL ACCESS AND ENVIROMENTAL CONTROL POLICY

Georgia Tech Aerospace Server French building Server Room. Server Room Policy Handbook: Scope, Processes and Procedure

Capgemini UK Infrastructure Outsourcing

Element D Services Heating, Ventilating, and Air Conditioning

OPERATOR - FACILITY ADDRESS - DATE

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

ASX Australian Liquidity Centre. ASXCoLo

Datacentre Studley. Dedicated managed environment for mission critical services. Six Degrees Group

Safeguard your critical IT assets in a highly secure, fully redundant, environmentally-conditioned and constantly monitored hosting space to receive

Pacnet Hong Kong CloudSpace2 Technical Specifications. Upholding the Principles of Efficiency and Sustainability in Data Center Design

Data Centre Services. JT Rue Des Pres Data Centre Facility Product Description

Data Centers, Information Technology and Low Current Services

Infrastructure & Software

7QUESTIONSYOUNEEDTOASKBEFORE CHOOSINGACOLOCATIONFACILITY FORYOURBUSINESS

NIGERIA S PREMIUM DATA CENTRE

Frankfurt Data Centre Overview

Security+ Guide to Network Security Fundamentals, Fourth Edition. Chapter 13 Business Continuity

Data Centre Issues supported in TIA-942 : "Telecommunications Infrastructure for Data Centre s"

Data Centre Basiglio, Milan Flexible, advanced and efficient by design.

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

POWERING A CONNECTED ASIA. Pacnet Hong Kong CloudSpace1 Technical Specifications. Asia s Pioneering Facility with Direct Subsea Cable Access

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

19 Site Location, Design and Attributes. Site Location, Design and Attributes

Best Practices for Wire-free Environmental Monitoring in the Data Center

Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM

How to Meet 24 by Forever Cooling Demands of your Data Center

Virtual Data Centre Design A blueprint for success

(Examples of various colocation sites)

A LOOK AT INTELLIGENT INFRASTRUCTURE MANAGEMENT. Krzysztof Ojdana, Product Manager TECHNICAL ARTICLE. Molex Premise Networks

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Intelligent. Buildings: Understanding and managing the security risks

Data Centre. Design and Project Management. from concept to completion

Rittal Liquid Cooling Series

Managing Data Centre Heat Issues

How To Write An Infrastructural Standard For Data Center Infrastructure

Co-location service from Bell. Version 1.0 April 10, Updated as of 4/10/2007 1

Upgrading the PRS datacentres: space requirement. S. S. Mathur, GM-IT, CRIS

Standard Comms Rooms Specification

melbourne//australia data center specifications internet + intellectual property + intelligence tel: fax:

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

NeuStar Ultra Services Physical Security Overview

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Greening Commercial Data Centres

MISSION CRITICAL FACILITIES DESIGN UPS BATTERY ROOM ENVIRONMENT CRITICAL DESIGN CONSIDERATIONS

Portable Energy Efficient Data Centre Solution

Server Monitoring & Management Services

Case Study Nottingham Trent University

Dat Da a t Cen t Cen er t St andar St d andar s d R o R undup o BICSI, TIA, CENELEC CENELE, ISO Steve Kepekci, RCDD

Data Centre Services. JT First Tower Lane Data Centre Facility Product Description

Data Centre Infrastructure Assessment

Statement Of Work. Data Center Power and Cooling Assessment. Service. Professional Services. Table of Contents. 1.0 Executive Summary

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Power and Cooling for Ultra-High Density Racks and Blade Servers

Data Centre Outsourcing a Buyer s Guide

South Asia s First Uptime Institute Certified TIER-IV IDC in Mumbai delivering % guaranteed uptime

vcloud SERVICE Virtual Tech in partnership with Equinix - vcloud Service

Data Centers and Mission Critical Facilities Operations Procedures

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Document Details. 247Time Backup & Disaster Recovery Plan. Author: Document Tracking. Page 1 of 12

GAS HEATING IN COMMERCIAL PREMISES

Your responsibilities if you have an automatic fire alarm. A handy guide for owners and responsible persons

Data Center Checklist

DATA CENTRE DATA CENTRE MAY 2015

san francisco//usa data center specifications tel: fax: internet + intellectual property + intelligence

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Knürr CoolAdd The universal retrofit solution against overheating in server racks. Rack & Enclosure Systems

Containment Solutions

Data Centre Testing and Commissioning

Datacentre London 1. Dedicated managed environment for mission critical services. Six Degrees Group

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

How To Co-Location On Nextgen (Networking)

Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria

Power to the adaptable data center

Data Centre Best Practices Summary

Eni Green Data Center - Specification

Transcription:

DATA CENTRES UNDERSTANDING THE ISSUES TECHNICAL ARTICLE Molex Premise Networks

EXECUTIVE SUMMARY The term data centre usually conjures up an image of a high-tech IT environment, about the size of a football pitch, full of equipment from a vast array of vendors with enough air conditioning to keep the occupants of the penguin enclosure in a zoo happy. This technical article discusses issues around data centre technologies and concepts.

Definition of a Data Centre The term data centre usually conjures up an image of a high-tech IT environment, about the size of a football pitch, full of equipment from a vast array of vendors with enough air conditioning to keep the occupants of the penguin enclosure in a zoo happy. Modern businesses are highly dependant on these IT facilities.. In fact, the term data centre applies to any space specifically allocated for cabinets or frames, housing network equipment that is either a source of services to other network devices (typically distributed over generic cabling) the receivers of services from an external telecommunications network (such as a PABX or ADSL connection) or a source of services to an external network (typified by data hosting facilities). We also generally assume a data centre to be a multi-client environment, with services to maintain that environment provided by a third party. The term is however equally applicable to the main communications room within an end-users internal network. In other words, a data centre may be a server room, an equipment room or a co-location facility. The Rise of the Data Centre Modern businesses are highly dependant on these IT facilities and care needs to be taken to ensure the environment created to house them supports their continued trouble free operation. In recent years there has been a growth of managed data centres. Whilst this may be an outsourcing move to save money, I believe it is more likely that network owner are increasing aware of their businesses IT dependency, but lack the budget or the cross-discipline expertise to cater for it in-house. This requires co-ordination between a number of technical services including electrical, HVAC, networking, fire detection and suppression and the wide area telecommunications service provider. The technologies and concepts examined in this article are equally applicable to both single client and multi-client environments, making it easy to see why the later is an attractive option to a busy network manager with limited resources. Applicable standards Currently there are no standards specifically written to govern cabling implementation in these spaces, although EN 50173 Part 5 is on its way to fill this void. Using the familiar media of optical fibre and balanced copper cabling, a hierarchical approach is employed, similar to BS(EN)50173 Part 2, Generic cabling for office premises. Within the data centre we will see Main Distributors (perhaps accommodating a core switch), supporting Zone Distributors (possibly a workgroup switch or cross connect supporting an aisle of racks), connecting to Equipment Outlets (expected to be located within the equipment racks). The will also be the addition of an External Network 3

Interface and its associated cabling, for connection to the outside world and the service providers network. None of this should prove problematic to a competent installation organisation, and is bound to be the subject of further articles after the document is published. Key Issues for the Network Operator Regardless of the size of the data centre there are three common priorities shared by their owner, operators and occupiers; Flexibility, Accessibility and Reliability...there are three common priorities... Flexibility, Accessibility and Reliability Flexibility A flexible infrastructure enables a wide range of types of equipment to be accommodated, and the hierarchical approach to data centre cabling that EN50173 Part 5 is proposing looks set to deliver that. In a client owned and operated data centre there is often more control over the equipment deployed, the inherent flexibility being used as a safeguard against future developments in technology, or change of IT strategy. Within a co-location or hosting facility however, flexibility is a pre-requisite for day-to-day operation. Client equipment can arrive at short notice, and require immediate connection and commissioning to keep any down time to minimum, and to ensure that service level agreements are met. The requirement to deploy a wide variety of equipment within the data centre can have an effect on the physical elements of the building. The load limit of the floor may need to be considered, whether that is the structural floor or any access floor system attached to it. The average server rack is unlikely to present any problems, but large un-interruptible power supplies and storage area network silos may need special consideration. This is clearly best achieved during the design stages of the building, not the fit-out. Accessibility Access is clearly important. The equipment will not install and configure itself, so personnel access will be needed for this scheduled event. This has a knock on effect on the sizing of the data centre, as space will be needed for the technicians to work and to manoeuvre equipment. BS(EN) 50174 is some help here, advising that a 1.2-meter clearance around cabinets is required. Ideally data centre sizing should then simply be a case of working of how many racks you have, then adding the required clearance to their overall dimension. Access for equipment is also an issue with the size of doors, proximity to lifts and access to loading bays all needing consideration. Many installers will have encountered this problem, with a 800mm cabinet not fitting through a 2 6 door. Simply specifying double doors goes a long was to solving this sort of issue, and again this is much easier and 4

cheaper at the data centre design stage, rather than when equipment is being installed! Network availability is of crucial importance to the operation of the data centre. Having decided that the equipment within the data centre is mission critical to the business, the case of unscheduled downtime from equipment failure also needs consideration here. It is clear that in the event of a failure, the deployment of technicians will be required, so access to the site must be arranged often in compliance with strict security policies. Obviously access to the equipment and its associated cabling will also be required, which is likely to involve opening cabinets and raising floor tiles. It is important to understand that whilst this enables repair work to be undertaken, it also exposes other equipment (either directly or via a shared cabling infrastructure) to the technician, and the threat of human error cannot be ignored. Procedures need to be in place to minimise this risk. Reliability Network availability is of crucial importance to the operation of the data centre. It underpins its very existence. It is important to realise however that the need for accessibility and flexibility can have a negative effect on Reliability. Ultimate accessibility can lead to a lack of control of personnel on site, making the data centre prone to human error. Ultimate flexibility can require multiple connections in a circuit to allow for reconfiguration. Having more connections builds in more points of failure. It would seem a balance needs to be struck, and a method needs to be found to quantifying the likely reliability of data centre based upon its constituent parts. The four tiers described by the Uptime Institute in the USA is such a system, which is worth investigating further. Recognising both the impracticality and the extremely high cost required to achieve the holy grail of 99.999% (five-nines) availability, they instead propose four more attainable levels, as follows. nn nn nn nn Tier I - Single path for power and cooling distribution, no redundant components, 99.671% availability. Tier II - Single path for power and cooling distribution, redundant components, 99.741% availability. Tier III - Multiple power and cooling distribution paths, but only one path active, redundant components, concurrently maintainable, 99.982% availability. Tier IV - Multiple active power and cooling distribution paths, redundant components, fault tolerant, 99.995% availability. It is easy to see how the Tier IV approach can deliver more resilience that a Tier I installation, demonstrated by its anticipated unscheduled downtime of under 27 minutes per annum, compared to Tier I at nearly 29 hours, but this is at significant cost. In its N+1 configuration, it is also 5

worth noting that during a failure in a Tier IV data centre, Tier II reliability is available. Designing to N+2 or N+N fixed this, but is generally cost prohibitive. An expedited repair would be in order, and this is where Intelligent Infrastructure Management (IIM) systems become invaluable. Network management tools have been available for many years to monitor data traffic, but the cabling has always been a blind spot. Now we are not only able to detect an unauthorised disconnection, but relate it to its physical location within the data centre and also inform the appropriate helpdesk services of the failure so that remedial action can be scheduled. No services, other than those used in the data centre, should pass through it... Associated Building Services Before looking at the other services required within the data centre, it is worth pointing out what should not be there. No services, other than those used in the data centre, should pass through it, reducing the risk of flooding or contamination of the equipment environment. Whilst services such as access control, CCTV and lighting are required, the key services are seen as air conditioning, power and fire detection and suppression. Air Conditioning The aim of temperature control within a data centre is to cool the cabinet mounted equipment, keeping it within its operation temperature range, not to cool the air outside it. Enclosed cabinets can inhibit the free flow of chilled air enabling hot-spots to develop so some clever design is required to overcome this. The concept of hot and cold aisles is often adopted. Adjacent bays of cabinets are configured with their equipment facing forward towards a chilled air outlet in the aisle between them. The cold air is then drawn by the equipment fans through the rack. This exits as warm air to the rear with the resulting convection currents acceleration the whole process, thus creating alternating hot and cold aisles. How effective this is can depend upon the specific equipment which will effect air flow. Open frames can help here, as can forced ventilation with fan trays. Ideally any HVAC system should also create a slight positive pressure, to help reduce the ingress of dust and other contaminants into the room. Power A generous allocation of mains power will be required for any data centre. The exact power requirement will depend upon a number of factors, not just the consumption of the network equipment installed. Lighting will need to be factored in. A significant load is likely to be draw by the HVAC plant. Even on stand-by, UPS will draw a load. The level of resilience within the data centre will also have an impact. Quite simply the more equipment that has been provided as a back up (be that network equipment or back up HVAC and lighting), the more power will be required to cater for it, and in turn the higher the running costs. 6

In the design of the power distribution system, it is not uncommon to have two feeds to each cabinet. In turn these may be fed from different distribution boards, even from different feeds from the energy company. Several general-purpose power outlets to cater for the occasional power tool or vacuum cleaner should also be provided. These should be clearly labelled, and never used to power rack mounted equipment. The risk of inadvertent disconnection is unacceptably high. In the design of the power distribution system, it is not uncommon to have two feeds to each cabinet. Fire detection and Suppression Although seldom mandated by fire regulations non-corrosive gas suppression systems are often desirable where high value equipment is located. Where a more conventional sprinkler system is used, a dry-pipe system is desirable, where by a POC (product of combustion) detector needs to be tripped to prime the pipes, in addition to an individual sprinkler head bursting due to heat build-up. This goes a long way to prevent the accidental discharge of the system, and the flooding that goes with it. The very minimum that should be considered are wire cages around the sprinkler heads if a wet pipe system is installed. 7

Clearly careful design is required in the planning of a data centre. Whilst it is possible to achieve almost total network availability, this comes at a price to high for most to pay. Managing the clients expectations from the outset is crucial if an acceptable compromise is to be reached. Ultimately the question to ask is How valuable is your data?. CONCLUSION 2013 Molex www.molexpn.com 8