New University Data Center



Similar documents
GREEN DATA CENTER DESIGN

Commercial Building Valuation Report

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

CITY UNIVERSITY OF HONG KONG Physical Access Security Standard

Louisiana State University Information Technology Services (ITS) Frey Computing Services Center Data Center Policy

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Standard Comms Rooms Specification

Phase 1 of SSIS Disaster Recovery/Business Continuation Planning begins with County Questionnaire

THE OPTIMAL SOLUTION FOR SMALL AND MEDIUM DATA CENTER

BRUNS-PAK Presents MARK S. EVANKO, Principal

Contents Error! Bookmark not defined. Error! Bookmark not defined. Error! Bookmark not defined.

How To Visit An Internet Data Center

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

UC Berkeley Data Center Overview

European Code of Conduct for Data Centre. Presentation of the Awards 2016

Data Center Components Overview

Benefits of. Air Flow Management. Data Center

AT&T Internet Data Center Site Specification Chicago Area (Lisle, IL)

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

ASX ITS Co-location Hosting Solution

The following are general terms that we have found being used by tenants, landlords, IT Staff and consultants when discussing facility space.

Building a Tier 4 Data Center on a Tier 1 Budget

Introducing UNSW s R1 Data Centre

National Aluminium Co. Ltd.

Commercial Building Valuation Report

Element D Services Heating, Ventilating, and Air Conditioning

Independent analysis methods for Data Centers

DATA CENTRES UNDERSTANDING THE ISSUES TECHNICAL ARTICLE

Small Data / Telecommunications Room on Slab Floor

How To Calculate Total Cooling Requirements For Data Center

in other campus buildings and at remote campus locations. This outage would include Internet access at the main campus.

Site Preparation Management Co., Ltd. March 1st 2013 By Nara Nonnapha, ATD UPTIME

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

APC APPLICATION NOTE #92

Datacentre Reading East 2 Data sheet

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Data Centre Testing and Commissioning

Data Center Infrastructure & Managed Services Outline

DATACENTER COLOCATION. Flexible, Secure and Connected

Power and Cooling for Ultra-High Density Racks and Blade Servers

Cisco Data Center Facilities Assessment Service

LEVEL 3 DATA CENTER ASSESSMENT

Data Center Consolidation Trends & Solutions. Bob Miller Vice President, Global Solutions Sales Emerson Network Power / Liebert

What is a Datacenter?

Data center solutions from Siemens. For the factories of the 21 st century. siemens.com/datacenters

Calculating Total Cooling Requirements for Data Centers

Cloud Computing, Virtualization & Green IT

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Data Center CH1. Elk Grove Village, IL

Five Nines and Infrared (IR) Testing at Data Centers

AT&T Internet Data Center Site Specification Washington-DC Area (Ashburn, VA)

T:

Data Centre. Design and Project Management. from concept to completion

Data Centre Barrie, Ontario

CSE 265: System and Network Administration

Increasing Energ y Efficiency In Data Centers

AT&T Internet Data Center Site Specification - Phoenix Area (Mesa, AZ)

JAWAHARLAL NEHRU UNIVERSITY

DCIM Readiness on the Rise as Significant Data Center Capacity Remains Unused. A Research Report from the Experts in Business-Critical Continuity TM

Automating Infrastructure A connectivity perspective for BICSI SEA meeting, November 2011

Standard: Data Center Security

Data Center Solutions - The Containerized Evolution

Office of the Government Chief Information Officer. Green Data Centre Practices

Data Centre Stockholm II, Sweden Flexible, advanced and efficient by design.

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

For Sale: Industrial Condominium

Technical specifications. Containerized data centre NTR CDC 40f+

AT&T Internet Data Center Site Specification New York III (Secaucus, NJ)

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures

Georgia Tech Aerospace Server French building Server Room. Server Room Policy Handbook: Scope, Processes and Procedure

IBM Portable Modular Data Center Overview for Critical Facilities Round Table

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Cisco Nexus 7000 Series.

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Green Data Centre Design

Guide Specification SECTION COMMISSIONING OF HVAC PART 1 - GENERAL 1.1 SUMMARY

AT&T Internet Data Center Site Specification Dallas Area (Allen, Tx)

How to Meet 24 by Forever Cooling Demands of your Data Center

Digital Chicago. 600 south FeDeraL - chicago grand ave - FranKLin ParK 350 east cermak - chicago

OPERATOR - FACILITY ADDRESS - DATE

Rack Installation. Unpacking the System. Choosing a Setup Location. General Server Precautions. Barracuda Appliances

White Paper Data Centre in a Standard Container

RACKMOUNT SOLUTIONS. 1/7/2013 AcoustiQuiet /UCoustic User Guide. The AcoustiQuiet /UCoustic User Guide provides useful

Warehouse-scale Design Reaches Retail Colocation High Density, Modern And Green

Transcription:

New University Data Center Operating Conditions Claudio Pini DRAFT FOR DISCUSSION Version 0.6 Aug 2011

Document Revision History Version Date Author(s) Description of Change 0.1 June 15, 2010 Claudio Pini Created 0.2 June 19, 2010 Claudio Pini Updates based on UDC WG feedback 0.3 June 21, 2010 Claudio Pini Updates based on D. Shorthouse, K. Richter & T. Yerex 0.4 June 22, 2010 Claudio Pini New network section and clarification on Charges 0.5 June 2011 Claudio Pini Updates from UDC 0.6 Aug 2011 Claudio Pini New template Table of Contents I. Background... 2 II. Summary of design and operating conditions... 2 a. Cooling... 2 b. Raised Floor... 2 c. Racks... 2 d. Access & Security... 3 e. Work Space... 3 f. UPS & Generator... 3 g. Communication... 3 h. Governance... 4 1

I. Background The University Data Centre Committee, working since 2007, has recommended the University undertake the construction of a new 2 MW facility on campus to house research servers. The recommendation is to construct a 5000 sq ft data centre as part of the Pharmaceutical Sciences building. This new facility is expected to be ready for occupancy in September 2012. The UDC will be filled gradually as demand grows and older sites around campus are decommissioned. II. Summary of design and operating conditions a. Cooling 65 F (18 C) water will be provided to each rack with the option of converting the entire facility to chilled water if required in the future. Racks should have rear door heat exchangers or another cooling solution that uses the provided water. A ceiling plenum (4 ) will be available for future cooling options if needed. <INSERT LINK TO COOLING DESIGN SPECS> b. Raised Floor The space under the raised floor is intended for pipes for the cooling system only. All cabling infrastructure should be above the racks. c. Racks All equipment must be rackable into the standard racks provided (preferred solution) or come in racks that can adapt to the cooling, electrical and seismic infrastructure of the UDC. <INSERT LINK TO RACKS SPECS> Individual servers will be hosted on provided standard racks with rear door heat exchangers. Pre-racked new equipment will be required to come with rear door heat exchangers and / or other similar mechanism to dissipate heat. 2

For existing pre-racked equipment that is intended to move to the new data center, UBC IT will work with the racks owners to determine the best solution for cooling (move to existing racks with heat exchangers or adapt existing racks to a similar solution) d. Access & Security The facility will be monitored 7/24 from the IT Operations Centre and IT staff & Researchers will have 24 hour access to their equipment. Access to the main room will be controlled via access cards or similar solution. Physical access to equipment can be restricted with locked racks and individual alarm systems if needed. Rack owners requiring additional security will need to cover the cost of the specific solution. Access to the network room will be limited to UBC IT authorized personnel. e. Work Space IT staff & Researchers will have access to a staging room to work on their equipment. The staging room will include workbenches with network drops and limited amount of storage space will be available for tools and spare parts. f. UPS & Generator The new data center will have UPS and Generator capacity to provide uninterruptable power to a small percentage of the equipment that we anticipate will require it. Space and power distribution will be designed taking into consideration that if required, additional UPS and generator capacity may eventually be added to support up to 100% of the equipment. Existing racks, that are connected to UPS and are scheduled to move to the new data center will be able to use the UDC s uninterruptable power free of charge. Post UDC initial occupancy, there will be an additional onetime charge based on power needs for those users who require UPS and Generator power. To maximize power efficiency and to minimize rack space utilization, all UPS will be located inside the UPS room and use of individual UPS will not be permitted. g. Communication 3

Plans and procedures to ensure proper and timely communication about scheduled outages or any work that may impact users of the data center will be in place. Researchers / users will have full visibility to monthly, quarterly and yearly maintenance windows plans. The scheduling and coordination of utilities interruptions will require collaborative effort between all impacted parties. Researchers / users will be required to provide contact information so they can be located in the case of unscheduled outages or other emergencies. h. Governance UBC IT will continue to work with representatives of the UBC community during the UDC design phase to ensure that the UDC s design meets the University s needs as broadly as possible. After completion, a users group should be formed to regularly review existing processes, generate requests or provide feedback to UBC IT to update/adjust governance policies and procedures. This group should also be involved in the definition of policies and procedures to manage long term space consumption. UBC IT will have control over the facility s design and management of the space, power, cooling and network and will be responsible for operational planning (i.e. fire alarm testing) Rules and regulations about proper use of the facility will be in place for the protection of all researchers equipment. UBC IT will designate a person or team who will be the point contact for researchers who want to use the facility. This person or team will work together with the researcher to plan the installation of new equipment. 4