High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.



Similar documents
The New Data Center Cooling Paradigm The Tiered Approach

How To Improve Energy Efficiency Through Raising Inlet Temperatures

APC APPLICATION NOTE #92

Power and Cooling for Ultra-High Density Racks and Blade Servers

Element D Services Heating, Ventilating, and Air Conditioning

Benefits of. Air Flow Management. Data Center

Data Center Components Overview

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Managing Data Centre Heat Issues

Data Center Power Consumption

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Data center upgrade proposal. (phase one)

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Driving Data Center Efficiency Through the Adoption of Best Practices

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Airflow Simulation Solves Data Centre Cooling Problem

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Great Lakes Data Room Case Study

BRUNS-PAK Presents MARK S. EVANKO, Principal

Benefits of Cold Aisle Containment During Cooling Failure

Managing Cooling Capacity & Redundancy In Data Centers Today

Using Simulation to Improve Data Center Efficiency

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Improving Data Center Energy Efficiency Through Environmental Optimization

Data Center Equipment Power Trends

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Choosing Close-Coupled IT Cooling Solutions

Strategies for Deploying Blade Servers in Existing Data Centers

- White Paper - Data Centre Cooling. Best Practice

Increasing Energ y Efficiency In Data Centers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Rittal Liquid Cooling Series

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Reducing the Annual Cost of a Telecommunications Data Center

How To Power And Cool A Data Center

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs

Small Data / Telecommunications Room on Slab Floor

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

The Different Types of Air Conditioning Equipment for IT Environments

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Specialty Environment Design Mission Critical Facilities

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

How to Meet 24 by Forever Cooling Demands of your Data Center

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Re Engineering to a "Green" Data Center, with Measurable ROI

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

APC APPLICATION NOTE #112

Using thermal mapping at the data centre

Data center TCO; a comparison of high-density and low-density spaces

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Elements of Energy Efficiency in Data Centre Cooling Architecture

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

Server Room Thermal Assessment

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Thinking Outside the Box Server Design for Data Center Optimization

How Row-based Data Center Cooling Works

Thermal Optimisation in the Data Centre

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

Enclosure and Airflow Management Solution

COMPARATIVE ANALYSIS OF WIDELY USED AIR CONTAINMENT SYSTEMS IN COMMERCIAL DATA CENTERS FOR MAXIMUM COOLING PERFORMANCE KASTURI RANGAN RAJAGOPALAN

Green Data Centre Design

Data Center Technology: Physical Infrastructure

How To Improve Energy Efficiency In A Data Center

Energy Efficiency in the Data Center

Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI)

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

RAISED-FLOOR DATA CENTER: PERFORATED TILE FLOW RATES FOR VARIOUS TILE LAYOUTS

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Virginia Tech. Background. Case Summary. Results

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Data Center Energy Profiler Questions Checklist

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Data Center Cabinet Dynamics Understanding Server Cabinet Thermal, Power and Cable Management

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Data Center Temperature Rise During a Cooling System Outage

EXECUTIVE REPORT. Can Your Data Center Infrastructure Handle the Heat of High Density?

Transcription:

High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white paper titled Heat Density Trends in Data Processing, computer Systems and Telecommunications Equipment Version 1.1 2

Power Consumptions will Continue to Increase More transistors and higher clock speeds will result in more power consumption which in turn lead to greater production of heat. Projections of power consumption increases by 2X every four years. 3

Power Consumptions The fact is that the majority of servers available today have high power consumptions. Technology Impact on 42U Enclosure 5U 3U 2U 1U 3U Blade Total Load (kw) 3.0-7.3 3.2-5.5 3.2-12.6 7.9-12.0 6.0-14.0 Albatros Server DS/5U 1.3 kw - - - - IBM X Series 360 -.37 kw - - - Compaq Proliant DL380 - -.4 kw - - IBM X Series 300 - - -.4 kw - Sun Fire B1600 - - - - 1.0 kw Number of Outlets 8-16 14-28 21-42 42 14-28 4

Cooling Microprocessors Within three to five years, researchers at Intel Corp., Hewlett-Packard Co. and IBM predict computer makers will have to move beyond fans and adopt cooling mechanisms such as radiators and liquid cooling systems to avoid potential overheating. 5

How Will These Trends Impact the Data Center? High density data centers Faster chips, more heat, more boxes Low to medium density data centers with hot spots Faster chips, more heat, but fewer total number of boxes Unknown effect 6

What is a High Density Data Center? A high compute density data center of today is characterized as one consisting of thousands of racks with multiple computing units The heat dissipation from a rack containing such computing units exceeds 10kW. Source: Thermal Considerations in Cooling Large Scale High Compute Density Data Center, Patel et al. 7

What EYP MCF Believes is the Practical Limit to a High Density Data Center A high density data center is one in which the power to the raised floor and the computing equipment exceeds 150 watts/sf over the entire raised floor. This is equivalent to approx. 3.5 kw/cabinet in an efficiently arranged data center. This is the point where traditional all air data centers begin to have significant limitations. 8

Cooling Strategies for the Conventional Data Center

Example of a Traditional All Air Data Center at 150 w/sf 15,000 sf, 620 24x36 cabinets, 3.5 kw/cabinet, 2170 kw cooling required Cabinets on hot aisle/cold aisle 16 ft aisle to aisle 23 or 24 CRACs (lay-out dependent) total at 110 kw each (375.3 MBH), 3 redundant 620 perforated tiles at 530 CFM each min. req. (24) 200 kw PDU s (2N configuration) 10

Two Approaches to Cooling the Data Center We use Computational Fluid Dynamics (CFD) to determine the preferred arrangement Option 1: CRACs on opposite walls Option 2: CRACs in middle and on one wall 11

Option 1: CRACs On Opposite Walls 150 w/sf Typical 15,000 sf data center 620 cabinets @ 3.5 kw/cabinet (24) 110.2 kw CRAC units (24) 225 KVA PDUs 12

Front In, Back Out, Hot Aisle / Cold Aisle Configuration for Cabinet Air is drawn through the front of the cabinet (blue) and discharged through the back (red) Cold Hot Cold Aisle Aisle Aisle Supply Air is 55º F Return Air is 85º F 13

Option 1: CRACs On Opposite Walls, 3 Raised Floor 14

Option 1: CRACs On Opposite Walls, 3 Raised Floor 2 Units Failed 15

Option 2: CRACs in Middle and On One Wall 150 w/sf Typical 15,000 sf data center 620 cabinets @ 3.5 kw/cabinet (23) 110.2 kw CRAC units (24) 225 KVA PDUs 16

Option 2: CRACs in Middle and On One Wall, 3 Raised Floor 17

Cooling Strategies for High Density Data Centers

High Density Data Center 240 w/sf Typical 15,000 sf 100 by 150 462 cabinets @ 7.9 kw/cabinet 3650 kw cooling 462 air grates @ 1200 cfm ea (40) 110.2 kw CRAC units (32) 225 KVA PDUs (2N) 19

High Density Data Center, 4 Rows of CRACs, 5 Raised Floor 20

High Density Data Center, 4 Rows of CRACs, 5 Raised Floor 3 Units Failed 21

To Overcome Airflow Variations Use Fan Powered Cabinets Ensures airflow to cabinet regardless of pressure in floor and airflow through air grates, however re-circulation will occur if sufficient air from under floor is not delivered. Cabinet fans supplement server fans 22

Fan Powered Cabinets: Front In, Back Out, Footprint Vented, Fan Assisted Air is drawn through the front of the cabinet (blue) and discharged through the back (red) Cabinet Fan Discharge Air is 85º F Raised Floor Supply Air is 55º F 23

Fan Powered Cabinets: Front In, Back Out, Fan Assisted Air is drawn through the front of the cabinet (blue) and discharged through the back (red) Cabinet Fans Discharge Air is 85º F Raised Floor Supply Air is 55º F 24

Fan Powered Cabinets: Front In, Top Out, Fan Assisted Air is drawn through the front of the cabinet (blue) and discharged through duct work (red) Return Air Ceiling Plenum is 85º F Ceiling Cabinet Fans Hot Air Plenum Discharge Air is 85º F Raised Floor Supply Air is 55º F 25

Fan Powered Cabinets: Front In, Top Out, Footprint Vented, Fan Assisted Air is drawn through the bottom of the cabinet (blue) and discharged through the top (red) Discharge Air is 85º F Cabinet Fans Raised Floor Supply Air is 55º F 26

Fan Powered Cabinets: Front In, Top Out, Footprint Vented, Forced Ventilated Air is drawn through the bottom of the cabinet (blue) and discharged through duct work (red) Discharge Air is 85º F Ceiling Cabinet Fans Raised Floor Supply Air is 55º F 27

High Density Data Center without CRACs 325 w/sf Typical 11,200 sf high density data center 462 cabinets @ 7.9 kw/cabinet 462 cabinet coolers (32) 225 kva PDUs 28

Water Cooled Cabinet: Front In, Back Out Air is drawn through the front of the cabinet (blue) and discharged through the back (blue) Cooling Coil Cabinet Fans Discharge Air is 72º F Raised Floor Supply Return Piping 29

Water Cooled Cabinet: Footprint Vented Cool Air Cabinet Rear Cabinet Front Fan and Coil Location under Cabinet Raised Floor Supply Return Piping Building Floor 30

Refrigerant Cooled Cabinet Air is drawn through the front of the cabinet (blue) and discharged through the back (red) 45º F Water from Chilled Water Plant Heat Exchanger 60º F Refrigerant Pump Discharge Air is 85º F Raised Floor 31

Refrigerant Cooled Cabinet: Front In, Back Out Air is drawn through the front of the cabinet (blue) and discharged through the back (red) Piping to Heat Exchanger Discharge Air is 85º F Overhead Fan Coil Supply Air Supply Air is 55º F Raised Floor 32

Inherent Problems with High Densities Air Distribution Problems Inadequate raised floor height and excessive openings in perforated tiles can result in high velocity airflows. Bypassing of air within cabinets Re-circulation of air outside of cabinets Today users report a greater number of server failures in top 1/3 of cabinets 33

Inherent Problems with High Densities Cabinet fans are generally single power supply. Cabinet fan power should be from a separate PDU with redundancy. Identical cabinets may be required. UPS system must be larger or separated to incorporate fan power (up to 350 w/cabinet). For our example of 462 cabinets 162kW of fan power. 34

Inherent Problems with High Densities What about the rate-of-change at the microprocessor level during transitions from utility to generators. To my knowledge this has not be addressed by the industry. Is uninterruptible cooling mandatory? If so, what are the implications? Costs Reliability Redundancy Maintainability 35

Other Considerations Numerous additional monitoring points. Addition of cooling liquids in data center environment. Fire Zones. Noise. Seismic. Computational Fluid Dynamic modeling is required for design and recalculation when hardware changes. 36

Other Considerations It will be mandatory to have close coordination between facilities department and IT hardware planners. Does not exist today Generally higher risks with lower margins for errors. 37

Prognostications late 1990 s requirements of 100 watts per square foot of average power now exceed 200 watts per square foot, and will double again by 2004. META Report - August 2, 2001 by Rich Evans 38

Prognostications In fact, the 2004 400-watts-persquare-foot requirement will again double by 2008, so designers must work this into longer-range facilities plans so that space can be fully used, as opposed to 40 percent to 50 percent utilization because of insufficient power and cooling. META Report August 2, 2001 by Rich Evans 39

Prognostications Data centers utilize much less than 50% of the physical and power infrastructure. Source: Cost of Overbuilding by Domenic Alcaro 40

Prognostications Current estimates of data center power requirements are greatly overstated because they are based on criteria that incorporate oversized, redundant systems, and several safety factors. Source: Data Center Power Requirements: Measurements from Silicon Valley, Mitchell-Jackson, Koomey, Nordman, Blazek 41

Prognostications Furthermore most estimates assume data centers are filled to capacity. For the most part, these numbers are unsubstantiated. Source: Data Center Power Requirements: Measurements from Silicon Valley, Mitchell-Jackson, Koomey, Nordman, Blazek 42

So What s Real? Today s data centers incorporate significant amounts of legacy equipment, slow change out due to economy. Many data centers have significant amounts of available space on the raised floor. 43

Current Load Densities DATA CENTER TYPE WATTS/SF KW/CABINET Enterprise 20 to 60 0.5 to 1.5 Internet/Co-location 40 to 90 1.0 to 2.0 Managed Services 60 to 110 1.5 to 2.5 High Density 150+ 3.5+ 44

My View Most projections of watts/square foot are overstated. Millions of dollars will be spent on unused infrastructure. There no longer are real drivers for compaction. Generally there is ample raised floor available. 45

My View High density data centers will not become a reality. Low/medium density data centers will have hot spots due to high density computing equipment. 46

My View of What to Do Manage your hardware installations Don t buy IT gear that is problematic Don t build yourself a problem Existing air delivery systems work Highly engineered systems are expensive to build and operate and significantly reduce reliability, redundancy and flexibility. 47

High Density Data Centers Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. www.eypmcf.com