Power and Cooling for Ultra-High Density Racks and Blade Servers



Similar documents
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Strategies for Deploying Blade Servers in Existing Data Centers

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Benefits of. Air Flow Management. Data Center

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

APC APPLICATION NOTE #92

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Choosing Close-Coupled IT Cooling Solutions

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

How Row-based Data Center Cooling Works

Data center upgrade proposal. (phase one)

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Managing Cooling Capacity & Redundancy In Data Centers Today

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Rack Hygiene. Data Center White Paper. Executive Summary

- White Paper - Data Centre Cooling. Best Practice

Small Data / Telecommunications Room on Slab Floor

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

APC APPLICATION NOTE #112

Great Lakes Data Room Case Study

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Airflow Simulation Solves Data Centre Cooling Problem

How to Meet 24 by Forever Cooling Demands of your Data Center

Data Center Power Consumption

The New Data Center Cooling Paradigm The Tiered Approach

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Impact of Virtualization on Data Center Physical Infrastructure White Paper #27. Tom Brey, IBM Operations Work Group

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Using Simulation to Improve Data Center Efficiency

Data Centre Infrastructure Assessment

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Benefits of Cold Aisle Containment During Cooling Failure

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Elements of Energy Efficiency in Data Centre Cooling Architecture

Blade Server & Data Room Cooling Specialists

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Raised Floor Data Centers Prepared for the Future

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Data Center Technology: Physical Infrastructure

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Improving Data Center Energy Efficiency Through Environmental Optimization

Driving Data Center Efficiency Through the Adoption of Best Practices

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Rittal Liquid Cooling Series

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

How To Cool A Data Center

Cabinets 101: Configuring A Network Or Server Cabinet

BRUNS-PAK Presents MARK S. EVANKO, Principal

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Data Center Components Overview

Managing Data Centre Heat Issues

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Greening Commercial Data Centres

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Overview & Design of Data Center Cabinets

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

How To Power And Cool A Data Center

Data Center Cooling A guide from: Your partners in UPS Monitoring Systems

Server Room Thermal Assessment

Reducing Data Center Energy Consumption

Center Thermal Management Can Live Together

Data Centre/Server Room Containment: How to Get More Cooling Capacity without Adding Additional AC

The Advantages of Row and Rack- Oriented Cooling Architectures for Data Centers

Cooling Strategies for IT Wiring Closets and Small Rooms

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Analysis of the UNH Data Center Using CFD Modeling

Thermal Optimisation in the Data Centre

Guidelines for Specification of Data Center Power Density

Element D Services Heating, Ventilating, and Air Conditioning

How To Run A Data Center Efficiently

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Data Center Consolidation Trends & Solutions. Bob Miller Vice President, Global Solutions Sales Emerson Network Power / Liebert

Specialty Environment Design Mission Critical Facilities

Re Engineering to a "Green" Data Center, with Measurable ROI

University of St Andrews. Energy Efficient Data Centre Cooling

Upgrading the PRS datacentres: space requirement. S. S. Mathur, GM-IT, CRIS

ADC-APC Integrated Cisco Data Center Solutions

Statement Of Work. Data Center Power and Cooling Assessment. Service. Professional Services. Table of Contents. 1.0 Executive Summary

Thermal Storage System Provides Emergency Data Center Cooling

Transcription:

Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46

Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw per rack) would greatly exceed the power and cooling ability of the typical data center Providing 10-20kW of cooling per rack is technically infeasible using conventional methods

Introduction The Solution There are practical strategies for installing, powering, and cooling high density racks either singly or in groups Some of these strategies challenge prevailing industry thinking on high-density deployment

Introduction The Risk Wrong choices in designing for high density can increase Total Cost of Ownership for NCPI by many times

Introduction! The Surprise Extreme compaction of data centers (over 6 kw per rack) creates the need for extreme cooling infrastructure, which Negates the space savings of high-density IT equipment Increases data center TCO

High Density s Challenge to Conventional Cooling

The Cooling Challenge 18 kw POWER 18kW 18 kw COOLING

The Cooling Challenge 18kW 2500 cfm 2500 cfm

The Cooling Challenge 18kW Exhaust drawn from itself or from neighboring racks 2500 cfm Less than 2500 cfm supplied

The Cooling Challenge: (For this 18kW rack) Supply 2500 cfm of cool air to the rack Remove 2500 cfm of hot exhaust air from the rack Keep the hot exhaust air away from the equipment intake Provide all these functions in a redundant and uninterrupted manner

CHALLENGE # 1: 18kW Supply 2500 cfm of Cool Air to the Rack

Typical Raised-Floor Airflow One 300 cfm vented tile per rack 300 cfm 300 cfm 300 cfm 300 cfm

Perforated Tiles? 18kW rack would require 8 perforated tiles Aisle width would need to be substantially increased Spacing between racks would need to be substantially increased Perforated tiles cannot cool an 18 kw rack in a typical data center

Floor Tile Cooling Ability 12 10 Typical Capability Perf tile With Effort Grate tile Extreme Impractical Rack Power (kw) that can be 8 6 4 cooled by one tile with this airflow 2 Blade Servers Standard IT Equipment 0 0 100 200 300 400 500 600 700 800 900 1000 [47.2] [94.4] [141.6] [188.8] [236.0] [283.2] [330.4] [377.6] [424.8] [471.9] Tile Airflow (cfm) [L/s]

Floor Tile Cooling Ability 12 10 Typical Capability 300-500 cfm Perf tile With Effort Requires careful raised floor design, careful CRAC placement, and control of under-floor obstacles (pipes/wiring) Extreme Grate tile Impractical Rack Power (kw) that can be 8 6 4 cooled by one tile with this airflow 2 Blade Servers Standard IT Equipment 0 0 100 200 300 400 500 600 700 800 900 1000 [47.2] [94.4] [141.6] [188.8] [236.0] [283.2] [330.4] [377.6] [424.8] [471.9]

Floor Tile Cooling Ability 12 10 Typical Capability Perf tile With Effort 500-700 cfm Grate tile Extreme Additionally requires grate-type tiles Impractical Rack Power (kw) that can be 8 6 4 cooled by one tile with this airflow 2 Blade Servers Standard IT Equipment 0 0 100 200 300 400 500 600 700 800 900 1000 [47.2] [94.4] [141.6] [188.8] [236.0] [283.2] [330.4] [377.6] [424.8] [471.9]

Increased Floor Depth? Airflow variation decreases as floor plenum depth increases Airflow variation Floor Plenum depth

Increased Floor Depth? Airflow variation decreases as floor plenum depth increases Airflow variation Airflow variation Floor Plenum depth Plenum depth

Increased Floor Depth? Max % Tile Airflow Variation Max % Tile Airflow Variation 350% 350% 300% 300% 250% 250% 200% 200% 150% 150% 100% 100% 50% 50% With grate-type tiles, airflow in some cases reverses! 25% Open Tiles (perforated) 56% Open Tiles (grate) Variation is large even for very deep plenum 0% 0% 1 2 3 4 5 6 1 ft 2 ft 3 ft 4 ft 5 ft 6 ft 0.30 m 0.61 m 0.91 m 1.22 m 1.52 m 1.83 m [0.30] [0.61] [0.91] [1.22] [1.52] Floor Plenum Depth Floor Plenum Depth (feet) [meters]

Grate-Type Tiles? Grate-type tiles dramatically alter under-floor pressure gradients, making cooling non-uniform and unpredictable Grate-type tiles in one area impact airflow in neighboring areas Large airflow variations when using grate-type tiles mean some locations will NOT receive enough cooling Even if an extreme cooling design could solve these large airflow variations, it would still take 3-4 grate-type tiles to cool one 18kW rack

Conventional Cooling Won t Work A conventional data center layout with one vented tile per rack simply cannot cool racks over approximately 6 kw per rack over a sustained area

CHALLENGE # 2: 18kW Remove 2500 cfm of Hot Air From the Rack

Removing Heat 18kW 2500 cfm 3 Ways to Remove Heat: Through the room Through a duct Through ceiling plenum

Removing Heat The ideal: Hot exhaust air from equipment is taken directly back to the cooling system Unobstructed, direct return path No chance to mix with surrounding air No chance to be drawn into equipment intakes 2500 cfm through a 12 round duct goes 35 mph 1180 L/s through a 30 cm round duct goes 56 km/h

Removing Heat Typical methods used in data centers High, open ceiling with bulk air return at a central high point Return ductwork Suspended ceiling plenum These methods present high-density design challenges due to high air velocity Bulk air return across the room, under ceiling that is just a few feet above the racks

Removing Heat More than 400 cfm (189 L/s) airflow per rack either supply or return over a sustained area requires specialized engineering to ensure performance and redundancy

CHALLENGE # 3: 18kW Keep Hot Exhaust Air Away From Equipment Intake

Preventing Recirculation The shortest path for air to reach the equipment intake is recirculation from the equipment s own exhaust In high density environments, high airflow velocities become subject to resistance in ductwork, which degrades airflow patterns Supply and return paths must dominate airflow near potential recirculation paths to keep equipment from ingesting its own hot exhaust

Cooling the data center

CHALLENGE # 4: Provide Cooling in a Redundant and Uninterrupted Manner Redundant Cooling must continue during downtime of a CRAC unit Uninterrupted Cooling must continue during failover to generator backup

Redundant Cooling: Conventional Solution Multiple CRAC units feed a shared raised floor or overhead plenum Plenum is assumed to sum all CRAC outputs and provide consistent pressure throughout System is designed to meet airflow and cooling requirements when any one CRAC unit is off

Redundant Cooling: High Density Challenge Airflow in cooling plenum increases Fundamental assumptions about shared-plenum system begin to break down With one CRAC unit off, local airflow velocities in the plenum are radically altered Airflow at an individual tile may even reverse may even reverse, drawing air down into floor from venturi effect Under fault conditions, cooling operation becomes unpredictable

Uninterrupted Cooling: Conventional Solution Conventional system puts CRACS on generator, not UPS Temperature rise during 5-20 second generator startup is acceptable: 1 C (1.8 F)

Uninterrupted Cooling: High Density Challenge In high density environment, temperature rise during uncooled 5-20 second generator startup can be 8-30 C (14-54 F) CRAC units may additionally have up to 5-minute settle time after power outage before they can be restarted With high power density, CRAC fans and pumps (CRAC units, in some cases) must be on the UPS to provide continuous cooling during generator startup CRACs on UPS are a major cost driver and a major barrier to HD deployment

Five Strategies That Work

5 Strategies for Deploying High-Density Racks and Servers 1. Design room for peak rack density Or design room BELOW peak rack density, and 2. Provide supplemental cooling for high-density racks 3. Establish rules for interspersed high-density racks to borrow cooling from adjacent racks 4. Spread out high-density equipment among multiple racks 5. Create a separate highly cooled area for highdensity equipment

Strategy #1 Design Room For Peak Rack Density Handles wide range of future high-density scenarios Requires very complex analysis and engineering Capital and operating cost up to 4x alternative methods Risk of extreme underutilization of expensive infrastructure For rare and extreme cases of large farms of high-density equipment and limited space

Strategy #2 Supplemental Cooling for HD Racks Types of supplemental cooling: Specialty floor tiles or fans to boost cool air supply to rack Specialty return ducts or fans to scavenge hot exhaust from racks for return to the CRAC Special racks or rack-mounted cooling devices to provide cooling directly to the rack

Supplemental-Cooling: Considerations Strategy #2 Provides high density when and where needed Defers capital cost of upgrading to high-density infrastructure High efficiency Optimal use of floor space Limited to about 10 kw per rack Room must be designed in advance to allow it

Supplemental-Cooling: Applications Strategy #2 New construction or renovation Mixed environment When location of high-density equipment is not known in advance

Rack Air Distribution Product Design Benefits Dual Fans Increases airflow from top to bottom of rack A-B Power Input Feeds Redundant power, Maximizes uptime Filtered, Conditioned Air Increases life of equipment by supplying cool clean air Independent Fan Control Switch Vary amount of airflow to equipment Air Filter Removes airborne particles from the rack Raised Floor Duct Allows air to be pulled into the rack directly from the raised floor Rack Air Distribution Unit

Rack Air Distribution Unit Airflow Diagram Conditioned Room Air Is: Pulled in from underneath raised floor Delivered from bottom to top of rack by dual fans Drawn in by the IT equipment Provides Cooler Air to the Rack Provides better cooling for IT equipment reducing heat related failures Extends the life of equipment in the rack Rack Air Distribution Unit

ARU AIR FLOW

Rack Air Removal Unit Airflow Diagram Fans pull in rack equipment exhaust air Cable impedance is overcome by high powered fans Ducted exhaust system (optional) delivers hot air to plenum Eliminates hot air from mixing with room air Proper airflow through the enclosure is ensured Cool inlet air moves freely to equipment in the rack Rack Air Removal Unit

ADU Air Flow

ARU with Fully Ducted Return

Strategy #3 Strategy #3 Borrow Cooling From Adjacent Racks Common and effective strategy in typical data centers Unused cooling capacity from neighboring racks can be used for up to 3x design density Uses rules for locating high-density racks to avoid creating hot spots Compliance can be verified by monitoring power consumption at the rack level

Borrowed Cooling Considerations Strategy #3 No new equipment needed Essentially free in many cases High-density racks are limited to about 2x average density Requires more floor space than supplemental-cooling strategy (lower density) Requires enforcement of deployment rules

Borrowed Cooling Applications Strategy #3 For existing data centers, when high-density equipment is a small fraction of total load

Borrowed Cooling Example of Deployment Rules Strategy #3 START: New proposed load Add up rack s existing power plus new load NO NO Does either adjacent rack exceed avg. cooling power? Does new rack power exceed avg. cooling power? YES YES Split up load or try different location NO Is rack at the end of a row? YES NO Does avg. of new and 2 adjacent racks exceed avg. cooling power? YES NO Does avg. of new and adjacent rack exceed avg. cooling power? YES New load may be deployed New load may not be deployed

Borrowed Cooling Example of Deployment Rules Strategy #3 If the proposed rack averaged with its neighbors (only one neighbor if at the end of a row) does not exceed design average power, AND neither neighbor is already a high-density rack, then it s OK to put it there

Strategy #4 Strategy #4 Split Up High Density Equipment Most popular solution High-density equipment is spread out among many racks No rack exceeds design power density Predictable cooling performance

Splitting Up Equipment: Considerations Strategy #4 No new equipment needed Essentially free in many cases High-density equipment must be spread out even more than in borrowing strategy Uses more floor space than full high-density racks Can cause data cabling problems Empty vertical space must be filled with blanking panels to prevent in-rack recirculation of hot exhaust air

Splitting Up Equipment: Applications Strategy #4 For existing data centers, when high-density equipment is a small fraction of the total load

Blanking Panels Strategy #4 BEFORE AFTER 90ºF 32ºC 79ºF 32ºC 80ºF 27ºC 73ºF 32ºC 95ºF 35ºC Rack front 73ºF 32ºC Rack front Blanking panels 83ºF 28ºC 73ºF 32ºC 72ºF 22ºC 72ºF 32ºC 70ºF 21ºC 70ºF 32ºC SIDE VIEW Server Inlet Temp SIDE VIEW

Air Flow with no blanking panels

Blanking Panels Strategy #4 Blanking panels block internal recirculation BEFORE AFTER 90ºF 32ºC 80ºF 27ºC 95ºF 35ºC Rack front 79ºF 32ºC 73ºF 32ºC 73ºF 32ºC Rack front Blanking panels 83ºF 28ºC 72ºF 22ºC 70ºF 21ºC 73ºF 32ºC 72ºF 32ºC 70ºF 32ºC SIDE VIEW SIDE VIEW

Air Flow with blanking panels

Blanking Panels Strategy #4 Snap-in blanking panel

Strategy #5 Strategy #5 Dedicated High-Density Area Supports maximum-density racks Doesn t require spreading out of high-density equipment Optimal floor space utilization New technologies can deliver predictable, highly efficient cooling

Dedicated High-Density Area: Considerations Strategy #5 Requires prior knowledge of number of high-density racks Need to plan high-density area in advance and reserve space for it Requires ability to segregate high-density equipment

Dedicated High-Density Area: Applications Strategy #5 New construction or renovations Density of 10-25 kw per rack High-density co-location

Dedicated High-Density Area: Power / Cooling Technology Strategy #5 Ambient-temperature air is returned to room Hot aisle Integral rack power distribution system (UPS is optional) Door access to hot aisle and rear of IT equipment Integral rack air conditioner All exhaust air is captured within chamber and neutralized to ambient temperature Equipment racks take in ambient air from front Can operate on hard floor or raised floor

ISX High Density Air Flow Pattern

Summary and Recommendations Seven Elements of an Optimal Cooling Strategy

Elements of Optimal Cooling Strategy 1 Ignore physical size of equipment and focus on functionality per watt consumed Minimizes area and TCO

Elements of Optimal Cooling Strategy 2 Design the system to permit later installation of supplemental cooling devices Allows for future supplemental cooling equipment where and when needed, on a live data center, in the face of uncertain future requirements

Elements of Optimal Cooling Strategy 3 Choose a room average power density between 40 and 100 W / ft 2 0.4 1.1 kw / m2 Avoids waste due to oversizing Practical for most new designs: 80 W / ft2 or 2.8 kw / rack.9 kw / m2 Keeps both routine and redundant performance predictable

Elements of Optimal Cooling Strategy 4 If the fraction of high density loads is high and predictable, establish high-density areas of 100-400 w / ft 2 (3-1 per rack) 1.1 4. / m 2 Requires planning ahead for specially equipped areas Adds significant cost, time, and complexity These areas do not use raised-floor cooling

Elements of Optimal Cooling Strategy 5 Establish policies/rules for allowable rack power based on location and adjacent loads Reduces hot spots Ensures cooling redundancy Increases system cooling efficiency Reduces electrical consumption More sophisticated rules and monitoring can enable even higher power density

Elements of Optimal Cooling Strategy 6 Use supplemental cooling devices where indicated Boosts local cooling capacity up to 3x room design to accommodate high-density equipment

Elements of Optimal Cooling Strategy 7 Split up equipment that cannot be installed to meet the rules Lowest cost, lowest risk option Consumes considerable space if there is more than a small fraction of high-density loads Chosen as primary strategy by users without significant area constraints

Questions?