BRUNS-PAK Presents MARK S. EVANKO, Principal



Similar documents
Dealing with Thermal Issues in Data Center Universal Aisle Containment

Benefits of. Air Flow Management. Data Center

Managing Data Centre Heat Issues

The New Data Center Cooling Paradigm The Tiered Approach

Managing Cooling Capacity & Redundancy In Data Centers Today

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Data Center Power Consumption

APC APPLICATION NOTE #92

Data center upgrade proposal. (phase one)

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Rittal Liquid Cooling Series

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

How To Improve Energy Efficiency Through Raising Inlet Temperatures

- White Paper - Data Centre Cooling. Best Practice

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Benefits of Cold Aisle Containment During Cooling Failure

Element D Services Heating, Ventilating, and Air Conditioning

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data Center Components Overview

Green Data Centre Design

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Specialty Environment Design Mission Critical Facilities

Server Room Thermal Assessment

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

How to Meet 24 by Forever Cooling Demands of your Data Center

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Power and Cooling for Ultra-High Density Racks and Blade Servers

How Row-based Data Center Cooling Works

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Reducing the Annual Cost of a Telecommunications Data Center

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

How to Build a Data Centre Cooling Budget. Ian Cathcart

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Strategies for Deploying Blade Servers in Existing Data Centers

Data Center Design Considerations. Walter P. Herring, RCDD/LAN/OSP Senior Systems Specialist Bala Consulting Engineers, Inc.

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Analysis of the UNH Data Center Using CFD Modeling

Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems

Rack Hygiene. Data Center White Paper. Executive Summary

Small Data / Telecommunications Room on Slab Floor

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Great Lakes Data Room Case Study

Driving Data Center Efficiency Through the Adoption of Best Practices

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Using thermal mapping at the data centre

How To Improve Energy Efficiency In A Data Center

Environmental Data Center Management and Monitoring

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Data Center Optimization Services. Maximize Your Data Center

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Data center TCO; a comparison of high-density and low-density spaces

Best Practices in HVAC Design/Retrofit

Reducing Data Center Energy Consumption

Measure Server delta- T using AUDIT- BUDDY

abstract about the GREEn GRiD

Data Centre Cooling Air Performance Metrics

Data Center Energy Profiler Questions Checklist

How To Power And Cool A Data Center

Improving Data Center Energy Efficiency Through Environmental Optimization

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Virtual Data Centre Design A blueprint for success

Data Centre/Server Room Containment: How to Get More Cooling Capacity without Adding Additional AC

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Thermal Mass Availability for Cooling Data Centers during Power Shutdown

How To Run A Data Center Efficiently

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Enclosure and Airflow Management Solution

Airflow Simulation Solves Data Centre Cooling Problem

2014 Best Practices. The EU Code of Conduct on Data Centres

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Transcription:

BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations Friday, July 13, 2007

Agenda I II III IV High Density Computing Equipment and the Thermodynamic Evolution High Density Impacts to Data Center Costs Thermodynamic Model Impacts to the Data Center Designs/Retrofits Questions and Answers 2

Part I High Density Computing Equipment and the Thermodynamic Evolution 3

High Density Computing Equipment and the Thermodynamic Evolution The Data Center Solution Components (chant!) A. Computer Hardware B. Computer Software C. Telecommunications (Data/Tele) D. Facility Infrastructure The Recent Data Center History of the World The Early on Watts per sq. ft. Metric 1980/1990 The High Density Discussion/Announcement Associated with Blade Server Technology 2000 4

High Density Computing Equipment and the Thermodynamic Evolution The Data Center Facility Infrastructure Transformation Discussion/Evolution from Capacity (watts per sq. ft.) to Distribution (cfm/static) The Retraction of Uptime Institute Spring 2006 Tier Ratings The Evolution of Facility Infrastructure Reliability vs. Data Processing Uptime The Role of Thermodynamic (CFD) Modeling One Size/Standard Facility Infrastructure Metric Does Not Fit All! GREEN/LEED Considerations: A) EPA Draft April 23, 2007 Server and Data Center Efficiencies 5

High Density Computing Equipment and the Thermodynamic Evolution Data center facility infrastructure summary reliability rankings: Numerical Rankings Terminology Summary Definition (1) Unreliable Shared building power and cooling; no generator (2) Partially Isolated, Unreliable (3) Isolated Unreliable (4) Isolated Conditioned (5) Isolated Improved (6) Isolated, Mostly Reliable (7) Reliable (8) Reliable Redundant (9) Ultra-Reliable (10) State of the Art Dedicated power system; shared cooling system; unconditioned power; non-redundant air conditioning; no generator Dedicated power and cooling systems; unconditioned power; non-redundant dedicated air conditioning units; no generator Dedicated power and cooling systems; conditioned power; non-redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; non-redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; redundant dedicated A/C units; generator Dedicated power and cooling systems; redundant UPS systems; redundant dedicated A/C units; redundant generators Redundant power train; redundant cooling system; redundant UPS systems; redundant dedicated A/C units; redundant generator systems; redundant fuel system Redundant power train; redundant cooling system, redundant UPS systems, redundant dedicated A/C units; redundant generator systems; redundant fuel system; site hardened for weather and geographic exposures; location minimizes exposure to jurisdictional closure from hazardous spill, terrorism, or similar risks. 6

High Density Computing Equipment and the Thermodynamic Evolution Tier I 99.671% Tier II 99.749% Tier III 99.982% Tier IV 99.995% 7

High Density Computing Equipment and the Thermodynamic Evolution A. Historical Data Center Facility Infrastructure Baseline Mainframe and mini computer equipment platforms Older technology DASD Tape and tape storage Gross sq. ft. sample densities: 20 to 45 watts/sq. ft. computer equipment 8 to 24 inch raised floor Level 7 reliability Building office space generally sufficient for structural load and/or simply modified 7 6 to 9 0 distance from raised floor to underside of suspended ceiling Buss/tag cable dams in underfloor 8

High Density Computing Equipment and the Thermodynamic Evolution B. The Evolution of the Server, the Blade Server, and the Super Server Stand alone Rack mounted Population growth U vs. 2U v. 3U Blade servers The density of servers per rack 1 kw to 24/41/72++ kw The super server solution announced 3000 lb. devices over a footprint of approximately 11 sq. ft. with a 30 kw demand and 67kBTU of heat rejection! Chilled water Potential announcements of computer equipment manufacturers of 50/60/70+ kw stand alone computer equipment devices. Announced 2007. 9

High Density Computing Equipment and the Thermodynamic Evolution C. The Facility Infrastructure Impact Potential. NOTE: Each client must have their short/long term computer equipment growth projection modeled independently. There are no standard guidelines. WARNING! 100, 400, 700, 1000, 2000+ watts per sq. ft. as interpreted from the computer equipment floor plan. 12 to 48 inch raised floor height. Level 7, 8, 9, 10 infrastructures depending on disaster recovery/mirroring plan. WARNING: Medical community with patient care applications. Structural loading exceeding 100, 300, 500 lbs. per sq. ft. Existing multi floor building structures do not have sufficient floor to underside of deck clearances. Short term problem avoided by the add of one in a 5000 sq. ft. data center. Electrical/mechanical capacity on floor EXCEEDS present loads however thermal problems cause shutdown. 10

High Density Computing Equipment and the Thermodynamic Evolution Rack Dissipation Air Side 7/12kW Nominal Impact Hot/Cold Aisles Impact Double Hot/Cold Aisles Air Side Stretch 4 22+ kw (cfd) Water Cooling Recirculation Warning: No Standard CFD Models Strongly Recommended 11

Part II High Density Impacts to Data Center Costs 12

High Density Impacts to Data Center One of the Most Dangerous/Variable Factors in Design/Building a Data Center in 2006 and Beyond 1980/1990 Rules of Thumb in cost per sq. ft. Danger Same sq. ft. Data Center ie: 5,000 sq. ft. vary density, reliability, location, long term growth projection, scalability $416 sq. ft. - $2,140 sq. ft. The Throw Away Data Center The Gartner Mission Costs The Scheduled Delivery? 13

High Density Impacts to Data Center Costs The Blade Server Summit Conference Discussions of Manufacturers Hardware (4/3/2/1)..Facilities? 14

High Density Impacts to Data Center A. Data center facility infrastructure cost impacts Physical size (See 1-5 year computer equipment plan CRITICAL) Electrical/mechanical capacity Reliability level (See 1-10 chart) Expandability Retrofit vs. New WARNING! Time allocated to complete the project Location in the United States or Canada Type of construction labor force Support Space Costs 15

High Density Impacts to Data Center Costs B. Summary data center facility infrastructure cost experiences: Numerical Ranking Size (sq.ft.) Cost/sq.ft. 10 15,000 $ 1,000 $ 7-9 15,000 $ 1,000 $ 5-7 15,000 $ 1,000 $ 1-5 15,000 $ 1,000 $ Office --------- $ NOTE: These cost experiences are not intended to be used as detailed budgets * Electrical Density Cost differences per mw. 16

High Density Impacts to Data Center Costs Schedule duration for typical projects: Item Area Description Duration 1 Evaluations To determine technical alternatives/costs/schedules associated with type of data center 10-14 weeks 2 Design/Engineering Detail drawings/specifications for the option selected in 1 8-22 weeks (excludes a building shell) 3 Permits For local authorities to review/approve Allow 4 weeks 4 General Construction A function of reliability, size, and location 10-26 weeks (excludes a building shell) 5 Pre-Purchase/Long Lead Time Equipment Warning based on co-location/ web hosting facilities Up to 46 weeks 6 Thermodynamic (CFD) Modeling Projects Stand alone base on information technology 10 16 weeks 17

High Density Impacts to Data Center Costs Project Team Client Client Project Team Project Director OEM Availability Installation Planning Hardware/Software Project Manager Design/Engineering Project Manager Construction Project Manager Communications OEM Hardware Support Architectural General Data Computer Equipment Planning Computer Equipment Migration Electrical Mechanical/Fire Protection Civil/Structural Electrical/Mechanical Voice Relocation/Move Software Planning CADD Manager Disaster Recovery 18

Part III Thermodynamic Model Impacts to the Data Center Designs/Retrofits 19

Base Model The Shell, Underfloor, and Raised Floor 20

Base Model The Shell, Floor, CRAC's, PDU's, and Equipment 21

Base Model Equipment Orientation The equipment orientation is shown as the intake side of the units being blue and the exhaust side being red. It can be seen that not all equipment is orientated in a hot and cold aisle configuration, and not all equipment within the same row is oriented in the same direction. There are some front to back oriented equipment racks as noted earlier. These racks will have a tendency to pull in hot exhaust air from the rack in front them. 22

Base Model Equipment Powers The equipment demand can be seen here in kw. 23

Base Model Floor Void Pressures The static pressure below the floor seems to be lacking in the majority of the data center. This is most likely resultant from the large unsealed cable cutouts throughout the raised floor area. The highest static pressure is seen in the corner of the data center with two (2) CRAC units operating with little load on them, and very few perforated tiles or cable cutouts. 24

Base Model Floor Void Temperatures The floor void temperature varies based upon CRAC unit operation, as would be expected. The observation here is that the higher temperatures can be associated with the lack of a demand on these units. Since little cooling is required, the units tend to warm the supply air or pass the relatively warm return air into the supply plenum. 25

Base Model Perforated Tile Flow Rates The airflow through most of the perforated tiles is relatively low though uniform. This can be attributed to the majority of the air escaping through the cable cutouts. 26

Base Model Top of Rack Temperatures The top of the equipment rack is typically where the highest intake temperatures are seen in the data center. This is a result of this part of the rack receiving less supply air from the underfloor plenum, the higher ambient temperatures at this level of the room, or the recirculation of hot air due to low ceiling heights. It can also be seen that some cold air is being returned to the CRAC units while warm air is being exhausted into the intake of other equipment. Some short cycling is occurring where perforated tile placement allows colder air to return to the CRAC units. 27

Base Model Maximum Equipment Inlet Temperatures The ASHRAE recommended inlet temperature for computer equipment is between 68 and 77 deg F. This temperature is exceeded in several areas shown as green and yellow. These temperatures however are still within the allowable ASHRAE range of 59 to 90 deg F. 28

Base Model Potential Equipment Overheat This indicates which racks may be at risk of thermal failures due to high intake temperatures. The specific rack configurations may negate such risk if no devices are installed in the top of the rack or if devices such as patch panels are installed at the top of the racks. 29

Base Model CRAC Unit Cooling Demands Units shown in red are working almost at their rated capacity while several others are operating below their rated capacity. 30

CRAC #1 Failed Top of Rack Temperatures The overall temperature does not change significantly from the failure of CRAC #1. This unit is located in the Network Room and the installed transfer fans adequately support the ventilation needs of the space. 31

CRAC #1 Failed Maximum Equipment Inlet Temperatures The inlet temperatures actually decrease during this failure. This is because the CRAC units in the data center work harder to compensate for the additional heat being transferred from the Network Room. 32

CRAC #1 Failed CRAC Unit Cooling Demands A comparison can be made against slide 30 to see the changes in unit operation. The CRAC units adjust to the new demands as the dynamics of the room change. 33

CRAC #2 Failed Top of Rack Temperatures 34

CRAC #2 Failed Maximum Equipment Inlet Temperatures 35

CRAC #2 Failed CRAC Unit Cooling Demands 36

CRAC #3 Failed Top of Rack Temperatures 37

CRAC #3 Failed Maximum Equipment Inlet Temperatures 38

CRAC #3 Failed CRAC Unit Cooling Demands 39

CRAC #4 Failed Top of Rack Temperatures 40

CRAC #4 Failed Maximum Equipment Inlet Temperatures 41

CRAC #4 Failed CRAC Unit Cooling Demands 42

CRAC #5 Failed Top of Rack Temperatures 43

CRAC #5 Failed Maximum Equipment Inlet Temperatures 44

CRAC #5 Failed CRAC Unit Cooling Demands 45

CRAC #6 Failed Top of Rack Temperatures 46

CRAC #6 Failed Maximum Equipment Inlet Temperatures 47

CRAC #6 Failed CRAC Unit Cooling Demands 48

CRAC #7 Failed Top of Rack Temperatures 49

CRAC #7 Failed Maximum Equipment Inlet Temperatures 50

CRAC #7 Failed CRAC Unit Cooling Demands 51

CRAC #8 Failed Top of Rack Temperatures 52

CRAC #8 Failed Maximum Equipment Inlet Temperatures 53

CRAC #8 Failed CRAC Unit Cooling Demands 54

CRAC #9 Failed Top of Rack Temperatures The failure of this unit has the most significant effect in the data center. Several equipment racks have exceeded the recommended maximum intake temperatures due to the elevated ambient temperatures in this area of the room. Due to the air patterns in the room the adjacent CRAC units cannot compensate for the loss of this unit. 55

CRAC #9 Failed Maximum Equipment Inlet Temperatures 56

CRAC #9 Failed CRAC Unit Cooling Demands 57

CRAC #10 Failed Top of Rack Temperatures 58

CRAC #10 Failed Maximum Equipment Inlet Temperatures 59

CRAC #10 Failed CRAC Unit Cooling Demands 60

CRAC #12 Failed Top of Rack Temperatures 61

CRAC #12 Failed Maximum Equipment Inlet Temperatures 62

CRAC #12 Failed CRAC Unit Cooling Demands 63

CRAC #13 Failed Top of Rack Temperatures 64

CRAC #13 Failed Maximum Equipment Inlet Temperatures 65

CRAC #13 Failed CRAC Unit Cooling Demands 66

CRAC #9 Supplement Top of Rack Temperatures A supplemental unit is added in the area of CRAC #9 to compensate for the loss of that unit. While the new unit does help it does not alleviate all of the potential problems. Plenum extensions were added to selected units to help reduce the short cycling that was encountered due to perforated tiles being located in the hot aisles replacement to CRAC placement. 67

CRAC #9 Supplement Perforated Tile Flow Rates The recommendation to fill in all cable openings resulted in nearly doubling the overall air flow through the perforated tiles. 68

CRAC #9 Supplement Floor Void Pressures Filling the cable cutouts also made a significant improvement in the overall static pressures underfloor. 69

CRAC #9 Supplement Above Floor Air Patterns Hot air recirculation is still occurring in certain areas due to improper equipment layout. The management of hot air is the number one challenge in data centers of today. Proper hot/cold aisle configurations as well as double hot layouts for high density areas provide the most effective control of the hotter air. 70

Part IV Questions and Answers General Discussion Mark S. Evanko 888-704-1400 www.bruns-pak.com 71