BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations Friday, July 13, 2007
Agenda I II III IV High Density Computing Equipment and the Thermodynamic Evolution High Density Impacts to Data Center Costs Thermodynamic Model Impacts to the Data Center Designs/Retrofits Questions and Answers 2
Part I High Density Computing Equipment and the Thermodynamic Evolution 3
High Density Computing Equipment and the Thermodynamic Evolution The Data Center Solution Components (chant!) A. Computer Hardware B. Computer Software C. Telecommunications (Data/Tele) D. Facility Infrastructure The Recent Data Center History of the World The Early on Watts per sq. ft. Metric 1980/1990 The High Density Discussion/Announcement Associated with Blade Server Technology 2000 4
High Density Computing Equipment and the Thermodynamic Evolution The Data Center Facility Infrastructure Transformation Discussion/Evolution from Capacity (watts per sq. ft.) to Distribution (cfm/static) The Retraction of Uptime Institute Spring 2006 Tier Ratings The Evolution of Facility Infrastructure Reliability vs. Data Processing Uptime The Role of Thermodynamic (CFD) Modeling One Size/Standard Facility Infrastructure Metric Does Not Fit All! GREEN/LEED Considerations: A) EPA Draft April 23, 2007 Server and Data Center Efficiencies 5
High Density Computing Equipment and the Thermodynamic Evolution Data center facility infrastructure summary reliability rankings: Numerical Rankings Terminology Summary Definition (1) Unreliable Shared building power and cooling; no generator (2) Partially Isolated, Unreliable (3) Isolated Unreliable (4) Isolated Conditioned (5) Isolated Improved (6) Isolated, Mostly Reliable (7) Reliable (8) Reliable Redundant (9) Ultra-Reliable (10) State of the Art Dedicated power system; shared cooling system; unconditioned power; non-redundant air conditioning; no generator Dedicated power and cooling systems; unconditioned power; non-redundant dedicated air conditioning units; no generator Dedicated power and cooling systems; conditioned power; non-redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; non-redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; redundant dedicated A/C units; no generator Dedicated power and cooling systems; uninterruptible power system; redundant dedicated A/C units; generator Dedicated power and cooling systems; redundant UPS systems; redundant dedicated A/C units; redundant generators Redundant power train; redundant cooling system; redundant UPS systems; redundant dedicated A/C units; redundant generator systems; redundant fuel system Redundant power train; redundant cooling system, redundant UPS systems, redundant dedicated A/C units; redundant generator systems; redundant fuel system; site hardened for weather and geographic exposures; location minimizes exposure to jurisdictional closure from hazardous spill, terrorism, or similar risks. 6
High Density Computing Equipment and the Thermodynamic Evolution Tier I 99.671% Tier II 99.749% Tier III 99.982% Tier IV 99.995% 7
High Density Computing Equipment and the Thermodynamic Evolution A. Historical Data Center Facility Infrastructure Baseline Mainframe and mini computer equipment platforms Older technology DASD Tape and tape storage Gross sq. ft. sample densities: 20 to 45 watts/sq. ft. computer equipment 8 to 24 inch raised floor Level 7 reliability Building office space generally sufficient for structural load and/or simply modified 7 6 to 9 0 distance from raised floor to underside of suspended ceiling Buss/tag cable dams in underfloor 8
High Density Computing Equipment and the Thermodynamic Evolution B. The Evolution of the Server, the Blade Server, and the Super Server Stand alone Rack mounted Population growth U vs. 2U v. 3U Blade servers The density of servers per rack 1 kw to 24/41/72++ kw The super server solution announced 3000 lb. devices over a footprint of approximately 11 sq. ft. with a 30 kw demand and 67kBTU of heat rejection! Chilled water Potential announcements of computer equipment manufacturers of 50/60/70+ kw stand alone computer equipment devices. Announced 2007. 9
High Density Computing Equipment and the Thermodynamic Evolution C. The Facility Infrastructure Impact Potential. NOTE: Each client must have their short/long term computer equipment growth projection modeled independently. There are no standard guidelines. WARNING! 100, 400, 700, 1000, 2000+ watts per sq. ft. as interpreted from the computer equipment floor plan. 12 to 48 inch raised floor height. Level 7, 8, 9, 10 infrastructures depending on disaster recovery/mirroring plan. WARNING: Medical community with patient care applications. Structural loading exceeding 100, 300, 500 lbs. per sq. ft. Existing multi floor building structures do not have sufficient floor to underside of deck clearances. Short term problem avoided by the add of one in a 5000 sq. ft. data center. Electrical/mechanical capacity on floor EXCEEDS present loads however thermal problems cause shutdown. 10
High Density Computing Equipment and the Thermodynamic Evolution Rack Dissipation Air Side 7/12kW Nominal Impact Hot/Cold Aisles Impact Double Hot/Cold Aisles Air Side Stretch 4 22+ kw (cfd) Water Cooling Recirculation Warning: No Standard CFD Models Strongly Recommended 11
Part II High Density Impacts to Data Center Costs 12
High Density Impacts to Data Center One of the Most Dangerous/Variable Factors in Design/Building a Data Center in 2006 and Beyond 1980/1990 Rules of Thumb in cost per sq. ft. Danger Same sq. ft. Data Center ie: 5,000 sq. ft. vary density, reliability, location, long term growth projection, scalability $416 sq. ft. - $2,140 sq. ft. The Throw Away Data Center The Gartner Mission Costs The Scheduled Delivery? 13
High Density Impacts to Data Center Costs The Blade Server Summit Conference Discussions of Manufacturers Hardware (4/3/2/1)..Facilities? 14
High Density Impacts to Data Center A. Data center facility infrastructure cost impacts Physical size (See 1-5 year computer equipment plan CRITICAL) Electrical/mechanical capacity Reliability level (See 1-10 chart) Expandability Retrofit vs. New WARNING! Time allocated to complete the project Location in the United States or Canada Type of construction labor force Support Space Costs 15
High Density Impacts to Data Center Costs B. Summary data center facility infrastructure cost experiences: Numerical Ranking Size (sq.ft.) Cost/sq.ft. 10 15,000 $ 1,000 $ 7-9 15,000 $ 1,000 $ 5-7 15,000 $ 1,000 $ 1-5 15,000 $ 1,000 $ Office --------- $ NOTE: These cost experiences are not intended to be used as detailed budgets * Electrical Density Cost differences per mw. 16
High Density Impacts to Data Center Costs Schedule duration for typical projects: Item Area Description Duration 1 Evaluations To determine technical alternatives/costs/schedules associated with type of data center 10-14 weeks 2 Design/Engineering Detail drawings/specifications for the option selected in 1 8-22 weeks (excludes a building shell) 3 Permits For local authorities to review/approve Allow 4 weeks 4 General Construction A function of reliability, size, and location 10-26 weeks (excludes a building shell) 5 Pre-Purchase/Long Lead Time Equipment Warning based on co-location/ web hosting facilities Up to 46 weeks 6 Thermodynamic (CFD) Modeling Projects Stand alone base on information technology 10 16 weeks 17
High Density Impacts to Data Center Costs Project Team Client Client Project Team Project Director OEM Availability Installation Planning Hardware/Software Project Manager Design/Engineering Project Manager Construction Project Manager Communications OEM Hardware Support Architectural General Data Computer Equipment Planning Computer Equipment Migration Electrical Mechanical/Fire Protection Civil/Structural Electrical/Mechanical Voice Relocation/Move Software Planning CADD Manager Disaster Recovery 18
Part III Thermodynamic Model Impacts to the Data Center Designs/Retrofits 19
Base Model The Shell, Underfloor, and Raised Floor 20
Base Model The Shell, Floor, CRAC's, PDU's, and Equipment 21
Base Model Equipment Orientation The equipment orientation is shown as the intake side of the units being blue and the exhaust side being red. It can be seen that not all equipment is orientated in a hot and cold aisle configuration, and not all equipment within the same row is oriented in the same direction. There are some front to back oriented equipment racks as noted earlier. These racks will have a tendency to pull in hot exhaust air from the rack in front them. 22
Base Model Equipment Powers The equipment demand can be seen here in kw. 23
Base Model Floor Void Pressures The static pressure below the floor seems to be lacking in the majority of the data center. This is most likely resultant from the large unsealed cable cutouts throughout the raised floor area. The highest static pressure is seen in the corner of the data center with two (2) CRAC units operating with little load on them, and very few perforated tiles or cable cutouts. 24
Base Model Floor Void Temperatures The floor void temperature varies based upon CRAC unit operation, as would be expected. The observation here is that the higher temperatures can be associated with the lack of a demand on these units. Since little cooling is required, the units tend to warm the supply air or pass the relatively warm return air into the supply plenum. 25
Base Model Perforated Tile Flow Rates The airflow through most of the perforated tiles is relatively low though uniform. This can be attributed to the majority of the air escaping through the cable cutouts. 26
Base Model Top of Rack Temperatures The top of the equipment rack is typically where the highest intake temperatures are seen in the data center. This is a result of this part of the rack receiving less supply air from the underfloor plenum, the higher ambient temperatures at this level of the room, or the recirculation of hot air due to low ceiling heights. It can also be seen that some cold air is being returned to the CRAC units while warm air is being exhausted into the intake of other equipment. Some short cycling is occurring where perforated tile placement allows colder air to return to the CRAC units. 27
Base Model Maximum Equipment Inlet Temperatures The ASHRAE recommended inlet temperature for computer equipment is between 68 and 77 deg F. This temperature is exceeded in several areas shown as green and yellow. These temperatures however are still within the allowable ASHRAE range of 59 to 90 deg F. 28
Base Model Potential Equipment Overheat This indicates which racks may be at risk of thermal failures due to high intake temperatures. The specific rack configurations may negate such risk if no devices are installed in the top of the rack or if devices such as patch panels are installed at the top of the racks. 29
Base Model CRAC Unit Cooling Demands Units shown in red are working almost at their rated capacity while several others are operating below their rated capacity. 30
CRAC #1 Failed Top of Rack Temperatures The overall temperature does not change significantly from the failure of CRAC #1. This unit is located in the Network Room and the installed transfer fans adequately support the ventilation needs of the space. 31
CRAC #1 Failed Maximum Equipment Inlet Temperatures The inlet temperatures actually decrease during this failure. This is because the CRAC units in the data center work harder to compensate for the additional heat being transferred from the Network Room. 32
CRAC #1 Failed CRAC Unit Cooling Demands A comparison can be made against slide 30 to see the changes in unit operation. The CRAC units adjust to the new demands as the dynamics of the room change. 33
CRAC #2 Failed Top of Rack Temperatures 34
CRAC #2 Failed Maximum Equipment Inlet Temperatures 35
CRAC #2 Failed CRAC Unit Cooling Demands 36
CRAC #3 Failed Top of Rack Temperatures 37
CRAC #3 Failed Maximum Equipment Inlet Temperatures 38
CRAC #3 Failed CRAC Unit Cooling Demands 39
CRAC #4 Failed Top of Rack Temperatures 40
CRAC #4 Failed Maximum Equipment Inlet Temperatures 41
CRAC #4 Failed CRAC Unit Cooling Demands 42
CRAC #5 Failed Top of Rack Temperatures 43
CRAC #5 Failed Maximum Equipment Inlet Temperatures 44
CRAC #5 Failed CRAC Unit Cooling Demands 45
CRAC #6 Failed Top of Rack Temperatures 46
CRAC #6 Failed Maximum Equipment Inlet Temperatures 47
CRAC #6 Failed CRAC Unit Cooling Demands 48
CRAC #7 Failed Top of Rack Temperatures 49
CRAC #7 Failed Maximum Equipment Inlet Temperatures 50
CRAC #7 Failed CRAC Unit Cooling Demands 51
CRAC #8 Failed Top of Rack Temperatures 52
CRAC #8 Failed Maximum Equipment Inlet Temperatures 53
CRAC #8 Failed CRAC Unit Cooling Demands 54
CRAC #9 Failed Top of Rack Temperatures The failure of this unit has the most significant effect in the data center. Several equipment racks have exceeded the recommended maximum intake temperatures due to the elevated ambient temperatures in this area of the room. Due to the air patterns in the room the adjacent CRAC units cannot compensate for the loss of this unit. 55
CRAC #9 Failed Maximum Equipment Inlet Temperatures 56
CRAC #9 Failed CRAC Unit Cooling Demands 57
CRAC #10 Failed Top of Rack Temperatures 58
CRAC #10 Failed Maximum Equipment Inlet Temperatures 59
CRAC #10 Failed CRAC Unit Cooling Demands 60
CRAC #12 Failed Top of Rack Temperatures 61
CRAC #12 Failed Maximum Equipment Inlet Temperatures 62
CRAC #12 Failed CRAC Unit Cooling Demands 63
CRAC #13 Failed Top of Rack Temperatures 64
CRAC #13 Failed Maximum Equipment Inlet Temperatures 65
CRAC #13 Failed CRAC Unit Cooling Demands 66
CRAC #9 Supplement Top of Rack Temperatures A supplemental unit is added in the area of CRAC #9 to compensate for the loss of that unit. While the new unit does help it does not alleviate all of the potential problems. Plenum extensions were added to selected units to help reduce the short cycling that was encountered due to perforated tiles being located in the hot aisles replacement to CRAC placement. 67
CRAC #9 Supplement Perforated Tile Flow Rates The recommendation to fill in all cable openings resulted in nearly doubling the overall air flow through the perforated tiles. 68
CRAC #9 Supplement Floor Void Pressures Filling the cable cutouts also made a significant improvement in the overall static pressures underfloor. 69
CRAC #9 Supplement Above Floor Air Patterns Hot air recirculation is still occurring in certain areas due to improper equipment layout. The management of hot air is the number one challenge in data centers of today. Proper hot/cold aisle configurations as well as double hot layouts for high density areas provide the most effective control of the hotter air. 70
Part IV Questions and Answers General Discussion Mark S. Evanko 888-704-1400 www.bruns-pak.com 71