Improving Rack Cooling Performance Using Airflow Management Blanking Panels
|
|
|
- Ophelia Sparks
- 10 years ago
- Views:
Transcription
1 Improving Rack Cooling Performance Using Airflow Management Blanking Panels White Paper 44 Revision 4 by Neil Rasmussen > Executive summary Unused vertical space in open frame racks and rack enclosures creates an unrestricted recycling of hot air that causes equipment to heat up unnecessarily. The use of airflow management blanking can reduce this problem. This paper explains and quantifies the effects of airflow management blanking on cooling system performance. Contents Click on a section to jump to it Introduction 2 Exhaust air recirculation 2 Why aren t airflow management blanking commonly deployed Other contributing factors to improper airflow 3 6 Real world example 7 Conclusion 9 Resources 10 Appendix A 11 white papers are now part of the Schneider Electric white paper library produced by Schneider Electric s Data Center Science Center [email protected]
2 Introduction Information technology equipment mounted in racks cools itself by drawing ambient air from the data center or network room. If the heated exhaust air is allowed to return to the air inlet of the equipment, an undesirable overheating condition may occur. Data centers and network rooms should be designed to prevent equipment from drawing heated air, and this can be accomplished by widely used installation practices, or by using systems that are preengineered. Within the rack itself the possibility exists for hot exhaust air to be recycled into the equipment air intake. This mainly is caused when hot exhaust air returns above or below the equipment and back to the air intake. This phenomenon is not widely appreciated by users and is a primary cause of equipment overheating in actual data centers. This paper explains how this problem occurs, provides actual examples of the effect, and shows that this problem significantly compromises the cooling of equipment when it occurs. The benefit of using airflow management blanking to reduce the problem is explained and quantified. Exhaust air recirculation Overheating due to exhaust air recirculation and the benefits of using airflow management blanking are well recognized by IT equipment manufacturers. In fact, IT equipment manufacturers advise users of the need to use airflow management blanking. The following excerpt is taken from a Compaq server installation guide: Blanking Panels CAUTION: Always use blanking to fill empty vertical spaces in the rack to maintain proper airflow. Using a rack without blanking results in improper cooling that can lead to thermal damage. If any of the vertical space in the rack is not filled by components, the gaps between components cause a change in airflow through the rack and across the components. Cover these gaps with blanking to maintain proper airflow. An example of how the air flows in a typical rack is provided in Figure 1. In Figure 1A, the airflow without airflow management blanking installed is diagrammed. In Figure 1B, the installation of airflow management blanking changes the airflow. Figures 1A and 1B Diagrams of rack airflow showing effect of airflow management blanking 1A (left) Without airflow management blanking 1B (right) With airflow management blanking Side Side Blanking Panel Schneider Electric Data Center Science Center White Paper 44 Rev 4 2
3 Note that when recirculation creates an overheating condition, and this recirculation is not eliminated, then the only practical solution to the problem is to decrease the bulk air temperature supplied to the room to attempt to balance the effect. This reduces the efficiency of the air conditioning system, creates additional condensate (water) generation by the main air conditioning system, and creates a need for supplemental humidification. These consequences can result in significantly increased electricity costs, and they can make it uncomfortable for personnel in the data center. Why aren t airflow management blanking commonly deployed Airflow management blanking are not commonly deployed because of two main factors. The first factor is a lack of knowledge. Many misunderstand the role an airflow management panel plays within the rack itself. Some believe that blanking exist for aesthetic purposes only. This paper should serve to illustrate and clarify this issue. The second factor is the difficulty in installation. Legacy screw-in airflow management blanking require up to four screws, four bushings, and four cage nuts to install. This takes time and adds difficulty to the process of deploying a rack. Human error is a serious issue when installing screw-in airflow management blanking because small cage nuts, screws and are oftentimes dropped near production equipment leading to potential downtime. In addition, legacy screw-in airflow management blanking typically ship in kits of various U heights. For example, a kit might include 1, 2, 4, and 8U. This presents a challenge because, not only does the correct amount of total U height need to be available, but also the right combination of panel sizes to fill the required spaces. These factors can slow down what is often a time sensitive process when deploying or refreshing a data center. Having airflow management blanking that snap-in to any square-holed rack enclosure and install without tools significantly reduces the time and labor cost associated with installing. In addition, by standardizing on a panel size of 1U, racks can be populated easily, rather than dividing out empty spaces into various-sized of 1, 2, 4, and 8U. For instance, if there was a 3U space that needed to be filled in a rack, and only two 2U remained, the space could not be filled with material on hand. One would have to wait for an order of 1U to arrive before the installation could be completed. An example of a solution that meets these requirements is the Schneider Electric AR8136BLK airflow management panel, as seen in Figure 2a and 2b. Figure 2A Example of modular snap-in airflow management panel Schneider Electric Data Center Science Center White Paper 44 Rev 4 3
4 Figure 2B Snap-in feature of airflow management panel Consider the material and labor cost of installing airflow management blanking in a 100-rack data center, assuming an average of 10U of empty space in each rack (a total of 1000U of airflow management blanking ). Table 1 compares the cost of installing 1U snap-in with the cost of installing legacy screw-in of various sizes. The material cost savings are on the order of 41%, labor cost saving are 97% for a total cost savings of 48% when snap-in airflow management blanking are used. Schneider Electric Data Center Science Center White Paper 44 Rev 4 4
5 Table 1 Airflow management panel cost analysis for 100 rack data center 1U snap-in blanking Variety pack Legacy screw-in blanking 1U blanking 2U blanking 4U blanking 8U blanking Typical blanking panel cost per U Blanking panel cost for 1000U Average install time per blanking panel (seconds) Time to install 1000U of blanking (hours) Install cost for 1000U based on a labor rate of $25 / hour Material cost Total material cost for 1U snap-in blanking Total average material cost for screw-in blanking $4.00 $4.67 $12.00 $7.25 $6.13 $4.00 $4, $4, $12, $7, $6, $4, $29.76 $ $2, $1, $ $ $4, $6, Labor cost Total labor cost using 1U snap-in blanking Total average labor cost using screw-in blanking $29.76 $ Savings % material cost savings using 1U snap-in blanking % labor cost savings using 1U snap-in blanking 41.2% 96.7% On average, 1U snap-in blanking install 30 times faster than legacy screw-in Analysis assumptions 100 rack data center with an average of 10U space to fill per rack or 1000U of blanking required 3 minute install time for (42) 1U snap-in blanking 5 minute install time per legacy screw-in blanking panel Legacy blanking have 4 holes per panel Variety pack consists of 1U, 2U, 4U, and 8U blanking (1 of each) 67 kits are used for 1000U Schneider Electric Data Center Science Center White Paper 44 Rev 4 5
6 Other contributing factors to improper airflow The absence of airflow management blanking in unused rack space is only one path that permits exhaust air recirculation. There are also types of equipment that are mounted in a rack that permit hot exhaust air to return to the front of the rack. In addition, some rack designs do not inherently separate intake and exhaust air. The key contributing factors to air leakage and how to control them are summarized in Table 2. This list can serve as a checklist for audit of existing data centers, or for evaluating proposed data center and network room designs. Table 2 suggests that the selection of equipment such as racks and monitors should not be done without considering the rack airflow. Implementation of the control principles summarized in Table 2 is essential to assuring an optimal and reliable rack cooling system. Contributing factor Consequence Control / checklist Unused vertical rack space Open area permits hot exhaust air to return to equipment air intake resulting in overheating Use airflow management blanking in all unused rack locations Rack rails inset from side of enclosure Open side area permits hot exhaust air to return to equipment air intake resulting in overheating Do not use 23 in (584 mm) racks with rails set to 19 in (483 mm) Use racks which do not have open space between rail and side of enclosure Table 2 Monitors on shelves Open space around monitor permits hot exhaust air to return to equipment air intake resulting in overheating Use thin flip-top LCD monitors Obtain rack mount bezels for CRT monitors Factors contributing to overheating caused by air recirculation in the rack along with methods to control them Tower servers on shelves Open space around servers permits hot exhaust air to return to equipment air intake resulting in overheating Use rack mount servers Note: the power density of tower servers in the rack is very low which reduces the magnitude of this problem Vertical rack space used to pass cables from front to back of rack Open space around cables permits hot exhaust air to return to equipment air intake resulting in overheating Use airflow management blanking equipped with a flexible brush or shield that allow the cables to pass through and reduce air leakage Front or rear doors on the rack with restricted ventilation Airflow resistance of the doors creates a pressure gradient which amplifies all of the above effects Use fully perforated front and rear doors Do not use doors with glass, or doors with limited perforation Space between racks Open area permits hot exhaust air to return to equipment air intake resulting in overheating Bay racks together wherever possible Schneider Electric Data Center Science Center White Paper 44 Rev 4 6
7 Real world example The quantitative benefit of using airflow management blanking was evaluated by measuring the effect on an actual rack set up with servers in typical conditions. The conditions of this experiment are described in Appendix A. The reduction in temperature rise of the server inlet air resulting from the installation of an airflow management panel is shown in Figure 3. Figure 3 Effect of installation of airflow management panel on server air inlet temperature 90 F (32 C) 80 F (27 C) 95 F (35 C) Side 79 F (26 C) 73 F (23 C) 73 F (23 C) Side 3A (left) Blanking Panel Without airflow management blanking 83 F (28 C) 73 F (23 C) 3B (right) With airflow management blanking 72 F (22 C) 70 F (21 C) Server Inlet Temp 72 F (22 C) 70 F (21 C) Server Inlet Temp The summary of this data is presented in Table 3. The data shows that the coolest servers are located at the bottom of the rack and are unaffected by the use of the airflow management panel. The hottest server is located just above the unused and open vertical rack space and experiences an inlet temperature reduction of over 20 F (over 11 C) when the airflow management panel is installed. Table 3 Experimental data showing effect of airflow management blanking on server air inlet temperature Without airflow management blanking With airflow management blanking (same server) 95 F (35 C) Hottest server 70 F (21 C) Coolest server 73 F (23 C) 70 F (21 C) Temperature difference 22 F (12 C) 0 F (0 C) The controlled test example represents a case where a number of high density racks are located side-by-side in long rows. In practice, high racks with high power density are often located near racks with low power density, and frequently used in short rows. The temperature reduction effect of airflow management blanking is expected to be attenuated in these cases. To confirm this effect, temperature measurements were made in actual network rooms with rows of mixed power density and short rows. Server inlet air temperature reductions resulting from the use of airflow management blanking to cover adjacent unused vertical rack space were observed in all cases. Actual measured temperature reductions varied from 5 F to 15 F (2.8 C to 8.3 C). Schneider Electric Data Center Science Center White Paper 44 Rev 4 7
8 Understanding of the principle of air recirculation, when combined with the experimental results, suggest that the following general conclusions can be drawn: Under real-world conditions the use of airflow management blanking can reduce the operating temperature of IT equipment by 22 F (12 C). The benefit of using airflow management blanking is greatest for equipment located adjacent to and above the unused space covered by the airflow management panel. The use of airflow management blanking can reduce the incidents of overheating and hot spot problems experienced in data centers and network rooms. When airflow management blanking are added, the same server inlet air temperature can be obtained with a higher air conditioner discharge temperature; this leads to less dehumidification and higher air conditioner efficiency. The guidance provided by equipment manufacturers to utilize airflow management blanking is appropriate. > Sealing cable cutouts Cable cutouts in a raised floor environment cause significant unwanted air leakage and should be sealed. This lost air, known as bypass airflow, contributes to IT equipment hotspots, cooling inefficiencies, and increases infrastructure costs. Many sites ignore unsealed floor openings and believe that inadequate cooling capacity is the problem. As a result, additional cooling units are purchased to address the overheating. In fact, those supplementary units may not be needed. One alternative to minimize the cost of additional cooling capacity is to seal cable cutouts. The installation of raised floor grommets both seals the air leaks and increases static pressure under a raised floor. This improves cool air delivery through the perforated floor tiles Schneider Electric Data Center Science Center White Paper 44 Rev 4 8
9 Conclusion IT equipment in the rack environment can overheat if the heated exhaust air flows back into the air intake. There are a number of situations that can occur in a rack which permit or even encourage air recirculation and the resultant overheating. When a properly designed rack is used in conjunction with rack-mounted equipment, a primary cause of air recirculation is unoccupied rack space. The use of airflow management blanking to fill this unoccupied space eliminates the problem. This paper provides a checklist of items that should be considered when designing a new data center or network room, and can be used to perform an audit of an existing data center or network room. When these guidelines are followed, overheating due to recirculation can be significantly reduced, and the efficiency of the air conditioning system is improved. About the author Neil Rasmussen is a Senior VP of Innovation for Schneider Electric. He establishes the technology direction for the world s largest R&D budget devoted to power, cooling, and rack infrastructure for critical networks. Neil holds 19 patents related to high-efficiency and high-density data center power and cooling infrastructure, and has published over 50 white papers related to power and cooling systems, many published in more than 10 languages, most recently with a focus on the improvement of energy efficiency. He is an internationally recognized keynote speaker on the subject of highefficiency data centers. Neil is currently working to advance the science of high-efficiency, high-density, scalable data center infrastructure solutions and is a principal architect of the APC InfraStruXure system. Prior to founding APC in 1981, Neil received his bachelors and masters degrees from MIT in electrical engineering, where he did his thesis on the analysis of a 200MW power supply for a tokamak fusion reactor. From 1979 to 1981 he worked at MIT Lincoln Laboratories on flywheel energy storage systems and solar electric power systems. Schneider Electric Data Center Science Center White Paper 44 Rev 4 9
10 Resources Click on icon to link to resource White Paper Library whitepapers.apc.com TradeOff Tools tools.apc.com Contact us For feedback and comments about the content of this white paper: Data Center Science Center If you are a customer and have questions specific to your data center project: Contact your Schneider Electric representative Schneider Electric Data Center Science Center White Paper 44 Rev 4 10
11 Appendix A: description of experimental conditions The purpose of the experiment is to create an environment similar to an actual data center. The experiment was conducted in a single rack, using thirty 1U server simulators. Each 1U server simulator consists of an actual 1U server chassis with power supply and fans, but with the CPU motherboard replaced with a resistive load. Each simulated server load was set up to draw 150 Watts. Thirty server simulators were placed in a 42U APC NetShelter VX enclosure 42 inches (1067 mm) deep. The total load was 4.5kW. The server simulators were arranged such that a single 11 U space was left in approximately half way up the enclosure. Inlet temperatures were monitored at every 7 th U space starting at the 2 nd U and ending at the 41 st U. To model the presence of the experimental rack in a row of racks, it was assumed that every rack in the row would be identical and that the experimental rack would be located near to the center of a long row. The air source is assumed to be a uniform line of raised floor ventilated tiles in front of the rack. In this case, all horizontal air pressure gradient vectors between adjacent racks approximately cancel, and the lateral air motion between racks would be approximately zero. Furthermore, racks are assumed to exist in rows with alternating hot and cold isles. Therefore, the air pressure gradients between adjacent rows approximately cancel, and the lateral air motion between rows is approximately zero across a line midway between the rows. To simulate the previously described data center condition in the laboratory with a single rack, partitions were placed as shown in Figure A1. The partitions balance the air pressure gradients without the need to actually install and operate a large number of racks. The server inlet air temperatures were measured with an Agilent 34970A data logger using Type T thermocouples with a published accuracy of +/- 1.0 C. The thermocouples were mounted in the air 2 inches in front of the air intake grille. Bulk air temperature was measured at the partition inlet opening and at the partition exhaust opening shown in Figure A1. Top Test Partition Figure A1 Experimental setup The free air temperature at the inlet was 70 F (21 C) during the experiment. The bulk exhaust temperature was 95 F (35 C) during the experiment. Side Blanking Panel Exhaust Intake Schneider Electric Data Center Science Center White Paper 44 Rev 4 11
Improving Rack Cooling Performance Using Airflow Management Blanking Panels
Improving Rack Cooling Performance Using Airflow Management Blanking Panels By Neil Rasmussen White Paper #44 Revision 3 Executive Summary Unused vertical space in open frame racks and rack enclosures
How To Calculate Total Cooling Requirements For Data Center
Calculating Total Cooling Requirements for Data Centers White Paper 25 Revision 3 by Neil Rasmussen > Executive summary This document describes how to estimate heat output from information technology (IT)
Avoiding Costs from Oversizing Data Center and Network Room Infrastructure
Avoiding Costs from Oversizing Data Center and Network Room Infrastructure White Paper 37 Revision 6 by Neil Rasmussen > Executive summary The physical and power infrastructure of data centers and network
Avoiding Costs from Oversizing Data Center and Network Room Infrastructure
Avoiding Costs from Oversizing Data Center and Network Room Infrastructure White Paper 37 Revision 7 by Neil Rasmussen > Executive summary The physical infrastructure of data centers and network rooms
Strategies for Deploying Blade Servers in Existing Data Centers
Strategies for Deploying Blade Servers in Existing Data Centers By Neil Rasmussen White Paper #125 Revision 1 Executive Summary When blade servers are densely packed, they can exceed the power and cooling
Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms
Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms By Neil Rasmussen White Paper #49 Executive Summary Avoidable mistakes that are routinely made when installing cooling
Power and Cooling Guidelines for Deploying IT in Colocation Data Centers
Power and Cooling Guidelines for Deploying IT in Colocation Data Centers White Paper 173 Revision 0 by Paul Lin and Victor Avelar Executive summary Some prospective colocation data center tenants view
Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment
Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment By Peter Hannaford White Paper #42 Revision 1 Executive Summary High-density servers present a significant cooling challenge.
Rack Hygiene. Data Center White Paper. Executive Summary
Data Center White Paper Rack Hygiene April 14, 2011 Ed Eacueo Data Center Manager Executive Summary This paper describes the concept of Rack Hygiene, which positions the rack as an airflow management device,
Power and Cooling Capacity Management for Data Centers
Power and Cooling Capacity for Data Centers By Neil Rasmussen White Paper #150 Executive Summary High density IT equipment stresses the power density capability of modern data centers. Installation and
Cooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
Cooling Strategies for IT Wiring Closets and Small Rooms
Cooling Strategies for IT Wiring Closets and Small Rooms White Paper 68 Revision 1 by Neil Rasmussen and Brian Standley > Executive summary Cooling for IT wiring closets is rarely planned and typically
Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs
WHITE PAPER Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs By Lars Strong, P.E., Upsite Technologies, Inc. 505.798.000 upsite.com Reducing Room-Level
Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers
CABINETS: ENCLOSED THERMAL MOUNTING MANAGEMENT SYSTEMS WHITE PAPER Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers 800-834-4969 [email protected] www.chatsworth.com All products
A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture
A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture White Paper 129 Revision 1 by Neil Rasmussen > Executive summary Significant improvements in efficiency, power density,
Power and Cooling for Ultra-High Density Racks and Blade Servers
Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw
Calculating Space and Power Density Requirements for Data Centers
Calculating Space and Power Density Requirements for Data Centers White Paper 155 Revision 0 by Neil Rasmussen Executive summary The historic method of specifying data center power density using a single
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
Implementing Energy Efficient Data Centers
Implementing Energy Efficient Data Centers By Neil Rasmussen White Paper #114 Executive Summary Electricity usage costs have become an increasing fraction of the total cost of ownership (TCO) for data
Creating Order from Chaos in Data Centers and Server Rooms
Creating from in Data Centers and Server Rooms White Paper 119 Revision 1 by Dennis Bouley > Executive summary Data center professionals can rid themselves of messy racks, sub-standard under floor air
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
The Different Types of UPS Systems
Systems White Paper 1 Revision 6 by Neil Rasmussen Contents Click on a section to jump to it > Executive summary There is much confusion in the marketplace about the different types of UPS systems and
Calculating Total Cooling Requirements for Data Centers
Calculating Total Cooling Requirements for Data Centers By Neil Rasmussen White Paper #25 Revision 2 Executive Summary This document describes how to estimate heat output from Information Technology equipment
2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms
Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms Agenda Review of cooling challenge and strategies Solutions to deal with wiring closet cooling Opportunity and value Power
High-Efficiency AC Power Distribution for Data Centers
High-Efficiency AC Power Distribution for Data Centers White Paper 128 Revision 2 by Neil Rasmussen > Executive summary The use of 240 volt power distribution for data centers saves floor space, simplifies
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
Guidelines for Specification of Data Center Power Density
Guidelines for Specification of Data Center Power Density White Paper 120 Revision 1 by Neil Rasmussen > Executive summary Conventional methods for specifying data center density are ambiguous and misleading.
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
The Different Types of UPS Systems
White Paper 1 Revision 7 by Neil Rasmussen > Executive summary There is much confusion in the marketplace about the different types of UPS systems and their characteristics. Each of these UPS types is
How Row-based Data Center Cooling Works
How Row-based Data Center Cooling Works White Paper 208 Revision 0 by Paul Lin and Victor Avelar Executive summary Row-based data center cooling is normally regarded as a cold air supply architecture that
AC vs. DC Power Distribution for Data Centers
AC vs. DC Power Distribution for Data Centers White Paper 63 Revision 6 by Neil Rasmussen > Executive summary DC power distribution has been proposed as an alternative to AC power distribution in data
Re-examining the Suitability of the Raised Floor for Data Center Applications
Re-examining the Suitability of the Raised Floor for Data Center Applications By Neil Rasmussen White Paper #19 Revision 1 Executive Summary The circumstances that gave rise to the development and use
Electrical Efficiency Modeling of Data Centers
Electrical Efficiency Modeling of Data Centers By Neil Rasmussen White Paper 113 Executive Summary The conventional models used to estimate electrical efficiency of data centers are grossly inaccurate
Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM
Data Center Rack Systems Key to Business-Critical Continuity A White Paper from the Experts in Business-Critical Continuity TM Executive Summary At one time, data center rack enclosures and related equipment
Overload Protection in a Dual-Corded Data Center Environment
Overload Protection in a Dual-Corded Data Center Environment White Paper 206 Revision 0 by Neil Rasmussen Executive summary In a dual-corded environment, the loss of power on one path will cause the load
Choosing Close-Coupled IT Cooling Solutions
W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of
Data Center Projects: Project Management
Project Management White Paper 141 Revision 1 by Neil Rasmussen and Suzanne Niles > Executive summary In data center design/build projects, flaws in project management and coordination are a common but
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Small Data / Telecommunications Room on Slab Floor
Small Data / Telecommunications Room on Slab Floor Small Data / Telecommunications Room on Slab Floor During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
The Advantages of Row and Rack- Oriented Cooling Architectures for Data Centers
The Advantages of Row and Rack- Oriented Cooling Architectures for Data Centers By Kevin Dunlap Neil Rasmussen White Paper #130 Executive Summary Room cooling is an ineffective approach for next-generation
A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School
A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of
Cooling Strategies for IT Wiring Closets and Small Rooms
Cooling Strategies for IT Wiring Closets and Small Rooms By Neil Rasmussen Brian Standley White Paper #68 Executive Summary Cooling for IT wiring closets is rarely planned and typically only implemented
Rack Systems Solutions
Rack Systems Solutions Designed to simplify cable management and improve power distribution Engineered by APC by Schneider Electric for Dell, NetShelter S is the next-generation rack enclosure solution
Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential
TECHNICAL REPORT Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential By Lars Strong, P.E., Upsite Technologies, Inc. Bruce Long, Upsite Technologies, Inc. +1.888.982.7800 upsite.com
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
Rack Blanking Panels To Fill or Not to Fill
Rack Blanking Panels To Fill or Not to Fill A Dell Technical White Paper Dell Data Center Infrastructure David L. Moss Joyce F. Ruff Learn more at Dell.com/PowerEdge/Rack THIS WHITE PAPER IS FOR INFORMATIONAL
HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE
HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE Best practices for airflow management and cooling optimisation AIRFLOW MANAGEMENT COLD AND HOT AISLE CONTAINMENT DC ASSESSMENT SERVICES Achieving true 24/7/365
ADC-APC Integrated Cisco Data Center Solutions
Advanced Infrastructure Solution for Cisco Data Center 3.0 Ordering Guide ADC-APC Integrated Cisco Data Center Solutions The evolution of data centers is happening before our eyes. This is due to the
Electrical Efficiency Modeling for Data Centers
Electrical Efficiency Modeling for Data Centers By Neil Rasmussen White Paper #113 Revision 1 Executive Summary Conventional models for estimating electrical efficiency of data centers are grossly inaccurate
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
Dimensions (HxDxW) Total Cabinet Area 89.96x 39.69 x 24 in/ 2,285 x 1,008.17 x 612.77 mm 95.47x 48 x 32 in/ 2,425 x 1,219.2 x 812.
1 Compaq 10000 Series Racks The 10000 series racks are a family of 1000mm deep racks which are available in 22U, 36U, 42U and 47U 600mm widths. Compaq 10000 Series Racks provides a common, integrated platform
Data Centre Infrastructure Assessment
Data Centre Infrastructure Assessment Prepared for: ACME INC SAMPLE REPORT Attention: Mr DC Manager This report represents a generic sample data centre audit for a generic customer by Focus Group Technologies.
How to Meet 24 by Forever Cooling Demands of your Data Center
W h i t e P a p e r How to Meet 24 by Forever Cooling Demands of your Data Center Three critical aspects that are important to the operation of computer facilities are matching IT expectations with the
Virtual Data Centre Design A blueprint for success
Virtual Data Centre Design A blueprint for success IT has become the back bone of every business. Advances in computing have resulted in economies of scale, allowing large companies to integrate business
Cabinets 101: Configuring A Network Or Server Cabinet
Cabinets 101: Configuring A Network Or Server Cabinet North, South and Central America White Paper May 2012 www.commscope.com Contents Background Information 3 What is a network or server cabinet? 3 What
Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture
A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture By Neil Rasmussen White Paper #129 Executive Summary Significant improvements in efficiency, power density, power monitoring,
Airflow Simulation Solves Data Centre Cooling Problem
Airflow Simulation Solves Data Centre Cooling Problem The owner s initial design for a data centre in China utilized 40 equipment racks filled with blade servers spread out in three rows along the length
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
Data Center Projects: Project Management
Data Center Projects: Project Management White Paper 141 Revision 1 by Neil Rasmussen and Suzanne Niles > Executive summary In data center design/build projects, flaws in project management and coordination
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
Racks & Enclosures. Deliver. Design. Define. Three simple words that clearly describe how we approach a Data Center project.
Three simple words that clearly describe how we approach a Data Center project. Eclipse Racks & Enclosures DELIVERING THE RIGHT INFRASTRUCTURE FOR YOUR MISSION-CRITICAL NEEDS SERVER RACKS & NETWORKING
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
Access Server Rack Cabinet Compatibility Guide
Access Server Rack Cabinet Compatibility Guide A Guide to the Selection and Evaluation of Access Server Rack Cabinets for Compatibility and Use with Third Party Server Chassis Kalkenstraat 91-93 B-8800
Driving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Raised Floors vs Hard Floors for Data Center Applications
Raised Floors vs Hard Floors for Data Center Applications White Paper 19 Revision 3 by Neil Rasmussen Executive summary Raised floors were once a standard feature of data centers, but over time a steadily
APC APPLICATION NOTE #92
#92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center
HotLok Pass Through Blanking Panel
HotLok Pass Through Blanking Panel Part Numbers 1U - 10112 2U - 10113 Table of Contents Tools Required for Installation Page 2 Dimensions and Rack Requirements Page 2 Installation Procedures Page 3 Removal
Dynamic Power Variations in Data Centers and Network Rooms
Dynamic Power Variations in Data Centers and Network Rooms White Paper 43 Revision 3 by James Spitaels > Executive summary The power requirement required by data centers and network rooms varies on a minute
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
Great Lakes Data Room Case Study
Great Lakes Data Room Case Study WeRackYourWorld.com Problem: During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure in the network enclosure. After an equipment
Cabinet V800 V800 CABINETS
V800 Cabinet Siemon s V800 cabinets provide a robust, cost-effective enclosure solution that provides valuable Zero-U space on each side of the equipment rails for cable management, PDU mounting or connectivity
Electrical Efficiency Measurement for Data Centers
Electrical Efficiency Measurement for Data Centers White Paper 154 Revision 2 by Neil Rasmussen Contents Click on a section to jump to it > Executive summary A closer look at data center 3 Data center
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
Customized Server Cabinets
GOING GREEN WITH CUSTOMIZABLE CABINETS AND RACKS Designed to provide industry leading airflow, CyrusOne s cabinets and rack solutions promote optimal cooling and contribute to reduced power consumption.
Layer Zero. for the data center
Layer Zero for the data center Preparefor the Data centers are the foundations of enterprises, vital to the daily operation of the entire organization - they no longer simply store data. Virtualization,
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
Server Room Thermal Assessment
PREPARED FOR CUSTOMER Server Room Thermal Assessment Analysis of Server Room COMMERCIAL IN CONFIDENCE MAY 2011 Contents 1 Document Information... 3 2 Executive Summary... 4 2.1 Recommendation Summary...
Efficiency and Other Benefits of 208 Volt Over 120 Volt Input for IT Equipment
Efficiency and Other Benefits of 208 Volt Over 120 Volt Input for IT Equipment By Neil Rasmussen White Paper #27 Revision 2 Executive Summary Decisions made regarding the distribution of 208V or 120V power
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Specification of Modular Data Center Architecture
Specification of Modular Data Center Architecture White Paper 160 Revision 1 by Neil Rasmussen > Executive summary There is a growing consensus that conventional legacy data center design will be superseded
BRUNS-PAK Presents MARK S. EVANKO, Principal
BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations
Airflow Management Solutions
Data Center Products Airflow Management Solutions Switch N to Eaton. Computational Fluid Dynamics (CFD) model of an optimized data center illustrating cold-aisle isolation and Eaton s Heat Containment
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
Creating Order from Chaos in Data Centers and Server Rooms
Creating from in Data Centers and Server Rooms By Dennis Bouley White Paper #119 Executive Summary Data center professionals can rid themselves of messy racks, sub-standard under floor air distribution,
RACKMOUNT SOLUTIONS. 1/7/2013 AcoustiQuiet /UCoustic User Guide. The AcoustiQuiet /UCoustic User Guide provides useful
RACKMOUNT SOLUTIONS 1/7/2013 AcoustiQuiet /UCoustic User Guide The AcoustiQuiet /UCoustic User Guide provides useful hints, tips and advice to help maximize the thermal and acoustic properties of the cabinets.
DATA CENTER RACK SYSTEMS: KEY CONSIDERATIONS IN TODAY S HIGH-DENSITY ENVIRONMENTS WHITEPAPER
DATA CENTER RACK SYSTEMS: KEY CONSIDERATIONS IN TODAY S HIGH-DENSITY ENVIRONMENTS WHITEPAPER EXECUTIVE SUMMARY Data center racks were once viewed as simple platforms in which to neatly stack equipment.
RACK ENCLOSURES & ACCESSORIES
RACK ENCLOSURES & ACCESSORIES RACK ENCLOSURES NetshelterTM 1 NetShelter VX / NetShelter VS 2 Open Frame Racks 3 NetShelter ES 4 NetShelter WX Part Number Model Height Width Depth Color Application Doors
Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers
C P I P O W E R M A N A G E M E N T WHITE PAPER Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers By Anderson Hungria Sr. Product Manager of Power, Electronics & Software Published
Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360
Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy
