Data Centers. Comparing Data Center & Computer Thermal Design

Size: px
Start display at page:

Download "Data Centers. Comparing Data Center & Computer Thermal Design"

Transcription

1 Data Centers Comparing Data Center & Computer Thermal Design By Michael K. Patterson, Ph.D., P.E., Member ASHRAE; Robin Steinbrecher; and Steve Montgomery, Ph.D. The design of cooling systems and thermal solutions for today s data centers and computers are handled by skilled mechanical engineers using advanced tools and methods. The engineers work in two different areas: those who are responsible for designing cooling for computers and servers and those who design data center cooling. Unfortunately, a lack of understanding exists about each other s methods and design goals. This can lead to non-optimal designs and problems in creating a successful, reliable, energyefficient data processing environment. This article works to bridge this gap and provide insight into the parameters each engineer works with and the optimizations they go through. A basic understanding of each role will help their counterpart in their designs, be it a data center, or a server. Server Design Focus Thermal architects are given a range of information to begin designing the thermal solution. They know the thermal design power (TDP) and temperature specifications of each component (typically junction temperature, T J, or case temperature T C ). Using a processor as an example, Figure 1 shows a typical component assembly. The processor is specified with a maximum case temperature, T C, which is used for design purposes. In this example, the design parameters are TDP = 103 W and T C = 72 C. Given an ambient temperature specification (T A ) = 35 C, the required thermal resistance of this example would need to be equal to or lower than: CA, required = (T C T A )/TDP = 0.36 C/W (1) Sometimes this value of CA is not feasible. One option to relieve the demands of a thermal solution with a lower thermal resistance is a higher T C. Unfortunately, the trend for T C continues to decline. Reductions in T C result in higher performance, better reliability, and less power used. Those advantages are worth obtaining, making the thermal challenge greater. One of the first parameters discussed by the data center designer is the temperature rise for the servers, but this value is a secondary consideration, at best, in the server design. As seen by Equation 1, no consideration is given to chassis temperature rise. The thermal design is driven by maintaining component temperatures within specifications. The primary parameters being T c, T ambient, and CA, actual. The actual thermal resistance of the solution is driven by component selection, material, configuration, and airflow volumes. Usually, the only time that chassis T RISE About the Authors Michael K. Patterson, Ph.D., P.E., is thermal research engineer, platform initiatives and pathfinding, at Intel s Digital Enterprise Group in Hillsboro, Ore. Robin Steinbrecher is staff thermal architect with Intel s Server Products Group in DuPont, Wash. Steve Montgomery, Ph.D., is senior thermal architect at Intel s Power and Thermal Technologies Lab, Digital Enterprise Group, DuPont, Wash. 3 8 A S H R A E J o u r n a l a s h r a e. o r g A p r i l

2 The engineers work in two different areas: those who are responsible for designing cooling for computers and servers and those who design data center cooling. Unfortunately, a lack of understanding exists about each other s methods and design goals. is calculated is to ensure that exhaust temperatures stay within safety guidelines. In addition to TDP and T C, the engineer has several other targets, including: Cost: Servers are sold into very competitive markets and cost is a critical consideration. Today s budget for thermal solutions in servers is typically in the range of $50 to $75, depending on the number of processors and features. It is desirable to minimize this cost. Weight: Current aluminum and copper heat sinks continue to expand in size and surface area to augment heat transfer. The increased weight of the heat sinks is a serious issue as the processor package and motherboard must be made sufficiently robust to handle the resulting mechanical load. Volumetric: The space inside a server is extremely valuable, especially as more computing power and capabilities are added. Using this space for heat sinks and fans is not adding value for the customer. Power: The total power required for servers is increasing and driving changes to the data center infrastructure. The server fans can use up to 10% of the server power.reducing all power is a design goal. Many components to cool: Ideally, sizing air-movers to cool the highest power component would be sufficient to cool the remainder of the system. Unfortunately, this is rarely the case and additional fans, heat sinks, and ducting in the server often are required. Thermal Interface Material Reliability: Operational continuity is vital to the success of the data center, so server reliability receives significant focus. For the thermal solution, the items most likely to fail are air movers. These are typically redundant to provide for this increased reliability. Redundancy results in oversizing of air-mover capability for normal operation leading to further inefficiencies. Acoustics: The volume of air required to cool today s servers often creates a noise problem such that hearing protection may be required. The area of acoustics is important enough to describe further. Heatsink Processor Package Socket T ambient Figure 1: Thermal resistance of typical server thermal solution. T sink T case Server Thermal Acoustic Management As mentioned previously, the thermal engineer designing the cooling and control system must counterbalance the need to cool all components in a system with the necessity of meeting acoustics requirements. To achieve this, the server management (SM) monitors combinations of temperature sensors and component use to take action to maintain the server within specifications. Required air-mover speeds are determined through calculations performed by a baseboard management controller (BMC). The SM then acts to change the air-mover speeds to ensure that the components stay within specification. Consequently, the SM normally is driving a server to be as quiet as possible while maximizing performance by keeping component temperatures within, but not over, their limits. In some instances, SM enables a customer to choose performance over acoustics. In these cases, air movers are driven to levels to achieve the highest thermal performance prioritized over acoustics. Acoustics specifications for computing equipment are specified at ambient temperatures, typically 23 C ± 2 C (73 C ± 4 C). Above this range, it is desirable, but not required, to have a quiet system. As a result, some systems attempt to maintain the quietest possible operation as a competitive advantage. Others sacrifice acoustics to reduce cost through the elimination of elaborate SM systems. The data center designer must understand, as a result of these SM schemes, required airflow through a system is greatly reduced when room temperatures, or more specifically server inlet air temperatures, are held below 25 C (77 F). The temperature rise through a system may be relatively high as a result of that lower airflow. Typical systems are designed to deliver about 60% to 70% of their maximum flow in this lower inlet temperature environ- A p r i l A S H R A E J o u r n a l 3 9 ca

3 ment. Monitoring of temperature sensors is accomplished via on-die thermal diodes or discrete thermal sensors mounted on the printed circuit boards (PCBs). Component utilization monitoring is accomplished through activity measurement (e.g., memory throughput measurement by the chipset) or power measurement of individual voltage regulators. Either of these methods results in calculation of component or subsystem power. Data Center Design Focus The data center designer faces a similar list of criteria for the design of the center, starting with a set of requirements that drive the design. These include: Cost: The owner will have a set budget and the designer must create a system within the cost limits. Capital dollars are the primary metric. However, good designs also consider the operational cost of running the system needed to cool the data center. Combined, these comprise the total cost of ownership (TCO) for the cooling systems. Equipment list: The most detailed information would include a list of equipment in the space and how it will be racked together. This allows for a determination of total cooling load in the space, and the airflow volume and distribution in the space. Caution must be taken if the equipment list is used to develop the cooling load by summing up the total connected load. This leads to over-design. The connected load or maximum rating of the power supply is always greater than the maximum heat dissipation possible by the sum of the components. Obtaining the thermal load generated by the equipment from the supplier is the only accurate way of determining the cooling requirements. Unfortunately, the equipment list is not always available, and the designer will be given only a cooling load per unit area and will need to design the systems based upon this information. Sizing the cooling plant is straightforward when the total load is known, but the design of the air-handling system is not as simple. Performance: The owner will define the ultimate performance of the space, generally given in terms of ambient temperature and relative humidity. Beaty and Davidson 2 discusses typical values of the space conditions and how these relate to classes of data centers. Performance also includes values for airflow distribution, total cooling, and percent outdoor air. Reliability: The cooling system s reliability level is defined and factored into equipment selection and layout of distribution systems. The reliability of the data center cooling system requires an economic evaluation comparing the cost of the reliability vs. the cost of the potential interruptions to center operations. The servers protect themselves in the event of cooling failure. The reliability of the cooling system should not be justified based upon equipment protection. Data Center Background Experience in data center layout and configuration is helpful to the understanding of the design issues. Consider two cases at the limits of data center arrangement and cooling configuration: 1. A single rack in a room, and 2. A fully populated room, with racks side by side in multiple rows. Case 2 assumes a hot-aisle/cold-aisle rack configuration, where the cold aisle is the server airflow inlet side containing the perforated tiles. The hot aisle is the back-to-back server outlets, discharging the warm air into the room. The hot aisle/cold aisle is the most prevalent configuration as the arrangement prevents mixing of inlet cooling and warm return air. The most common airflow configuration of individual servers is front-to-back, working directly with the hot-aisle/cold-aisle concept, but it is not the only configuration. Consider the rack of servers in a data processing environment. Typically, these racks are 42U high, where 1U = 44.5 mm (1.75 in.) A U is a commonly used unit to define the height of electronics gear that can be rack mounted. The subject rack could hold 42 1U servers, or 10 4U servers, or other combinations of equipment, including power supplies, network hardware, and/or storage equipment. To consider the two limits, first take the described rack and place it by itself in a reasonably sized space with some cooling in place. The other limit occurs when this rack of equipment is placed in a data center where the rack is one of many similar racks in an aisle. The data center would have multiple aisles, generally configured front-to-front and back-to-back. Common Misconceptions A review of misconceptions illustrates the problems and challenges facing designers of data centers. During a recent design review of a data center cooling system, one of the engineers claimed that the servers were designed for a 20 C (36 F) T RISE, inlet to outlet air temperature. This is not the case. It is possible that there are servers that, when driven at a given airflow and dissipating their nominal amount of power, may generate a 20 C (36 F) T, but none were ever designed with that in mind. Recall the parameters that were discussed in the section on server design. Reducing CA can be accomplished by increasing airflow. However, this also has a negative effect. More powerful air movers increase cost, use more space, are louder, and consume more energy. Increasing airflow beyond the minimum required is not a desirable tactic. In fact, reducing the airflow as much as possible would be of benefit in the overall server design. However, nowhere in that optimization problem is T across the server considered. Assuming a simple T RISE leads to another set of problems. This implies a fixed airflow rate. As discussed earlier, most servers monitor temperature at different locations in the system and modulate airflow to keep the components within desired temperature limits. For example, a server in a well designed data center, particularly if located low in the rack, will likely see a T A of 20 C (68 F) or less. However, the thermal solution in the server is normally designed to handle a T A of 35 C (95 F). If the inlet temperature is at the lower value, the case temperature will be lower. Then, much less airflow is required, and if variable flow capability is built into the server, it will run quieter and consume less power. The server airflow 4 0 A S H R A E J o u r n a l a s h r a e. o r g A p r i l

4 (and hence T RISE ) will vary between the T A = 20 C (68 F) and 35 C (95 F) cases, a variation described in ASHRAE s Thermal Guideline for Data Processing Environments. The publication provides a detailed discussion of what data should be reported by the server manufacturer and in which configuration. Another misconception is that the airflow in the server exhaust must be maintained below the server ambient environmental specification. The outlet temperature of the server does not need to be below the allowed value for the environment (typically 35 C [95 F]). Design Decisions To understand the problems that can arise if the server design process is not fully understood, revisit the two cases introduced earlier. Consider the fully loaded rack in a space with no other equipment. If sufficient cooling is available in the room, the server thermal requirements likely will be satisfied. The servers will pull the required amount of air to cool them, primarily from the raised floor distribution, but if needed, from the sides and above the server as well. It is reasonable to assume the room is well mixed by the server and room distribution airflow. There likely will be some variation of inlet temperature from the bottom of the rack to the top but if sufficient space exists around the servers it is most likely not a concern. In this situation, not having the detailed server thermal report, as described in Reference 3, may not be problematic. At the other limit, a rack is placed in a space that is fully populated with other server racks in a row. Another row sits across the cold aisle facing this row as well as another sitting back-to-back on the hot-aisle side. The space covered by the single rack unit and its associated cold-aisle and hot-aisle floor space often is called a work cell and generally covers a 1.5 m 2 (16 ft 2 ) area. The 0.6 m 0.6 m (2 ft 2 ft) perforated tile in the front, the area covered by the rack (~0.6 m 1.3 m [~ 2 ft 4.25 ft]) and the remaining uncovered solid floor tile in the hot-aisle side. Consider the airflow in and around the work cell. Each work cell needs to be able to exist as a stand-alone thermal zone. The airflow provided to the zone comes from the perforated tile, travels through the servers, and exhausts out the top-back of the work cell where the hot aisle returns the warm air to the inlet of the room air handlers. The work cell cannot bring air into the front of the servers from the side as this would be removing air from another work cell and shorting that zone. No air should come in from the top either as that will bring air at a temperature well above the desired ambient and possibly above the specification value for T A (typically 35 C [95 F]). Based on this concept of the work cell it is clear that designers must know the airflow through the servers or else they will not be able to adequately size the flow rate per floor tile. Conversely, Figure 2: The work cell is shown in orange. if the airflow is not adequate, the server airflow will recirculate, causing problems for servers being fed the warmer air. If the design basis of the data center includes the airflow rates of the servers, certain design decisions are needed. First, the design must provide enough total cooling capacity for the peak, matching the central plant to the load. Another question is at what temperature to deliver the supply air. Lowering this temperature can reduce the required fan size in the room cooling unit but also can be problematic, as the system, particularly in a high density data center, must provide the minimum (or nominal) airflow to all of the work cells. A variant of this strategy is that of increasing the T. Doing this allows a lower airflow rate to give the same total cooling capability. This will yield lower capital costs but if the airflow rate is too low, increasing the T will cause recirculation. Also, if the temperature is too low, comfort and ergonomic issues could arise. If the supplier has provided the right data, another decision must be made. Should the system provide enough for the peak airflow, or just the typical? The peak airflow rate will occur when T A = 35 C (95 F) and the typical when T A = 20 ~ 25 C (68 F ~ 77 F). Sizing the air-distribution equipment at the peak flow will result in a robust design with flexibility, but at a high cost. Another complication in sizing for the peak flow, particularly in dense data centers, is that it may prove difficult to move this airflow through the raised floor tiles, causing an imbalance or increased leakage elsewhere. Care must be taken to ensure the raised floor is of sufficient height and an appropriate design for the higher airflows. If the nominal airflow rate is used as the design point, the design, installation, and operation (including floor tile selection for balancing the distribution) must be correct for the proper operation of the data center, but a cost savings potential exists. It is essential to perform some level of modeling to determine the right airflow. In this design, any time the servers ramp up to their peak airflow rate, the racks will be recirculating warm air from the hot aisle to feed some server inlets. This occurs because the work cell has to satisfy its own airflow needs (because its neighbors are also short of airflow) and, if the servers need more air, they will receive it by recirculating. Another way to visualize this is to consider the walls of symmetry around each work cell and recall that there is no flux across a symmetry boundary. The servers are designed to operate successfully at 35 C (95 F) inlet air temperatures so if the prevalence of this recirculation is not too great, the design should be successful. If the detailed equipment list is unknown when the data center is being designed, the airflow may be chosen based on historical airflows for similarly loaded racks in data centers of the same 4 1 A S H R A E J o u r n a l a s h r a e. o r g A p r i l

5 load and use patterns. It is important to ensure the owner is aware of the airflow assumptions made and any limits that the assumptions would place on equipment selection, particularly in light of the trend towards higher power density equipment. The airflow balancing and verification would then fall to a commissioning agent or the actual space owner. In either case, the airflow assumptions need to be made clear during the computer equipment installation and floor tile set up. Discussions with a leading facility engineering company in Europe provide an insight to an alternate design methodology when the equipment list is not available. A German engineering society standard on data center design requires a fixed value of 28 C at 1.8 m (82 F at 6 ft) above the raised floor. This includes the hot aisle and ensures that if sufficient airflow is provided Full Data Center to the room, all servers will be maintained below the upper temperature limits even if recirculation occurs. Using this approach, it is reasonable to calculate the total airflow in a new design by assuming an inlet temperature of 20 C (68 F) (low end of Thermal Guidelines) and a discharge temperature of 35 C (95 F) (maximum inlet temperature that should be fed Figure 3: Rack recirculation problem. to a server through recirculation) and the total cooling load of the room. A detailed design of the distribution still is required to ensure adequate airflow at all server cold aisles. The Solution The link for information and what is needed for successful design is well defined in Thermal Guidelines. Unfortunately, it is only now becoming part of server manufacturers vocabulary. The data center designer needs average and peak heat loads and airflows from the equipment. The best option is to obtain the information from the supplier. While testing is possible, particularly if the owner already has a data center with similar equipment, this is not a straightforward process as the server inlet temperatures and workload can affect the airflow rate. Thermal Guidelines provides information about airflow measurement techniques. The methodology of the German standard also can be used, recognizing recirculation as a potential reality of the design and ensuring discharge temperatures are low enough to support continued computer operation. Finally, the worst but all-toocommon way is to use a historical value for T and calculate a cfm/kw based on the historical value. In any case, the total heat load of the room and the airflow need to be carefully considered to ensure a successful design. < > Temperature, C Effecting Change The use of Thermal Guidelines has not been adopted yet by all server manufacturers. The level of thermal information provided from the same manufacturer can even vary from product to product. During a recent specification review of several different servers, one company provided extensive airflow information, both nominal and peak, for their 1U server but gave no information on airflow for their 4U server in the same product line. If data center operators and designers could convince their information technology sourcing managers to only buy servers that follow Thermal Guidelines (providing the needed information) the situation would rectify itself quickly. Obviously, that is not likely to happen, nor should it. On the other hand, those who own the problem of making the data center cooling work would help themselves by pointing out to the procurement decision-makers that they can have only a high degree of confidence in their data center designs for those servers that adhere to the new publication. As more customers ask for the information, more equipment suppliers will provide it. Summary The information discussed here is intended to assist data center designers in understanding the process by which the thermal solution in the server is developed. Conversely, the server thermal architect can benefit from an understanding of the challenges in building a high density data center. Over time, equipment manufacturers will continue to make better use of Thermal Guidelines, which ultimately will allow more servers to be used in the data centers with better use of this expensive and scarce space. References 1. Processor Spec Finder, Intel Xeon Processors. PkgType=ALL&SysBusSpd=ALL&CorSpd=ALL. 2. Beaty, D. and T. Davidson New guideline for data center cooling. ASHRAE Journal 45(12): TC Thermal Guidelines for Data Processing Environments. ASHRAE Special Publications. 4. Koplin, E.C Data center cooling. ASHRAE Journal 45(3): Rouhana, H Personal communication. Mechanical Engineer, M+W Zander Mission Critical Facilities, Stuttgart, Germany, November Verein Deutscher Ingenieure, VDI Raumlufttechnische Anlagen für Datenverarbeitung September. 4 2 A S H R A E J o u r n a l a s h r a e. o r g A p r i l

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Dealing with Thermal Issues in Data Center Universal Aisle Containment Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe Daniele.Tordin@Panduit.com AGENDA Business Drivers Challenges

More information

Intel Atom Processor E3800 Product Family

Intel Atom Processor E3800 Product Family Intel Atom Processor E3800 Product Family Thermal Design Guide October 2013 Document Number: 329645-001 Legal Lines and Disclaimers INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS.

More information

Great Lakes Data Room Case Study

Great Lakes Data Room Case Study Great Lakes Data Room Case Study WeRackYourWorld.com Problem: During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure in the network enclosure. After an equipment

More information

Using CFD for optimal thermal management and cooling design in data centers

Using CFD for optimal thermal management and cooling design in data centers www.siemens.com/datacenters Using CFD for optimal thermal management and cooling design in data centers Introduction As the power density of IT equipment within a rack increases and energy costs rise,

More information

Element D Services Heating, Ventilating, and Air Conditioning

Element D Services Heating, Ventilating, and Air Conditioning PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer

More information

- White Paper - Data Centre Cooling. Best Practice

- White Paper - Data Centre Cooling. Best Practice - White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH

More information

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white

More information

Elements of Energy Efficiency in Data Centre Cooling Architecture

Elements of Energy Efficiency in Data Centre Cooling Architecture Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total

More information

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:

More information

How To Improve Energy Efficiency Through Raising Inlet Temperatures

How To Improve Energy Efficiency Through Raising Inlet Temperatures Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD

More information

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no

More information

Increasing Energ y Efficiency In Data Centers

Increasing Energ y Efficiency In Data Centers The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes

More information

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich

More information

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200

More information

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs Intel Intelligent Management How High Temperature Data Centers and Intel Technologies Decrease Operating Costs and cooling savings through the use of Intel s Platforms and Intelligent Management features

More information

Thinking Outside the Box Server Design for Data Center Optimization

Thinking Outside the Box Server Design for Data Center Optimization Thinking Outside the Box Server Design for Data Center Optimization Marie Ross, Senior Applications Engineer Tom Gregory, Consultant Engineer October 2013 The benefits of analyzing thermal performance

More information

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200

More information

Measure Server delta- T using AUDIT- BUDDY

Measure Server delta- T using AUDIT- BUDDY Measure Server delta- T using AUDIT- BUDDY The ideal tool to facilitate data driven airflow management Executive Summary : In many of today s data centers, a significant amount of cold air is wasted because

More information

Choosing Close-Coupled IT Cooling Solutions

Choosing Close-Coupled IT Cooling Solutions W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of

More information

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Cooling Small Server Rooms Can Be Inexpensive, Efficient and Easy - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Server Rooms Description & Heat Problem Trends

More information

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation

More information

Energy Performance Optimization of Server Room HVAC System

Energy Performance Optimization of Server Room HVAC System International Journal of Thermal Technologies E-ISSN 2277 4114 2 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijtt/ Research Article Manoj Jadhav * and Pramod Chaudhari Department

More information

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National

More information

Improving Data Center Energy Efficiency Through Environmental Optimization

Improving Data Center Energy Efficiency Through Environmental Optimization Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic

More information

APC APPLICATION NOTE #112

APC APPLICATION NOTE #112 #112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.

More information

Analysis of the UNH Data Center Using CFD Modeling

Analysis of the UNH Data Center Using CFD Modeling Applied Math Modeling White Paper Analysis of the UNH Data Center Using CFD Modeling By Jamie Bemis, Dana Etherington, and Mike Osienski, Department of Mechanical Engineering, University of New Hampshire,

More information

APC APPLICATION NOTE #92

APC APPLICATION NOTE #92 #92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center

More information

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.

More information

Thermal Mass Availability for Cooling Data Centers during Power Shutdown

Thermal Mass Availability for Cooling Data Centers during Power Shutdown 2010 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ashrae.org). Published in ASHRAE Transactions (2010, vol 116, part 2). For personal use only. Additional reproduction,

More information

BICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

BICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment. BICSInews m a g a z i n e july/august 2012 Volume 33, Number 4 plus + Industrial-grade Infrastructure and Equipment + I Say Bonding, You Say Grounding + Data Center Containment & design deployment The

More information

How To Run A Data Center Efficiently

How To Run A Data Center Efficiently A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly

More information

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency

More information

Reducing Data Center Energy Consumption

Reducing Data Center Energy Consumption Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate

More information

Data Center Components Overview

Data Center Components Overview Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room

More information

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room

More information

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue

More information

Rittal Liquid Cooling Series

Rittal Liquid Cooling Series Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel

More information

Using Simulation to Improve Data Center Efficiency

Using Simulation to Improve Data Center Efficiency A WHITE PAPER FROM FUTURE FACILITIES INCORPORATED Using Simulation to Improve Data Center Efficiency Cooling Path Management for maximizing cooling system efficiency without sacrificing equipment resilience

More information

Benefits of. Air Flow Management. Data Center

Benefits of. Air Flow Management. Data Center Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment

More information

The New Data Center Cooling Paradigm The Tiered Approach

The New Data Center Cooling Paradigm The Tiered Approach Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,

More information

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,

More information

Power and Cooling for Ultra-High Density Racks and Blade Servers

Power and Cooling for Ultra-High Density Racks and Blade Servers Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw

More information

The Benefits of Supply Air Temperature Control in the Data Centre

The Benefits of Supply Air Temperature Control in the Data Centre Executive Summary: Controlling the temperature in a data centre is critical to achieving maximum uptime and efficiency, but is it being controlled in the correct place? Whilst data centre layouts have

More information

Data Center Environments

Data Center Environments Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of

More information

USE OF FLOW NETWORK MODELING (FNM) FOR THE DESIGN OF AIR-COOLED SERVERS

USE OF FLOW NETWORK MODELING (FNM) FOR THE DESIGN OF AIR-COOLED SERVERS USE OF FLOW NETWORK MODELING (FNM) FOR THE DESIGN OF AIR-COOLED SERVERS Robin Steinbrecher Intel Corporation 2800 Center Drive Dupont, WA 98327 robin.steinbrecher@intel.com Amir Radmehr, Kanchan M. Kelkar,

More information

Benefits of Cold Aisle Containment During Cooling Failure

Benefits of Cold Aisle Containment During Cooling Failure Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business

More information

Utilizing Temperature Monitoring to Increase Datacenter Cooling Efficiency

Utilizing Temperature Monitoring to Increase Datacenter Cooling Efficiency WHITE PAPER Utilizing Temperature Monitoring to Increase Datacenter Cooling Efficiency 20A Dunklee Road Bow, NH 03304 USA 2007. All rights reserved. Sensatronics is a registered trademark of. 1 Abstract

More information

Data Center Temperature Rise During a Cooling System Outage

Data Center Temperature Rise During a Cooling System Outage Data Center Temperature Rise During a Cooling System Outage White Paper 179 Revision 1 By Paul Lin Simon Zhang Jim VanGilder > Executive summary The data center architecture and its IT load significantly

More information

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Prediction Is Better Than Cure CFD Simulation For Data Center Operation. Prediction Is Better Than Cure CFD Simulation For Data Center Operation. This paper was written to support/reflect a seminar presented at ASHRAE Winter meeting 2014, January 21 st, by, Future Facilities.

More information

Introducing AUDIT-BUDDY

Introducing AUDIT-BUDDY Introducing AUDIT-BUDDY Optimize the Data Center with AUDIT-BUDDY Executive Summary Proper temperature and humidity for the inlet air into the servers is essential to efficient data center operation. Several

More information

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.

More information

Introducing AUDIT- BUDDY

Introducing AUDIT- BUDDY Introducing AUDIT- BUDDY Monitoring Temperature and Humidity for Greater Data Center Efficiency 202 Worcester Street, Unit 5, North Grafton, MA 01536 www.purkaylabs.com info@purkaylabs.com 1.774.261.4444

More information

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs WHITE PAPER Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs By Lars Strong, P.E., Upsite Technologies, Inc. 505.798.000 upsite.com Reducing Room-Level

More information

Effect of Rack Server Population on Temperatures in Data Centers

Effect of Rack Server Population on Temperatures in Data Centers Effect of Rack Server Population on Temperatures in Data Centers Rajat Ghosh, Vikneshan Sundaralingam, Yogendra Joshi G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology, Atlanta,

More information

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models

Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models Unique Airflow Visualization Techniques for the Design and Validation of Above-Plenum Data Center CFD Models The MIT Faculty has made this article openly available. Please share how this access benefits

More information

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010 White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence

More information

BRUNS-PAK Presents MARK S. EVANKO, Principal

BRUNS-PAK Presents MARK S. EVANKO, Principal BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations

More information

Virtual Data Centre Design A blueprint for success

Virtual Data Centre Design A blueprint for success Virtual Data Centre Design A blueprint for success IT has become the back bone of every business. Advances in computing have resulted in economies of scale, allowing large companies to integrate business

More information

Using Simulation to Improve Data Center Efficiency

Using Simulation to Improve Data Center Efficiency A WHITE PAPER FROM FUTURE FACILITIES INCORPORATED Using Simulation to Improve Data Center Efficiency Cooling Path Management for maximizing cooling system efficiency without sacrificing equipment resilience

More information

Managing Data Centre Heat Issues

Managing Data Centre Heat Issues Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design

More information

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers CABINETS: ENCLOSED THERMAL MOUNTING MANAGEMENT SYSTEMS WHITE PAPER Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers 800-834-4969 techsupport@chatsworth.com www.chatsworth.com All products

More information

Chiller-less Facilities: They May Be Closer Than You Think

Chiller-less Facilities: They May Be Closer Than You Think Chiller-less Facilities: They May Be Closer Than You Think A Dell Technical White Paper Learn more at Dell.com/PowerEdge/Rack David Moss Jon Fitch Paul Artman THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES

More information

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory

More information

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer

More information

DataCenter 2020: first results for energy-optimization at existing data centers

DataCenter 2020: first results for energy-optimization at existing data centers DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction

More information

Data center TCO; a comparison of high-density and low-density spaces

Data center TCO; a comparison of high-density and low-density spaces White Paper Data center TCO; a comparison of high-density and low-density spaces M.K. Patterson, D.G. Costello, & P. F. Grimm Intel Corporation, Hillsboro, Oregon, USA M. Loeffler Intel Corporation, Santa

More information

HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.)

HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.) HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.) Felipe Visbal / March 27, 2013 HPPR Data Center Technologies Team Services Data Centers: Thermal Assessments

More information

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,

More information

Best Practices for Wire-free Environmental Monitoring in the Data Center

Best Practices for Wire-free Environmental Monitoring in the Data Center White Paper Best Practices for Wire-free Environmental Monitoring in the Data Center April 2012 Introduction Monitoring for environmental threats in the data center is not a new concept. Since the beginning

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power savings through the use of Intel s intelligent

More information

Technology Corporation

Technology Corporation 1 White Paper Meeting The Increased Demand For Efficient Computer Room Cooling Server Cooling Problems: An Overview As microprocessors and other electronic components in servers grow more powerful, they

More information

Data Center Temperature Rise During a Cooling System Outage

Data Center Temperature Rise During a Cooling System Outage Data Center Temperature Rise During a Cooling System Outage White Paper 179 Revision 0 By Paul Lin Simon Zhang Jim VanGilder > Executive summary The data center architecture and its IT load significantly

More information

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms Agenda Review of cooling challenge and strategies Solutions to deal with wiring closet cooling Opportunity and value Power

More information

Datacenter Efficiency

Datacenter Efficiency EXECUTIVE STRATEGY BRIEF Operating highly-efficient datacenters is imperative as more consumers and companies move to a cloud computing environment. With high energy costs and pressure to reduce carbon

More information

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Cooling Audit for Identifying Potential Cooling Problems in Data Centers Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous

More information

Greening Commercial Data Centres

Greening Commercial Data Centres Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling

More information

Data Center Power Consumption

Data Center Power Consumption Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling

More information

Environmental Data Center Management and Monitoring

Environmental Data Center Management and Monitoring 2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor

More information

The State of Data Center Cooling

The State of Data Center Cooling White Paper Digital Enterprise Group The State of Data Center Cooling A review of current air and liquid cooling solutions. The challenges of increased data center density elevate the importance of the

More information

Data Center Energy Consumption

Data Center Energy Consumption Data Center Energy Consumption The digital revolution is here, and data is taking over. Human existence is being condensed, chronicled, and calculated, one bit at a time, in our servers and tapes. From

More information

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager iseaton@chatsworth.com

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager iseaton@chatsworth.com C P I C O N TA I N M E N T S O L U T I O N S WHITE PAPER Extend the Life of Your Data Center By Ian Seaton Global Technology Manager iseaton@chatsworth.com Published August 2013 800-834-4969 techsupport@chatsworth.com

More information

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center

More information

Specialty Environment Design Mission Critical Facilities

Specialty Environment Design Mission Critical Facilities Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,

More information

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

AIR-SITE GROUP. White Paper. Green Equipment Room Practices AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider

More information

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

The Effect of Data Centre Environment on IT Reliability & Energy Consumption The Effect of Data Centre Environment on IT Reliability & Energy Consumption Steve Strutt EMEA Technical Work Group Member IBM The Green Grid EMEA Technical Forum 2011 Agenda History of IT environmental

More information

How To Improve Energy Efficiency In A Data Center

How To Improve Energy Efficiency In A Data Center Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice

More information

The Effect of Data Center Temperature on Energy Efficiency

The Effect of Data Center Temperature on Energy Efficiency The Effect of Data Center Temperature on Energy Efficiency Michael K Patterson Intel Corporation 2111 NE 25 th Avenue Hillsboro, Oregon, USA, 97124 Phone: (503) 712-3991 Email: michael.k.patterson@intel.com

More information

FNT EXPERT PAPER. // Data Center Efficiency AUTHOR. Using CFD to Optimize Cooling in Design and Operation. www.fntsoftware.com

FNT EXPERT PAPER. // Data Center Efficiency AUTHOR. Using CFD to Optimize Cooling in Design and Operation. www.fntsoftware.com FNT EXPERT PAPER AUTHOR Oliver Lindner Head of Business Line DCIM FNT GmbH // Data Center Efficiency Using CFD to Optimize Cooling in Design and Operation Energy is the biggest cost factor with the highest

More information

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of

More information

Server Platform Optimized for Data Centers

Server Platform Optimized for Data Centers Platform Optimized for Data Centers Franz-Josef Bathe Toshio Sugimoto Hideaki Maeda Teruhisa Taji Fujitsu began developing its industry-standard server series in the early 1990s under the name FM server

More information

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Improving Rack Cooling Performance Using Airflow Management Blanking Panels Improving Rack Cooling Performance Using Airflow Management Blanking Panels By Neil Rasmussen White Paper #44 Revision 3 Executive Summary Unused vertical space in open frame racks and rack enclosures

More information

In Row Cooling Options for High Density IT Applications

In Row Cooling Options for High Density IT Applications In Row Cooling Options for High Density IT Applications By Ramzi Y. Namek, PE, LEED AP, BD+C; Director of Engineering 1 Users wishing to deploy high density IT racks have several In Row Cooling solutions

More information

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers Power and Cooling Guidelines for Deploying IT in Colocation Data Centers White Paper 173 Revision 0 by Paul Lin and Victor Avelar Executive summary Some prospective colocation data center tenants view

More information

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack

More information

Improving Data Center Efficiency with Rack or Row Cooling Devices:

Improving Data Center Efficiency with Rack or Row Cooling Devices: Improving Data Center Efficiency with Rack or Row Cooling Devices: Results of Chill-Off 2 Comparative Testing Introduction In new data center designs, capacity provisioning for ever-higher power densities

More information

Office of the Government Chief Information Officer. Green Data Centre Practices

Office of the Government Chief Information Officer. Green Data Centre Practices Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain

More information

Managing Cooling Capacity & Redundancy In Data Centers Today

Managing Cooling Capacity & Redundancy In Data Centers Today Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS

More information

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY The Aegis Services Power and Assessment Service provides an assessment and analysis of your data center facility and critical physical

More information

Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center

Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center Sang-Woo Ham 1, Hye-Won Dong 1, Jae-Weon Jeong 1,* 1 Division of Architectural Engineering,

More information