Typical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution White Paper July 2013 Yong Quiang Chi University of Leeds
Contents Background of the Work... 3 The Design of the Iceotope Liquid Cooled System... 3 3M HFE Liquid Properties... 5 Natural Convection Principle... 5 Main Advantage (Thermal)... 6 Other Benefits... 6 Disadvantages... 7 Project Overview... 7 Case Study... 9 System Configuration... 9 Energy Cost and Temperature... 11
Background of the work When a higher performance and more dense data centre is deployed, the temperature on computer components becomes critical and traditional air cooled methods are forced to push their limitations of releasing the heat from the data centre. Under such circumstances a more advanced liquid cooled encapsulation technology has been deployed at the University of Leeds, which enables increased performance and density but also resolves the cooling issue by way of liquid cooling. Liquid cooling is thousands of times more effective at removing heat than air and allows all components to run room neutral. This unique and patented technology is called the Iceotope Solution and is proudly designed, engineered and manufactured in the UK by Iceotope. Traditional air cooled data centres use chiller and air conditioning systems to gain lower temperature gages in order to maintain the chip temperature on the motherboard. With such a design an air cooled based data centre can achieve a higher computer density and performance yet fans and chillers amount for extra energy consumption thus resulting in lower power usage efficiency (PUE). To resolve the conflict between efficiency and performance in the data centre, liquid cooled solutions, similar to those deployed in the 1970 s, are being reintroduced and are starting to be seen more often in modern retrofit and new build data centres. Nowadays designs like backdoor water-cooled and waterblock solutions are seen more often in high efficiency data centres, which can improve the PUE down to 1.2 or lower. However, such types of liquid cooled solutions are still based on air-cooled design, and the reality is that there is still potential for liquid cooled designs to achieve even higher efficiencies. As a result, fully immersed liquid cooled solutions such as the liquid encapsulation solution offered by Iceotope have been introduced into the market to achieve the ultimate efficiency in data centre design. With the motherboard fully encapsulated inside the Iceotope Module, which contains a dielectric coolant, it does not require air to transport heat. The solution is also modular and scalable, making it one of the most successful liquid cooled designs available in the data centre market today. Iceotope Liquid Cooled System Design The design of the Iceotope Solution is based on the principle of liquid encapsulation using 3M Novec Engineered Fluid as the dielectric coolant medium. The coolant motion in the primary stage relies on density driven natural convection. In this stage the Novec is in direct contact with all electronic components and heat sinks and as such, extracts the heat from everything to the cold plate where the other side of the cold plate makes contact with a water jacket, which is cast into the design of the Iceotope Module. Figure 1 Iceotope Module (computer node) in detail
The rest of Iceotope Solution features a gravity fed water circulation system that is thermally coupled to the outside environment. This water system can be flexible to connect with other applications such as local and/or building water supply and also allows the option for heat reuse by way of hot water, which can be reused to heat the building or nearby facilities. The Iceotope installation at the University of Leeds reused the heat by way of two domestic radiators that heat a laboratory. Figure 2 Iceotope Platform (cabinet) Diagram Figure 3 Iceotope Solution Expanded View
3M HFE Liquid Properties The first problem facing fully immersed liquid cooled systems is the choice of the liquid agent, which directly contacts the electronic parts in computers. Although water is almost the best liquid for heat transfer it is also an electrical conductor and would not just damage but completely destroy the motherboard immediately when coming into contact with a live circuit board. Historically speaking some liquid cooled solutions have made use of di-electrical oils (i.e., mineral oils) to work as thermal agents. However, mineral oils are not only poor thermal agents but are also poor fluidity agents and as such the use of mineral oil based systems has remained in the concept stage for many years with little progress. In recent years 3M has developed a family of hydrofluoroether engineered liquids (Novec) that are suitable for electronic cleaning and thermal management. In 2009, Novec was tested by 3M and found to possess the required properties for computer liquid cool applications. The Iceotope Solution is currently the only liquid encapsulated system on the planet that has a market ready product utilising Novec in its design. Molecular Weight Boiling Point (1) Freeze Point (1) Density (2) SHC Thermal Conductivity Unit ( o C) ( o C) Kg/m 3 J/(kg K) W/(m K) 350 98-35 1660 1140 0.069 Dynamic Viscosity (2) Kinematic Viscosity (2) Thermal Expensive (2) Unit Pa s cst 1/K 0.001176 0.71 0.0145 Table 1 3M Novec Engineered Fluid Properties (1) : At 760 mmhg (2) : At 25 o C Natural Convection Principle Natural convection is a flow mechanism, which usually occurs in a closed environment with a heating and cooling source. When the flow in the closed volume heats up by the surrounding source, the density of the heated fluid changes (usually less dense) would rise up to the top while the cooled fluid drops (usually heavier) to the button. This fluid motion usually forms a loop such that the cooled fluid feeds to the heat source and sustains the motion. Figure 4 Diagram of Density Driven Natural Convection
Also by the definition, a fluid with a higher thermal expansion ratio would have a stronger natural convection effect. Novec has quite a high thermal expansion factor of 0.0145 1/k, which is relatively high among general liquids. Main Advantage (Thermal) The primary goal of using a liquid cooled solution versus a traditional air cooled solution is the heat transfer efficiency of liquid. Due to the high conductivity properties that liquids usually possess, especially in a system without an air-to-liquid heat exchanger, a pure liquid-to-liquid heat exchanger should have much lower delta temperature. The typical properties of liquids versus air can be seen in the table below: Unit Water Air Novec Density Kg/m 3 997.0 1.1839 1660 SHC J/(kg K) 4181.8 1005 1140 Thermal Conductivity W/(m K) 0.6 0.025 0.069 KinematicViscosity (2) cst 0.89 0.01568 0.71 Thermal Expensive (2) 1/K 0.000257 0.0034 0.00145 Specific heat per unit volume J/( m 3 K) 4169255 1189.8 1892400 Table 2 Thermal Properties (1) : At 760 mmhg (2) : At 25 o C In fact gases also have a very low density (about 1/1000 of which air compares to water), however driving gas (air) flow over an object to take away a certain amount of heat would require pushing a larger volume in a higher speed rather than smaller volume and still be slower than liquid. Another advantage of using liquid as a thermal medium, particularly with those possessing a large thermal expansion ratio, is to use the density driven from natural convection to transport thermal energy rather force convection. As in previously stated the natural convection flow is driven by the density changes of the fluid via temperature difference, it is basically a true free cooling method, which can deliver heat energy over a distance without costing any extra energy. Generally speaking all fluids (gas or liquid) with thermal expansion abilities could offer natural convection, however the driving force of natural convection is usually a small force so the convection flow would be very slow. In such cases air is not ideal to deliver large amounts of heat in low speeds due to its small density, and for this reason Iceotope chose to use Novec in the design of their products. Other Benefits Along with the significant thermal advantage, there are other benefits to using liquid cooled systems. In the Iceotope design the primary coolant is contained in a static condition (state) with static sealing (encapsulation). So there are no motions or extra moving parts required in the primary stage of the system. The Iceotope Solution deployed at the University of Leeds (500kW) requires 64 pumps (N+N) in the server room with only 32 of them in active, compared to a similar air cooled solution which usually has more than 1,000 fans spinning, and the noise level o is almost zero.
Another benefit that liquid cooling can offer is that all the motherboards are contained within sealed containers to create fully controlled environments impervious to dust, humidity, pollution and vibration resulting in increased accessibility for the server room as well as the modules (computer nodes) themselves. A final benefit, unique to Iceotope s design, is the option for heat recapture and reuse. Disadvantages Although liquid cooled systems have obvious thermal benefits compares to traditional air cooled systems, the major disadvantages of most liquid cooled solutions on the market today are the sealing and leakage of liquid. In most fully immersed liquid cooled designs the primary cooling stage is forced convection, or in other words pumping liquid from the first stage. In the Iceotope Solution design motion is avoided in the primary loop and natural convection is utilised instead of pumping coolant /forced convection. As such the increased reliability of the Iceotope system improves the reliability in deployment and still manages to pump water through each module and the cabinet itself. Another disadvantage of most liquid cooled systems is the internal pressure inside the system. Consider a typical 42U rack that is approximately 2 meters tall, the water from the top to the bottom would have about 0.2 atm gage pressure. In fact with all the pumping force to accomplish the head loss of the system, the actual pressure inside a liquid cooled system can be up to +/- 0.5 atm gage. Although 0.5 atm is not a very high value in general industrial water circuit system, it is significantly higher than the air pressure of 0.0035 atm which a powerful 120mm turbine cooling fan can general (Delta PFB1212UHE, 252CFM, 45W at 5500RPM) Project Overview This project is based on the review of the Iceotope liquid cooled encapsulated computer server system as a data centre solution. It is the first of its kind and ready for the commercial market, which aims to provide high efficiency data centre solutions with lower PUE s and full time free cooling to data centre operators in any geographic location in the world. In 2012 the University of Leeds was the first user to fully deploy an Iceotope Solution beta system comprised of one cabinet with 11 computing modules + 2X2 PSU modules inside a university lab at the Mechanical Engineering Building. The project focus in this case was to compare between a liquid cooled solution and an identical air cooled solution, which is based on experimental data acquired from the Iceotope Solution versus the university's existing HPC data centre. It is important to note that the Iceotope Solution is a single standalone supercomputer, which is not fully populated, and the university's HPC data centre is a combination of air cooled, back door water-cooled, old servers and new servers, and as such it is not ideal to compare them directly. Real system HPC Centre in Leeds University Iceotope Solution in Leeds University Capacity (W) 480KW (240kW X2) 3KW Computer system Mixture of Sun, Dell and Intel SuperMicro CPU / GPU Intel / AMD / nvidia / Others Intel / AMD Rack cooling method 84kW (28kW X3) Airedale backdoor water cooled 170KW conventional air cooled Fully liquid cooled (Iceotope) External heat exhaust Airedale free cooling Passive air cooled Table 3 Basic Configuration of the HPC Server and Iceotope Solution at the University of Leeds
Figure 5 Photo of HPC Servers in the University of Leeds' Server Room Figure 6 Photo of Iceotope Solution in the University of Leeds' Lab
Case Study Assumptions have been made in during this project in order to scale up the Iceotope system and make fair comparisons. In this project the fabricated liquid cooled and air cooled systems are based on an identical systems, as noted in the table below. Fabricated System Air Cooled system Liquid Cooled System Capacity (W) 480KW (240kW X2) 480W (240kW X2) Computer system SuperMicro X9D series SuperMicro X9D series CPU / GPU Intel (E5-2620) Intel (E5-2620) Rack cooling method back-door water cooled (Airedale OnRak 28Kw) Fully liquid cooled (Iceotope) External heat exhaust Airedale free cooling Airedale free cooling Table 4 Fabricated System Configurations In the table above the Iceotope Solution has been scaled up to match the same size of the Leeds HPC system, assuming the use of the same motherboard and CPU solutions (SuperMicro X9D with Intel 2620) and the Airedale Ultima Compact Free Cool heat exhaust solution. With such assumptions the only physical difference in the 2 sides would only be the computer and cabinet cooling method, which eventually provides an idea of how much energy efficiency can be saved just by switching the air cooled portion of the data centre for a fully liquid cooled design. System Configuration The air cooled system in this project is based on a typical SuperMicro 2U rack design (CSE217HQ- R1K62MB) with 4 1U motherboards inside a 2U package. In such a configuration it has a typical N+N redundant PSU layout in each computer node, and 4 computer boards towards 1+1 PSU. Figure 7 SuperMicro 2027TR-HTRF+ 2U Rack System (CSE-217HQ-R1K62MB) Computer / PSU Node Layout
Figure 8 500kW Air Cooled Data Centre Layout with Chillers, Air Conditioning and Rack Level Backdoor Water-cooling The liquid cooled system is a typical liquid cooled system based on the Iceotope solution with 8 computer nodes and 2+2 PSU nodes in each rack level. Figure 9 Iceotope Liquid Cooled System Computer / PSU Node Layout
Figure 10 500kW Fully Immersed Liquid Cooled Data Centre Layout Although the air cooled and liquid cooled system has less in common in its layout, they have the same computer node to PSU ratio (4:1), which makes them have identical topology to each other in energy efficiency calculation. Figure 11 SuperMicro Air Cooled Rack Layout (left) Compared to Iceotope Liquid Cooled Rack Layout Energy Cost and Temperature The energy calculation of a typical 10kw rack in both an air cooled system and liquid cooled configuration is demonstrated in the table below, assuming both designs are identical to each other in computing level.
Server level IT load Rack level Total Facility Power Type Unit Load (kw) Number Air-cooled energy cost SuperMicro System Cooling Fans X9DTT / Intel Xeon E5-2620 220W 201Gflops 48 X 32 = 1536 Nidec V80E12BS2 23.4W 48 X 32 = 1536 Storage Intel SSD 330 0.85W 48 X 32 = 1536 PSU Super Micro PWS1K62 PSU Fan Rack fan Nidec R40W12BGCA Airedale LogiCool OnRak LOR6042UC028-0 7% loss X880W=61.6W 24X32=768 384 active 15.8W 24X32=768 384 active Pump GrundFos Alpha 2L 45W 2X32=64 32 active Telco Equipment D-Link DGS-1210-48 PDU Avocent PM3000 3.5% loss X10.56kW =369.6W UPS CRACs Ventilation APC Symmetra PX 250kW Airedale Ultima Compact Chiller UCFC250D-8/2 Airedale AlpaCoolDF25A / CUS8.5 337.92kW 308Tflops 35.942kW 1.306kW 15.342kW 6.067kW Liquid-cool energy cost 337.92kW 308Tflops N/a 1.306kW 15.342kW 6.067kW Total load 396.3kW 360.4kW Gflops/W 0.777 0.855 161W 32 5.152kW N/a N/a 1.44kW 33.4W 2X32=64 2.138kW 2.138kW 4% loss X250kW =10kW 79.9kW (Chiller on) 7.824kW (Chiller off) 32 11.827kW 11.827kW component 19.117kW 15.405kW Total load 415.4kW 375.8kW 2 20kW 20kW 2 159.8kW N/a 2 N/a 16.848kW 880W 20 17.6kW N/a component 197.4kW 36.848kW Total load 612.8kW 412.7kW Total PUE 1.475 1.098 Gflops/W 0.503 0.746 Table 5 Energy Stack-up Data of an Air Cooled Data Centre Compared to a Liquid Cooled Data Centre Notice: The energy data from above is based on both systems running under full load condition. The table above demonstrates that the liquid cooled solution can achieve a PUE of 1.098 compared to an identical air cooled system, which can achieve a PUE of about 1.48. In actual fact at the IT level and
cabinet level of the liquid solution does not gain a great deal of improvement in energy efficiency, however at the building level when chillers and compressors can be switched off, the liquid cooled system can save a significant amount of energy. As a matter of fact most air conditioning and/or chillers have an EER (Energy Efficiency Rate) of 3 in the full load condition and this can equate to the ideal PUE of the chiller that is no more than 1+ERR / ERR = 1.333. Dr Jon Summer, Senior Lecturer at the University of Leeds, claims that the HPC data centre in the university requires approximately 100W cooling power to cool down a 250W system with a PUE of 1.4. The reason that liquid cooled system can avoid using chillers is that in a liquid encapsulated system, the liquid to liquid heat exchanger can have extremely low delta temperatures from onstage to another stage. Therefore it does not require chillers and/or compressors to bring down the temperature and create a larger delta temperature. Air-cooled system Liquid-cooled system medium Temperature medium Temperature in out In out Ambient Air 25 o C Air 25 o C Chiller R407c/ water 19 o C 13 o C Water 32 o C 28 o C Building water Water 13 o C 19 o C Water 28 o C 32 o C Ventilation Air / water 22 o C N/a Rack Air 25 o C 45 o C Water 32 o C 38 o C Computer node Air 60 o C Water/ 32 o C 38 o C HFE7300 55 o C CPU Air 75 o C HFE 7300 75 o C Max delta temperature 63 o C 50 o C Table 6 Temperature Stack-up Data of an Air Cooled Data Centre Compared to a Liquid Cooled Data Centre The table above demonstrates that the temperature related to each part of a data centre both in the air cooled solution and the liquid cooled solution. With the air cooled solution the maximum delta temperature is 13 o C greater than that of the liquid cooled solution, as a result it has to use chillers to generate that extra temperature gap.