The Thermal Bus Opportunity A Quantum Leap in Data Center Cooling Potential

Size: px
Start display at page:

Download "The Thermal Bus Opportunity A Quantum Leap in Data Center Cooling Potential"

Transcription

1 2005, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. ( Reprinted by permission from ASHRAE Transactions, Volume 111, Part 2. The material may not be copied nor distributed in either paper or digital form without ASHRAE s permission. DE The Thermal Bus Opportunity A Quantum Leap in Data Center Cooling Potential Paul L. Leonard, PE Member ASHRAE A.L. (Fred) Phillips ABSTRACT This paper describes an attainable means to provide the required cooling to server-based data equipment that not only satisfies the increasing heat load density in datacomm facilities but does so at a reduced infrastructure system first cost and reduced operating cost over the life of the facility. First, the paper discusses loop thermosyphon and thermal bus technology. In particular, the testing and development that have been accomplished to apply this technology at the server cabinet level. Next, the impact of integrating this technology into the datacomm facility infrastructure is examined. First cost and operating cost comparisons are presented, specifically for the facility infrastructure required to support conventional aircooled server equipment and the facility infrastructure required to support components cooled via a thermal bus architecture and a liquid-cooled strategy integral to the facility. The analysis concludes that applying a liquid-cooled thermal bus system to high-density electronic components not only results in less first cost and less operating cost for the facility but, by utilizing this system, a greater density of equipment deployment than air-cooled systems can be realized, thus further maximizing the benefit. 1. Datacomm facility is inclusive terminology used in this paper for a data center or communication facility or combination of both types of facilities. INTRODUCTION The transition of existing data centers has been well documented in recent years, from systems that were planned around mainframe equipment to today s datacomm 1 facility topologies that are densely deployed with air-cooled serverbased equipment organized in cabinets. Organization strategies that optimize raised floor cooling airflow distribution have been proposed (such as hot aisle/cold aisle) and are being implemented across the industry. Through the use of software tools such as computational fluid dynamics modeling below and above the raised floor, the support of conventional aircooled server designs is being pushed to their limit while the desire to densify the layout from an information technology perspective is growing. A path forward that allows a greater equipment deployment density in current legacy (or inherited) data centers is the application of loop thermosyphons and a cabinet level thermal bus heat rejection architecture concept that allows sealed cabinet server cooling and heat rejection outside the server cabinet via a liquid loop. Increasing power densities at the central processing unit (CPU) level already require the use of heat pipes and even pumped liquid coolant systems to maintain required operating temperatures. These thermal transport systems typically extract a high heat flux at the chip and spread it to an extended fin structure elsewhere in the package to transfer the heat to air. With a little (and literal) out of the box thinking, this thermal transport can be extended out of the box where it can mate with a thermal bus built into the cabinet. With the thermal bus connected directly to facility liquid circuits, air is eliminated as a heat transport medium. Since air is a relatively poor heat transport medium, its elimination leads to major reductions in system operating temperature differences. In most climates and conditions, this reduction is sufficient to allow the elimination of the refrigeration component of the facility cooling system. This means a cooling tower only method of heat rejection alone becomes sufficient to satisfy increasing cooling loads in datacomm facilities. Paul Leonard is an engineering design principal at Kling, Philadelphia, PA. Fred Phillips is a senior engineer with Thermacore, a subsidiary of Modine Manufacturing Company, Lancaster, PA ASHRAE.

2 As a result, the impacts to the facility cooling infrastructure are nothing short of a new generation of systems to support data centers. In lieu of producing colder air below a raised floor via a central cooling plant, developing the best airflow distribution patterns below the raised floor via modeling and deploying air-cooled data equipment to match raised floor air distribution capacity, thermal bus technology allows the elimination of the refrigeration component of the cooling system entirely and uses typical cooling tower heat rejection only to satisfy datacomm facility cooling loads. Therefore, the first costs and the long-term energy costs are significantly lower than conventional designs, which are limited by the primary heat transfer medium (air). THE THERMAL BUS CONCEPT Introduction Just as an electrical bus serves to distribute electrical power from the outside world to individual components, a thermal bus serves to collect waste heat from individual components and remove it to the outside world. An excellent overview of the thermal bus concept was presented at the 2004 ASHRAE Winter Meeting (Wilson et al. 2004). That paper explained that power densities are reaching the level where it will not be possible to drive enough air through a cabinet to provide adequate cooling, and it described the two-phase heat transport devices that makes the thermal bus feasible. The present paper provides actual test results to confirm the thermal performance estimates in the Wilson paper and shows how the chassis level thermal bus is a natural extension of existing cooling methods for server CPUs. As implemented and analyzed, the thermal bus consists of three distinct portions. The first removes heat from significant point sources and transports that waste heat to an interface at a cabinet level. The cabinet level collects heat from the individual server chassis and transports it to an interface with the facility heat rejection system. The facility heat rejection system would extend liquid circuits to collect heat from the server cabinets and would transport this heat outside the building where it would ultimately be rejected to ambient air. As implemented, the system would largely eliminate air as a heat transport medium and lead to reductions in overall temperature difference sufficient to allow rejection to the outside air without the use of refrigeration. Background Many years ago, electronic components were adequately cooled by natural circulation of air over the component. As components became more powerful, heat sinks had to be added to the components to increase heat rejection area and fans had to be added to the chassis to provide forced convection cooling. In recent years the power density has exceeded the capacity of a simple heat sink on the component cooled by an external fan. Complex heat sinks with integral fans have become common, as have heat pipe towers that greatly Figure 1 Figure 2 Heat pipe tower. Pumped liquid cooling system in server. increase the fin area that is available to an individual CPU. Extreme applications are now going to pumped liquid cooling, which can handle CPU power levels of 200 W (682 Btu/h) at heat fluxes in excess of 250 W/cm 2 (5503 Btu/h in. 2 ). An example of a heat pipe tower for server cooling is shown in Figure 1. This tower interfaces directly with a CPU chip in a server chassis and actually fastens the chip to its socket. The heat pipe allows a much larger fin stack and much higher heat transfer than could be achieved from a heat sink directly attached to the chip. The tower has the fins located above the CPU, but variations of this concept transport heat to locations remote from the chip where the fin stacks can be fitted and ventilated. Millions of these units have been placed in service. Figure 2 shows a pumped liquid system that can extract very high heat fluxes. Heat is extracted from the CPU chip and ASHRAE Transactions: Symposia 733

3 Figure 4 Air-cooled thermal bus demo unit. Figure 3 Server level/cabinet level integration. delivered to a liquid-to-air heat exchanger at a more convenient location. This pumped system is just beginning commercial operation for top-end applications where chips are being pushed to their operating limits. Heat Transport at the Server Chassis Level A large percentage of existing servers already employ heritage heat transfer devices or systems. These interfaces are well established and proven. Implementing the thermal bus concept at the server chassis level is simply a matter of extending these devices or systems outside the box. The simplicity of making this change is especially apparent when referring to the pumped liquid system shown in Figure 2; implementing a thermal bus would simply make the liquid lines longer and replace the liquid-to-air heat exchanger with a suitable plate heat exchanger that could mate with a cold plate in the cabinet. No changes to the chassis circuitry, physical arrangement, or CPU thermal/mechanical interfaces would be necessary. The incremental cost to extend this system out of the box is marginal. Heat Transport at the Server Cabinet Level The thermal bus at the cabinet level consists of loop thermosyphons (LTS) built into the cabinet. The evaporators of the cabinet level LTSs are vertically staggered to mate with condensers of the server level devices. The condensers of the cabinet level LTSs are grouped at the top of the cabinet where they are integrated with the facility heat rejection circuit. Figure 3 illustrates the general arrangement; the LTS condensers and the connection to the facility heat rejection system are not shown. DEMONSTRATION THERMAL BUS TEST RESULTS Two configurations of thermal buses have been demonstrated. This is, of course, in addition to the tens of thousands of tests that have been done over the years on the components making up the thermal bus. Demonstration of Thermal Bus Rejecting to Air Figure 4 shows the demonstration unit that will be exhibited in This chassis uses the vertical card configuration that is common for telecomm applications. Each card pack was rated for 240 watts (818 Btu/h) as tested. The idea behind this design was to allow adequate cooling of the amplifiers on the cards by transporting the heat to a location above the chassis where sufficient spaces existed to allow adequate fans and air side heat exchangers. In a server application, the fin stacks would be replaced by a more compact vapor-to-liquid heat exchanger that would be cooled by a facility heat rejection circuit. Figure 5 shows the testing of the actual components connected into a system configuration. The numbers on the figure designate individual thermocouples (t/c s). T/c s #1-5 and #14-18 measure the load temperature, which would be the surface of the CPU or other concentrated heat source. The measured delta-t from the hottest t/c to the inlet air was 30.5 C (54.9 F). More than 12 C (21.6 F) consisted of the rise in air temperature through the condenser. The delta-t measured at the inlet of the condenser (t/c #38) was 14.4 C (25.9 F). For a server application being discussed in this paper, a liquid-cooled heat exchanger would replace the aircooled condenser. The liquid-cooled condenser would have a delta-t less than 5 C (9 F). Adding this 5 C (9 F) to the 14.4 C (25.9 F) measured at the inlet of the air condenser gives less than 20 C (36 F) for the delta-t of the liquid-cooled system. 734 ASHRAE Transactions: Symposia

4 INTERFACE WITH FACILITY INFRASTRUCTURE Figure 5 Thermal bus to air test setup. Demonstration of a 15 kw Liquid-Cooled Thermal Bus Cabinet A prototype of a cabinet incorporating a thermal bus was built and tested for a major telecomm company. Since the design is proprietary to the telecomm company, no pictures or illustrations are included with this paper. This unit demonstrated a 20 C (36 F) delta-t from the base of the amplifiers being cooled to the inlet temperature of the coolant while operating at a cabinet power dissipation of 15 kw. Each loop thermosyphon was carrying more than 1500 W. The heat flux at the interface to the amplifier was 50 W/cm 2 (1100 Btu/h in. 2 ). This performance was accomplished with methanol working fluid, which is a much less capable fluid than water. The thermal bus was capable of transporting up to 25 kw as tested. Conclusion from Test Results Both prototype systems demonstrated a delta-t from the CPU to facility level coolant of 20 C (36 F) or less. The specification for the maximum case temperature of a Pentium 4 CPU is 70 C (158 F). For the prototype systems tested, the facility heat rejection circuit supply water temperature could be as high as 50 C (122 F) and meet the temperature specification for the CPU. This coolant temperature can be obtained without refrigeration. The impact on the design of datacomm facilities is described and analyzed below. As a basis for the analysis, this section sets a typical equipment density for datacomm facilities as a basis for infrastructure support system comparison. In response to ever increasing heat loads that are required to be supported by air cooling, ASHRAE (in cooperation with Technical Committee 9.9, Mission Critical Facilities) has developed the publication Thermal Guidelines for Data Processing Environments. This guideline organizes facility planning issues around air-cooled datacomm environments. The publication recognizes underfloor as well as overhead cooling solutions. It has formalized the server cabinet layout to maximize cooling capacity available from raised floor air distribution. TC 9.9 has developed the terminology of aisle pitch as the distance between centerline of cold aisle to centerline of cold aisle. An eight pitch layout signifies eight raised floor tiles (24 in. 24 in., 0.61 m 0.61 m) from centerline of cold aisle to centerline of cold aisle. This philosophy typically allows the installation of 48 in. (1.22 m) deep server cabinets. Based on this metric, an expected server cabinet density over the rackable area of the raised floor is approximately 22 ft 2 (2.04 m 2 ) per server rack. For the base case of air-cooled servers, taking into account access aisles, downflow air-conditioning equipment, and power distribution equipment, this translates into an overall density of 30 ft 2 per server cabinet or a total of 1,333 server cabinets over a 40,000 ft 2 ( m 2 ) facility. For the purposes of this analysis, at an average load of 75 W/ft 2 (256 Btu/h ft 2 ), the typical heat load per server cabinet is 2,250 W (7677 Btu/h). Assuming an operating load for each server at 220 W (750 Btu/h), this allows an average of 10 servers per cabinet. The heat load per square foot of area above is an average for plant equipment sizing. There are typically areas of the datacomm facility that will be denser in heat load and some areas that are more lightly loaded. The server heat gain is assumed to be an operating heat rejection vs. a nameplate power supply rating. The above load metric over a nominal 40,000 ft 2 ( m 2 ) data center produces an operating peak cooling load of 1,000 tons (3513 kw) including the ancilliary cooling loads associated with the data center (lighting, fan heat, local power distribution unit [PDU] transformation heat losses and uninterruptable power supply [UPS] transformation heat losses). For this magnitude of cooling load, a central chilled water system consisting of multiple lineups of central plant equipment is proposed as the base system to achieve the desired level of redundancy. For the basis of this analysis for the base and alternative systems, it is assumed that all equipment is required on a N+1 (Need+1) reliability level. Also, on the basis of first-cost estimating for the base and alternative systems, it is assumed that dual-path distribution of cooling ASHRAE Transactions: Symposia 735

5 Table 1. Conventional System Equipment Listing (Base System) Operating Quantity Equipment Description Efficiency / Capacity ton chillers with VSD compressor 12 F (6.67 C) chilled water temperature difference 10 F (5.55 C) condenser water temperature difference and heat rejection services is being maintained to serve the load. The services have been valued to be segmented (valved) to allow maintenance of a section of the piping system and maintain service to the data center. Therefore, not only equipment redundancy is maintained, but also two distribution paths are maintained to serve the load kw/ton ( kw/kw) - 100% load - 85 F (29.4 C) entering condenser water temperature kw/ton ( kw/kw) - 100% load - 65 F (18.3 C) entering condenser water temperature 2 Primary chilled water pump 1,000 gpm (63 L/s) 50 hp (37.3 kw) 2 Condenser water pump 1,500 gpm (95 L/s) 40 hp (29.8 kw) 2 Open cooling towers with two-speed fan motor, 10 F (5.55 C) water temperature difference Nominal 625 tons each 10 in. (250 mm) chiller water looped header serving chilled water generation - all welded construction 10 in. (250 mm) chiller water loop serving data center air-handling units - all welded construction 40 Nominal 30 ton (105.4 kw) (sensible) raised floor mounted chilled water computer room air-conditioning units Electrical substation to support mechanical equipment Standby power generation to support mechanical equipment Refer to Figure 6 for system configuration. 95 F (35 C) entering water 85 F (29.4 C) leaving water 78 F (25.5 C) outdoor wet bulb 25 HP (18.65 kw) 10 HP (7.46 kw) supply fan SYSTEM COMPARISON CONVENTIONAL AIR- COOLED INFRASTRUCTURE AND THERMAL BUS INFRASTRUCTURE This portion of the paper will present a comparison of the facility infrastructure system required to support conventional air-cooled servers and a thermal bus heat rejection system. A new electric centrifugal central chiller plant supporting a 40,000 ft 2 ( m 2 ) datacomm facility with conventional raised floor air distribution supporting air-cooled servers will be compared against a thermal bus heat rejection system with integrated liquid cooling that requires a typical cooling tower application as a heat rejection source. A new fully optimized chiller installation will be analyzed to show that even with a solution at today s top end technology, the operating cost savings with a thermal bus heat rejection system are dramatic. Thermal bus heat rejection architecture allows a fully sealed server cabinet configuration. Therefore, not only does the majority of the refrigeration in the support of the cooling infrastructure go away but also the current air distribution issue with densifying loads in data centers is all but eliminated. Figure 6 Typical datacom facility cooling plant model. 736 ASHRAE Transactions: Symposia

6 CONVENTIONAL INFRASTRUCTURE TO SUPPORT AIR-COOLED SYSTEMS Table 2. Alternative System Equipment Listing Operating Quantity Equipment Description Efficiency / Capacity 2 Open cooling towers with two-speed fan motor 10 F (5.55 C) water temperature difference Nominal 500 tons each 95 F (35 C) Entering water 85 F (29.4 C) Leaving water 78 F (25.5 C) Outdoor wet bulb 25 HP (18.65 kw) 2 Cooling tower water pump 1,200 gpm (75 L/s) 40 hp (29.8 kw) Plate and frame heat exchangers 1,200 GPM Cooling tower side F Internal heat rejection loop side F 2 Internal heat rejection loop pumps 1,200 gpm (75 L/s) 50 hp (37.3 kw) 10 in. (250 mm) heat rejection loop serving local Thermal Bus heat exchangers and data center air handling units (with compressors) - all welded construction 3 Nominal 30 ton (105.4 kw) (sensible) raised floor mounted computer room air conditioning units (with compressors) Electrical substation to support mechanical equipment Electrical standby power generation to support mechanical equipment Refer to Figure 7 for system configuration. Serves thermal bus heat exchangers - 88 F (31.1 C) entering heat exchanger 98 F (36.6 C) leaving heat exchangers at full load 10 HP (7.46 kw) supply fan Based on the load densities described above, an optimized conventional cooling system would typically consist of variable flow, primary only electric centrifugal chillers (with variable speed compressor drives) arranged to serve the load in 500 ton (1757 kw) segments with a spare chiller lineup (Table 1, Figure6). Chillers with variable speed compressor drives provide the most energy-efficient operation as loads ramp up in the data center. Data center loads are highly constant but may ramp up over months vs. varying day to day. Chiller operation would be paired with a control program to drive down cooling tower water temperatures through the year for optimal efficiency. The ideal operating point (minimum energy usage) for the chiller operation is minimum condenser water inlet temperatures (say 65 F, 18.3 C) and 100% loading on the compressor. Operating conditions in this ideal state are approximately kw/ton ( kw/kw). With a fully loaded chiller operation on a design day, efficiencies typically are kw/ton ( kw/kw) due to increased condenser water temperatures. Operating costs will be presented for operating the chillers throughout the year. A reduced operating cost will also be presented for the base system, assuming the chiller plant is installed in an area of the country that can support a water-side economizer system. First Costs. The first cost estimate of this system installed in the northeastern United States, based on a singlestory facility and an adjacent chiller building, is $7,126,000. Figure 7 Modified datacom facility cooling plant model for thermal bus heat rejection architecture. For a system with a water-side economizer, the first cost estimate is $7,246,000. Operating Costs. The operating cost estimate of this system utilizing Newark, New Jersey, weather data and utility rates local to that area is $1,690,000 per year assuming the ASHRAE Transactions: Symposia 737

7 chillers are operating throughout the year. For a system with a water-side economizer, the operating costs are $1,412,300. Maintenance Costs. Typical central plant maintenance costs for a plant of this size have been assumed to be $26,250 per year. This is based on $25 per ton of installed refrigeration equipment and a 70% factor applied as all equipment is assumed to be in one location. Refer to Appendix A for a first cost breakdown and a monthly operating cost summary of both the base and alternative systems. ALTERNATIVE INFRASTRUCTURE TO SUPPORT LIQUID-COOLED THERMAL BUS SYSTEMS Based on the data above, a heat rejection system (open cooling towers, heat exchangers, and pumps producing leaving cooling tower water temperatures of 85 F) can support the full data center heat rejection requirements via the thermal bus system and still maintain processor temperatures well within specifications (Table 2, Figure 7). A nominal amount of computer room air-conditioning units remain to cool lighting and environmental loads. First Costs. The first cost estimate of this system installed in the northeastern United States, based on a singlestory facility, is $5,404,400. Included in this cost is the thermal bus heat rejection systems at the cabinet level for 1333 server cabinets at a cost of $1200 per server cabinet. Each server cabinet has a connection to the heat rejection loop outside the cabinet. The heat rejection loop inside the building has not been valued to be insulated, whereas in the base case the chilled water loop is insulated. However, credited in this cost from the base case is the reduced cost of electrical infrastructure and standby power generation supporting refrigeration systems. Also, a smaller support building has been assumed from the base case as the chillers are eliminated. Operating Costs. The estimated operating cost of this system utilizing Newark, New Jersey, weather data and utility rates local to that area is $510,900 per year. Maintenance Costs. Typical central plant maintenance costs for a plant of this size have been assumed to be $13,125 per year (proportioned form the previous estimate as the chillers have been eliminated). Refer to Appendix A for a first cost breakdown and a monthly operating cost summary of both the base and alternative systems. DISCUSSION OF SYSTEM FIRST COST AND OPERATING COST COMPARISON For the cooling load requirement throughout the year, it has been assumed that the plant load of 75 W/ft 2 (256 Btu/ h ft 2 ) is operating continuously. The energy modeling method used to model the base system was a spreadsheet method that applies 8760 hours of outdoor weather data and, via modeling of the cooling tower output temperature (based on manufacturer s performance data), predicts an entering water temperature to a fully loaded chiller. The chiller manufacturer s fullload efficiency data were utilized based on a variable speed drive operating the compressor to match power requirements to lower tower water temperatures. The most efficient chiller system was proposed to present the most aggressive cost comparison between a base system and the alternative. Both systems were considered at an average full load continuously to concentrate on the plant operational cost differences. Chilled water and condenser water pumps were operated at a continuous full flow for the base case. In the base case, the water-side economizer system operates below a temperature of 40 F (4.5 C). The energy modeling method used to calculate the alternate system s operating cost to support the thermal bus heat rejection system was a spreadsheet method that again applies 8760 hours of outdoor weather data and, via modeling of the cooling tower output temperature (based on manufacturer s performance data), predicts an entering water temperature to the heat exchanger. The flow from the cooling tower to the heat exchanger was modeled as variable flow below 65 F (18.3 C). Heat exchanger loop pumps were operated at a continuous full flow to the thermal bus heat exchangers. The proposed liquid-cooled heat exchanger serving the thermal bus heat exchanger system can be located in various locations in the data center. Due to the physics of the operation of the thermosyphon (steam rises and water falls by gravity inside the heat pipe), the liquid-cooled heat exchanger is located above the servers in the cabinet. This cabinet becomes just like any other computer room air-conditioning unit (that is, fed from a central water-based heat rejection system) in a conventional data center. Alternatively, the heat exchanger serving the thermal bus system can be located above the data center in an entirely dedicated mechanical space. This has the potential of eliminating the water piping from the data center. Although the example in this paper is water cooled, alternatively the thermal bus heat rejection has the potential of being air cooled. The thermal bus heat exchanger could be a fin stack in an adequate airstream. Not included in the cost analysis is a potential credit for deleting the raised floor in a typical data center. With the thermal bus system and a facility culture of managing cabeling overhead in cabletrays, as is done in some facilities, the raised floor can be deleted entirely or at least reduced if cabling is still below the raised floor and the raised floor is not needed for full air-conditioning purposes. Also not included in the cost analysis is any reduction in data center area for the reduction in raised floor mounted airconditioning units. In the base case, 2400 ft 2 (223 m 2 ) of raised floor space is required for the footprint and access for the downflow computer room air-conditioning units. In the alternative thermal bus system, a reduced quantity of computer room air-conditioning units has been allowed to condition ancillary cooling loads. Approximately 10% of the area above is required for these units. The thermal bus system not only provides a two-thirds reduction in annual operating costs but also provides a 20 percent reduction in first cost. 738 ASHRAE Transactions: Symposia

8 Figure 8 Comparison of typical datacom facility energy usage. Facility Support System Option Table 3. Total Cost Comparison First Cost Summary Yearly Costs Operating Costs Maintenance Costs Conventional cooling system air-cooled servers $7,126,000 $1,716,300 Conventional cooling system with water-side economizer air-cooled servers $7,246,000 $1,438,500 Alternative heat rejection system thermal bus-cooled servers $5,404,400 $524,050 Refer to Figure 8 for a comparison of typical datacomm facility energy usage. As can be seen in the comparisons, a conventional system optimized as above still uses 27% of the facility s energy. With the alternative system, only 9% of the facility s energy is dedicated to the cooling system. This also translates into less electrical infrastructure required in terms of substation capacity and standby power generation capacity. CONCLUSION Application of thermal bus technology eliminates current issues surrounding the limitations of raised floor air distribution, allows a denser deployment of data equipment in nontraditional data center spaces, and eliminates the refrigeration component of conventional data center cooling systems, resulting in reduced first cost and operating cost. The impact to data equipment manufacturers for coming generations of equipment is minimal as their current thermal transport applications are literally extended out of the box to the thermal bus system. The thermal bus system represents an incremental cost to each server cabinet but an overall lower first cost of the heat rejection system. The thermal bus heat rejection system provides a liquid-cooled heat exchanger outside the server cabinet. This connection to a heat rejection loop is no different than any other computer room air-conditioning unit connection that currently exists in data centers. The thermal bus heat exchanger can be located outside of the data center to entirely eliminate water piping from the data center. The energy saving represented supports sustainable design concepts. The energy savings alone (64% to 69% depending upon the application of a water-side economizer system) represents ten LEED points in the Energy and Atmosphere category. This paper advances the topic that was presented at the 2004 ASHRAE Winter Meeting (Wilson et al. 2004) and formalizes the application of thermal bus heat rejection into a data center cooling strategy for a new facility. Other papers will present strategies to integrate thermal bus technology in existing data centers transitioning from chilled water cooling systems to thermal bus heat rejection systems. REFERENCES Wilson, M., J.P. Wattelet, and K.W. Wert A thermal bus system for cooling electronic components in highdensity cabinets. ASHRAE Transactions 110(1). Zuo, Z.J., L.R. Hoover, and A.I. Phillips Advanced thermal architecture for cooling of high power electronics. IEEE Transactions on Components and Packaging Technologies 25(4) (December 2002): ASHRAE Transactions: Symposia 739

9 Appendix A Applied Thermal Bus Technology First Cost Summary Base Case - Chiller Plant Description Qty U/M Unit $ Total Chiller Bldg Foundation 2400 ft 2 $15 $36,000 Chiller Bldg Superstructure 2400 ft 2 $15 $36,000 Chiller Bldg Roofing 2400 ft 2 $10 $24,000 Chiller Bldg Exterior Wall 3276 ft 2 $28 $90,090 Chiller Bldg Interior Partitions 750 ft 2 $9 $6,375 Chiller Bldg Painting 1 lpsm $1,000 $1,000 Chiller Bldg Roof Drains 4 ea $3,500 $14,000 Chiller Bldg Floor Drains 7 ea $5,000 $35,000 Chiller Bldg Fire Protection 2400 sqft $3 $7, Ton Cooling Tower 3 ea $120,000 $360, Ton Chiller w VSD 3 ea $550,000 $1,650,000 Condenser Water Pumps 3 ea $60,000 $180,000 Chilled Water Pumps 3 ea $50,000 $150,000 CRAC Units 30 ton CHW 40 ea $28,940 $1,157,600 BAS 1 lpsm $360,000 $360,000 Condenser Water Piping 582 lnft $611 $355,602 Chilled Water Piping 2000 lnft $455 $910,000 Electrical Power Wiring 2400 sqft $20 $48,000 Equipment Electrical Connections 1 lpsm $63,000 $63,000 Additional Electrical Substation Capacity 1 lpsm $600,000 $600,000 Electricalstandby Power Generation Capacity 1 lpsm $400,000 $400,000 Commissioning Mechanical Equipment Contingency Building and Mechanical Subtotal $6,483,867 $104,928 Subtotal $6,588,795 $537,287 Total $7,126,082 Maintenance Costs* 1500 $/ton $17.50 $26,250 *Costs include reduction due to equipment being in same mechanical room 740 ASHRAE Transactions: Symposia

10 Appendix A Applied Thermal Bus Technology First Cost Summary Base Case - Chiller Plant with Waterside Economizer Description Qty U/M Unit $ Total Chiller Bldg Foundation 2400 ft 2 $15 $36,000 Chiller Bldg Superstructure 2400 ft 2 $15 $36,000 Chiller Bldg Roofing 2400 ft 2 $10 $24,000 Chiller Bldg Exterior Wall 3276 ft 2 $28 $90,090 Chiller Bldg Interior Partitions 750 ft 2 $9 $6,375 Chiller Bldg Painting 1 lpsm $1,000 $1,000 Chiller Bldg Roof Drains 4 ea $3,500 $14,000 Chiller Bldg Floor Drains 7 ea $5,000 $35,000 Chiller Bldg Fire Protection 2400 ft 2 $3 $7, Ton Cooling Tower 3 ea $120,000 $360, Ton Chiller w VSD 3 ea $550,000 $1,650,000 Condenser Water Pumps 3 ea $60,000 $180,000 Chilled Water Pumps 3 ea $50,000 $150,000 CRAC Units 30 ton CHW 40 ea $28,940 $1,157,600 BAS 1 lpsm $360,000 $360,000 Condenser Water Piping 582 lnft $611 $355,602 Chilled Water Piping 2000 lnft $455 $910,000 Waterside Economizer Heat Exchangers 2 ea $50,000 $100,000 Waterside Economizer Piping 1 lpsm $20,000 $20,000 Electrical Power Wiring 2400 ft 2 $20 $48,000 Equipment Electrical Connections 1 lpsm $63,000 $63,000 Additional Electrical Substation Capacity 1 lpsm $600,000 $600,000 Electricalstandby Power Generation Capacity 1 lpsm $400,000 $400,000 Commissioning Mechanical Equipment Contingency Building and Mechanical Subtotal $6,603,867 $104,928 Subtotal $6,708,795 $537,287 Total $7,246,082 Maintenance Costs* 1500 $/ton $17.50 $26,250 *Costs include reduction due to equipment being in same mechanical room ASHRAE Transactions: Symposia 741

11 Appendix A Applied Thermal Bus Technology First Cost Summary Alternate Case - Liquid Cooled Thermal Bus System Description Qty U/M Unit $ Total Mechanical Bldg Foundation 1600 ft 2 $15 $24,000 Mechanical Bldg Superstructure 1600 ft 2 $15 $24,000 Mechanical Bldg Roofing 1600 ft 2 $10 $16,000 Mechanical Bldg Exterior Wall 2184 ft 2 $28 $60,060 Mechanical Bldg Interior Partitions 500 ft 2 $9 $4,250 Mechanical Bldg Painting 1 lpsm $800 $800 Mechanical Bldg Roof Drains 3 ea $3,500 $10,500 Mechanical Bldg Floor Drains 9 ea $5,000 $45,000 Mechanical Bldg Fire Protection 1600 ft 2 $3 $4, Ton Cooling Tower 3 ea $120,000 $360,000 Heat Exchangers 3 ea $50,000 $150,000 Condenser Water Pumps 6 ea $48,000 $288,000 CRAC Units 30 ton Water Cooled 3 ea $45,140 $135,420 BAS 1 lpsm $200,000 $200,000 Condenser Water Piping 10,073 lnft $203 $2,048,949 Electrical Power Wiring 1600 ft 2 $20 $32,000 Equipment Electrical Connections 1 lpsm $29,850 $29,850 Cabinet Thermal Bus Heat Rejection System 1333 ea $1,200 $1,599,600 Commissioning Mechanical Equipment Contingency Building and Mechanical Subtotal $5,033,229 $34,003 Subtotal $5,067,232 $337,180 Total $5,404,412 Maintenance Costs* 1500 $/ton $8.75 $13,125 *Costs include reduction due to equipment being in same mechanical room 742 ASHRAE Transactions: Symposia

12 Appendix A ASHRAE Transactions: Symposia 743

13 Appendix A 744 ASHRAE Transactions: Symposia

14 Appendix A ASHRAE Transactions: Symposia 745

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building

More information

The New Data Center Cooling Paradigm The Tiered Approach

The New Data Center Cooling Paradigm The Tiered Approach Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,

More information

Optimization of Water - Cooled Chiller Cooling Tower Combinations

Optimization of Water - Cooled Chiller Cooling Tower Combinations Optimization of Water - Cooled Chiller Cooling Tower Combinations by: James W. Furlong & Frank T. Morrison Baltimore Aircoil Company The warm water leaving the chilled water coils is pumped to the evaporator

More information

How to Build a Data Centre Cooling Budget. Ian Cathcart

How to Build a Data Centre Cooling Budget. Ian Cathcart How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options

More information

Element D Services Heating, Ventilating, and Air Conditioning

Element D Services Heating, Ventilating, and Air Conditioning PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer

More information

Rittal Liquid Cooling Series

Rittal Liquid Cooling Series Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de

More information

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.

More information

Second Place: Industrial Facilities or Processes, New

Second Place: Industrial Facilities or Processes, New Second Place: Industrial Facilities or Processes, New Photo: Paul Howell Texas Medical Center in Houston is the largest medical center in North America. It has a chilled water system of 160,000 tons. Big

More information

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Cooling Audit for Identifying Potential Cooling Problems in Data Centers Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous

More information

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation

More information

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect of Record: SPARCH Project Overview The project is

More information

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy Executive Summary Data center owners and operators are in constant pursuit of methods

More information

Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy

Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy Executive Summary Data center energy usage has risen dramatically over the past decade and

More information

Data Center Equipment Power Trends

Data Center Equipment Power Trends Green field data center design 11 Jan 2010 by Shlomo Novotny Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. explores water cooling for maximum efficiency - Part 1 Overview Data

More information

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue

More information

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch

More information

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation Version 1.0 - November, 2011 Incorporating Data Center-Specific Sustainability Measures into

More information

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling

More information

The Different Types of Air Conditioning Equipment for IT Environments

The Different Types of Air Conditioning Equipment for IT Environments The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.

More information

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages Objectives Increase awareness of the types of economizers currently used for data centers Provide designers/owners with an overview of the benefits and challenges associated with each type Outline some

More information

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks Presentation Outline Common Terms / Concepts HVAC Building Blocks Plant Level Building Blocks Description / Application Data Green opportunities Selection Criteria Air Distribution Building Blocks same

More information

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures Data Realty Colocation Data Center Ignition Park, South Bend, IN Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures Project Overview Data Realty is a data center service provider for middle

More information

APC APPLICATION NOTE #112

APC APPLICATION NOTE #112 #112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.

More information

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white

More information

Thermal Mass Availability for Cooling Data Centers during Power Shutdown

Thermal Mass Availability for Cooling Data Centers during Power Shutdown 2010 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ashrae.org). Published in ASHRAE Transactions (2010, vol 116, part 2). For personal use only. Additional reproduction,

More information

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES

More information

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,

More information

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Munther Salim, Ph.D. Yury Lui, PE, CEM, LEED AP eyp mission critical facilities, 200 west adams street, suite 2750, Chicago, il 60606

More information

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services

More information

SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS

SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS UNIT OBJECTIVES After studying this unit, the reader should be able to explain the purpose of the condenser in a refrigeration system. describe differences

More information

Reducing Data Center Energy Consumption

Reducing Data Center Energy Consumption Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate

More information

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency

More information

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America 7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer

More information

COMMERCIAL HVAC CHILLER EQUIPMENT. Air-Cooled Chillers

COMMERCIAL HVAC CHILLER EQUIPMENT. Air-Cooled Chillers COMMERCIAL HVAC CHILLER EQUIPMENT Air-Cooled Chillers Technical Development Programs (TDP) are modules of technical training on HVAC theory, system design, equipment selection and application topics. They

More information

HVAC Systems: Overview

HVAC Systems: Overview HVAC Systems: Overview Michael J. Brandemuehl, Ph.D, P.E. University of Colorado Boulder, CO, USA Overview System Description Secondary HVAC Systems Air distribution Room diffusers and air terminals Duct

More information

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.

More information

BRUNS-PAK Presents MARK S. EVANKO, Principal

BRUNS-PAK Presents MARK S. EVANKO, Principal BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations

More information

- White Paper - Data Centre Cooling. Best Practice

- White Paper - Data Centre Cooling. Best Practice - White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH

More information

State of the Art Energy Efficient Data Centre Air Conditioning

State of the Art Energy Efficient Data Centre Air Conditioning - White Paper - State of the Art Energy Efficient Data Centre Air Conditioning - Dynamic Free Cooling - Release 2, April 2008 Contents ABSTRACT... 3 1. WHY DO I NEED AN ENERGY EFFICIENT COOLING SYSTEM...

More information

OPTIMIZING CONDENSER WATER FLOW RATES. W. A. Liegois, P.E. Stanley Consultants, Inc. Muscatine, Iowa

OPTIMIZING CONDENSER WATER FLOW RATES. W. A. Liegois, P.E. Stanley Consultants, Inc. Muscatine, Iowa OPTIMIZING CONDENSER WATER FLOW RATES W. A. Liegois, P.E. Stanley Consultants, Inc. Muscatine, Iowa T.A. Brown, P.E. Thermal Energy Corporation Houston, Texas ABSTRACT Most chillers are designed for a

More information

Increasing Energ y Efficiency In Data Centers

Increasing Energ y Efficiency In Data Centers The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes

More information

Wet Bulb Temperature and Its Impact on Building Performance

Wet Bulb Temperature and Its Impact on Building Performance Wet Bulb Temperature and Its Impact on Building Performance By: Kurmit Rockwell, PE, CEM, LEED AP and Justin Lee, PE, LEED, AP BD+C Energy Solution Services, AtSite, Inc. 1 What is Wet Bulb Temperature?

More information

Elements of Energy Efficiency in Data Centre Cooling Architecture

Elements of Energy Efficiency in Data Centre Cooling Architecture Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total

More information

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center

More information

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise A White Paper from the Experts in Business-Critical Continuity Executive Summary Today s data centers are changing rapidly,

More information

News in Data Center Cooling

News in Data Center Cooling News in Data Center Cooling Wednesday, 8th May 2013, 16:00h Benjamin Petschke, Director Export - Products Stulz GmbH News in Data Center Cooling Almost any News in Data Center Cooling is about increase

More information

Defining Quality. Building Comfort. Precision. Air Conditioning

Defining Quality. Building Comfort. Precision. Air Conditioning Defining Quality. Building Comfort. Precision Air Conditioning Today s technology rooms require precise, stable environments in order for sensitive electronics to operate optimally. Standard comfort air

More information

In Row Cooling Options for High Density IT Applications

In Row Cooling Options for High Density IT Applications In Row Cooling Options for High Density IT Applications By Ramzi Y. Namek, PE, LEED AP, BD+C; Director of Engineering 1 Users wishing to deploy high density IT racks have several In Row Cooling solutions

More information

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER DATA CENTERS 2009 IT Emissions = Aviation Industry Emissions Nations Largest Commercial Consumers of Electric Power Greenpeace estimates

More information

Chapter 3.4: HVAC & Refrigeration System

Chapter 3.4: HVAC & Refrigeration System Chapter 3.4: HVAC & Refrigeration System Part I: Objective type questions and answers 1. One ton of refrigeration (TR) is equal to. a) Kcal/h b) 3.51 kw c) 120oo BTU/h d) all 2. The driving force for refrigeration

More information

Managing Data Centre Heat Issues

Managing Data Centre Heat Issues Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design

More information

Data Center. Ultra-Efficient chilled water system optimization. White paper. File No: 9.236 Date: december 03, 2015 Supersedes: new Date: new

Data Center. Ultra-Efficient chilled water system optimization. White paper. File No: 9.236 Date: december 03, 2015 Supersedes: new Date: new Data Center Ultra-Efficient chilled water system optimization White paper File No: 9.236 Date: december 03, 2015 Supersedes: new Date: new Data Center - Ultra-Efficient white paper 3 abstract The primary

More information

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200

More information

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com

More information

SECTION 11 78 13 MORTUARY REFRIGERATORS

SECTION 11 78 13 MORTUARY REFRIGERATORS PART 1 - GENERAL 1.1 DESCRIPTION SECTION 11 78 13 MORTUARY REFRIGERATORS SPEC WRITER NOTE: Delete between //---// if not applicable to project. Also delete any other item or paragraph not applicable in

More information

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite

More information

Heat Recovery In Retail Refrigeration

Heat Recovery In Retail Refrigeration This article was published in ASHRAE Journal, February 2010. Copyright 2010 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. Posted at www.ashrae.org. This article may not

More information

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the

More information

HVACPowDen.xls An Easy-to-Use Tool for Recognizing Energy Efficient Buildings and HVAC Systems

HVACPowDen.xls An Easy-to-Use Tool for Recognizing Energy Efficient Buildings and HVAC Systems HVACPowDen.xls An Easy-to-Use Tool for Recognizing Energy Efficient Buildings and HVAC Systems Fundamental Principles of Environmentally Responsible, Energy Efficient Buildings 1. Energy efficiency is

More information

Water cooled chiller plant (cp/vs)

Water cooled chiller plant (cp/vs) Data Center Water cooled chiller plant (cp/vs) Design Envelope File No: 9.573 Date: december 16, 2014 Supersedes: 9.573 Date: november 14, 2014 design envelope s performance improvements are among the

More information

Creating Efficient HVAC Systems

Creating Efficient HVAC Systems Creating Efficient HVAC Systems Heating and Cooling Fundamentals for Commercial Buildings Heating, ventilating, and air conditioning (HVAC) systems account for nearly half of the energy used in a typical

More information

How to Meet 24 by Forever Cooling Demands of your Data Center

How to Meet 24 by Forever Cooling Demands of your Data Center W h i t e P a p e r How to Meet 24 by Forever Cooling Demands of your Data Center Three critical aspects that are important to the operation of computer facilities are matching IT expectations with the

More information

Data Center Energy Profiler Questions Checklist

Data Center Energy Profiler Questions Checklist Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space

More information

Guide to Minimizing Compressor-based Cooling in Data Centers

Guide to Minimizing Compressor-based Cooling in Data Centers Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi

More information

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program

More information

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE 24 2013? SLIDE 1

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE 24 2013? SLIDE 1 WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE 24 2013? SLIDE 1 Data Center CASE Scope Existing Title 24 2008 Scope Current scope ( 100 T24-2008) exempts process space from many of the requirements, however

More information

Specialty Environment Design Mission Critical Facilities

Specialty Environment Design Mission Critical Facilities Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,

More information

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory

More information

DataCenter 2020: first results for energy-optimization at existing data centers

DataCenter 2020: first results for energy-optimization at existing data centers DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction

More information

HOW TO CONDUCT ENERGY SAVINGS ANALYSIS IN A FACILITY VALUE ENGINEERING STUDY

HOW TO CONDUCT ENERGY SAVINGS ANALYSIS IN A FACILITY VALUE ENGINEERING STUDY HOW TO CONDUCT ENERGY SAVINGS ANALYSIS IN A FACILITY VALUE ENGINEERING STUDY Benson Kwong, CVS, PE, CEM, LEED AP, CCE envergie consulting, LLC Biography Benson Kwong is an independent consultant providing

More information

STULZ Water-Side Economizer Solutions

STULZ Water-Side Economizer Solutions STULZ Water-Side Economizer Solutions with STULZ Dynamic Economizer Cooling Optimized Cap-Ex and Minimized Op-Ex STULZ Data Center Design Guide Authors: Jason Derrick PE, David Joy Date: June 11, 2014

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel

More information

Choosing Close-Coupled IT Cooling Solutions

Choosing Close-Coupled IT Cooling Solutions W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of

More information

Benefits of Cold Aisle Containment During Cooling Failure

Benefits of Cold Aisle Containment During Cooling Failure Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business

More information

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National

More information

Data Center Environments

Data Center Environments Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of

More information

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Cooling Small Server Rooms Can Be Inexpensive, Efficient and Easy - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Server Rooms Description & Heat Problem Trends

More information

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009 Optimum Climate Control For Datacenter - Case Study T. Prabu March 17 th 2009 Agenda 2 About EDEC (Emerson) Facility Data Center Details Design Considerations & Challenges Layout Design CFD Analysis Of

More information

IT@Intel. Thermal Storage System Provides Emergency Data Center Cooling

IT@Intel. Thermal Storage System Provides Emergency Data Center Cooling White Paper Intel Information Technology Computer Manufacturing Thermal Management Thermal Storage System Provides Emergency Data Center Cooling Intel IT implemented a low-cost thermal storage system that

More information

How To Improve Energy Efficiency Through Raising Inlet Temperatures

How To Improve Energy Efficiency Through Raising Inlet Temperatures Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD

More information

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no

More information

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010 White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence

More information

PAYGO for Data Center -- Modular Infrastructure

PAYGO for Data Center -- Modular Infrastructure PAYGO for Data Center -- Modular Infrastructure Introduction The infrastructure costs for Data Centers are increasing at a significant pace. Construction costs and their environmental impact are also soaring.

More information

White Paper #10. Energy Efficiency in Computer Data Centers

White Paper #10. Energy Efficiency in Computer Data Centers s.doty 06-2015 White Paper #10 Energy Efficiency in Computer Data Centers Computer Data Centers use a lot of electricity in a small space, commonly ten times or more energy per SF compared to a regular

More information

BICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

BICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment. BICSInews m a g a z i n e july/august 2012 Volume 33, Number 4 plus + Industrial-grade Infrastructure and Equipment + I Say Bonding, You Say Grounding + Data Center Containment & design deployment The

More information

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems A White Paper from the Experts in Business-Critical Continuity Executive Summary In enterprise data centers, designers face

More information

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings ASHRAE Conference, Denver Otto Van Geet, PE June 24, 2014 NREL/PR-6A40-58902 NREL is a national laboratory of the U.S. Department

More information

Benefits of Water-Cooled Systems vs. Air-Cooled Systems for Air-Conditioning Applications

Benefits of Water-Cooled Systems vs. Air-Cooled Systems for Air-Conditioning Applications Benefits of Water-Cooled Systems vs. Air-Cooled Systems for Air-Conditioning Applications Kavita A. Vallabhaneni U. S. Government commitment to reduce greenhouse gas emissions can have a significant impact

More information

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London. Energy Efficient Data Centre at Imperial College M. Okan Kibaroglu IT Production Services Manager Imperial College London 3 March 2009 Contents Recognising the Wider Issue Role of IT / Actions at Imperial

More information

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

AIR-SITE GROUP. White Paper. Green Equipment Room Practices AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider

More information

How To Improve Energy Efficiency In A Data Center

How To Improve Energy Efficiency In A Data Center Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice

More information

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA Paul Oliver Sales Director Paul Oliver, Airedale Tom Absalom, JCA Airedale an overview 25,000m 2 UK premier test centre 450 staff Global distribution A Modine company Modine centre of excellence Airedale

More information

HVAC: Cool Thermal Storage

HVAC: Cool Thermal Storage HVAC: Cool Thermal Storage Thermal storage systems offer building owners the potential for substantial operating cost savings by using offpeak electricity to produce chilled water or ice for use in cooling

More information

National Grid Your Partner in Energy Solutions

National Grid Your Partner in Energy Solutions National Grid Your Partner in Energy Solutions National Grid Webinar: Enhancing Reliability, Capacity and Capital Expenditure through Data Center Efficiency April 8, 2014 Presented by: Fran Boucher National

More information

Green Data Centre Design

Green Data Centre Design Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS

More information

HEAT RECOVERY FROM CHILLED WATER SYSTEMS. Applications for Heat Reclaim Chillers

HEAT RECOVERY FROM CHILLED WATER SYSTEMS. Applications for Heat Reclaim Chillers HEAT RECOVERY FROM CHILLED WATER SYSTEMS Applications for Heat Reclaim Chillers April 2008 TABLE OF CONTENTS INTRODUCTION... 3 WASTE HEAT SOURCES... 3,4 Capturing Sufficient Heat for Useful Purposes...

More information

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers C P I P O W E R M A N A G E M E N T WHITE PAPER Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers By Anderson Hungria Sr. Product Manager of Power, Electronics & Software Published

More information

Energy Impact of Increased Server Inlet Temperature

Energy Impact of Increased Server Inlet Temperature Energy Impact of Increased Server Inlet Temperature By: David Moss, Dell, Inc. John H. Bean, Jr., APC White Paper #138 Executive Summary The quest for efficiency improvement raises questions regarding

More information

Alcatel-Lucent Modular Cooling Solution

Alcatel-Lucent Modular Cooling Solution T E C H N O L O G Y W H I T E P A P E R Alcatel-Lucent Modular Cooling Solution Redundancy test results for pumped, two-phase modular cooling system Current heat exchange methods for cooling data centers

More information

VertiCool Space Saver 2 to 15 tons Water-Cooled and Chilled Water

VertiCool Space Saver 2 to 15 tons Water-Cooled and Chilled Water 2 to 15 tons Water-Cooled and Chilled Water Unique Solutions for Challenging HVAC Projects Your Choice for Limited Space Applications Water-Cooled Chilled Water The VertiCool Space Saver allows installation

More information