Reducing the Annual Cost of a Telecommunications Data Center

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Reducing the Annual Cost of a Telecommunications Data Center"

Transcription

1 Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities managers for a large internet service provider have known for a while that one of their data centers is over-cooled. Over-cooling translates into unnecessary energy consumption and expense, so the managers knew that some changes to the data center were needed. Several options were possible, such as shutting down one or more of the cooling units. Many questions arose, however. For example, what would be the consequences of shutting down a CRAC? Would it be possible to shut down two? If so, which two? Could the supply temperatures be increased? To answer these questions, the operators decided to use computational fluid dynamics (CFD), a tool that uses airflow predictions to demonstrate how effectively the cooling air reaches and removes heat from - the equipment in the room. Using CFD-based modeling techniques for quantifying the efficiency of the data center, different energy-saving strategies can be compared before physical changes to the room are made. Problem Description The CFD modeling is done using CoolSim software from Applied Math Modeling. The raised-floor data center is 4720 sq. ft. in size (80 ft x 59 ft) and makes use of a ceiling plenum return. The 2 ft supply plenum, 15 ft Figure 1: Isometric view of the room geometry, showing the rack rows (pink and gray), CRACs (blue tops), perforated floor tiles (white) and overhead ceiling grilles (green) 2011 Applied Math Modeling Inc. 1

2 Figure 2: Rack heat loads, which range from a low of 10W (dark blue) to a high of 7.8 kw (red); while there are only two 10W racks, there are several 50W racks, which also appear dark blue room height, and 5 ft ceiling plenum combine to form a space that is 22 ft high. (See Figure 1.) The data center contains ten rows of equipment with either 17 or 21 racks per row. The heat loads in the racks vary from 10W up to 7.8 kw, as shown in Figure 2. The total IT heat load in the room is 363 kw with a density of 6.9 W/sq. ft. Five Liebert DS105AU CRACs are positioned on opposing walls for a total of ten CRACs in the room. These direct expansion (DX) cooling units are controlled in zones, each of which consists of two opposing CRACs (Figure 3). The data center was originally designed for a 1 MW heat load, but in its current use, the IT load is only about one-third of that value (363 kw). Assuming a 20 F temperature rise across all racks, 57,020 CFM of cooling air is needed for the present heat load. Each CRAC delivers 14,500 CFM, so with all ten CRACs operating, a total of 145,000 CFM is generated, which is almost three times the needed amount. Thus in normal operating mode, two zones are disabled, so that only six of the ten CRACs are in use. The disabled cooling zones, 2 and 4, are shown in Figure 3. For this configuration, the six active CRACs supply 87,000 CFM of cooling air, which is about 50% more than required for the heat load. The total cooling capacity of the six CRACs exceeds the heat load by approximately the same amount. Measurements of supply air temperatures place the range between 50 F and 68 F. Return temperatures are also available for comparison with values predicted by the CFD simulation. Once a CFD model is created and validated, alternative energy optimization scenarios can be investigated, including disabling additional CRACs, hot or cold aisle containment, and modification to the CRAC cooling parameters. In this study, the first and third of these options will be considered. Preliminary Results for the Baseline Case The first CFD model created for this study (the baseline case) corresponds to the data center operating in normal mode, with six CRACs operational, as shown in Figure 3. Boundary conditions for the simulation in Applied Math Modeling Inc. 2

3 Zone # CRAC # % in all cases. A validation of the preliminary model, such as this, is an important step if modifications are to be made. A demonstration that the base model accurately captures the physics to within an acceptable margin of error means that it can be used to correctly predict trends if one or more changes are made. 2 Figure 3: Five cooling zones, each of which consists of a pair of opposing CRACs; under normal operating conditions, Zones 2 and 4 are shut down clude the heat load and flow rate associated with each rack and the supply temperature and flow rate associated with each CRAC. The measured supply and return temperatures are shown in Table 1, along with the predicted return temperatures from the CFD model. In all but one case, the predicted temperatures are below the measured values. Often, when the CFD model under-predicts the return temperature on every CRAC, it means that either the heat loads are under-represented or the CRAC flowrates are too high. In this data center, it could be one of these factors or a combination of both, but the effect is small, since the error is below 4 6 CRAC # Contours of rack inlet temperature for the baseline case are shown in Figure 4. The temperatures all fall below the ASHRAE recommended maximum value of 80.6 F. The maximum rack inlet temperature is a good metric to follow when comparing cooling strategies. For an over-cooled data center, however, the minimum rack inlet temperature is also important to follow. According to the ASH- RAE guidelines, the rack inlet temperature 2011 Applied Math Modeling Inc. 3 8 Measured Supply Temperature (F) 10 Measured Return Temperature (F) Predicted Return Temperature (F) Error (%) Table 1: Measurements of supply and return temperature and predicted return temperature for the baseline case; the error in the predicted return temperature is under 5% for all CRACs

4 should not go below 64.4 F, although the allowed minimum value is 59 F. For the baseline case, at least half of the racks have inlet temperatures that are too cold. Data Center Metrics PUE and DCIE A number of metrics have been defined in recent years that can be used to gauge the efficiency of a data center. Metrics can also be used to test whether changes to the data center bring about reduced (or increased) power demands. One of the most popular metrics is the Power Utilization Effectiveness, or PUE, defined as the ratio of total facility power to total IT power. Total Facility Power PUE Total IT Power The total facility power includes that needed to run the CRACs (chillers and fans), IT equipment, battery backup systems, lighting, and any other heat-producing devices. Thus PUE is always greater than 1, but values that are close to 1 are better than those that are not. A typical value is 1.8, a good value is 1.4, and an excellent value is 1.2. COP The largest contributor to the total facility power is the cooling system, comprised of the heat exchangers (chillers, condensers and cooling fluid pumps, for example) and fans. The heat exchanger portion of the CRAC is a heat pump, whose job it is to move heat from one location (inside the room) to another Figure 4: Rack inlet temperatures for the baseline case, in which the CRACs in Zones 2 and 4 are inactive (1) (outside). Heat pumps are rated by their coefficient of performance, or COP. The COP is the ratio of the heat moved by the pump to the work done by the pump to perform this task. The work done by the pump encompasses the heat exchanger work and does not include the CRAC fans. The COP can also be expressed as a power ratio, making use of the rate at which heat is moved (in Watts, say) or work is done (again, in Watts). COP Heat Moved Work Done Using more practical terms, the COP is the ratio of the total room heat load to the power needed to run the chillers, condensers and other heat rejection equipment. For data center cooling equipment, COP values range from 2 to 5, with larger numbers corresponding to better heat pumps. Note that an alternative definition of COP could be made for the data center as a whole, rather than just for the heat rejection system. In this alternative 2011 Applied Math Modeling Inc. 4 (2)

5 definition, the work done would include the power used to run the CRAC fans. For the purposes of this paper, the traditional definition of COP is used. RCI HI n T i TR i 1 1 NT A_ HI T _ HI R _ HI x100% (4) Return Temperature Index TM The Return Temperature Index, a trademark of ANCIS Inc. (www.ancis.us), is a percentage based on the ratio of the total demand air flow rate to the total supply air flow rate. Total Demand Air Flow Rate RTI (3) Total Supply Air Flow Rate Alternatively, it can be computed using the ratio of the average temperature drop across the CRACs to the average temperature rise across the racks. In either case, a value of 100% indicates a perfectly balanced airflow configuration, where the supply equals the demand. Values with RTI < 100% have excess cooling airflow, so short-circuiting across the CRACs exists. Values with RTI > 100% have a deficit of cooling air, so there is recirculation from the rack exhausts to the rack inlets. It is best to have RTI values that are less than, but close to 100%. Rack Cooling Index The Rack Cooling Index, a registered trademarked of ANCIS Inc., is computed using the average number of degrees that the rack inlet temperature falls above (or below) the ASHRAE recommended temperature range (64.4 F to 80.6 F). One index is defined for temperatures above the range (RCI HI ) and another for temperatures below the range (RCI LO ). For the high side: where T R_HI is the ASHRAE recommended maximum temperature (80.6 F) T A_HI is the ASHRAE allowed maximum temperature (90 F) T i is the maximum inlet temperature on the i th rack n is the number of racks with T i > T R_HI N is the total number of racks in the sample The index on the low side is similarly defined: RCI LO n T R _ LO Ti i 1 1 x100% (5) NT R _ LO TA_ LO where T R_LO is the ASHRAE recommended minimum temperature (64.4 F) T A_LO is the ASHRAE allowed minimum temperature (59 F) T i is the minimum inlet temperature on the i th rack n is the number of racks with T i < T R_LO N is the total number of racks in the sample Ideally, no racks should be outside the recommended range, so the ideal value is 100% for both indices. Values between 90% and 100% are in the acceptable to good range, while values under 90% are considered poor Applied Math Modeling Inc. 5

6 Metrics for the Baseline Data Center Using the metrics defined above, the baseline data canter configuration can now be evaluated using a combination of measurements and CFD results. Because the cooling system is controlled in five separate zones, the facility managers have been able to measure the electric power needed to run the heat rejection system (the CRAC power minus the fans). The measured value, kw, is a snapshot of one day s power demand for the three normally functioning zones. They have also determined that each CRAC uses 8kW to run its fan, so the total CRAC fan power is 48 kw. Combining these, the total measured cooling power is = 317.1kW. The total rack heat load in the room is 363 kw, and this includes PDUs, which are rack mounted. If 5% of this value is assumed for additional support infrastructure (lights, etc.), the total IT heat load in the room is =381.2kW. Taking the most conservative approach described above, the CRAC fan power will be included in the room heat load. Assuming that all of the CRAC fan power will eventually be converted to heat, the total room heat load becomes = 429.2kW. The ratio of the total room heat load to the power needed to run the heat rejection system (269.1 kw) is the COP: COP 159. (6) This value is low, indicating that the data center could support more equipment for the amount of power being delivered to the cooling system. Alternatively, it suggests that shutting down one or more of the CRACs is an option to be considered. To calculate the PUE, the total facility power is needed. This is simply the total cooling power (317.1 kw) plus the total room heat load (429.2 kw), or kw. Dividing the total facility power by the total IT heat load (382.1 kw), the PUE is: PUE (7) The return temperature index can be computed using the boundary conditions used for the CRACs and IT equipment. The total supply air flow from the CRACs is 87,000 CFM. The demand air flow from the IT load (363 kw) is 57, 020 CFM. Assuming the additional 5% of heat load, the demand air flow should be adjusted by 5%, bringing the total to 59,871 CFM. The ratio of the adjusted demand air flow to the supply air flow is: 59,871 RTI % (8) 87,000 Consistent with earlier calculations, the RTI also suggests that the data center is overcooled. The degree to which it is overcooled is indicated by the high and low rack cooling indices. An analysis of the CFD results using Eq. (4) and (5) yields RCI 100% (9) HI 2011 Applied Math Modeling Inc. 6

7 and Modifying the Design RCI 0% (10) LO A value of 100% for RCI HI means that no racks have inlet temperatures above the recommended maximum value. A value less than 0 for RCI LO indicates that the average number of degrees below the recommended minimum value is greater than the number of degrees between the recommended and allowable minimum values. In other words, the inlet temperatures on the whole are much too cold. The metrics calculated for the baseline case are summarized in Table 2. Estimating the Baseline Data Center Costs Before considering changes to the data center, the cost of running the facility in its present state is estimated. To determine the cost, the total facility power is needed along with the cost of electricity. Using 746.3kW as the total facility power and $0.09 as the cost per kwh, the estimated annual cost of running the data center is about $588,300, which is within 10% of the actual cost. While this value is not based on the CFD analysis, a similar calculation can be done for proposed modifications to the data center. Thus while a CFD analysis can be used to judge the efficacy of each design, the companion energy calculation can be done to estimate the cost savings. Disabling Zones As a first step, each of the three active zones is disabled in a series of trials. These trials are solved concurrently on separate nodes at CoolSim s remote simulation facility (RSF) using the CRAC Failure Analysis model. Trial 1 has Zones 1, 2, and 4 disabled, Trial 2 has Zones 2, 3, and 4 disabled, and Trial 3 has Zones 2, 4, and 5 disabled. For each of these trials, the maximum rack inlet temperature is, at most, 75 F, well below the ASH- RAE recommended value of 80.6 F. Trial 1 has the highest rack inlet temperature, and contours for all of the racks for this case are shown in Figure 5. Note that when the left two zones are shut down, the temperature on that side of the room increases. Pathlines of the supply air in the plenum (Figure 6) show that jets from the opposing CRACs collide and deflect the cooling air to the left side of the room, keeping the rack temperatures in range. These trials illustrate that the simplest Figure 5: Contours of rack inlet temperature for Trial 1 of the baseline case, where Zones 1, 2, and 4 are shut down 2011 Applied Math Modeling Inc. 7

8 Figure 6: Pathlines of supply air in the plenum for Trial 1 of the baseline case, where Zones 1, 2, and 4 are shut down there are still no racks with temperatures above the recommended value. The RCI LO index remains below 0, but only slightly. Thus while the rack inlet temperatures are not as cold as before, they are still colder than they need to be. Owing to the drop in the total facility power, the cost to run the data center also drops. The new annual cost is estimated to be $492,400, representing a savings of about $95,900. These results are summarized in Table 2. modification to the data center - shutting down one of the zones - will not adversely impact the equipment. The data center metrics computed for Trial 1 show a great deal of improvement in energy efficiency and an associated cost savings. Because the amount of power needed to run the cooling system and CRAC fans is twothirds of the earlier value, the total cooling power is reduced to kw and the COP is increased to 2.3. The total facility power is reduced to kw, leading to a decrease in the PUE to The rack temperature index increases from 69% to 103%. Ideally, the RTI should be below 100%, but because an additional 5% of infrastructure equipment is included in the total heat load, the demand air flow rate is assumed to have a corresponding increase, which may be too much. (Additional heat from overhead lamps may be lost through the ceiling, for example.) The RCI HI index remains at 100%, indicating that Increasing the Supply Temperatures One of the dominant factors in reducing data center energy consumption is air supply temperature. For every 1.8 F increase in supply air temperature, the efficiency of the heat pump improves by 3.5% (Design Considerations for Datacom Equipment Centers, Atlanta: ASHRAE, 2005). Further, by increasing the supply air temperature, the window of free cooling opens, since air-side or water-side economizers can be used on more days of the year. Economizers improve the efficiency of the cooling system by making use of the reservoir of outside air in the heat rejection process. If the temperature difference between the supply air and outside air is reduced, the chillers and condensers in the heat rejection system can be augmented or even replaced by economizers, resulting in huge gains in the COP. Because the data center is initially overcooled, it is a prime candidate for increased supply temperature. Thus, as a second modi Applied Math Modeling Inc. 8

9 fication, all of the supply temperatures are increased to 65 F. Recall that in the original configuration, measured temperatures were used for the CRAC boundary conditions and all but two were below 60 F. Increasing all of the supply temperatures to 65 F should Baseline Case Trial 0 Baseline Case Trial 1 IT Heat Load (kw) Total IT Heat Load (kw) CRAC Cooling Power (kw) CRAC Fan Power (kw) Total Room Heat Load (kw) Total Cooling Power (kw) Total Facility Power (kw) COP PUE Total Supply Air Flow (CFM) 87,000 58,000 Total Demand Air Flow (CFM) 59,871 59,871 RTI (%) RCI HI (%) RCI LO (%) <0 <0 Cost of Electricity ($/kw-hr) Annual Cost ($) 588, ,400 Savings ($) 95,900 Table 2: Data center metrics comparing Trials 0 and 1 for the baseline case in which Zones 2 and 4 and Zones 1, 2, and 4 are shut down, respectively 2011 Applied Math Modeling Inc. 9

10 Figure 7: Rack inlet temperatures corresponding to 65 F CRAC supply temperatures for Trial 0 where Zones 2 and 4 are disabled alleviate the problems suggested by the RCI LO index and improve the COP, which will save a significant amount of power. To properly assess such a proposed change, a CFD analysis is needed to determine if hot spots will form, impacting the performance at the upper end of the recommended range. Contours of the rack inlet temperatures for Trial 0 of this scenario with Zones 2 and 4 disabled are shown in Figure 7. The minimum and maximum values for the contours are shown in the key on the left. Because the range (65 F to 78 F) falls with the ASHRAE recommended range (64.4 F to 80.6 F), all racks satisfy the condition and the RCI HI and RCI LO values are both 100%. The average supply temperature for the baseline case with only two zones disabled is 57 F. Increasing the average supply temperature to 65 F (an 8 F increase) corresponds to a 15% increase in the COP, so the new value for this configuration is The previous analysis showed, however, that disabling an additional zone results in potential savings of about $95,000 a year. Thus a CRAC failure analysis should be done with the 65 F supply temperature boundary condition to make sure that the rack inlet temperatures aren t too high if one of the zones is disabled. In Figure 8, the rack inlet temperatures are shown for the trial where the maximum rack inlet temperature is highest. It is again Trial 1 in which Zones 1, 2, and 4 are disabled. Based on the maximum value shown in the figure, some of the racks have temperatures above the ASHRAE recommended maximum of 80.6 F. A calculation of RCI HI supports this finding, with a value of 97.3%. RCI values between 95% and 100% are considered good for a data center. The value suggests that the average deviation in temperature above the recommended value is small, however, and this is indeed borne out by the detailed results. Indeed, all racks have inlet temperatures that are well below the ASHRAE allowable maximum value (90 F). As expected, RCI LO has a value of 100%. With 60 F as the average supply temperature for Trial 1 in the baseline case, the increase in supply temperature for this case (5 F) corresponds to an increase in the COP to Increasing the supply temperatures to 68 F results in RCI HI and RCI LO indices of 100% for Trial 0. Furthermore, the COP increases to For Trial 1, RCI LO remains at 100%, but RCI HI drops to 84%. Even so, none of 2011 Applied Math Modeling Inc. 10

11 the rack inlet temperatures goes above the ASHRAE allowable value. The COP increases to 2.66 for this scenario. Figure 8: Rack inlet temperatures corresponding to 65 F CRAC supply temperatures for Trial 1 where Zones 1, 2, and 4 are disabled The total facility power can be computed for each of these cases, and from it, the annual cost of running the data center. A summary of COP values and associated costs for the various trials discussed in this section is presented in Table 3. Comparison of the Trail 0 results shows that between $28,500 and $37,000 can be saved by increasing the supply temperatures. Comparison of the Trial 1 results shows that an additional Baseline Trial 0 Supply 65 F Trial 0 Supply 68 F Trial 0 Average T SUPPLY ( F) COP Total Facility Power (kw) Annual Cost ($) 588, , ,000 Savings ($) 28,500 37,300 Baseline Trial 1 Supply 65 F Trial 1 Supply 68 F Trial 1 Average T SUPPLY ( F) COP Total Facility Power (kw) Annual Cost ($) 492, , ,400 Savings ($) 12,500 19,000 Table 3: A comparison of COP and predicted annual costs resulting from increased CRAC supply temperatures; savings of at least $28,000 can be achieved if 3 of the 5 zones are operational (Trial 0, top) and at least $12,000 if one additional zone is disabled (Trial 1, bottom) 2011 Applied Math Modeling Inc. 11

12 $12,500 to $19,000 can be saved by disabling one of the zones. Applying the savings computed in Tables 2 and 3, the annual cost of the data center could be cut by at least $110,000 by disabling one of the zones and increasing the supply temperature to 65 F. Summary Computational fluid dynamics and data center metrics have been used to study a data center for which a number of measurements were available. The ten CRACs in the room are controlled using five zones, with two CRACs in each zone. Because the heat load is less than the original planned value, the data center currently operates with only three of the five zones active. Even so, the normal operating configuration is generating temperatures that are colder than needed. CFD was used to test alternative scenarios with additional zones disabled and with increased supply temperatures. For each of the design modifications, energy calculations were performed to estimate the total facility power usage and corresponding cost. The results of the studies show that one additional zone can be disabled and the supply temperatures can be raised slightly. With these changes, the rack inlet temperatures will remain well within the ASHRAE allowable temperature range and the annual cost of running the facility will be reduced by about $100, Applied Math Modeling Inc. 12

Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures

Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD

More information

Analysis of the UNH Data Center Using CFD Modeling

Analysis of the UNH Data Center Using CFD Modeling Applied Math Modeling White Paper Analysis of the UNH Data Center Using CFD Modeling By Jamie Bemis, Dana Etherington, and Mike Osienski, Department of Mechanical Engineering, University of New Hampshire,

More information

How to Build a Data Centre Cooling Budget. Ian Cathcart

How to Build a Data Centre Cooling Budget. Ian Cathcart How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options

More information

Analysis of data centre cooling energy efficiency

Analysis of data centre cooling energy efficiency Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe

More information

Benefits of. Air Flow Management. Data Center

Benefits of. Air Flow Management. Data Center Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment

More information

Improving Data Center Energy Efficiency Through Environmental Optimization

Improving Data Center Energy Efficiency Through Environmental Optimization Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic

More information

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.

More information

HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.)

HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.) HPPR Data Center Improvements Projects Towards Energy Efficiency (CFD Analysis, Monitoring, etc.) Felipe Visbal / March 27, 2013 HPPR Data Center Technologies Team Services Data Centers: Thermal Assessments

More information

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,

More information

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency

More information

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling Student Intern Earth and Environmental Engineering,

More information

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:

More information

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the

More information

Google s Green Data Centers: Network POP Case Study

Google s Green Data Centers: Network POP Case Study Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice

More information

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc. radmehr@inres.com

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc. radmehr@inres.com Minneapolis Symposium September 30 th, 2015 Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc. radmehr@inres.com Learning Objectives 1. Gain familiarity with Computational

More information

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building

More information

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue

More information

Using CFD for optimal thermal management and cooling design in data centers

Using CFD for optimal thermal management and cooling design in data centers www.siemens.com/datacenters Using CFD for optimal thermal management and cooling design in data centers Introduction As the power density of IT equipment within a rack increases and energy costs rise,

More information

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white

More information

Data Center Components Overview

Data Center Components Overview Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room

More information

Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center

Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center Impacts of Perforated Tile Open Areas on Airflow Uniformity and Air Management Performance in a Modular Data Center Sang-Woo Ham 1, Hye-Won Dong 1, Jae-Weon Jeong 1,* 1 Division of Architectural Engineering,

More information

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack

More information

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no

More information

Measure Server delta- T using AUDIT- BUDDY

Measure Server delta- T using AUDIT- BUDDY Measure Server delta- T using AUDIT- BUDDY The ideal tool to facilitate data driven airflow management Executive Summary : In many of today s data centers, a significant amount of cold air is wasted because

More information

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Dealing with Thermal Issues in Data Center Universal Aisle Containment Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe Daniele.Tordin@Panduit.com AGENDA Business Drivers Challenges

More information

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015 Data Center Cooling & Air Flow Management Arnold Murphy, CDCEP, CDCAP March 3, 2015 Strategic Clean Technology Inc Focus on improving cooling and air flow management to achieve energy cost savings and

More information

Airflow and Cooling Performance of Data Centers: Two Performance Metrics

Airflow and Cooling Performance of Data Centers: Two Performance Metrics 2008, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ashrae.org). Published in ASHRAE Transactions, Vol. 114, Part 2. For personal use only. Additional reproduction,

More information

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy

More information

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore Introduction Agenda Introduction Overview of Data Centres DC Operational

More information

Energy Performance Optimization of Server Room HVAC System

Energy Performance Optimization of Server Room HVAC System International Journal of Thermal Technologies E-ISSN 2277 4114 2 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijtt/ Research Article Manoj Jadhav * and Pramod Chaudhari Department

More information

POWER USAGE EFFECTIVENESS. Developing Best Practices Manual for Indian Data Centers. Presentation by HVAC Core Group

POWER USAGE EFFECTIVENESS. Developing Best Practices Manual for Indian Data Centers. Presentation by HVAC Core Group POWER USAGE EFFECTIVENESS Developing Best Practices Manual for Indian Data Centers PUE - Power usage effectiveness is Total incoming power( IT equipment + electrical And mechanical support system ) / Presentation

More information

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential TECHNICAL REPORT Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential By Lars Strong, P.E., Upsite Technologies, Inc. Bruce Long, Upsite Technologies, Inc. +1.888.982.7800 upsite.com

More information

A White Paper from the Experts in Business-Critical Continuity TM. Data Center Cooling Assessments What They Can Do for You

A White Paper from the Experts in Business-Critical Continuity TM. Data Center Cooling Assessments What They Can Do for You A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly

More information

Managing Data Centre Heat Issues

Managing Data Centre Heat Issues Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design

More information

BRUNS-PAK Presents MARK S. EVANKO, Principal

BRUNS-PAK Presents MARK S. EVANKO, Principal BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations

More information

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com

More information

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to

More information

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project Control of Computer Room Air Handlers Using Wireless Sensors Energy Efficient Data Center Demonstration Project About the Energy Efficient Data Center Demonstration Project The project s goal is to identify

More information

Environmental Data Center Management and Monitoring

Environmental Data Center Management and Monitoring 2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor

More information

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200

More information

Effect of Rack Server Population on Temperatures in Data Centers

Effect of Rack Server Population on Temperatures in Data Centers Effect of Rack Server Population on Temperatures in Data Centers Rajat Ghosh, Vikneshan Sundaralingam, Yogendra Joshi G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology, Atlanta,

More information

The New Data Center Cooling Paradigm The Tiered Approach

The New Data Center Cooling Paradigm The Tiered Approach Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,

More information

Green Data Centre Design

Green Data Centre Design Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel

More information

Using CFD for Data Center Design and Analysis

Using CFD for Data Center Design and Analysis Applied Math Modeling White Paper Using CFD for Data Center Design and Analysis By Liz Marshall and Paul Bemis, Applied Math Modeling Inc., Concord, NH January, 2011 Introduction Computational fluid dynamics

More information

Benefits of Cold Aisle Containment During Cooling Failure

Benefits of Cold Aisle Containment During Cooling Failure Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power savings through the use of Intel s intelligent

More information

Adaptive Cooling in Data Centers. Silicon Valley Leadership Group Energy Efficient Data Center Demonstration Project October, 2009

Adaptive Cooling in Data Centers. Silicon Valley Leadership Group Energy Efficient Data Center Demonstration Project October, 2009 Adaptive Cooling in Data Centers Silicon Valley Leadership Group Energy Efficient Data Center Demonstration Project October, 2009 Maximizing Cooling Efficiency in a Concurrently Maintainable and Fault

More information

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of

More information

Energy Efficiency and Effective Equipment Cooling In Data Centers. Presentation at ASHRAE Summer Meeting Quebec City, Canada, June 27, 2006

Energy Efficiency and Effective Equipment Cooling In Data Centers. Presentation at ASHRAE Summer Meeting Quebec City, Canada, June 27, 2006 ANCIS TM I N C O R P O R A T E D Energy Efficiency and Effective Equipment Cooling In Data Centers Presentation at ASHRAE Summer Meeting Quebec City, Canada, June 27, 2006 Magnus K. Herrlin, Ph.D., CEM

More information

Greening Commercial Data Centres

Greening Commercial Data Centres Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling

More information

DataCenter 2020: first results for energy-optimization at existing data centers

DataCenter 2020: first results for energy-optimization at existing data centers DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction

More information

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program

More information

Managing Cooling Capacity & Redundancy In Data Centers Today

Managing Cooling Capacity & Redundancy In Data Centers Today Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS

More information

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010 White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence

More information

Rack Hygiene. Data Center White Paper. Executive Summary

Rack Hygiene. Data Center White Paper. Executive Summary Data Center White Paper Rack Hygiene April 14, 2011 Ed Eacueo Data Center Manager Executive Summary This paper describes the concept of Rack Hygiene, which positions the rack as an airflow management device,

More information

How Row-based Data Center Cooling Works

How Row-based Data Center Cooling Works How Row-based Data Center Cooling Works White Paper 208 Revision 0 by Paul Lin and Victor Avelar Executive summary Row-based data center cooling is normally regarded as a cold air supply architecture that

More information

The CEETHERM Data Center Laboratory

The CEETHERM Data Center Laboratory The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology

More information

Center Thermal Management Can Live Together

Center Thermal Management Can Live Together Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots

More information

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center

More information

An Introduction to Cold Aisle Containment Systems in the Data Centre

An Introduction to Cold Aisle Containment Systems in the Data Centre An Introduction to Cold Aisle Containment Systems in the Data Centre White Paper October 2010 By Zac Potts MEng Mechanical Engineer Sudlows October 2010 An Introduction to Cold Aisle Containment Systems

More information

Data center upgrade proposal. (phase one)

Data center upgrade proposal. (phase one) Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The

More information

Power and Cooling for Ultra-High Density Racks and Blade Servers

Power and Cooling for Ultra-High Density Racks and Blade Servers Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw

More information

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory

More information

Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers. Sami Alkharabsheh, Bahgat Sammakia 10/28/2013

Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers. Sami Alkharabsheh, Bahgat Sammakia 10/28/2013 Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers Sami Alkharabsheh, Bahgat Sammakia 1/28/213 2 ES2 Vision To create electronic systems that are self sensing and

More information

Integrated Cabinets and Thermal Systems May 15, 2014 Panduit Korea Technical System Engineer Chester Ki

Integrated Cabinets and Thermal Systems May 15, 2014 Panduit Korea Technical System Engineer Chester Ki 1 Integrated Cabinets and Thermal Systems May 15, 2014 Panduit Korea Technical System Engineer Chester Ki 4/10/2014 2 Agenda Market Trends Thermal Architectures Data Centre Cabinet Systems Net-Access Cabinet

More information

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Unified Physical Infrastructure (UPI) Strategies for Thermal Management Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting

More information

Enclosure and Airflow Management Solution

Enclosure and Airflow Management Solution Enclosure and Airflow Management Solution CFD Analysis Report Des Plaines, Illinois December 22, 2013 Matt Koukl- DECP Mission Critical Systems Affiliated Engineers, Inc. Contents Executive Summary...

More information

Specialty Environment Design Mission Critical Facilities

Specialty Environment Design Mission Critical Facilities Brian M. Medina PE Associate Brett M. Griffin PE, LEED AP Vice President Environmental Systems Design, Inc. Mission Critical Facilities Specialty Environment Design Mission Critical Facilities March 25,

More information

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

AIR-SITE GROUP. White Paper. Green Equipment Room Practices AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider

More information

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and

More information

APC APPLICATION NOTE #92

APC APPLICATION NOTE #92 #92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center

More information

Improving IDC Cooling and Air Conditioning Efficiency

Improving IDC Cooling and Air Conditioning Efficiency Improving IDC Cooling and Air Conditioning Efficiency V Junichi Ishimine V Yuji Ohba V Sayoko Ikeda V Masahiro Suzuki (Manuscript received August 21, 2008) With the demand for information technology (IT)

More information

Data Center Energy Profiler Questions Checklist

Data Center Energy Profiler Questions Checklist Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space

More information

STULZ Water-Side Economizer Solutions

STULZ Water-Side Economizer Solutions STULZ Water-Side Economizer Solutions with STULZ Dynamic Economizer Cooling Optimized Cap-Ex and Minimized Op-Ex STULZ Data Center Design Guide Authors: Jason Derrick PE, David Joy Date: June 11, 2014

More information

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite

More information

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Technical Paper (13 December 2013) SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES

More information

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,

More information

Data Center Temperature Rise During a Cooling System Outage

Data Center Temperature Rise During a Cooling System Outage Data Center Temperature Rise During a Cooling System Outage White Paper 179 Revision 0 By Paul Lin Simon Zhang Jim VanGilder > Executive summary The data center architecture and its IT load significantly

More information

Data Center Temperature Rise During a Cooling System Outage

Data Center Temperature Rise During a Cooling System Outage Data Center Temperature Rise During a Cooling System Outage White Paper 179 Revision 1 By Paul Lin Simon Zhang Jim VanGilder > Executive summary The data center architecture and its IT load significantly

More information

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Prediction Is Better Than Cure CFD Simulation For Data Center Operation. Prediction Is Better Than Cure CFD Simulation For Data Center Operation. This paper was written to support/reflect a seminar presented at ASHRAE Winter meeting 2014, January 21 st, by, Future Facilities.

More information

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.

More information

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer

More information

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling

More information

Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI)

Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI) 2005. American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. Reprinted by permission from ASHRAE Transactions, Vol. 111, Part 2. This material may not be copied nor distributed

More information

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems A White Paper from the Experts in Business-Critical Continuity Executive Summary In enterprise data centers, designers face

More information

Data Center Technology: Physical Infrastructure

Data Center Technology: Physical Infrastructure Data Center Technology: Physical Infrastructure IT Trends Affecting New Technologies and Energy Efficiency Imperatives in the Data Center Hisham Elzahhar Regional Enterprise & System Manager, Schneider

More information

DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER

DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER Omendra K. Govind 1, Annu Govind 2 1 Advanced Level Telecom Training Centre (ALTTC), BSNL, Ghaziabad,(India) 2 Research Scholar, Department of Electrical Engineering,

More information

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY The Aegis Services Power and Assessment Service provides an assessment and analysis of your data center facility and critical physical

More information

APC APPLICATION NOTE #112

APC APPLICATION NOTE #112 #112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.

More information

THE GREEN DATA CENTER

THE GREEN DATA CENTER GREEN IT THE GREEN DATA CENTER WHERE ECOLOGY MEETS ECONOMY We truly live in an information age. Data Centers serve a very important purpose they provide the global community with nearly unlimited access

More information

11 Top Tips for Energy-Efficient Data Center Design and Operation

11 Top Tips for Energy-Efficient Data Center Design and Operation 11 Top Tips for Energy-Efficient Data Center Design and Operation A High-Level How To Guide M e c h a n i c a l a n a l y s i s W h i t e P a p e r w w w. m e n t o r. c o m When is Data Center Thermal

More information

Strategies for Deploying Blade Servers in Existing Data Centers

Strategies for Deploying Blade Servers in Existing Data Centers Strategies for Deploying Blade Servers in Existing Data Centers By Neil Rasmussen White Paper #125 Revision 1 Executive Summary When blade servers are densely packed, they can exceed the power and cooling

More information

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200

More information

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Munther Salim, Ph.D. Yury Lui, PE, CEM, LEED AP eyp mission critical facilities, 200 west adams street, suite 2750, Chicago, il 60606

More information

Statement Of Work Professional Services

Statement Of Work Professional Services Statement Of Work Professional Services Data Center Cooling Analysis using CFD Data Center Cooling Analysis Using Computational Fluid Dynamics Service 1.0 Executive Summary Table of Contents 1.0 Executive

More information

Using Simulation to Improve Data Center Efficiency

Using Simulation to Improve Data Center Efficiency A WHITE PAPER FROM FUTURE FACILITIES INCORPORATED Using Simulation to Improve Data Center Efficiency Cooling Path Management for maximizing cooling system efficiency without sacrificing equipment resilience

More information

Virginia Tech. Background. Case Summary. Results

Virginia Tech. Background. Case Summary. Results Background Virginia Tech When Virginia Tech initiated plans to build a supercomputer by clustering hundreds of desktop computers, it had ambitious goals for the performance of the new system and a very

More information

Data Centre Cooling Air Performance Metrics

Data Centre Cooling Air Performance Metrics Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. info@dc-oi.com Abstract Data centre energy consumption

More information

Boosting the Energy efficiency of a Server Room: experimental + virtual approach.

Boosting the Energy efficiency of a Server Room: experimental + virtual approach. Boosting the Energy efficiency of a Server Room: experimental + virtual approach. Nowadays, the increase in computing power allows the successful applications of the CFD simulation methods in civil environments.

More information