Australian Government Data Centre Strategy

Size: px
Start display at page:

Download "Australian Government Data Centre Strategy 2010-2025"

Transcription

1 Australian Government Data Centre Strategy Better Practice Guide: Data Centre Cooling November /11/2013 2:58 PM

2 22/11/2013 2:58 PM Contents Contents 2 1. Introduction 3 Purpose 3 Scope 3 Policy Framework 3 Related documents 4 2. Discussion 6 Overview 6 Fundamentals 6 Common Concepts and Definitions 8 Common Cooling Problems 9 Improving Cooling Efficiency 10 Potential Work Health Safety Issues 12 Capacity Planning for Cooling Systems 13 Operations Effectiveness and Continuous Improvement 15 Optimising Cooling Systems 16 Maintenance 16 Sustainability 17 Trends 17 Conclusion Better Practices 19 Operations 19 Planning Conclusion 22 Summary of Better Practices 22 22/11/2013 2:58 PM

3 1. Introduction The purpose of this guide is to advise Australian Government agencies on ways to improve operations relating to the data centre cooling. Many government functions are critically dependent upon information and communication technology (ICT) systems based in data centres. The principal purpose of the data centre s cooling systems is to provide conditioned air to the ICT equipment with the optimum mix of temperature, humidity and pressure. In larger, purpose built data centres, the cooling system also cools the support equipment and the general office area. Cooling is typically a lower priority for management attention. The ICT equipment and power are given more attention because when they fail, the impacts are more immediate and disruptive. However, extended cooling system failures can be more damaging than power failures. Further, inefficient or ineffective cooling is a major cause of waste in data centre operations, and contributes to increased hardware failures. Historically, making cooling systems efficient has resulted in reducing data centre operating costs by up to 50 per cent. Ineffective cooling results in some ICT equipment running at higher temperatures and so failing sooner. Sound operations practices are essential to efficient and effective data centre cooling. This guide on cooling forms part of a set of better practice guides for data centres. Purpose The cooling system is a major cost driver for data centres, and interacts with many other data centre systems. The intent of this guide is to assist managers to assess how well a cooling system meets their agency s needs, and to reduce the capital and operating costs relating to data centre cooling. Scope This guide addresses the operations and processes required to achieve better results from data centre cooling technology. This guide does not consider data centre cooling technology in detail, as information at this level is widely available, subject to rapid change, contentious and specialised. Industry will be able to supply agencies with advice and cooling technology to meet their specific needs. The discussion will be restricted to cooling technology and operations that are relevant to data centres used by Australian Government agencies. Policy Framework The guide has been developed within the context of the Australian public sector s data centre policy framework. This framework applies to agencies subject to the Better Practice Guide: Data Centre Cooling 3

4 Financial Management and Accountability Act 1997 (FMA Act). The data centre policy framework seeks financial, technical and environmental outcomes. The Australian Government Data Centre Strategy (data centre strategy) describes actions that will avoid $1 billion in future data centre costs. The data centre facilities panel, established under the coordinated procurement policy, provides agencies with leased data centre facilities. The Australian Government ICT Sustainability Plan describes actions that agencies are to take to improve environmental outcomes. The ICT sustainability plan sets goals for power consumption in data centres, which is the key factor driving the need for cooling. The National Construction Code was created in 2011 by combining the Building Code of Australia and the Plumbing Code of Australia. The National Construction Code controls building design in Australia, and may be further modified by State Government and council regulations. The Technical Committee 9.9 (TC9.9) of the American Society of Heating, Refrigeration and Air-conditioning Engineers (ASHRAE) publishes guidance on data centre cooling 1. The Australian Refrigeration Council (ARC) is the organisation appointed by the Australian Government able to accredit individuals to handle refrigerants and related chemicals. The Australian Institute of Refrigeration, Air conditioning and Heating (AIRAH) is a specialist membership association for air conditioning, refrigeration, heating and ventilation professionals. AIRAH provides continuing professional development, accreditation programs and technical publications. Related documents Information about the data centre strategy, and DCOT targets and guidance can be obtained from the Data Centre section ([email protected]). The data centre better practice guides also cover: Power: the data centre infrastructure supplying power safely, reliably and efficiently to the ICT equipment and the supporting systems. Structure: the physical building design provides for movement of people and equipment through the site, floor loading capacity, reticulation of cable, air and water. The design also complements the fire protection and security better practices. Data Centre Infrastructure Management: the system that monitors and reports the state of the data centre. Also known as the building management system. Fire protection: the detection and suppression systems that minimise the effect of fire on people and the equipment in the data centre. 1 The reports of the Technical Committee 9.9 are widely referenced and%20Controls_V1.0.pdf Better Practice Guide: Data Centre Cooling 4

5 Security: the physical security arrangements for the data centre. This includes access controls, surveillance and logging throughout the building, as well as perimeter protection. Equipment racks: this guide brings together aspects of power, cooling, cabling, monitoring, fire protection, security and structural design to achieve optimum performance for the ICT equipment. Environment: this guide examines data centre sustainability, including packaging, electric waste, water use and reducing green house gas generation. Better Practice Guide: Data Centre Cooling 5

6 2. Discussion Overview This section discusses key concepts and practices relating to data centre cooling. The focus is on operations, as this is essential for efficient, reliable and effective performance of the information and communication technology (ICT) in the data centre. If more background is needed on cooling systems design, there is a vast amount of publicly available information, from the general to the very detailed. AIRAH 2 provides material relating to Australia s general refrigeration industry. The Green Grid 3 and ASHRAE TC9.9 are industry bodies that provide vendor independent information relating to data centre cooling. The refrigeration industry has been operating for over 150 years, and the air conditioning industry for over 100 years. As data centres are recent, designers first adapted other air conditioning systems to suit data centre needs. In the last 15 years, purpose built equipment has been developed. The result is diversity and innovation. There are a range of designs and choice in equipment makes and models. And there is constant innovation in cooling designs and technology. Cooling a data centre is a continuing challenge. Data centres have dynamic thermal environments. This requires regular monitoring and adjustments due to interactions between ICT, power and cooling systems. External factors, such as the weather, time of day and seasons, also contribute to the challenge. The better practice is to ensure that the overhead costs due to cooling are minimised over the data centre s life. Fundamentals The basic physics of a data centre is that electricity is converted to heat and noise. The cooling system must transfer enough heat quickly enough from the ICT equipment to prevent equipment failures. There are two types of air conditioning systems, comfort and precision. Comfort systems are designed for human use while precision systems are designed for ICT equipment. Comfort systems have a lower capital cost, but have a higher operating cost as they are less efficient than precision systems in cooling data centres. Units The wide spread use of archaic units of measurement is due to the longevity of the refrigeration industry. As the core issue the cooling system is intended to handle is energy, this paper uses the International Standard (SI) unit, the watt. This allows Better Practice Guide: Data Centre Cooling 6

7 comparison between the power consumed in the data centre and the cooling required. Agencies are encouraged to use the watt as the basic unit. Heating, Cooling and Humidity In a simplified model of data centre cooling there are three major elements. Supply Transport Demand Heat rejected Input air Cooling Waste heat carried away Cooling ICT and Other Sources Power supplied to ICT equipment becomes heat Figure 1: A Simplified Model for Data Centre Cooling Supply: this creates the cooling. Common technologies are refrigeration, chillers, and use of the ambient temperature. Cooled air is very commonly used, due to overall cost and current ICT designs. Liquids are much more effective (10 to 80 times). Typically, smaller cooling systems use air only (with a sealed refrigerant) while larger cooling systems use a combination of air and liquid. Transport: ensures that enough cooling is delivered soon enough, and enough heat is removed quickly enough, to maintain the ICT equipment at the correct temperature. Typically, air is used to deliver the cooling and carry away the heat at the equipment rack, and fans are used to move the air. Demand: the sources that create the heat, mostly due to ICT. The other sources include the people that work in the data centre, the external ambient temperature that transfers into the data centre, and the other support systems. The cooling system itself generates heat. The uninterruptible power supply (UPS) also generates heat in standby and operation, in particular batteries. The cooling system is also required to adjust the humidity of the air and to remove particles. Depending upon the climate at a data centre s location, moisture may need to be added or removed. Similarly, the types and amount of particles to be removed from the air are determined by the location and external events. ICT Equipment Heat Generation Each item of ICT hardware in the data centre needs an adequate supply of cooling air to be supplied to it, and the heated air removed. Different types of equipment have different needs. Server and network ICT equipment depend on high performance computer chips. Most computer chips are able to generate enough heat to damage themselves and nearby components. Chips are designed to operate reliably at between 55 C and 70, with fans blowing cooling air over the chips to remove heat. Manufacturers specify the range of inlet air temperature needed for the equipment to operate reliably. Better Practice Guide: Data Centre Cooling 7

8 Other ICT devices, such as disk storage and tape drives, consume much less electricity and generate much less heat. Although these types of devices also have chips, they are designed to operate at lower temperatures. Common Concepts and Definitions This section illustrates commonly used terms and concepts in data centres operated by APS agencies. Note that this is not a comprehensive or definitive list. There are many other ways used to implement cooling systems in data centre. Figure 2 shows one type of data centre cooling system. A chiller on the left receives water warmed in the data centre, chills the water and pumps it back to the data hall. The Computer Room Air Handler (CRAH) uses the chilled water to create cool air. The cool air is pushed into the under floor plenum to the ICT racks. The ICT equipment is cooled by the air. The warm air is drawn back to the CRAH, to be cooled again. Chiller CRAH ICT Rack Chilled water loop Under floor plenum Figure 2: Data Centre Cooling using CRAH and chilled water Figure 3 shows another common type of cooling system. The Computer Room Air Conditioner (CRAC) creates cool air and pushes this into the under floor plenum to the ICT racks. The ICT equipment is cooled by the air. The warm air is drawn back to the CRAC, to be cooled again. Exchanger Economizer CRAC ICT Rack Under floor plenum Figure 3: Cooling system using CRAC CRACs may use refrigerants (as shown in Figure 3) or chilled water (similar to Figure 2) to cool the air. The refrigerant is pumped to the exchanger to remove the excess heat, before being returned to the CRAC.A common technology for the exchanger is the cooling tower, which uses water vapour to cool and humidify external air as it is drawn in, while warm air is expelled to the outside air. Better Practice Guide: Data Centre Cooling 8

9 Free air cooling is a technique of bringing air that has a suitably low temperature into the data centre, either directly to the ICT rack or indirectly to chill the water. This is also known as an economizer. Figure 4 shows the direct exchange (DX) cooling system. The DX unit has the same mode of operating as a CRAC, and use refrigerants for cooling. DX units may also have free air cooling. Exchanger DX Unit ICT Rack Under floor plenum Figure 4: DX cooling system Dedicated chilled water systems generate more cost effective cooling for larger data centres. Smaller data centres will typically use refrigeration or share chilled water with the building air conditioning system. The cooling system must also manage the humidity, the amount of water vapour in the air. The preferred range is 20% to 80% relative humidity. Below 20%, there is a risk of static electricity causing ICT equipment failure. Above 80%, there is a risk of condensation. 4 A blanking panel is a solid plate put in a rack to prevent cool and warm air from mixing. Blanking panels are placed as required, and provide a 1 to 5 per cent improvement in efficiency. The inlet air temperature is the temperature measured at the point of entering a piece of ICT. Knowing the inlet air temperature across all equipment is important when maximising the data centre s efficiency. The examples in this section showed an under floor plenum to reflect the design most commonly used in APS data centres. Data halls first began using raised floors (under floor plenums) in Many recently built data centres use solid concrete slabs rather than raised floors. There are merits in both approaches. Common Cooling Problems The figure below illustrates many common problems in data centre cooling. The cooling air rises from the under-floor plenum for racks1 and 2. However, rack 3 is drawing the warmed air from rack 2 directly into the equipment in rack 3. This increases the rate of equipment failure in rack 3. Before entering rack 2, the cooling air mixes with warmed air from rack 1. Thus, rack 2 is not cooled as effectively. In both rack 1 and 2, the warm air is moving back from right to left inside the racks. Blanking would stop this air mixing. 4 Large data centres can have issues. Better Practice Guide: Data Centre Cooling 9

10 As well as the increased rate of equipment failure, the energy to cool the air has been wasted by allowing the warm air leaving a rack to mix before entering the rack. ` Cold air Warm air Hot air Rack 1 Rack 2 Rack 3 Figure 5: Common cooling problems Another common problem occurs when two or more CRACs have misaligned target states. For example, if one CRAC is set for a relative humidity of 60% and another CRAC is set to 40%, then both battle to reach their preferred humidity level. As both CRACs will operate for far longer than necessary, electricity and water will be wasted. Older CRACs use belts to drive the fans. The belts wear and shed dust throughout the data hall. This dust passes into the ICT equipment s cooling systems, reducing efficiency slightly. It is cost effective to replace belts as the newer drives have lower operating costs due to lower power use and lower maintenance requirements. Noise is acoustic energy, and can be used as a proxy measure of data centre efficiency. Higher levels of noise mean that more energy is being lost from various data centre components and converted to acoustic energy. Two key sources are fans and air-flow blockages. Fans are essential to air movement. However, lowering the inlet air temperature means the fans need to operate less frequently, or at lower power. Another common issue is blockages in transporting the cool or hot air. Blockages mean that more pressure is needed to transport the air. Pressure is generated by the fans (for example, in the CRACs or CRAHs), which must use more power to create the greater pressure. Bundles of cables in the underfloor plenum or creating a curtain in the equipment racks are very common faults that are easily remedied. Ductwork with several left and right turns is another common issue. Smooth curves minimise turbulence and allow for more efficient air movement. Improving Cooling Efficiency There are several simple, inexpensive techniques that have consistently improved cooling system performance by 10 to 40 per cent. Better Practice Guide: Data Centre Cooling 10

11 Hot / Cold Aisle Cold air Hot air Figure 6: Hot aisle configuration Hot aisle / cold aisle: aligning the ICT equipment in the racks so that all the cold air is drawn from one side of the rack and expelled from the other side. The racks are then aligned in rows, so that the hot air from two rows blow toward each other, while the alternate row, the cold air is drawn into two racks. Changing from randomly arranged equipment to this arrangement reduces cooling costs by 15% to 25%. Hot / Cold Aisle Containment Enclosing one of the hot or cold aisles gives greater efficiencies by further preventing hold and cold air from mixing. Cooling costs are reduced by another 10% over hot / cold aisle alignment. 5 Cold air Hot air Cold aisle containment Hot aisle containment Figure 7 Cold or hot aisle containment Hot or cold aisle containment is nearly always cost effective in data centres that have not implemented hot / cold aisle alignment and purpose-built containment solutions are available. Containment can be retrofitted into existing data halls, using an inexpensive material such as plywood, MDF or heavy plastic. However, due care is necessary, for example to ensure that the fire suppression system will still operate as designed. 5 Moving from random placement to hot / cold aisle containment means 25% to 35% reductions in cooling costs. Better Practice Guide: Data Centre Cooling 11

12 Class Raising the Temperature Since 2004, ASHRAE TC 9.9 has published guidance on the appropriate temperature of a data centre. Many organisations have reported major reductions in cooling costs by operating the data centre at temperatures in the mid-twenties. The most recent advice was released in October 2011, and subsequently incorporated into a book in The following table presents the recommended and allowable conditions for different classes of ICT equipment. A1 to A4 A1 A2 A3 A4 B C Dry-Bulb Temperature ( C) Equipment Environmental Specifications Product Operations Maximum Maximum Maximum Rate of Dew Point Elevation Change ( C) (m) ( C/hour) Humidity Range, Non- Condensing Dry-Bulb Temperature ( C) Product Power Off Relative Humidity (%) Maximum Dew Point ( C) Recommended (Applies to all A classes; individual data centres can choose to expand this range based upon the analysis described in ASHRAE documents) 18 to to to 35 5 to 40 5 o 45 5 to 35 5 to C DP to 60% RH and 15 C DP 20% to 80% RH 20% to 80% RH -12 C DP & 8% RH to 85% RH -12 C DP & 8% RH to 90% RH 8% RH to 80% RH 8% RH to 80% RH Allowable /20 5 to 45 8 to /20 5 to 45 8 to /20 5 to 45 8 to /20 5 to 45 8 to NA 5 to 45 8 to NA 5 to 45 8 to ASHRAE offers two cautions, around noise and operating life. It is not enough to raise the temperature of the air entering the data hall. The facilities and ICT staff must know the temperature of the air entering the ICT equipment, and how the equipment will respond. Beyond a certain temperature, any savings made in the cooling system will be lost as fans in the ICT equipment work harder and longer. ASHRAE also advise that raising the temperature does reduce operating life. However, for many types of ICT equipment the operating life is significantly longer than the economic life. Agencies should monitor the life of their ICT assets, including failure rates. Agencies should also discuss their plans with the ICT equipment manufacturer. Potential Work Health Safety Issues The cooling system can pose health risks to staff and members of the public. With planning, these risks can be treated and managed. The common risks are noise, bacteria and heat. Heat is a possible risk once the data centre design includes zones in which the temperature is intended to reach over 35 C. Hot aisle containment systems can Better Practice Guide: Data Centre Cooling 12

13 routinely have temperatures over 40 C. Staff must follow procedures to monitor their environmental temperature, including duration. Hydration, among other mitigation steps, will be required. A typical data centre is noisy, and this is a potential risk to be managed by data centre management, staff and visitors. 6 Australian work health safety regulations identify two types of harmful noise. 7. Harmful noise can cause gradual hearing loss over a period of time or be so loud that it causes immediate hearing loss. Hearing loss is permanent. The exposure standard for noise is defined as an LA eq,8h of 85 db(a) or an LC, peak of 140 db(c). LA eq,8h means the eight hour equivalent continuous A-weighted sound pressure level in decibels, referenced to 20 micropascals, determined in accordance with AS/NZS This is related to the total amount of noise energy a person is exposed to in the course of their working day. An unacceptable risk of hearing loss occurs at LAeq,8h values above 85 db(a). LC, peak means the C-weighted peak sound pressure level in decibels, referenced to 20 micropascals, determined in accordance with AS/NZS It usually relates to loud, sudden noises such as a gunshot or hammering. LC, peak values above 140 db(c) can cause immediate damage to hearing. Guidance and professional services are available to manage risks due to noise. 8 Capacity Planning for Cooling Systems Planning for cooling capacity requires taking several perspectives of the data centre. The data centre can be broken down into a set of volumes of space. The first volume is the ICT equipment in the rack. The second is groups of racks, or pods (if used). The third is the data hall, and the last is the whole data centre. For each volume, the key questions are: How much heat is being generated? How is the heat distributed in that volume of space? How much cooling can be delivered to that volume, and at what rate? Figure 8 shows a simplified representation of the data centre power systems. These are the components that create the heat that requires cooling. Each component subsystem has its own cooling needs and characteristics. 6 Safework Australia, Managing Noise and Preventing Hearing Loss in the Workplace: Code of Practice, ng_hearing_loss_at_work.pdf 7 Safework SA, Noise in the Workplace 8 Australian Hearing, Protecting Your Hearing Better Practice Guide: Data Centre Cooling 13

14 Main switchboard UPS DB Equipment racks ICT equipment Local distribution lines to the building Backup generator Office area Fire protection system HVAC system Security system Data Centre Infrastructure Management Figure 8: Conceptual view of data centre power systems A simple approach to determine the total demand is: Measure the total data centre power used at the main switchboard. Most of this power will be converted to heat. If the UPS uses chemical batteries, then these create heat throughout their life, even if the UPS is not supplying power. This heat must be added to the demand. The backup generator will require cooling when operating. This must be included in the total demand. If, when operating on backup power, there is a substantial amount of the ICT equipment turned off, then If the office area uses the data centre cooling, then this must be included. People generate about 1 kw of heat. Margin for peak demand, such as hot weather. Headroom for growth in demand must be added. To control capital expenditure, the headroom should be decided more on the length of time needed for a capacity upgrade, and not the life of the data centre. Using the data centre power as measured at the main switchboard is preferable over adding up all the name plate power required by the ICT equipment. The ICT equipment name plate describes the maximum amount of cooling required. As the ICT equipment is usually operating at less than maximum power, using the sum of all the name plate ratings of the equipment will lead to over provisioning of the cooling system. Headroom can then be created, based on expected growth. The total cooling demand can then be allocated to the various halls and rooms. The number of CRACs (or other types of technology) to supply this cooling should be enough so that at least one CRAC can be shut down for maintenance while the full cooling capacity is delivered to the data hall. (This is known as N+1 redundancy). The equipment racks will house different classes of ICT equipment, and so the cooling needs will vary between racks. Some racks may use 20kW of power, and so need 20kW of cooling, while other racks use 1kW. It is necessary to consider the upper and lower cooling needs to ensure the cooling is distributed appropriately. Better Practice Guide: Data Centre Cooling 14

15 Operations Effectiveness and Continuous Improvement A program to consistently improve operational effectiveness requires measurement and standard reporting. The level of investment is influenced by the data centre power bill and state of the data centre. Data centres that follow no better practices may reduce their power bill by over 50 per cent. Measurement and reporting will achieve these savings sooner. Measurement There are many options and possible measurement points in a data centre. Many types of data centre equipment now include thermal sensors and reporting. The precision and accuracy of these sensors is variable and should be checked. Agencies may choose to sample at various points in the data centre, and extrapolate for similar points. This approach reduces costs, at the expense of accuracy. Sampling may be unsuitable in data centres with a high rate of change, or when trialling higher data hall temperatures. The following points should be considered for reporting: Inlet air temperature for each device. Exit point from the cooling unit. Base of the rack. Top of the rack Return to the cooling unit. External ambient temperature. Chilled water loop. For liquid cooling, this needs to be extended for the transfer points from the liquid to the ICT equipment and back again. A FLIR camera can be useful in identifying the air temperature and movement. The camera captures small temperature gradients. This information can be used to find hot spots, as the basis for efficiency improvements and removing causes of faults. Data centres over medium size, or ones supplying critical services should use an automated data collection and reporting product. The recording frequency can be as low as 15 minutes, but ideally every 5 minutes. There must be two thermal alarms, a warning for temperatures approach the point at which equipment may fail, and a second for when temperatures exceed equipment operating thresholds. NABERS and PUE Agencies that are planning to control data centre costs should use a consistent metric. The APS Data Centre Optimisation Target policy specifies the use of the Power Usage Effectiveness (PUE) metric, and sets a target range of 1.7 to 1.9. The National Australian Built Environment Rating System (NABERS) power for data centre metric Better Practice Guide: Data Centre Cooling 15

16 was launched in February NABERS should, over time, replace PUE for APS data centres. A rating of 3 to 3.5 stars is equivalent to DCOT s PUE target. A key difference between NABERS and PUE is the ability to compare different data centres. NABERS is explicitly designed for the purpose of comparing different data centres. PUE is intended to be a metric for improving the efficiency of an individual data centre, not for comparing data centres. Optimising Cooling Systems The Plan, Do, Check, Act approach (Deming Cycle) is suitable for optimising cooling systems. The measurement and reporting systems can establish the baseline and report on the effect of the changes. Typical changes include: Reduce the heat to be removed by reducing the electricity used by the ICT equipment. There are many actions that can be taken, including virtualisation, consolidation, using modern equipment, and turning off idle hardware. Reduce the amount of energy needed to cool the air, by using the external environment. Free air cooling can be retrofitted to most existing data centres. Common techniques are to draw in external air, and to pipe the chilled water through the external environment. Rarer examples include using rivers, seas and underground pipes. Reduce the energy used to move the air to and from the ICT equipment. One approach is to prevent cold and hot from mixing. This can use blanking panels, containment and hot / cold aisle alignment. Another approach is to remove barriers, allowing the air to move more freely. A third approach is to use the fans less. Possible actions include using variable speed fans (replacing fixed speed fans), using a cycle of pushing very cold air into the data hall then turning the fans off and letting the hall warm up, and using larger, more efficient fans. Once the effect of the trial has been measured, any beneficial changes can be made, the new baseline established and the next range of actions planned. Maintenance The efficiency and performance of the cooling system is tied to the maintenance regime. Skimping on routine maintenance usually incurs higher running costs and reduces the reliability of the data centre. At minimum, the routine maintenance regime should follow the manufacturer s specifications. Most modern air conditioning units provide historical information which is useful in monitoring overall performance, and when optimising cooling system performance. Larger data centres, and those with more stringent reliability needs, will find preventive and active maintenance are likely to be necessary. Preventive (or predictive) maintenance is the scheduled replacement of components and units before they are expected to fail. This type of maintenance relies on advice from the manufacturer (which may change from time to time based on field experience) and on the performance of the systems in the data centre. Better Practice Guide: Data Centre Cooling 16

17 Active maintenance involves replacing equipment once any warning signs are noticed. Active maintenance relies on detailed monitoring of all components, and being able to set up parameters for normal and abnormal operation. Managers should note anecdotal evidence of car park servicing, a form of fraud in which maintenance work is claimed to be done but has not. This risk can be minimised by escorting service provides, and by monitoring the operating history. Sustainability Two considerations for sustainable cooling operations are refrigerants and water. Most refrigeration and air-conditioning equipment uses either ozone depleting or synthetic greenhouse gases, which are legally controlled in Australia. The Ozone Protection and Synthetic Greenhouse Gas Management Act 1989 (the Act) controls the manufacture, import and export of a range of ozone depleting substances and synthetic greenhouse gases. The import, export and manufacture of these 'controlled substances', and the import and manufacture of certain products containing or designed to contain some of these substances, is prohibited in Australia unless the correct licence or exemption is held. More information is available here: The Australian Government ICT Sustainability Plan describes a control process which should be used for reducing water use in cooling systems. There is no specific target for water use. Some cooling technology uses significant quantities of water 9, and their use may be banned under extreme drought conditions. The Green Grid has developed the Water Usage Effectiveness (WUE) metric 10, to assist data centre operators to develop controls to manage their water use. Agencies should note that there are cooling systems designs that have closed water loops, which need only tens of litres of topup water. Some data centres also use rain fall to improve their sustainability. Trends There are several trends that are likely to affect data centre cooling systems: Power efficiency: there is a steady reduction of the amount of power used in all classes of ICT equipment, even as price/performance improves. In some racks the amount of power needed will fall, meaning less cooling is required. LAN switches with copper interfaces are an example of this. Densification: there is a steady reduction in size for some classes of ICT equipment, notably servers and storage. Racks with servers are likely to consume more power. As the servers become physically smaller, more servers will fit into a rack. While each server uses less power, the greater number of servers means more power is needed. 9 Evaporative cooling towers can use millions of litres per year The Green Grid: Better Practice Guide: Data Centre Cooling 17

18 Cloud computing: this is likely to slow down the rate of expansion of data centre ICT capacity, and may significantly reduce the ICT systems in the data centre. Conclusion Cooling is an essential overhead once the ICT equipment uses more than about 10kW of power. Ideally, the cooling systems keep the temperature and humidity within a range that preserves the life of the ICT equipment. As cooling is typically the largest overhead agencies should concentrate on making the cooling efficient. However, this challenging and complex work is a lower priority activity than ensuring the power supply and managing ICT moves and changes in the data centre. Key points are: Design and operate for the specific and the whole. Consider all aspects of the data centre when upgrading or tuning the cooling systems. Ensure that the consequences of changes are considered on other data centre equipment, not only the ICT equipment. The design must consider the whole data centre, and not a subset. Extrapolating from the likely performance of the cooling system at a single rack is likely to produce errors. Instead, model the likely air flow for the entire data centre. Then be sure to measure it. Air inlet temperature is a key metric: this is the temperature of the air cooling the ICT equipment. Being able to measure the air temperature at this point is central to understanding how well the cooling system is working. Change the ICT equipment, change the air flow. The cooling system behaviour will change as the hardware, rack, cables etc change. This means monitoring and tuning the cooling system is a continual task. Raise the temperature, cautiously. Raising the data centre temperature to more than 24 C has proven effective in many sites for reducing energy use and saving money. However, the ASHRAE guidance clearly advises taking care when doing so. As the temperature rises, there may be changes in the air flow, resulting in new hot spots. As well, different makes and models of ICT equipment may require different temperatures and humidity ranges to operate reliably. Agencies must confirm these details with the equipment manufacturer s specifications. When things go wrong. The operations, disaster recovery and business continuity plans need to explicitly consider minor and major failures, and the time needed to restart. In a major cooling failure, the ICT equipment can continue to warm the data centre air. Time will be needed to remove this additional heat once the cooling system restarts. Safety. Noise, heat and bacteria are all potential issues with a data centre cooling system. Good operating procedures and training will address these risks. Better Practice Guide: Data Centre Cooling 18

19 3. Better Practices Operations The data hall has been arranged in hot and cold aisles, or in a containment solution. All obstructions to the air flow are removed. In particular, cables do not share air ducts or lay across the path that air is intended to follow. This includes to and from the data hall, within the racks, and under the raised floor. The temperature of the essential equipment in the data centre is monitored and recorded. The humidity of the data centre is monitored and recorded. All deviations from Recommended to Allowable (or worse) are analysed, corrected and reported. The actions to keep the temperature and humidity to recommended levels are documented and practiced. There is a routinely applied process for finding, investigating and if needed, removing hot spots from the data centre. A FLIR camera may assist in this process. The noise levels in the data centre are monitored and reported. A noise management plan is operating. The agency has an energy efficiency target that involves the data centre s cooling system. The target may be based on PUE or NABERS. The progress towards this target is being tracked and reported to the agency s executive monthly. There is routine maintenance of all cooling system elements, as per the manufacturer s requirements. The maintenance includes pipes and ducts as well as major equipment. There is a control plan in place to ensure that the maintenance has been performed adequately. The data centre is monitored for dust and other particles. Sources of particles, including unfiltered outside air, wearing belts and older floor tiles, are removed over time. Filter paper may be used as an interim measure to improve equipment reliability by removing particles. The disaster recovery and business continuity plans include the impacts and controls for the partial or complete failure of the cooling system. There are rehearsals of the limitation, bypass and recovery activities for cooling system operations. There is a plan for managing leaks and spills in the data centre. This plan is rehearsed from time to time. There is a method for ensuring that procedures are followed and documentation is maintained. This method may be based on ISO 9000 or other framework. There are training and/or communications processes in place to ensure staff know and follow procedures. Better Practice Guide: Data Centre Cooling 19

20 Planning A plan is being followed for: Cooling systems asset replacement. Altering the capacity and distribution of the cooling. Noise management. A plan exists for the actions following the failure of the cooling system. The length of time to restore the operating temperature is known and updated from time to time. Agencies with larger data centres may conduct a complex fluid dynamic analysis of the data centre from time to time. All planning work involves the ICT, facilities and property teams. The plans are approved by the senior responsible officer, and included in agency funding. Better Practice Guide: Data Centre Cooling 20

21 Fundamental Measuring and reporting the power consumption of the cooling system. Measuring and reporting the inlet temperature of ICT equipment. Monitoring the outlet temperature of ICT equipment. Maintaining hot and cold aisle alignment for racks and ICT equipment. If hot/cold aisle containment has not been implemented, evaluate the business case for containment. The temperature of the inlet air has been raised to at least 22. There are active operations processes to raise the data centre temperature to reduce energy costs. The temperature range in which the cooling systems and the ICT systems use the least amount of energy has been identified. The water use by the cooling system is measured and reported. There are active operations processes to minimise water use. The work health and safety plans include noise management. There are plans to reduce noise. The relationship between raising temperature and noise levels is measured and used in operations and capacity planning. There is a capacity plan for the cooling system. Options for upgrading or replacing parts of the cooling system are documented. The cooling equipment is maintained according to manufacturers specifications. All works are inspected and verified as being conducted as required. The operating hours of key equipment (e.g. CRACs) is tracked. The cooling system is cleaned according to manufacturer s specifications and government regulations. There is a plan to manage leaks and spills in the data centre. There is a plan, endorsed by senior management, for changing the cooling systems capacity. Better Practice Guide: Data Centre Cooling 21

22 4. Conclusion Agencies that use better practices in their data centres can expect lower costs, better reliability, and improved safety than otherwise. Implementing the better practices will give managers more information about data centre cooling, enabling better decisions. Overall, the data centre will be better aligned to the agency s strategic objectives and the total cost of ownership will be lower. Agencies will also find it simpler and easier to report against the mandatory objectives of the data centre strategy. The key metric is avoided costs, that is, the costs that agencies did not incur as a result of improvements in their data centres. Capturing avoided costs is most effective when done by an agency in the context of a completed project that has validated the original business case. Summary of Better Practices Cooling is an essential overhead once the ICT equipment uses more than about 10kW of power. Cooling is typically the largest data centre overhead and agencies should ensure that the cooling system is efficient. Key points are: Design and operate for the specific and the whole. The design must consider the whole data centre, and how each piece of equipment is cooled. The likely air flow through the data centre should be modelled and measured routinely. Air inlet temperature is a key metric: Being able to measure the air temperature at this point is central to understanding how well the cooling system is working. Change the ICT equipment, change the air flow. The cooling system behaviour will change as the data centre configuration changes. Monitoring and tuning the cooling system is a continual task. Raise the temperature, cautiously. Raising the data centre temperature has proven effective in many sites for reducing energy use and so saving money. The ASHRAE guidance clearly advises taking care when doing so. When things go wrong. In a major cooling failure, the ICT equipment can continue to warm the data centre air. Time will be needed to remove this additional heat once the cooling system restarts. Safety. Noise, heat and bacteria are all potential issues with a data centre cooling system. Better Practice Guide: Data Centre Cooling 22

Australian Government Data Centre Strategy 2010-2025

Australian Government Data Centre Strategy 2010-2025 Australian Government Data Centre Strategy 2010-2025 Better Practice Guide: Data Centre Power June 2013 17/06/2013 4:12 PM 17/06/2013 4:12 PM Contents Contents 2 1. Introduction 3 Scope 3 Policy Framework

More information

Australian Government Data Centre Strategy 2010-2025

Australian Government Data Centre Strategy 2010-2025 Australian Government Data Centre Strategy 2010-2025 Better Practice Guide: Data Centre Structure August 2013 Contents Contents 2 1. Introduction 3 Purpose 3 Scope 3 Policy Framework 4 Related documents

More information

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0 BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0 To achieve GREEN MARK Award Pre-requisite Requirement All relevant pre-requisite requirements for the specific Green Mark Rating are to be complied

More information

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory

More information

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.

More information

Analysis of data centre cooling energy efficiency

Analysis of data centre cooling energy efficiency Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe

More information

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Energy Efficiency Best Practice Guide Data Centre and IT Facilities 2 Energy Efficiency Best Practice Guide Data Centre and IT Facilities Best Practice Guide Pumping Systems Contents Medium-sized data centres energy efficiency 3 1 Introduction 4 2 The business benefits

More information

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building

More information

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com

More information

Environmental Data Center Management and Monitoring

Environmental Data Center Management and Monitoring 2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor

More information

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National

More information

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

AIR-SITE GROUP. White Paper. Green Equipment Room Practices AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider

More information

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no

More information

Green Data Centre Design

Green Data Centre Design Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS

More information

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Dealing with Thermal Issues in Data Center Universal Aisle Containment Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges

More information

- White Paper - Data Centre Cooling. Best Practice

- White Paper - Data Centre Cooling. Best Practice - White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH

More information

Reducing Data Center Energy Consumption

Reducing Data Center Energy Consumption Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate

More information

The Different Types of Air Conditioning Equipment for IT Environments

The Different Types of Air Conditioning Equipment for IT Environments The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.

More information

How To Improve Energy Efficiency In A Data Center

How To Improve Energy Efficiency In A Data Center Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice

More information

Glossary of Heating, Ventilation and Air Conditioning Terms

Glossary of Heating, Ventilation and Air Conditioning Terms Glossary of Heating, Ventilation and Air Conditioning Terms Air Change: Unlike re-circulated air, this is the total air required to completely replace the air in a room or building. Air Conditioner: Equipment

More information

Server Room Thermal Assessment

Server Room Thermal Assessment PREPARED FOR CUSTOMER Server Room Thermal Assessment Analysis of Server Room COMMERCIAL IN CONFIDENCE MAY 2011 Contents 1 Document Information... 3 2 Executive Summary... 4 2.1 Recommendation Summary...

More information

abstract about the GREEn GRiD

abstract about the GREEn GRiD Guidelines for Energy-Efficient Datacenters february 16, 2007 white paper 1 Abstract In this paper, The Green Grid provides a framework for improving the energy efficiency of both new and existing datacenters.

More information

Greening Commercial Data Centres

Greening Commercial Data Centres Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling

More information

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America 7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer

More information

AIRAH Presentation April 30 th 2014

AIRAH Presentation April 30 th 2014 Data Centre Cooling Strategies The Changing Landscape.. AIRAH Presentation April 30 th 2014 More Connections. More Devices. High Expectations 2 State of the Data Center, Emerson Network Power Redefining

More information

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project Control of Computer Room Air Handlers Using Wireless Sensors Energy Efficient Data Center Demonstration Project About the Energy Efficient Data Center Demonstration Project The project s goal is to identify

More information

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency

More information

News in Data Center Cooling

News in Data Center Cooling News in Data Center Cooling Wednesday, 8th May 2013, 16:00h Benjamin Petschke, Director Export - Products Stulz GmbH News in Data Center Cooling Almost any News in Data Center Cooling is about increase

More information

How To Run A Data Center Efficiently

How To Run A Data Center Efficiently A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly

More information

State of the Art Energy Efficient Data Centre Air Conditioning

State of the Art Energy Efficient Data Centre Air Conditioning - White Paper - State of the Art Energy Efficient Data Centre Air Conditioning - Dynamic Free Cooling - Release 2, April 2008 Contents ABSTRACT... 3 1. WHY DO I NEED AN ENERGY EFFICIENT COOLING SYSTEM...

More information

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room

More information

Data Center Environments

Data Center Environments Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of

More information

Direct Fresh Air Free Cooling of Data Centres

Direct Fresh Air Free Cooling of Data Centres White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The

More information

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel

More information

Data Centre Cooling Air Performance Metrics

Data Centre Cooling Air Performance Metrics Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. [email protected] Abstract Data centre energy consumption

More information

Containment Solutions

Containment Solutions Containment Solutions Improve cooling efficiency Separate hot and cold air streams Configured to meet density requirements Suitable for commercial and SCEC endorsed cabinets Available for retrofit to SRA

More information

DataCenter 2020: first results for energy-optimization at existing data centers

DataCenter 2020: first results for energy-optimization at existing data centers DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction

More information

Data Center Power Consumption

Data Center Power Consumption Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling

More information

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense CarbonDecisions The green data centre Why becoming a green data centre makes good business sense Contents What is a green data centre? Why being a green data centre makes good business sense 5 steps to

More information

Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public

Best Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document

More information

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the

More information

Power and Cooling for Ultra-High Density Racks and Blade Servers

Power and Cooling for Ultra-High Density Racks and Blade Servers Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw

More information

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Unified Physical Infrastructure (UPI) Strategies for Thermal Management Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting

More information

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,

More information

Element D Services Heating, Ventilating, and Air Conditioning

Element D Services Heating, Ventilating, and Air Conditioning PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer

More information

National Grid Your Partner in Energy Solutions

National Grid Your Partner in Energy Solutions National Grid Your Partner in Energy Solutions National Grid Webinar: Enhancing Reliability, Capacity and Capital Expenditure through Data Center Efficiency April 8, 2014 Presented by: Fran Boucher National

More information

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building rsed Energy Efficiency & Renewable Energy FEDERAL ENERGY MANAGEMENT PROGRAM Reducing Data Center Loads for a Large-Scale, Net Zero Office Building Energy Efficiency & Renewable Energy Executive summary

More information

Managing Data Centre Heat Issues

Managing Data Centre Heat Issues Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design

More information

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY The Aegis Services Power and Assessment Service provides an assessment and analysis of your data center facility and critical physical

More information

Re Engineering to a "Green" Data Center, with Measurable ROI

Re Engineering to a Green Data Center, with Measurable ROI Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices

More information

Benefits of. Air Flow Management. Data Center

Benefits of. Air Flow Management. Data Center Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment

More information

University of St Andrews. Energy Efficient Data Centre Cooling

University of St Andrews. Energy Efficient Data Centre Cooling Energy Efficient Data Centre Cooling St. Andrews and Elsewhere Richard Lumb Consultant Engineer Future-Tech Energy Efficient Data Centre Cooling There is no one single best cooling solution for all Data

More information

ICT and the Green Data Centre

ICT and the Green Data Centre ICT and the Green Data Centre Scott McConnell Sales Manager c/o Tanya Duncan MD Interxion Ireland Green Data Centres Our Responsibility Data centre greenhouse gas emissions are projected to quadruple by

More information

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Recommendations for Measuring and Reporting Overall Data Center Efficiency Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 2 Measuring PUE for Data Centers 17 May 2011 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations for Measuring

More information

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.

More information

Design Guide. Retrofitting Options For HVAC Systems In Live Performance Venues

Design Guide. Retrofitting Options For HVAC Systems In Live Performance Venues Design Guide Retrofitting Options For HVAC Systems In Live Performance Venues Heating, ventilation and air conditioning (HVAC) systems are major energy consumers in live performance venues. For this reason,

More information

BCA-IDA Green Mark for New Data Centres Version NDC/1.0

BCA-IDA Green Mark for New Data Centres Version NDC/1.0 BCA-IDA Green Mark for New Data Centres Version NDC/1.0 To achieve GREEN MARK Award Pre-requisite Requirement All relevant pre-requisite requirements for the specific Green Mark Rating are to be complied

More information

Office of the Government Chief Information Officer. Green Data Centre Practices

Office of the Government Chief Information Officer. Green Data Centre Practices Office of the Government Chief Information Officer Green Data Centre Practices Version : 2.0 April 2013 The Government of the Hong Kong Special Administrative Region The contents of this document remain

More information

Improving Data Center Energy Efficiency Through Environmental Optimization

Improving Data Center Energy Efficiency Through Environmental Optimization Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic

More information

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015 Data Center Cooling & Air Flow Management Arnold Murphy, CDCEP, CDCAP March 3, 2015 Strategic Clean Technology Inc Focus on improving cooling and air flow management to achieve energy cost savings and

More information

HEATING, VENTILATION & AIR CONDITIONING

HEATING, VENTILATION & AIR CONDITIONING HEATING, VENTILATION & AIR CONDITIONING as part of the Energy Efficiency Information Grants Program Heating and cooling can account for approximately 23 % of energy use in pubs and hotels 1. Reducing heating

More information

How To Improve Energy Efficiency Through Raising Inlet Temperatures

How To Improve Energy Efficiency Through Raising Inlet Temperatures Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD

More information

White Paper #10. Energy Efficiency in Computer Data Centers

White Paper #10. Energy Efficiency in Computer Data Centers s.doty 06-2015 White Paper #10 Energy Efficiency in Computer Data Centers Computer Data Centers use a lot of electricity in a small space, commonly ten times or more energy per SF compared to a regular

More information

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres What factors determine the energy efficiency of a data centre? Where is the energy used? Local Climate Data Hall Temperatures Chiller / DX Energy Condenser / Dry Cooler / Cooling Tower Energy Pump Energy

More information

Managing Cooling Capacity & Redundancy In Data Centers Today

Managing Cooling Capacity & Redundancy In Data Centers Today Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS

More information

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,

More information

Choosing Close-Coupled IT Cooling Solutions

Choosing Close-Coupled IT Cooling Solutions W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of

More information

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy

More information

Data Centre Testing and Commissioning

Data Centre Testing and Commissioning Data Centre Testing and Commissioning What is Testing and Commissioning? Commissioning provides a systematic and rigorous set of tests tailored to suit the specific design. It is a process designed to

More information

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Cooling Small Server Rooms Can Be Inexpensive, Efficient and Easy - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Server Rooms Description & Heat Problem Trends

More information

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits. This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits. DCD, DCOM, DCE DESIGN IMPLEMENTATION BEST PRACTICE OPERATION & MANAGEMENT DATA

More information

AIR CONDITIONING EFFICIENCY F8 Energy eco-efficiency opportunities in Queensland Foundries

AIR CONDITIONING EFFICIENCY F8 Energy eco-efficiency opportunities in Queensland Foundries AIR CONDITIONING EFFICIENCY F8 Energy eco-efficiency opportunities in Queensland Foundries Hot tips and cool ideas to save energy and money! Air conditioning units or systems are often used by foundries

More information

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics On January 13, 2010, 7x24 Exchange Chairman Robert Cassiliano and Vice President David Schirmacher met in Washington, DC with representatives from the EPA, the DOE and 7 leading industry organizations

More information

Data center upgrade proposal. (phase one)

Data center upgrade proposal. (phase one) Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The

More information

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Managing Power Usage with Energy Efficiency Metrics: The Available Me... 1 of 5 9/1/2011 1:19 PM AUG 2011 Managing Power Usage with Energy Efficiency Metrics: The Available Metrics and How to Use Them Rate this item (1 Vote) font size Data centers consume an enormous amount

More information

AIA Provider: Colorado Green Building Guild Provider Number: 50111120. Speaker: Geoff Overland

AIA Provider: Colorado Green Building Guild Provider Number: 50111120. Speaker: Geoff Overland AIA Provider: Colorado Green Building Guild Provider Number: 50111120 Office Efficiency: Get Plugged In AIA Course Number: EW10.15.14 Speaker: Geoff Overland Credit(s) earned on completion of this course

More information

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:

More information

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Recommendations for Measuring and Reporting Overall Data Center Efficiency Recommendations for Measuring and Reporting Overall Data Center Efficiency Version 1 Measuring PUE at Dedicated Data Centers 15 July 2010 Table of Contents 1 Introduction... 1 1.1 Purpose Recommendations

More information

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation

More information

2014 Best Practices. The EU Code of Conduct on Data Centres

2014 Best Practices. The EU Code of Conduct on Data Centres 2014 Best Practices The EU Code of Conduct on Data Centres 1 Document Information 1.1 Version History Version 1 Description Version Updates Date 5.0.1 2014 Review draft Comments from 2013 stakeholders

More information

Data Center Airflow Management Retrofit

Data Center Airflow Management Retrofit FEDERAL ENERGY MANAGEMENT PROGRAM Data Center Airflow Management Retrofit Technology Case Study Bulletin: September 2010 Figure 1 Figure 2 Figure 1: Data center CFD model of return airflow short circuit

More information

Blade Server & Data Room Cooling Specialists

Blade Server & Data Room Cooling Specialists SURVEY I DESIGN I MANUFACTURE I INSTALL I COMMISSION I SERVICE SERVERCOOL An Eaton-Williams Group Brand Blade Server & Data Room Cooling Specialists Manufactured in the UK SERVERCOOL Cooling IT Cooling

More information

An Introduction to Cold Aisle Containment Systems in the Data Centre

An Introduction to Cold Aisle Containment Systems in the Data Centre An Introduction to Cold Aisle Containment Systems in the Data Centre White Paper October 2010 By Zac Potts MEng Mechanical Engineer Sudlows October 2010 An Introduction to Cold Aisle Containment Systems

More information

STULZ Water-Side Economizer Solutions

STULZ Water-Side Economizer Solutions STULZ Water-Side Economizer Solutions with STULZ Dynamic Economizer Cooling Optimized Cap-Ex and Minimized Op-Ex STULZ Data Center Design Guide Authors: Jason Derrick PE, David Joy Date: June 11, 2014

More information

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015 I-STUTE Project - WP2.3 Data Centre Cooling Project Review Meeting 8, Loughborough University, 29 th June 2015 Topics to be considered 1. Progress on project tasks 2. Development of data centre test facility

More information

Top 5 Trends in Data Center Energy Efficiency

Top 5 Trends in Data Center Energy Efficiency Top 5 Trends in Data Center Energy Efficiency By Todd Boucher, Principal Leading Edge Design Group 603.632.4507 @ledesigngroup Copyright 2012 Leading Edge Design Group www.ledesigngroup.com 1 In 2007,

More information

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise A White Paper from the Experts in Business-Critical Continuity Executive Summary Today s data centers are changing rapidly,

More information

2014 Best Practices. for the EU Code of Conduct on Data Centres

2014 Best Practices. for the EU Code of Conduct on Data Centres EUROPEAN COMMISSION DIRECTORATE-GENERAL JRC JOINT RESEARCH CENTRE Institute for Energy and Transport Renewable Energies Unit 2014 Best Practices for the EU Code of Conduct on Data Centres 1 Document Information

More information

Alcatel-Lucent Modular Cooling Solution

Alcatel-Lucent Modular Cooling Solution T E C H N O L O G Y W H I T E P A P E R Alcatel-Lucent Modular Cooling Solution Redundancy test results for pumped, two-phase modular cooling system Current heat exchange methods for cooling data centers

More information

Data Center Facility Basics

Data Center Facility Basics Data Center Facility Basics Ofer Lior, Spring 2015 Challenges in Modern Data Centers Management, Spring 2015 1 Information provided in these slides is for educational purposes only Challenges in Modern

More information

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power savings through the use of Intel s intelligent

More information

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Guideline for Water and Energy Considerations During Federal Data Center Consolidations Guideline for Water and Energy Considerations During Federal Data Center Consolidations Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory

More information

Benefits of Cold Aisle Containment During Cooling Failure

Benefits of Cold Aisle Containment During Cooling Failure Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business

More information

BRUNS-PAK Presents MARK S. EVANKO, Principal

BRUNS-PAK Presents MARK S. EVANKO, Principal BRUNS-PAK Presents MARK S. EVANKO, Principal Data Centers of the Future and the Impact of High Density Computing on Facility Infrastructures - Trends, Air-Flow, Green/LEED, Cost, and Schedule Considerations

More information

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and

More information

Site Preparation Management Co., Ltd. March 1st 2013 By Nara Nonnapha, ATD UPTIME

Site Preparation Management Co., Ltd. March 1st 2013 By Nara Nonnapha, ATD UPTIME Site Preparation Management Co., Ltd. March 1st 2013 By Nara Nonnapha, ATD UPTIME 2 AGENDA SESSION -1 What is Data Center? SESSION -2 Equipment and Components SESSION -3 Standard in designing Data Center

More information

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009 Optimum Climate Control For Datacenter - Case Study T. Prabu March 17 th 2009 Agenda 2 About EDEC (Emerson) Facility Data Center Details Design Considerations & Challenges Layout Design CFD Analysis Of

More information