Rittal Liquid Cooling Series



Similar documents
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

The New Data Center Cooling Paradigm The Tiered Approach

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

BRUNS-PAK Presents MARK S. EVANKO, Principal

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Data Center Equipment Power Trends

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Technology Corporation

DATA CENTER RACK SYSTEMS: KEY CONSIDERATIONS IN TODAY S HIGH-DENSITY ENVIRONMENTS WHITEPAPER

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Data Center Components Overview

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Managing Data Centre Heat Issues

Element D Services Heating, Ventilating, and Air Conditioning

Power and Cooling for Ultra-High Density Racks and Blade Servers

DataCenter 2020: first results for energy-optimization at existing data centers

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Managing Cooling Capacity & Redundancy In Data Centers Today

Raised Floor Data Centers Prepared for the Future

Airflow Simulation Solves Data Centre Cooling Problem

Virginia Tech. Background. Case Summary. Results

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Data center upgrade proposal. (phase one)

APC APPLICATION NOTE #92

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

White Paper Rack climate control in data centres

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Small Data / Telecommunications Room on Slab Floor

Rack Hygiene. Data Center White Paper. Executive Summary

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Driving Data Center Efficiency Through the Adoption of Best Practices

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

- White Paper - Data Centre Cooling. Best Practice

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Benefits of. Air Flow Management. Data Center

Elements of Energy Efficiency in Data Centre Cooling Architecture

COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Racks & Enclosures. Deliver. Design. Define. Three simple words that clearly describe how we approach a Data Center project.

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

RiMatrix S Make easy.

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Data Center Energy Profiler Questions Checklist

NAISSUS THERMAL MANAGEMENT SOLUTIONS

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Benefits of Cold Aisle Containment During Cooling Failure

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Direct Contact Liquid Cooling (DCLC ) PRODUCT GUIDE. Industry leading data center solutions for HPC, Cloud and Enterprise markets.

High Performance Computing (HPC) Solutions in High Density Data Centers

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

How To Run A Data Center Efficiently

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

How to Meet 24 by Forever Cooling Demands of your Data Center

Choosing Close-Coupled IT Cooling Solutions

Center Thermal Management Can Live Together

Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM

SERVER ROOM CABINET INSTALLATION CONCEPTS

Rittal White Paper 507: Understanding Data Center Cooling Energy Usage & Reduction Methods By: Daniel Kennedy

Blade Server & Data Room Cooling Specialists

Thermal Optimisation in the Data Centre

Great Lakes Data Room Case Study

HPC TCO: Cooling and Computer Room Efficiency

Product Brochure. Airflow Management Solutions

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture

Data Centre Testing and Commissioning

Virtual Data Centre Design A blueprint for success

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Data Center Environments

The State of Data Center Cooling

Strategies for Deploying Blade Servers in Existing Data Centers

High Density Rack Configurations and their Associated Cooling Issues

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

The Different Types of Air Conditioning Equipment for IT Environments

A Scalable, Reconfigurable, and Efficient Data Center Power Distribution Architecture

Using Simulation to Improve Data Center Efficiency

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

APC APPLICATION NOTE #112

Analysis of the UNH Data Center Using CFD Modeling

Optimizing Power Distribution for High-Density Computing

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency

Reducing Data Center Energy Consumption

Improving Data Center Efficiency with Rack or Row Cooling Devices:

Cabinets 101: Configuring A Network Or Server Cabinet

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

CRAC Precision climate control units for data centres

Environmental Data Center Management and Monitoring

Transcription:

Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de www.rimatrix5.com

Rittal White Paper Liquid Cooling Series 2 Inhalt 1 Executive summary...3 2 Alternative solutions for next generation data centers...3 3 A brief history...4 4 Liquid Cooling solutions...5 5 LCP design and implementation...7 6 The future is now...8 7 Conclusions...9 8 About the author...9 Page 2/9

1 Executive summary Rittal White Paper Liquid Cooling Series 3 As heat densities continue to escalate, data center managers are seeking solutions for handling these extreme heat loads to protect their equipment and their facilities. This paper provides a brief history of data center cooling and explains the planning and steps necessary to implement liquid cooling with enclosure-based cabinet heat exchanger systems. 2 Alternative solutions for next generation data centers Finding reliable solutions for removing excessive heat loads from data center enclosures is rapidly becoming the most critical issue facing the end user community. Data center managers are evaluating the most effective solutions for heat removal from equipment and facilities and many are returning to liquid cooling. But locating the proper system can be a difficult process. Type Liquid Cooling of Data Center Enclosures into one of the more popular online search engines and be prepared to sift through over 43,000 entries. Revising that search to Liquid Cooling of Data Center Enclosures White Papers still requires a review of about 14,200 listings. Liquid Cooling Solutions are water-or refrigerant-based cooling systems, either installed directly into enclosures or mounted in data center spaces. IT managers, however, are extremely reluctant to use water (or other liquids) in data center spaces because of the risk of hardware failure due to fluid leakage. With proper planning, implementation and service, a liquid cooling solution does not have to be the source of endless, sleepless nights. Page 3/9

Rittal White Paper Liquid Cooling Series 4 3 A brief history In terms of a facility, water has always been present. A variety of water-cooled systems have been part of data centers for many years and remain so today. Pipes carrying chilled water often run through data center spaces, not to mention water for sprinkler systems and AC cooling. Data centers have even been located in close proximity to other spaces that use water, such as cafeterias and labs. In the 1960s and 1970s, the first data center facilities were designed to handle large mainframe systems. Chilled water, channeled through mainframe hardware, was the only efficient way to cool the vast amount of heat generated by those systems. With far more efficient heat-absorbing properties than air, water was capable of removing 1000 times as much heat. The supercomputer systems that followed were also cooled by liquid-based systems. Then came the explosive growth of rack-mount servers. With ambient air becoming the primary cooling medium, originally using room ambient air passing laterally through the chassis to remove heat, these servers improved network performance and capacity. Installed in rack-mount server enclosures, with front and rear perforated doors, relatively low heat loads (about 2-4 kw) per cabinet did not pose a problem for existing climate control systems. Successive generations of servers reduced the size of individual chassis and increased processor performance, enabling users to install more units in the same enclosure, often doubling heat loads. With the introduction of 1U multiprocessor units (two to four processors) and blade server systems, heat loads per cabinet are continuing to increase. In the last seven years, heat loads have almost quadrupled, matching the exponential increase in deployed servers. Facility managers are being asked to provide cooling capacity for 15-20 kw loads and even 30 kw configurations are being discussed. While early sites were designed to support 50 to 100 watts per square foot loads, today s facilities need to support up to 200 watts/square foot and more. Thinking of heat loads in watts per square foot of data center space becomes meaningless at these higher heat loads. A 15 kw load in a 24 wide by 40 deep enclosure (6.66 sq ft) yields a load of 2,252 watts/sq ft. Putting this in a larger cabinet (24 wide by 48 deep, 8 sq ft) reduces the load to ONLY 875 watts/sq ft (see figure 4). Clearly, traditional ambient air-cooled solutions cannot sufficiently handle these extreme heat loads. Even if heat could be evacuated from enclosures, that much heat in a limited amount of floor space would overwhelm CRAC capabilities, pointing once again to the need for liquid cooling solutions. Page 4/9

Rittal White Paper Liquid Cooling Series 5 Figure 2 4 Liquid Cooling solutions Liquid Cooling Solutions are available in liquid-to-chip and cabinet-level cooling units. At the processor level, the traditional air-cooled heat sink is replaced with a chip-sized heat exchanger (similar to a radiator). The cooling liquid, typically a dielectric solution, is directed through a tube to the heat exchanger and the heated liquid is removed through a separate tube. A vertical distribution manifold provides connections for fluid paths from each server, as well as hot and cold paths to a chiller system. These processor-based systems are usually provided directly by server manufacturers. These systems are not suitable for retrofitting and require significant advance planning prior to installation of the first server. Figure 3 Page 5/9

Rittal White Paper Liquid Cooling Series 6 Rather than supplying chilled liquid directly to individual servers, enclosure-based cabinet heat exchanger systems (Liquid Cooling Package LCP, see Figure 3) provide cold air to an entire enclosure. Building chilled water supplied to the system provides the cooling capacity with a separate warm water return to facility-chilled water plants. The LCP system acts as a cabinet-sized heat exchanger with all the heat from the enclosure transferring to the water. Depending on building chilled water inlet temperature and the temperature rise across the servers, an LCP system can cool from 12 to 20 kw per cabinet with one system and up to 35 kw with two LCP systems supplying cold air to a single enclosure. If such needs seem out of the realm of possibility, consider the following: Figure 4 So even with current generation rack-mount and blade servers, heat loads of 15-20 kw can be easily reached. With these loads well over the capacity of standard data center cooling systems, an LCP system will quickly become the only viable option. In addition to removing greater amounts of heat from enclosures, LCP systems also benefit sites that are operating near, at or over their thermal capacity. As stated above, the heat from the enclosure is removed with the water; none (or at least very little) of the heat is rejected into the pace. Consider the following hypothetical scenario: In a room that is close to maximum cooling capacity, several enclosures with low heat loads (3-5 kw per enclosure) are to be retrofitted with blade servers, increasing heat loads to 15-18 kw. The existing CRACs cannot handle this extra heat. An LCP installation will accommodate the additional capacity and even reduce the heat load for the rest of the facility. Even for one or two enclosures, the original 6-10 kw of heat from those cabinets is now eliminated, adding cooling capacity to the space. Page 6/9

Rittal White Paper Liquid Cooling Series 7 5 LCP design and implementation Great care must be taken to ensure that liquid-cooled enclosures can be deployed in existing and future data centers with minimal impact to existing facilities. Of course, these systems must be leak proof with dedicated control and monitoring systems. They must be reliable, expandable and flexible enough to allow easy reconfiguration in a dynamic data center space. Nonconductive, clean liquids should be used to eliminate the potential for equipment degradation and damage. LCP systems do not have to be deployed throughout an entire data center. End users are looking to cluster their extreme heat load systems in only a small area of the center. There will always remain a need for enclosures to support low-density installations, legacy equipment and network and termination components. For these applications, the traditional air-cooled rack-mount server cabinet, with fully perforated doors front and rear, will meet requirements. Building-chilled water supply lines and warm water return lines should be installed to support these cooling systems. Supply and return lines can be installed to provide an appropriate level of redundance required for the installation (see Figures 5 and 6). Sufficient floor space should be provided to allow for the extra footprint required by the LCP system. And enough power should be brought in to supply the increased electrical demands of these high-density installations. Figure 5 Figure 6 Page 7/9

Rittal White Paper Liquid Cooling Series 8 To install LCP systems, design engineers must start with adequate floor space. For side-mount or rear-door systems, additional clearance must be provided so that components can be installed while still maintaining aisle space and accessibility. Side-mount systems will require an additional floor space alongside each enclosure. Rear-door mount systems will need an additional 8-10 inches of aisle clearance and room to swing open a wider door, usually 30-32 inches in width. The overall enclosure depth should be at least 40 inches in order to install rack-mount servers with enough space to provide adequate airflow paths to server air intakes. LCP enclosures should also be placed in data center spaces in close proximity to water supply and return lines. Additional piping runs should be minimized as much as possible. With all physical plant systems in place, the task of installing LCP enclosures is relatively straightforward. If new pipe runs are required, these must not interfere with already installed connectivity or power cables, under-floor airflow paths or cable routing systems (i.e. cable tray). LCP systems can be installed in non-raised floor environments. However, to avoid any possible issue with dripping water, supply and return, piping should be installed in raised floor spaces. Access to these spaces must facilitate installation of attachment points to piping for connection to LCP systems. Additional floor tile cutouts will be required to feed connection hoses to LCP modules. Separate electrical circuits should be available to provide power to LCP modules. And, if required, network connectivity points should also be available to provide monitoring and control capabilities of the LCP system. Finally, if installed in existing data center enclosures, it may be necessary to retrofit solid doors (viewing or solid steel) and a solid roof to the enclosure. It will also be necessary to seal or close all cable entry penetrations into the enclosures to be liquid cooled. This is critical to ensure that all cold air is delivered to server air intakes and no warmed exhaust air is dissipated into data center spaces. 6 The future is now Higher heat loads in individual enclosures and facilities are demanding alternate cooling solutions to ambient air, which is quickly reaching its heat removal limit. As liquid cooling gradually moves into the mainstream, data center managers must prepare to support liquid cooling solutions with careful consideration given to: - Environmental considerations including existing floor spaces, water supply and available power - Equipment requirements to manage liquid at the cabinet or microprocessor level - Cabinet and hardware selection that best supports liquid cooling - Quantifications on liquid cooling benefits in terms of component performance, data integrity, floor space savings, data center scalability and resulting ROI Page 8/9

Rittal White Paper Liquid Cooling Series 9 With the proper education, planning and implementation, liquid cooling will provide efficient cooling for data centers, now and for years to come. With ever increasing heat loads looming on the horizon, liquid cooling is the best choice for data center heat removal needs. 7 Conclusions With careful planning and implementation, liquid cooling cabinet heat exchangers (LCP) are an ideal solution for removing excessive heat loads from the data center. Among the considerations for implementations of LCP systems listed above is the logistics of placing enclosures in close proximity to water supply and return lines. To avoid any potential dripping water, supply and return piping needs to be installed under floor spaces and access floor cutouts are needed to connect hoses to LCP modules. Data center managers also need to allow for sufficient floor space for the LCP systems, as well as the additional power that will be required. They should also plan for adequate aisle space and accessibility to equipment after the liquid cooling systems are installed. Network connectivity points should also be available to provide monitoring and control capabilities of the LCP systems. 8 About the author Herb Villa is the Field Technical Manager (Datacomm Group) for the Rittal Corporation for the US. A mechanical engineer, he has more than 17 years of experience in the data communications industry. He has been involved in all aspects of system design, installation and support, which allows him to see data center projects from three unique vantage points ~ manufacturer, installation and supplier. Page 9/9