Blade Server & Data Room Cooling Specialists



Similar documents
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

In Row Cooling Options for High Density IT Applications

Element D Services Heating, Ventilating, and Air Conditioning

Power and Cooling for Ultra-High Density Racks and Blade Servers

Elements of Energy Efficiency in Data Centre Cooling Architecture

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Data Center Energy Profiler Questions Checklist

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

Direct Fresh Air Free Cooling of Data Centres

Analysis of data centre cooling energy efficiency

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Legacy Data Centres Upgrading the cooling capabilities What are the options?

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

National Grid Your Partner in Energy Solutions

Choosing Close-Coupled IT Cooling Solutions

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Liebert CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment. Precision Cooling For Business-Critical Continuity

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

University of St Andrews. Energy Efficient Data Centre Cooling

Rittal Liquid Cooling Series

Improving Data Center Efficiency with Rack or Row Cooling Devices:

White Paper Rack climate control in data centres

Thermal Storage System Provides Emergency Data Center Cooling

Energy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling

APC APPLICATION NOTE #112

Drives and motors. A guide to using variable speed drives and motors in data centres Meeting your Carbon Reduction Commitment (CRC)

How to Build a Data Centre Cooling Budget. Ian Cathcart

The New Data Center Cooling Paradigm The Tiered Approach

The Different Types of Air Conditioning Equipment for IT Environments

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014

EXAMPLE OF A DATA CENTRE BUILD

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

CRAC Precision climate control units for data centres

THE OPTIMAL SOLUTION FOR SMALL AND MEDIUM DATA CENTER

Data Center Components Overview

State of the Art Energy Efficient Data Centre Air Conditioning

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

The Benefits of Supply Air Temperature Control in the Data Centre

How To Run A Data Center Efficiently

CANNON T4 MODULAR DATA CENTRE HALLS

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

News in Data Center Cooling

Agenda. Who we are What we do How we serve the BPO Industry

Critical Infrastructure for the World s Leading Data Centers

COOLSIDE box R410A. conditioned server - racks for direct installation into the room INVERTER E C

Benefits of. Air Flow Management. Data Center

Managing Data Centre Heat Issues

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Benefits of Cold Aisle Containment During Cooling Failure

Using Simulation to Improve Data Center Efficiency

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

AIRAH Presentation April 30 th 2014

EHDC. Cooling system for Data Center and server cabinets.

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Data Center Environments

CoolTeg Plus Chilled Water (CW) version: AC-TCW

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data Center Technology: Physical Infrastructure

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

How to Meet 24 by Forever Cooling Demands of your Data Center

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Portable Energy Efficient Data Centre Solution

Managing Cooling Capacity & Redundancy In Data Centers Today

An Introduction to Cold Aisle Containment Systems in the Data Centre

Strategies for Deploying Blade Servers in Existing Data Centers

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Alcatel-Lucent Modular Cooling Solution

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

Automation's Modular Containerized Datacenter Solutions

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Liebert CRV Row-Based Cooling Intelligent Precision Cooling For Data Center Equipment

- White Paper - Data Centre Cooling. Best Practice

Excool Indirect Adiabatic and Evaporative Data Centre Cooling. World s Leading Indirect Adiabatic and Evaporative Data Centre Cooling

Drives and motors. A guide to using variable-speed drives and motors in data centres

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

5 Reasons. Environment Sensors are used in all Modern Data Centers

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Data Centre Cooling Air Performance Metrics

Improving Data Center Energy Efficiency Through Environmental Optimization

Center Thermal Management Can Live Together

Nirvana. Cycling Refrigerated Dryers

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Containment Solutions

Digital Realty Data Center Solutions Digital Chicago Datacampus Franklin Park, Illinois Owner: Digital Realty Trust Engineer of Record: ESD Architect

Data Centre. Flexible, high efficiency cooling solutions. CONDITIONING CONDENSING UNITS CONTROLS SERVICE TRAINING

Transcription:

SURVEY I DESIGN I MANUFACTURE I INSTALL I COMMISSION I SERVICE SERVERCOOL An Eaton-Williams Group Brand Blade Server & Data Room Cooling Specialists Manufactured in the UK

SERVERCOOL Cooling IT Cooling mission-critical data centres presents a number of challenges. With the power and heat densities of Blade Servers continuing to rise, managing the heat density and minimising the energy used for cooling are the highest priorities. After that, being future-proof and able to scale-up as the data centre evolves, are the next requirements. ServerCool delivers all these benefits and more. ServerCool removes heat from the server rack at source by means of a rear door heat exchanger (RDHx). The RDHx then discharges the heat to the building chilled water system via a Cooling Distribution Unit (CDU). ServerCool has no thermal impact upon the white space and minimal footprint. It was evaluated along with competing systems by the Silicon Valley leadership group and judged to be the most energy efficient for medium to high heat density server racks and has been shown to use just 20% of the energy used by conventional close control air conditioning unit based systems. The CDU is an all important part of the system as: It controls the flow of clean, treated water to one or more RDHx s at above dew point to ensure condensation free operation. Water volume is limited to a few litres at controlled low pressure. Individual racks can be isolated without impacting on others. It has N+1 reliability built in and if two CDUs are used in tandem can provide tier 4 resilience. Plug and play manifold provides scalability and white space flexibility upgrade path. Only 5kW of power provides 260kW of cooling. The CDU has an impeccable pedigree; it was initially developed for IBM to work with their Cool Blue and idataplex RDHx s and Eaton-Williams are the only CDU manufacturer specified worldwide by IBM in their RDHx planning guides. The complete solution for all your data centre cooling requirements: Rear Door Heat Exchangers (RDHx) Cooling Distribution Units (CDU) Cooling Distribution Unit Modules Close Control Air Conditioning Units Data Room Floor Grilles Surveys, Design, Install, Commissioning & Service Floor Grille Modular CDU CDU Module Server Cabinets Rear Door Heat Exchangers

Removing Heat at source with ServerCool Up to 50kW Cooling per rack! Rear Door Heat Exchanger Advantages The world s most energy efficient cooling. Cooling Server Racks at Source with ServerCool is the most energy and space-efficient solution for reducing Data Centre Costs. Chilled water CDU (Cooling Distribution Unit) RDHx (Rear Door Heat Exchanger) The world s most space efficient cooling. 100% sensible cooling no condensate. Installs in minutes and retrofittable. Most flexible upgrade path. No re-arrangement of enclosures required. Allows up to 5 the computer power versus air conditioning. Reduces cooling energy by 90% or more. Reduces space requirement by 80% or more. Reduces capital and operational TCO by 50%. Compatible with most enclosures and racks. Top feed or bottom feed connections. 3 2 1 1 Ambient air enters front of cabinet. 2 Servers generate heat up to 55ºC. 3 Exit air is cooled via the rear door heat exchanger and returned to data centre at around ambient temperature. Thermal image of rear of server rack showing 32kW of heat dissipating into data centre. Thermal image with Rear Door Heat Exchanger (RDHx) fitted showing heat neutralised at source. Passive RDHx up to 30kW of cooling Microchannel MCHx up to 20kW of cooling Active RDHx up to 50kW of cooling

Silicon Valley judge Passive Door best system The RDHx and CDU were independently tested against the five leading Data Centre cooling solutions by the Silicon Valley Leadership Group at Lawrence Berkley National Labs in California to establish an efficiency rating over a range of heat densities up to 12kW rack load. The graphic below represents the results; illustrating that the ServerCool system is rated by far the most efficient cooling system at all heat densities. Less Efficient 0.6 Typical CRAH (45ºF supply) 2.11 More Efficient kw Power per Ton of Cooling 0.5 0.4 0.3 0.2 0.1 Liquid Rack (50-70ºF supply, Controlled Air Inlet Temp) In Row (45ºF supply, Controlled Air Inlet Temp) Overhead (45ºF supply, Controlled Air Inlet Temp) Passive Door (50-70ºF supply) ServerCool rated most efficient 1.76 1.41 1.06 0.70 0.35 kw Power per kw Cooling 0.0 0 2 4 6 8 10 12 14 16 18 20 kw per Rack Heat Density Water Cooled System Comparisons Water Cooled Systems Close Control Air Conditioning Cooling Cold/Hot Aisle Containment In-Row Cooling Recirculating Rack Cooling ServerCool RDHx s/cdu Cooling Range of Effective Cooling (kw per rack) Energy per 150kW Cooling (kw) Approx. Capital Expenditure + Energy Cost for 150kW Cooling for 5 Years 0.5-3.0 3.0-12.0 5.0-15.0 8.0-18.0 5.0-30.0 14.0 12.0 10.5 10.5 2.8 144,000 168,000 144,000 156,000 87,600 CO ² Emissions for 150kW Cooling for 5 Years Footprint per 150kW Cooling (m²) Upgrade Path and Flexibility Noise Addition in Data Centre Built-in Redundancy of Moving Parts Condensate-Free Operation Low (<3 bar) Operating Pressure for Safety 288,204 kg 247,032 kg 216,153 kg 216,153 kg 53,524 kg 3.35 3.35 2.25 2.0 0.4 Space Needed Layout Restrictions Racks need moving Racks need moving Easy Retrofit High High Very High High Low No No No No Yes No No No No Yes No No No No Yes Consequence of Failure or Leak Large Volume of Building Chilled Water Loss of Cold Aisle Pressure and Capacity Large Volume of Building Chilled Water at Rack Almost Instant Server Inlet Overheat/ Shutdown Small Volume Secondary Water Loop, No Impact on Shut Down ServerCool can save up to 80% of the energy consumed in removing heat from the Data Centre.

ServerCool CDU The Cooling Distribution Unit (CDU) is designed to provide cooling water, close controlled and above dew point for up to 20 rear door heat exchangers. It is capable of 260kW cooling capacity. Designed and approved to work with IBM s Cool Blue and idataplex RDHx (Rear Door Heat Exchangers) and Blue Gene racks. Why use a CDU and Secondary Loop The CDU provides essential separation of primary chilled water from cooling water to the racks / RDHx s, meaning: Low pressure, clean water in rack environment. Low volume of water in secondary loop, reducing risk. Water flow controlled to maintain optimum operation and minimise energy use. Water temperature controlled at above dew point to maximise cooling and ensure condensate-free operation. Ensures 100% sensible cooling reducing the need for humidification and saving energy. Provides upgrade path and flexibility without major M&E upgrade work plug and play. Provides essential monitoring of the racks being cooled and system health. Enables system redundancy through secondary manifold design. No potentially unreliable condensate pumps required anywhere in the system. CDU Features Cooling capacities ranging from 20kW to 260kW. Full run/standby capability with redundant pumps. Optional internal manifold with leak-free quick-release couplings. Easy to install and retrofit. Dew point temperature control of cooling water. Auto-fill and bleed of coolant. Full alarm monitoring and connectivity. Modbus + SNMPv3. Primary Control Valve (3-Way with Manual Override) Controller Display (User Interface) Ambient Temperature and RH Sensor CDU FRONT Expansion Vessel Pressure Relief Valve CDU REAR Plate Heat Exchanger Start/Stop and Fill Pump Switches Primary Flow Meter (If Fitted) Primary Filter (If Fitted) Pump Inverter Drive Fill Pump Water Make-up Container Secondary Pumps (Run and Standby) Secondary Flow Meter (If Fitted) Pump Isolation Valves (Applicable for Redundant Pump Units only) Level Sensor Secondary Filter (If Fitted) Drain Valves Wheels and Adjustable Feet Primary Isolation Valves

ServerCool for small Server Rooms The CDU Module is designed to provide close controlled cooling water for up to 3 RDHx s with a total maximum cooling capacity of 20kW. It is the ideal system to start a planned upgrade path and is particularly suitable where space is restricted. CDU Module Features Cooling capacity up to 20kW. Full run/standby capability. Leak-free quick-release couplings. Easy to install and retrofit. Dew point temperature control of cooling water. Auto-fill and bleed of coolant. Full alarm monitoring and connectivity. Small Server Room Cooling The Modular CDU combined with the RDHx is the perfect solution for the small IT installation. The CDU is installed within a standard rack; taking up just 6U of space so leaving room for the servers above. The CDU Module provides up to 20kW of cooling and can be connected to a maximum of three RDHx s. As room temperature is not critical to cooling the servers, the rack can stand in the corner of a room or even in a cupboard. Heavy Duty Floor Grilles and Fan Assisted Floor Grilles Our heavy duty floor grilles have been specifically designed to suit current industry requirements employing raised access floors in Data Centres. They are designed to replace a 600mm square tile and are aesthetically pleasing air distribution devices providing cool air exactly where required, eliminating high temperatures and reducing the risk of equipment failure. The Model TJH Heavy / Extra Heavy duty floor grille complies with the requirements of the relevant sections of PSA MOB PF2 PS/SPU specification for Raised Access Floors March 1992 and IBM Management Design Guide for Raised Access Floors. Our fan assisted floor grille has been designed to boost airflow and cooling in areas with low or even negative floor pressure. The grille can direct airflow towards the rack inlet at up to 15 deg from vertical, this and speed control of the fans ensures that cooling is maximised efficiently. With up to 5 times the cooling airflow delivery of a standard grille in a low floor pressure area, this makes it an ideal solution for hotspots and problem areas that lack effective cooling.

Flexibility, Scalability and other services Minimising initial Capex whilst building in the most efficient cooling system with upgrade potential are challenges faced by many Data Centre and IT managers. The Eaton-Williams ServerCool system provides the perfect answer, enabling the system resilience, scalability, efficiency of space and power to be built in day 1 with excellent upgrade potential. The system can then be upgraded rack by rack or row by row by simply fitting and plugging in additional RDHx s to a common under-floor or over-head manifold. New RDHx s can be fitted in as little as 15 minutes without down-time. There is no need to re-position racks or use valuable additional space. Top feed Bottom feed Installing our unique and highly resilient plug and play manifold system, either overhead or under-floor, enables an upgrade path with low initial Capex. CDU s and RDHx s can be added with minimal disruption as the IT load increases and additional cooling is required. The manifold is tested to 5 its working pressure. Optimum use of the White Space The RDHx takes up minimal space and, if required, the CDU s can be installed outside the white space. Because the heat is removed at source, computer racks can be positioned close together. Minimising Carbon Footprint ServerCool will save you 40% plus compared with other systems but the energy savings do not stop there. Because ServerCool cools at source you can allow the white space to operate at a higher temperature, up to 27ºC, which saves more energy, and if you use the Eaton-Williams Close Control Products with high efficiency fans and free cooling you save even more energy. Connection Options and Features Comms Primary Filter Option Primary Flow Option Secondary Filter Options Connection Options and Features The CDU can be fitted with an internal manifold allowing connection up to 8 racks with heat loads of up to 30kW. Alternatively, Flextails can be fitted, allowing multiple connections via an over-head or under-floor external manifold. Other optional features include Primary/Secondary Filters, ATS, Primary Flow Meter and Voltage options. Internal Manifold Option

Group Products and Services We design, manufacture, install, commission and service your data room equipment and keep your system running at peak performance. In addition, we manufacture, service and supply a wide range of compatible equipment including chillers, chilled water and DX close control air conditioning units, floor grilles, direct and indirect free cooling units, air handling units and humidifiers. DRU Close Control Air Conditioning Unit DRU Data Room air conditioning units are designed for high density load areas where a large cooling duty is required, from a compact footprint, but without compromising the critical application of data centre cooling. 80/100/120kW cooling capacities. Available in chilled water, DX, DX with water cooled condenser and glycol free cooling configurations. A full range of options and ancillaries are available including inbuilt humidifier for room humidity control. IPAC Close Control Air Conditioning Unit IPAC floor standing air conditioning units are available in chilled water, DX/air cooled or DX with water cooled condenser allowing indirect free cooling. 15 to 80kW cooling capacities are available in configurations including up-flow, down-flow or direct free cooling. A full range of options are available including inbuilt humidifier for room humidity control. ebtx Unit ebtx self contained DX air handler is designed specifically for communications networks and data centres; the ebtx primarily uses outside air for cooling with DX available to deal with peak temperatures. It uses a fraction of the energy of conventional close control air conditioning units. It is available in four sizes: 15, 22, 30 and 45kW. Chillers Energy efficient chillers up to a capacity of 2000kW with the full run/standby (N+1) capability can be offered. These can also be fitted with free cooling capability and qualify for ECA (Enhanced Capital Allowance) under the Carbon Trust Scheme. Installation Services include pre-fabricated manifold systems significantly reducing installation time. Our managed services team can also, design, plan, project manage and install the entire system on a turnkey basis, with first class product and system knowledge giving peace of mind and problem free implementation. Commissioning and Service Commissioning and service capability ensures first class start-up services, technical support, after-sales service and care through the life cycle of the product. SERVERCOOL An Eaton-Williams Group Brand We reserve the right to alter specifications without notice and products will sometimes be supplied with modifications which do not strictly comply with the descriptions and illustrations in this publication. Fircroft Way, Edenbridge Kent TN8 6EZ, England Tel: +44 (0)1732 866055 Fax: +44 (0)1732 865658 Email: servercool@eaton-williams.com www.eaton-williams.com 0413132 SDRC /2.12/3