HPC TCO: Cooling and Computer Room Efficiency



Similar documents
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

FEAR Model - Review Cell Module Block

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Enabling an agile Data Centre in a (Fr)agile market

Airflow Simulation Solves Data Centre Cooling Problem

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Cloud Computing Data Centers

DataCenter Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.

DataCenter 2020: first results for energy-optimization at existing data centers

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Rittal Liquid Cooling Series

The New Data Center Cooling Paradigm The Tiered Approach

Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL

Data center TCO; a comparison of high-density and low-density spaces

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings

Managing Data Centre Heat Issues

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Design Best Practices for Data Centers

Energy Management Services

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

abstract about the GREEn GRiD

Direct Contact Liquid Cooling (DCLC ) PRODUCT GUIDE. Industry leading data center solutions for HPC, Cloud and Enterprise markets.

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

I don t want to move to the Arctic - a HPC Data Center Take on PUE and Energy Efficiency

Green Data Centre Design

Reducing Data Center Energy Consumption

Energy Smart Data Center Phase-III R&D. Andrés Márquez. Pacific Northwest National Laboratory

The Pros and Cons of Modular Systems

University of St Andrews. Energy Efficient Data Centre Cooling

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Elements of Energy Efficiency in Data Centre Cooling Architecture

Cost Model for Planning, Development and Operation of a Data Center

Oil Submersion Cooling for Todayʼs Data Centers. An analysis of the technology and its business implications

Thermal Management of Datacenter

Optimization of energy consumption in HPC centers: Energy Efficiency Project of Castile and Leon Supercomputing Center - FCSCL

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Data center upgrade proposal. (phase one)

Virginia Tech. Background. Case Summary. Results

Datacenter Efficiency

Managing Cooling Capacity & Redundancy In Data Centers Today

EXECUTIVE REPORT. Can Your Data Center Infrastructure Handle the Heat of High Density?

Greening Data Centers

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

ICT and the Green Data Centre

Managing Data Center Power and Cooling

Benefits of Cold Aisle Containment During Cooling Failure

Specialty Environment Design Mission Critical Facilities

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Office of the Government Chief Information Officer. Green Data Centre Practices

Overview of Green Energy Strategies and Techniques for Modern Data Centers

THE GREEN DATA CENTER

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Typical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Data Center Energy Efficiency. SC07 Birds of a Feather November, 2007 Bill Tschudi wftschudi@lbl.gov

Independent analysis methods for Data Centers

European Code of Conduct for Data Centre. Presentation of the Awards 2016

Guide to Minimizing Compressor-based Cooling in Data Centers

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

How to Build a Data Centre Cooling Budget. Ian Cathcart

Data Center Facility Basics

The Mission Critical Data Center Understand Complexity Improve Performance

Data Center Energy Use, Metrics and Rating Systems

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

Increasing Energ y Efficiency In Data Centers

DATA CENTER ASSESSMENT SERVICES

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 4, Lancaster University, 2 nd July 2014

Data Center Components Overview

How To Improve Energy Efficiency Through Raising Inlet Temperatures

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Recommendations for Measuring and Reporting Overall Data Center Efficiency

GHG Protocol Product Life Cycle Accounting and Reporting Standard ICT Sector Guidance. Chapter 8: 8 Guide for assessing GHG emissions of Data Centers

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Best Practices for Wire-free Environmental Monitoring in the Data Center

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

Data Center Environments

AIRAH Presentation April 30 th 2014

IBM Twin Data Center Complex Ehningen Peter John IBM BS 2011 IBM Corporation

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Data Center Energy Consumption

Greening Commercial Data Centres

GREEN DATA CENTER DESIGN

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Transcription:

HPC TCO: Cooling and Computer Room Efficiency 1

Route Plan Motivation (Why do we care?) HPC Building Blocks: Compuer Hardware (What s inside my dataroom? What needs to be cooled?) HPC Building Blocks: Subsystems (What is around my data room and what do I need it for?) Traditional Air Cooling (What we have been doing for decades) Chillers (So where does all the heat go?) Cooling Towers (Can t we do that for free?) Warm water cooling ( Well thats sounds like an oxymoron) 2

Route Plan Liquid summersion coling (taking the servers for a bath) Modular Data Centers Efficiency Metrics (Where this gets boring) PUE ERE Space Utilization computation Low vs High Density DC (Tight is better) Liquid Cooling vs. Air Cooling (Wet is better) 3

Motivation If HPCs are not constantly cooled, they overheat and, in the worst case scenario, may be damaged. Most of the energy consumed by an HPC or DC that is not used for computation is used in keeping the equipment at an appropriate temperature. The cooling system used in an HPCs has mayor implications in the design of the data room and the facility as a whole and will significantly affect the capital and operational costs. 4

Motivation Power consumption and density have been rising steadily in the past and are expected to keep on rising. Keeping density high is in our interest because it let us operate very powerful machine with low space requirements. Archiving high density in a reliable and safe way is not a trivial task.. 5

HPC Building Blocks: Compuer Hardware Current HPC systems are built from commodity server processors Modular servers like 1U or blades are put into racks and are are interconected though some kind of network fabric. The more servers/blades we put into a single rack, the higher the heat concentration. The layout of the data room depends greatly on how we plan to cool the system. 6

HPC Building Blocks: Subsystems Electricity is supplied either from the grid or an on-site generator; this is then conditioned on-site before delivery to the computer room. Central plant chillers provide continuous supply of cold water for use in the computer room air-conditioning (CRAC) units. Additionally, apart from the rack switches, network connectivity must be provided for enabling data transmission within the HPC. 7

Traditional Air Cooling The most common datacenter layout in today s data centers is a repeating of rows of racks side-by-side with alternating cold aisles and hot aisles. Computer Room Air Conditioning units (CRACs) pull hot air across chillers and distribute the cool air beneath the raised floor. 8

Traditional Air Cooling A CFD simulation by HP from 2005 The computer room needs to be carefully designed Possible equipment failures have to be taken into account In high density setups computational fluid dynamics (CFD) simulations might be needed to achieve an efficient design. 9

Chillers CRAC units usually need cold water or some kind of liquid coolant for heat exchange with the hot air. After the heat exchange the water or liquid coolant has to be cooled down for reuse. This is usually done in a chiller. A chiller is a machine that removes heat from a liquid via a vapor-compression or absorption refrigeration cycle. Chillers can consume a lot of energy. 10

Cooling Towers (Free Cooling) Hot water comes at the top of the towers Waters fall down the tower and cools down mainly through evaporation. Much cheaper operational costs than traditional chillers. Redundancy can be costly. It s tricky to prevent freezing in cold climates. 11

Warm Water Cooling Water is circulated as close to the computer hardware as possible in order to allow heat exchange from the hardware into the water even at relatively high temperatures. Keeping the water temperature above the wet-bulb temperature of the ambient air means that free cooling is possible all the time completely removing the need for chillers. It allows for energy reuse for heating purposes (SuperMUC) or to drive an adsorption chiller (idatacool). 12

Liquid Submersion Cooling The servers are immersed in a non conductive coolant. The direct contact of the coolant with the hardware allows for a more efficient heat exchange. The servers are placed vertically in the racks The racks have a very exotic form factor that uses a wider area than conventional racks Like warm water cooling, it allows for higher densities than air cooling. 13

Liquid Submersion Cooling 14

Modular Data Centers Portable modular data centers, fits data center equipment (servers, storage and networking equipment) into a standard shipping container, which is then transported to a desired location. Containerized data centers typically come outfitted with their own cooling systems. Modular data centers typically consist of standardized components, making them easier and cheaper to build Their main advantage is their rapid deployment. Still need considerable infrastructure around them 15

Efficiency Metrics: PUE 16

Efficiency metrics: ERE 17

Efficiency Metrics: Space Utilization Watts/rack Measures the energy consumption of a single rack Watts/sq ft is a common but ambiguous metric because it doesn t specify the nature of the area in the denominator. Watts/sq ft of work cell is the preferred metric for data center to (air cooled) data center benchmarking, or in infrastructure discussions. Layout efficiency measures the data center square footage utilization and the efficiency of the data center layout. This is defined as racks per thousand square feet and so it is a measure of space efficiency related to the electrically active floor in a data center. 18

Efficiency Metrics: Computation MFlops/Watt and GFlops/Node are important metrics that should be taken into consideration specially when choosing the processors. Together with the space utilization metrics this helps us to get an idea of how much space we will need for a certain computational requirement. 19

Low vs High Density DC (Comparison by Intel in 2007) 20

Low vs High Density DC (Comparison by Intel in 2007) 21

Low vs High Density DC (Comparison by Intel in 2007) 22

Low vs High Density DC (Comparison by Intel in 2007) High density data centers require higher raised floors and a more careful design but the reduced area and improved cooling efficiency make up for it by significantly lowering the capital and operational costs. As a result high density datacenters have a lower TCO. 23

Liquid Cooling vs. Air Cooling (A Comparison by Eurotech) 24

Liquid Cooling vs. Air Cooling (A Comparison by Eurotech) 25

Liquid Cooling vs. Air Cooling (A Comparison by Eurotech) 26

Liquid Cooling vs. Air Cooling (A Comparison by Eurotech) 27

Liquid Cooling vs. Air Cooling (A Comparison by Eurotech) Liquid cooling allows for higher density DCs. It makes possible a higher Watts/Racks ratio and layout efficiency. Removes the need for CRACS and other electrically active equipment not used for computation Incurs in additional capital costs for plumbing and pipes. Achieves a higher reliability by removing vibration and moving parts. 28

Final Notes It s important to make considerations about space needed for our data room when planning a DC or HPC. The cooling system is a decisive factor to calculate the needed space and to anticipate the kind of infrastructure that is going to be needed to support the systems. Traditional air cooling approaches underperform in most benchmarks when compared with technologies like warm-water cooling or liquid submersion cooling. There are other possibilities for cooling not mentioned in this presentation like using naturally occurring cold water (CSCS Swiss National Supercomputing Centre) There have been efforts in the last years to create ARM based HPC which could theoretically achieve much better Mflops/watt rations than current systems. 29