Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings



Similar documents
Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

NREL s Plans & Strategies for Building Green Data Centers

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages

Guide to Minimizing Compressor-based Cooling in Data Centers

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

STULZ Water-Side Economizer Solutions

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

Specialty Environment Design Mission Critical Facilities

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Data Center RD&D at EPRI

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Data Center Leadership In the DOE Better Buildings Challenge

Data Center Environments

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

Improving Data Center Efficiency with Rack or Row Cooling Devices:

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

Benefits of. Air Flow Management. Data Center

Greening Data Centers

Re Engineering to a "Green" Data Center, with Measurable ROI

Analysis of data centre cooling energy efficiency

Reducing Data Center Energy Consumption

Data Center Components Overview

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Improving Data Center Energy Efficiency Through Environmental Optimization

DataCenter 2020: first results for energy-optimization at existing data centers

University of St Andrews. Energy Efficient Data Centre Cooling

How To Improve Energy Efficiency In A Data Center

HPC TCO: Cooling and Computer Room Efficiency

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Data Center Energy Profiler Questions Checklist

Direct Fresh Air Free Cooling of Data Centres

Data Centre Cooling Air Performance Metrics

Managing Data Centre Heat Issues

AIRAH Presentation April 30 th 2014

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

Green Data Center: Energy Reduction Strategies

Elements of Energy Efficiency in Data Centre Cooling Architecture

Environmental Data Center Management and Monitoring

Top 5 Trends in Data Center Energy Efficiency

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Stanford University Server/Telecom Rooms Design Guide

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

European Code of Conduct for Data Centre. Presentation of the Awards 2014

Economizer Modes of Data Center Cooling Systems

Data Realty Colocation Data Center Ignition Park, South Bend, IN. Owner: Data Realty Engineer: ESD Architect: BSA LifeStructures

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

ERE: A METRIC FOR MEASURING THE BENEFIT OF REUSE ENERGY FROM A DATA CENTER

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Green Data Centre Design

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions Sponsored by:

Best Practices Guide for Energy-Efficient Data Center Design

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Alcatel-Lucent Modular Cooling Solution

How to Build a Data Centre Cooling Budget. Ian Cathcart

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

Data Center Design and Energy Management

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

Best Practices in HVAC Design/Retrofit

CRITICAL COOLING FOR DATA CENTERS

Legacy Data Centres Upgrading the cooling capabilities What are the options?

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Data Center Technology: Physical Infrastructure

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE ? SLIDE 1

Benefits of Cold Aisle Containment During Cooling Failure

Economizer Fundamentals: Smart Approaches to Energy-Efficient Free-Cooling for Data Centers

The New Data Center Cooling Paradigm The Tiered Approach

SECTION 5 COMMERCIAL REFRIGERATION UNIT 22 CONDENSERS

Data Center Equipment Power Trends

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Reducing the Annual Cost of a Telecommunications Data Center

Transcription:

Trends in Data Center Design ASHRAE Leads the Way to Large Energy Savings ASHRAE Conference, Denver Otto Van Geet, PE June 24, 2014 NREL/PR-6A40-58902 NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.

2 Data Center Energy Data centers are energy intensive facilities. 10-100x more energy intensive than an office Server racks well in excess of 30 kw Surging demand for data storage EPA estimate: 3% of U.S. electricity Power and cooling constraints in existing facilities. Photo by Steve Hammond, NREL Data center inefficiency steals power that would otherwise support compute capability.

3 BPG Table of Contents Summary Background Information Technology Systems Environmental Conditions Air Management Cooling Systems Electrical Systems Other Opportunities for Energy Efficient Design Data Center Metrics & Benchmarking

4 Safe Temperature Limits CPUs ~65C (149F) Memory ~85C (185F) GPUs ~75C (167F) Photo by Steve Hammond, NREL CPU, GPU & Memory, represent ~75-90% of heat load

Environmental Conditions Data center equipment s environmental conditions should fall within the ranges established by ASHRAE as published in the Thermal Guidelines. Environmental Specifications ( F) (@ Equipment Intake) Recommended Allowable Temperature Data Centers ASHRAE 65 80 F 59 90 F (A1) 41 113 F (A4) Humidity (RH) Data Centers ASHRAE 42 F DP 60% or 59 o F DP 20% 80% & 63 F DP ASHRAE Reference: ASHRAE (2008), (2011) 5

6 Equipment Environmental Specification Air inlet to IT equipment is the important specification to meet Outlet temperature is not important to IT equipment

7 Key Nomenclature Recommended Range (Statement of Reliability) Preferred facility operation; most values should be within this range. Allowable Range (Statement of Functionality) Robustness of equipment; no values should be outside this range. RACK INTAKE TEMPERATURE MAX ALLOWABLE MAX RECOMMENDED MIN RECOMMENDED MIN ALLOWABLE Over-Temp Recommended Range Under-Temp Allowable Range

8 2011 ASHRAE Thermal Guidelines 2011 Thermal Guidelines for Data Processing Environments Expanded Data Center Classes and Usage Guidance. White paper prepared by ASHRAE Technical Committee TC 9.9

9 2011 ASHRAE Allowable Ranges Dry Bulb Temperature

10 Psychrometric Bin Analysis Relative Humidity 40ºF 40ºF 50ºF Boulder, Colorado TMY3 Weather Data 60ºF 50ºF 70ºF 60ºF 80ºF Design Conditions (0.4%): 91.2 db, 60.6 wb 100% 0.004 80% 60% TMY3 Weather Data 0.002 40% 20% Class 1 Recommended 0 Range 30 40 50 60 70 80 90 100 110 120 Dry-Bulb Temperature (ºF) Class 1 Allowable Range 0.036 0.034 0.032 0.03 0.028 0.026 0.024 0.022 0.02 0.018 0.016 0.014 0.012 0.01 0.008 0.006 Humidity Ratio (lb Water/lb Dry Air)

Estimated Savings Baseline System DX Cooling with no economizer Load 1 ton of cooling, constant year-round Efficiency (COP) 3 Total Energy (kwh/yr) 10,270 RECOMMENDED RANGE ALLOWABLE RANGE Results Hours Energy (kwh) Hours Energy (kwh) Zone1: DX Cooling Only 25 8 2 1 Zone2: Multistage Indirect Evap. + DX (H80) 26 16 4 3 Zone3: Multistage Indirect Evap. Only 3 1 0 0 Zone4: Evap. Cooler Only 867 97 510 57 Zone5: Evap. Cooler + Economizer 6055 417 1656 99 Zone6: Economizer Only 994 0 4079 0 Zone7: 100% Outside Air 790 0 2509 0 Total 8,760 538 8,760 160 Estimated % Savings - 95% - 98% 11

Energy Savings Potential: Economizer Cooling Energy savings potential for recommended envelope, Stage 1: Economizer Cooling. 12 (Source: Billy Roberts, NREL) 12

Energy Savings Potential: Economizer + Direct Evaporative Cooling Energy savings potential for recommended envelope, Stage 2: Economizer + Direct Evap. Cooling.12 (Source: Billy Roberts, NREL) 13

Energy Savings Potential: Economizer + Direct Evap. + Multistage Indirect Evap. Cooling Energy savings potential for recommended envelope, Stage 3: Economizer + Direct Evap. + Multistage Indirect Evap. Cooling.12 (Source: Billy Roberts, NREL) 14

15 Improve Air Management Typically, more air circulated than required Air mixing and short circuiting leads to: Low supply temperature Low Delta T Use hot and cold aisles Improve isolation of hot and cold aisles Reduce fan energy Improve air-conditioning efficiency Increase cooling capacity Source: http://www1.eere.energy.gov/femp/pdfs/eedatacenterbestpractices.pdf Hot aisle/cold aisle configuration decreases mixing of intake & exhaust air, promoting efficiency.

16 Isolate Cold and Hot Aisles 95-105ºF vs. 60-70ºF 70-80ºF vs. 45-55ºF Source: http://www1.eere.energy.gov/femp/pdfs/eedatacenterbestpractices.pdf

17 Adding Air Curtains for Hot/Cold Isolation Photo used with permission from the National Snow and Ice Data Center. http://www.nrel.gov/docs/fy12osti/53939.pdf

Move to Liquid Cooling Server fans are inefficient and noisy. Liquid doors are an improvement but we can do better! Power densities are rising making componentlevel liquid cooling solutions more appropriate. Liquid benefits Thermal stability, reduced component failures. Better waste heat re-use options. Warm water cooling, reduce/eliminate condensation. Provide cooling with higher temperature coolant. Eliminate expensive & inefficient chillers. Save wasted fan energy and use it for computing. Unlock your cores and overclock to increase throughput! 18

Liquid Cooling Overview Water and other liquids (dielectrics, glycols and refrigerants) may be used for heat removal. Liquids typically use LESS transport energy (14.36 Air to Water Horsepower ratio for example below). Liquid-to-liquid heat exchangers have closer approach temps than Liquid-to-air (coils), yielding increased economizer hours. 19

20 Liquid Cooling New Considerations Air Cooling Humidity Fan failures Air side economizers, particulates Liquid Cooling ph & bacteria Dissolved solids Corrosion inhibitors, etc. When considering liquid-cooled systems, insist that providers adhere to the latest ASHRAE water quality spec or it could be costly.

2011 ASHRAE Liquid Cooling Guidelines 21

22 2011 ASHRAE Liquid Cooling Guidelines NREL ESIF HPC (HP hardware) using 75 F supply, 113 F return W4/W5

Courtesy of Henry Coles, Lawrence Berkeley National Laboratory

Three Cooling Device Categories 1 - Rack Cooler APC-water Knürr(CoolTherm)-water Knürr(CoolLoop)-water Rittal-water rack containment SERVER FRONT IT Equipment Rack cooling water row containment SERVER FRONT 2 - Row Cooler APC(2*)-water Liebert-refrigerant IT Equipment Rack IT Equipment Rack IT Equipment Rack cooling water cooling water 3 - Passive Door Cooler IBM-water Vette/Coolcentric-water Liebert-refrigerant SUN-refrigerant SERVER FRONT IT Equipment Rack cooling water Courtesy of Henry Coles, Lawrence Berkeley National Laboratory 24

25 Chill-off 2 Evaluation of Close-coupled Cooling Solutions less energy use Courtesy of Geoffrey Bell and Henry Coles, Lawrence Berkeley National Laboratory

Cooling Takeaways Use a central plant (e.g., chiller/crahs vs. CRAC units) Use centralized controls on CRAC/CRAH units to prevent simultaneous humidifying and dehumidifying Move to liquid cooling (room, row, rack, chip) Consider VSDs on fans, pumps, chillers, and towers Use air- or water-side economizers Expand humidity range and improve humidity control (or disconnect) 26

Data Center Efficiency Metric Power Usage Effectiveness (PUE) is an industry standard data center efficiency metric. The ratio of power used or lost by data center facility infrastructure (pumps, lights, fans, conversions, UPS ) to power used by compute. Not perfect, some folks play games with it. 2011 survey estimates industry average is 1.8. Typical data center, half of power goes to things other than compute capability. P.U.E. = IT power + Facility power IT power 27

PUE Simple and Effective 28

29 Data Center PUE PUE 1.45 1.35 1.25 1.15 1.05 0.95 0.85 0.75 100 80 60 40 20 0-20 Outdoor Temperature ( F) NREL PIX 17828 PUE 1.45 1.35 1.25 1.15 1.05 0.95 0.85 0.75 100 80 60 40 20 0-20 Outdoor Temperature ( F) Data Center PUE Outdoor Temperature

I am re-using waste heat from my data center on another part of my site and my PUE is 0.8! http://www.thegreengrid.org/en/global/content/white-papers/ere

I am re-using waste heat from my data center on another part of my site and my PUE is 0.8! http://www.thegreengrid.org/en/global/content/white-papers/ere

ASHRAE & friends (DOE, EPA, TGG, 7x24, etc..) do not allow reused energy in PUE & PUE is always >1.0. Another metric has been developed by The Green Grid +; ERE Energy Reuse Effectiveness. http://www.thegreengrid.org/en/global/content/white-papers/ere

33 ERE Adds Energy Reuse Rejected Energy (a) Cooling (f) Reused Energy Utility (e) IT (g) (b) UPS (c) PDU (d)

34 DOE/NREL Research Support Facility More than 1300 people in DOE office space on NREL s campus 360,0,000 ft 2 Design/build process with required energy goals 25 kbtu/ft 2 50% energy savings LEED Platinum Replicable Process Technologies Cost Site, source, carbon, cost ZEB:B Includes plugs loads and datacenter Firm fixed price - $246/ft 2 construction cost (not including $27/ft 2 for PV from PPA/ARRA) Opened June 10, 2010 (First Phase) Credit: Haselden Construction

35 RSF Datacenter Fully containing hot aisle Custom aisle floor and door seals Ensure equipment designed for cold aisle containment And installed to pull cold air Not hot air 1.18 annual PUE ERE = 0.9 Control hot aisle based on return temperature of ~90F. Waste heat used to heat building. Economizer and Evaporative Cooling Low fan energy design 1900 ft 2 NREL PIX 25417 Credit: Marjorie Schott/NREL

36 NREL HPC Data Center Chips to bricks approach Leveraged expertise in energy efficient buildings to focus on showcase data center Showcase Facility 10MW, 10,000 ft 2 Leverage favorable climate Use evaporative rather mechanical cooling. Waste heat captured and used to heat labs & offices. World s most energy efficient data center, PUE 1.06! Lower CapEx and OpEx. High Performance Computing Petascale+ HPC Capability in 2013 20-year planning horizon 5 to 6 HPC generations. Insight Center Scientific data visualization Collaboration and interaction.

37 Critical Data Center Specs Warm water cooling, 75F (24C) Water much better working fluid than air - pumps trump fans. Utilize high quality waste heat, 95F (35C) or warmer. +90% IT heat load to liquid. High power distribution 480VAC, Eliminate conversions. Think outside the box Don t be satisfied with an energy efficient data center nestled on campus surrounded by inefficient laboratory and office buildings. Innovate, integrate, optimize. Photo by Steve Hammond, NREL Dashboards report instantaneous, seasonal, and cumulative PUE values.

38 Water Considerations We shouldn t use evaporative cooling, water is scarce. Thermoelectric power generation (coal, oil, natural gas and nuclear) consumes about 1.1 gallon per kw hour, on average. This amounts to about 9.5M gallons per MW year. We estimate about 2.0M gallons water consumed per MW year for on-site evaporative cooling at NREL. If chillers need 0.2MW per MW of HPC power, then chillers have an impact of 2.375M gallons per year per MW. Actuals will depend on your site, but evap. cooling doesn t necessarily result in a net increase in water use. NREL PIX 17828

Data Center Energy Efficiency ASHRAE 90.1 2011 requires economizer in most data centers. ASHRAE Standard 90.4P, Energy Standard for Data Centers and Telecommunications Buildings PURPOSE: To establish the minimum energy efficiency requirements of data centers and telecommunications buildings for: Design, construction, and a plan for operation and maintenance SCOPE: This standard applies to: New, new additions, and modifications to data centers and telecommunications buildings or portions thereof and their systems Will set minimum PUE based on climate More details at https://www.ashrae.org/news/2013/ashrae-seeksinput-on-revisions-to-data-centers-in-90-1-energy-standard-scope 39

40 Energy Conservation Measures 1. Reduce the IT load - Virtualization & Consolidation (up to 80% reduction). 2. Implement contained hot aisle and cold aisle layout. Curtains, equipment configuration, blank panels, cable entrance/exit ports, 3. Install economizer (air or water) and evaporative cooling (direct or indirect). 4. Raise discharge air temperature. Install VFD s on all computer room air conditioning (CRAC) fans (if used) and network the controls. 5. Reuse data center waste heat if possible. 6. Raise the chilled water (if used) set-point. Increasing chiller water temperature by 1 F reduces chiller energy use by 1.4% 7. Install high efficiency equipment including UPS, power supplies, etc. 8. Move chilled water as close to server as possible (direct liquid cooling). 9. Consider centralized high efficiency water cooled chiller plant Air-cooled = 2.9 COP, water-cooled = 7.8 COP.

QUESTIONS? RSF II 21 kbtu/ft2 $246/ft2 Construction Cost Otto VanGeet 303.384.7369 Otto.VanGeet@nrel.gov 41