The Effect of Data Centre Environment on IT Reliability & Energy Consumption



Similar documents
DATA CENTER EFFICIENCY AND IT EQUIPMENT RELIABILITY AT WIDER OPERATING TEMPERATURE AND HUMIDITY RANGES

Data Center Environments

Dell s Next Generation Servers: Pushing the Limits of Data Center Cooling Cost Savings

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

2011 Thermal Guidelines for Data Processing Environments Expanded Data Center Classes and Usage Guidance

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

The New Data Center Cooling Paradigm The Tiered Approach

Environmental Data Center Management and Monitoring

Analysis of data centre cooling energy efficiency

Chiller-less Facilities: They May Be Closer Than You Think

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

How To Find A Sweet Spot Operating Temperature For A Data Center

Specialty Environment Design Mission Critical Facilities

CRITICAL COOLING FOR DATA CENTERS

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Data Centre Infrastructure Management DCIM. Where it fits with what s going on

Guide to Minimizing Compressor-based Cooling in Data Centers

Top 5 Trends in Data Center Energy Efficiency

Managing Data Centre Heat Issues

DataCenter 2020: first results for energy-optimization at existing data centers

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages

Energy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling

AIRAH Presentation April 30 th 2014

Improving Data Center Energy Efficiency Through Environmental Optimization

STULZ Water-Side Economizer Solutions

Data Centre Cooling Air Performance Metrics

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Data Centre Cooling Technologies. Jon Pettitt, EMEA DataCenter Business Development Manager

Benefits of. Air Flow Management. Data Center

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Minimising Data Centre Total Cost of Ownership Through Energy Efficiency Analysis

Increasing Energ y Efficiency In Data Centers

THERMAL CLIMATE IN A DATA CENTER

Enclosure and Airflow Management Solution

Paul Oliver Sales Director. Paul Oliver, Airedale Tom Absalom, JCA

Direct Fresh Air Free Cooling of Data Centres

Dealing with Thermal Issues in Data Center Universal Aisle Containment

How to Build a Data Centre Cooling Budget. Ian Cathcart

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

How to Save Over 60% on Cooling Energy Needed by a Data Center? The Roadmap from Traditional to Optimized Cooling

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

DataCenter Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

2008 ASHRAE Environmental Guidelines for Datacom Equipment -Expanding the Recommended Environmental Envelope-

Green HPC - Dynamic Power Management in HPC

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions Sponsored by:

Center Thermal Management Can Live Together

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Server Room Thermal Assessment

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

Re Engineering to a "Green" Data Center, with Measurable ROI

Data Center Energy Profiler Questions Checklist

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Office of the Government Chief Information Officer. Green Data Centre Practices

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

SURVEY RESULTS: DATA CENTER ECONOMIZER USE

Expert guide to achieving data center efficiency How to build an optimal data center cooling system

TelecityGroup plc. Outstanding data centres. Expertise you can trust.

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Energy Impact of Increased Server Inlet Temperature

Free Cooling for Data Centers

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Data center upgrade proposal. (phase one)

Solve your IT energy crisis WITH An energy SMArT SoluTIon FroM Dell

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Greening Commercial Data Centres

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Infrastructure & Cities Sector

Rittal White Paper 508: Economized Data Center Cooling Defining Methods & Implementation Practices By: Daniel Kennedy

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Code of Conduct on Data Centre Energy Efficiency. Endorser Guidelines and Registration Form. Version 3.0.0

Best Practices in HVAC Design/Retrofit

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

BRUNS-PAK Presents MARK S. EVANKO, Principal

Reducing Data Center Energy Consumption

Rack Blanking Panels To Fill or Not to Fill

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

Cloud Computing Driving Infrastructure Innovation Intel DCSG Distinguished Speaker Series

Transcription:

The Effect of Data Centre Environment on IT Reliability & Energy Consumption Steve Strutt EMEA Technical Work Group Member IBM The Green Grid EMEA Technical Forum 2011

Agenda History of IT environmental operating ranges Increasing data centre efficiency Concerns with operating at higher temperatures IT reliability and temperature Using the ASHRAE classes Exploiting the allowable range IT Reliability in the allowable range Higher operating ranges ASHRAE 2011 classes Effect on chiller hours Implications IT hardware design Data Centre design Further information

History of IT Operating Ranges 20 o C considered optimal operational temperature Intended for punched cards? ASHRAE TC 9.9 - Mission Critical Facilities, Technology Spaces, and Electronic Equipment 2004, 2008 guidance Recommended range Allowable range Recommended Allowable Year 2004 2008 2004 2008 Temperature range 20 O C - 25 O C 18 O C - 27 O C 15 O C - 32 O C 10 O C - 35 O C Moisture range 40% - 55% RH 5.5 O C DP 60% RH 20%- 80% RH 20%- 80% RH Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

Increasing Data Centre Efficiency Increasing usage of economizers Reduce costs by increasing number of hours that economizers can be used Direct air cooled, indirect air cooled, indirect water cooled. Reduce size of chiller plant or remove all together Higher return temperatures to increase cooling efficiency Contained aisle solutions Stop air mixing and recirculation Elimination of hot spots Little change in operating temperatures for most data centres Poor communication between facilities and IT Concern about business impact The effect of temperature and humidity on reliability Industry support and warranties

Industry Change Rise of cloud computing as a new IT delivery model Early adopters demonstrated viability of higher temperatures Different approach to systems availability Intel study Reliability at the software and application layer Enterprise IT model assumes high infrastructure reliability Traditional DC with supply temperatures of approximately 20 o C resulted in a failure rate of 2.5% to 3.8% over twelve months Similar data centre using an air-side economizer with temperatures occasionally ranging to 35 o C the failure rate was 4.5% Note no classification of the hardware failures was performed http://www.intel.com/content/www/us/en/data-center-efficiency/data-centerefficiency-xeon-reducing-data-center-cost-with-air-economizer-brief.html

Concerns With Higher Operating Temps Lack of data on change of reliability with temperature Increasing server energy consumption with temperature Server fan power Silicon leakage current Older equipment Rapid change in environmental support in last 4 years Current models support higher ranges Blades Cisco UCS: 10-35 C, 10%-90% IBM BladeCenter HS22: 10-35 C, 8%-80% HP BL490c G6: 10-35 C, 10%-90% Switches Cisco Nexus 5000: 0-40 C, 5%-90% Storage EMC Symmetrix V-MAX: 10-32 C, 20%-80% NetApp 10-40 C, 20%-80% Source: Information from vendors websites

IT Reliability and Temperature 2.50 Relative server failure rate with temperature Relative failure rate compared to 20C 2.00 1.50 1.00 0.50 Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments 0.00 15 17.5 20 22.5 25 27.5 30 32.5 35 37.5 40 42.5 45 Continuous operational temperature in C Manufacturer's data in ASHRAE 2011 guidance shows moderate increasing failure rate with temperature Limited duration operation above 20 O C has marginal impact on overall failure rate Operation below 20 O C reduces failure rate

Relative increase Temp O C Watts (Fan) Power Consumption With Temperature Fan power consumption and component temperature Source: ASHRAE 2008 Thermal Guidelines for Data Processing Environments Relative server power increase compared to 15 O C inlet temperature Inlet Temp O C Fan power consumption increases as a cube of the airflow rate Large fans consume less energy and create less noise for the same airflow Operational temperature in O C Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

Higher Inlet Temperatures Power Increase by up to 20% Airflow Can double by 35 O C Requires improved air flow design Increased failure rate if starved Noise Increases as 5th power of rotational speed Exceed health and safety guidelines Exhaust temperature Can be 20 O C more than inlet Exceed health and safety guidelines Air flow increase with ambient temperature Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

Using The ASHRAE Classes The Green Grid has identified that much of Europe can benefit from air side economizers while remaining within ASHRAE Class 2 Recommended upper guideline of 27 O C Hours per year when dry bulb temperature less than or equal 27 O C

Exploiting the Allowable Range ASHRAE Class 2 Recommended range 15 o C to 27 o C Allowable range 10 o C to 35 o C 100% 95% 90% 85% 80% 75% 70% 65% 30-35C 25-30C 20-25C 15-20C 100% 60% 80% 60% Oslo London 30-35C 25-30C Frankfurt Milan Sydney Rome Subset detail Tokyo 40% 20-25C 15-20C 20% 0% Oslo London Frankfurt Milan Sydney Rome Tokyo Economised air temperature by percentage of time for major cities

Reliability within Allowable Range Relative failure rate by temperature Economised air temperature by percentage of time for major cities 2.50 100% 2.00 1.50 1.00 0.50 0.00 15 17.5 20 22.5 25 27.5 30 32.5 35 37.5 40 42.5 45 + 95% 90% 85% 80% 75% 70% 65% 60% Oslo London Frankfurt Milan Sydney Rome Tokyo 30-35C 25-30C 20-25C 15-20C 1.4 1.2 Relative failure rate for major cities within ASHRAE allowable range 1 0.8 30-35C 25-30C 20-25C 0.6 15-20C Oslo London Frankfurt Mexico City Milan Sydney Rome Tokyo Sao Paolo San Jose, Costa Ri Hong Kong Bangalore Singapore

Higher Operating Ranges ETSI EN 300 019 Class 3.1 (NEBS) Operation up to 45 o C Small number of server models available Usually limited performance compared to volume models Higher cost ASHRAE 2011 guidelines Comparison of ASHRAE 2008 and 2011 classes Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

ASHRAE Expanded Data Centre Classes Equipment Environmental Specifications Classes Product Operation Dry-Bulb Temperature ( O C) Humidity Range non-condensing Maximum Dew Point ( O C) Maximum Elevation (m) Maximum Rate of Change ( O C/hr) A1 to A4 18 to 27 5.5C DP to 60% RH and 15C DP Recommended Allowable A1 15 to 32 20 to 80% RH 17 3040 5/36 A2 10 to 35 20 to 80% RH 21 3040 5/36 A3 5 to 40 8 to 85% RH 24 3040 5/36 A4 5 to 45 8 to 90% RH 24 3040 5/36 B 5 to 35 8 to 85% RH 28 3040 NA C 5 to 40 8 to 85% RH 28 3040 NA Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments (Reformatted)

Chiller Hours When Using Economizers Number of hours per year of chiller operation required for air-side and waterside economizer for ASHRAE classes Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

Chiller Hours Using Closed Loop Cooling Number of hours per year of chiller operation required for a water-side economizer with a dry-cooler type tower for ASHRAE classes Source: ASHRAE Whitepaper 2011 Thermal Guidelines for Data Processing Environments

Implications for Design of IT Equipment Lower temperature differential across hot components (Delta T) Intel processors typically run at 60 O C Greater airflow required Larger heat sinks New server form factors Heat dissipation of 1U servers is limited by fan and headsink size Reduced fan power consumption and noise Expectations of warranty and reliability Costs of classes A3/A4 compared to A1/A2 Future fan-less servers and water cooling

Implications For Design of Data Centres Higher data centre airflow Temperature and noise in hot aisle Humidity and corrosion effects Are chillers required? Ride through time on cooling plant failure PUE anomalies due to server fan power at high temperatures Redundant IT service design and workload migration Requires holistic approach to IT and DC Water cooling Server replacement costs are lower than investment costs in plant and electricity IT service level requirements

Further Information 2011 Thermal Guidelines for Data Processing Environments Expanded Data Centre Classes and Usage Guidance http://tc99.ashraetcs.org/documents/ashrae Whitepaper - 2011 Thermal Guidelinesfor Data Processing Environments.pdf Deutsche Bank - Banking on Fresh AirEco Data Centre New York http://www.banking-on-green.com/en/content/news_3604.html http://www.thegreengrid.org/library-and-tools.aspx British Computer Society IT environmental range and data centre cooling analysis http://dcsg.bcs.org/data-centre-cooling-analysis

Thank you for attending! Questions? The Green Grid EMEA Technical Forum 2011