Energy Efficient High-tech Buildings



Similar documents
Data Center Energy Efficiency. SC07 Birds of a Feather November, 2007 Bill Tschudi wftschudi@lbl.gov

Electrical Systems. <Presenter>

Data Center Energy Use, Metrics and Rating Systems

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Data Center Leadership In the DOE Better Buildings Challenge

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Data Center Energy Profiler Questions Checklist

How to Build a Data Centre Cooling Budget. Ian Cathcart

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

GREEN DATA CENTER DESIGN

DOE Data Center Energy Efficiency Program

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

Green Data Centre Design

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

DC Pro Assessment Tools. November 14, 2010

National Grid Your Partner in Energy Solutions

Data Centre Testing and Commissioning

Self-benchmarking Guide for Data Center Infrastructure: Metrics, Benchmarks, Actions Sponsored by:

The New Data Center Cooling Paradigm The Tiered Approach

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

Data Center Commissioning: What you need to know

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Presentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Thermal Storage System Provides Emergency Data Center Cooling

DESIGN ASPECT OF ENERGY EFFICIENT DATA CENTER

Reducing Data Center Energy Consumption

Increasing Energ y Efficiency In Data Centers

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Managing Data Centre Heat Issues

Calculating Total Cooling Requirements for Data Centers

Enabling an agile Data Centre in a (Fr)agile market

General Recommendations for a Federal Data Center Energy Management Dashboard Display

Cooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: Uptime Racks:

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

Data Center Components Overview

Data Center Airflow Management Retrofit

BRUNS-PAK Presents MARK S. EVANKO, Principal

Variable-Speed Fan Retrofits for Computer-Room Air Conditioners

Defining Quality. Building Comfort. Precision. Air Conditioning

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Our data centres have been awarded with ISO 27001:2005 standard for security management and ISO 9001:2008 standard for business quality management.

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Blade Servers and Beyond: Adaptive Cooling for the Next Generation of IT Systems. A White Paper from the Experts in Business-Critical Continuity

Specialty Environment Design Mission Critical Facilities

Datacenter Power Delivery Architectures : Efficiency and Annual Operating Costs

CRITICAL COOLING FOR DATA CENTERS

Elements of Energy Efficiency in Data Centre Cooling Architecture

How To Calculate Total Cooling Requirements For Data Center

Benefits of. Air Flow Management. Data Center

Lawrence Berkeley National Laboratory Lawrence Berkeley National Laboratory

Stanford University Server/Telecom Rooms Design Guide

NetApp Data Center Design

NEW JERSEY CENTER OF EXCELLENCE

CyberFortress Data Center at Innovation Park Infrastructure and Topology Guide. Innovation Park Charlotte. NC DRAFT V0.

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Legacy Data Centres Upgrading the cooling capabilities What are the options?

Calculating Total Power Requirements for Data Centers

Data Center Precision Cooling: The Need For A Higher Level Of Service Expertise. A White Paper from the Experts in Business-Critical Continuity

Data Centers WHAT S ONTHEHORIZON FOR NR HVAC IN TITLE ? SLIDE 1

POLK STATE COLLEGE CHILLER PLANT EVALUATION WINTER HAVEN, FLORIDA APRIL 2, C arastro & A ssociates, C&A# 5747

Guidance for Calculation of Efficiency (PUE) in Data Centers

Electrical Efficiency Modeling for Data Centers

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Energy Efficient Cooling Solutions for Data Centers

EMC DURHAM CLOUD DATA CENTER

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

Maximize energy savings Increase life of cooling equipment More reliable for extreme cold weather conditions There are disadvantages

Data Center Combined Heat and Power Benefits and Implementation. Justin Grau, Google Inc. Sam Brewer, GEM Energy

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Heating & Cooling Efficiency

High Performance Data Centers

Datacenter Design Issues

Guidance for Calculation of Efficiency (PUE) in Data Centers

AC vs DC Power Distribution for Data Centers

Transcription:

Energy Efficient High-tech Buildings Can anything be done to improve Data Center and Cleanroom energy efficiency? Bill Tschudi Lawrence Berkeley National Laboratory Wftschudi@lbl.gov

Acknowledgements California Energy Commission Pacific Gas and Electric Company Federal Energy Management Program (FEMP) 7 x 24 Exchange Uptime Institute Rocky Mountain Institute NYSERDA E Source Rumsey Engineers EYP Mission Critical Facilities Industry Partners (Too many to name all)

Data Center efficiency Benchmarking Case study example Power conversions Selected opportunities for improvement

We also operate data centers National Energy Research Scientific Computing supercomputer center in Oakland

California energy research related to data centers Energy research roadmap Case studies and energy benchmarking Best practice identification Self benchmarking protocol Investigate efficiency of power supplies in IT equipment Investigate efficiency of UPS systems Metrics for computing performance vs. energy Technology transfer Demonstration projects

Data center efficiency opportunity Industry professionals, LBNL and others brainstormed efficiency improvements at RMI Charrette (2003) Practical (near term) solutions as well as longer term concepts were identified Available through RMI website: http://www.rmi.org/store/

Data center energy roadmap Input through workshops, conferences, and contacts with industry professionals. Participation in design charrette facilitated by the Rocky Mountain Research Institute (RMI) Selected research areas are being pursued Available through LBNL website: http://datacenters.lbl.gov/docs/roadmap admapfinal.pdf

Why benchmark data centers? Utility load growth planning Baseline energy use System and component efficiency comparisons Best practices using current technology Identify areas where further work is required

April 10, 2003 San Jose Mercury News A new power plant is up and running in San Jose's Alviso neighborhood, but the massive Internet server farm that it was supposed to fuel is nowhere in sight. The Los Esteros Critical Energy Facility, a 180-megawatt plant built by Calpine in North San Jose, was designed to power an adjacent Internet server farm by U.S. Dataport. The server farm never broke ground -- and company officials didn't return calls Wednesday to say if or when it might -- but Calpine proceeded with the plant anyway, after securing a three-year deal with the state Department of Water Resources to buy power. Company and state officials say the plant is still needed, even though the state's infamous energy crunch of 2000-01 is long over. 180 MW: 900,000 sq.ft. x 200 W/sq.ft

Case studies/benchmarks California Storage device and router Mfgs. Banks Web hosting facilities Internet service provider State tax center Federal facilities New York Recovery center (hosting) Financial institution

Electricity flow in Data Centers HVAC system lights, office space, etc. uninterruptible power local distribution lines to the building, 480 V UPS PDU computer racks computer equipment backup diesel generators UPS = Uninterruptible Power Supply PDU = Power Distribution Unit;

Metrics IT equipment load intensity: W/sq. ft. (electrically active space Uptime definition) UPS losses: % Chilled water: kw/ton; W/sq.ft. End use pie chart: W/end use; W/sq.ft. Occupancy: % full (subjective) % of design load (readout from UPS or PDU)

2003 IT equipment loads (W/Sq.Ft. of electrically active floor space) Computer Load Density W/sq.ft. 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Facility Average 27 +/-

Projecting fully loaded conditions (W/Sq.Ft. of electrically active floor space) Current and Projected Load Intensity W/sq. ft. 100 80 60 40 20 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Facility Projected Average 44 Current Computer Load Projected Computer Load

Distribution of computer room power reported to Uptime Institute Fraction of total floor area in sample 1.00 0.80 0.60 0.40 0.20 0.00 1999 2000 2001 Source: Uptime Institute, 2002. Number of facilities Computer room power density W/square foot Total floor area Million square feet 1999 35 1.55 22.9 2000 38 1.72 22.4 2001 48 1.86 25.3 0 20 40 60 80 100 Computer room UPS power (Watts/square foot)

2005 IT equipment benchmarks IT equipment load 100 90 80 70 Average 52 w/sf LBNL NERSC supercomputer Watts/sf 60 50 40 30 20 10 0 1 2 3 4 5 6 7 Data center identifier

End-use pie charts vary Office Space Conditioning 1% Lighting 2% Other 13% Computer Loads 67% Electrical Room Cooling 4% Cooling Tower Plant 4% Data Center Server Load 51% HVAC - Air Movement 7% Data Center CRAC Units 25% Lighting 2% HVAC - Chiller and Pumps 24%

A better ratio is when infrastructure loads account for a smaller percentage Computer Loads 67% HVAC - Air Movement 7% Lighting 2% HVAC - Chiller and Pumps 24%

Effectiveness of HVAC systems HVAC (as a % of total load) 60% 50% % of total load 40% 30% 20% 10% 0% 1 2 3 4 5 6 7 8 9 10 11 12 Data Center Identifier

Index of performance The Uptime Institute proposed a metric to evaluate the total efficiency of infrastructure systems: Index of performance = building systems KW UPS output (i.e. ratio of building systems to IT equipment load)

Total chilled water system efficiency Total Average Efficiency kw/ton kw/ton 3 2.5 2 1.5 1 0.5 0 Lower is better Average 1.69 Facility

Chiller comparison 1.2 Average.75 1.0 0.8 kw / ton 0.6 0.4 0.2 0.0 Lower is better Facility

Chilled water systems efficiencies 1.8 kw / ton (lower is better) 1.6 1.4 1.2 1 0.8 0.6 0.4 Cooling Tower CW Pumps CHW Pumps Chiller 0.2 0 Water Cooled 42F Air Cooled 42F Air Cooled 40F Air Cooled 48F Air Cooled 50F Water Cooled 40F Water Cooled 38F Water Cooled 36F Water Cooled 44F Water Cooled 43F Fac. A Fac. B.1 Fac. B.2 Fac. B.2 Fac. B.2 Fac. C Fac. D Fac. E.1 Fac. E.2 Fac. F

Power loss in UPS systems Loss in UPS as a percent of total 14 12 Average 7.7 % 10 Percent 8 6 4 2 0 Facility Cooling loads compound this

How many times do data centers convert AC and DC? 5V Internal Drive In Battery/Charger Rectifier Bypass Inverter Out AC/DC PWM/PFC Switcher Unregulated DC To Multi Output Regulated DC Voltages 12V 3.3V 12V 1.5/2. DC/DC 5V 12V 3.3V DC/DC 1.1V- 1.85V External Drive I/O Memory Controller µ Processor SDRAM 3.3V Graphics Controller Voltage Regulator Modules AC/DC Multi output PS

Measured UPS efficiency 100 80 Efficiency (%) 60 40 20 0 0 20 40 60 80 100 Load Factor (%)

UPS factory measurements 100% Factory Measurements of UPS Efficiency (tested using linear loads) 95% Efficiency 90% 85% 80% Flywheel UPS 75% 70% Double-Conversion UPS Line-Interactive, Delta Conversion UPS 0% 20% 40% 60% 80% 100% Percent of Rated Active Power Load

Measured UPS losses 100% Factory Measurements of UPS Efficiency (tested using non-linear loads) 95% Efficiency 90% 85% 80% 75% 70% Double Conversion UPS Line-Interactive, Delta Conversion UPS 0% 20% 40% 60% 80% 100% Percent of Rated Active Power Load

Double conversion UPS systems can be more efficient today

Analyzing UPS performance in high efficiency option High Efficiency Mode 30% sag 10 cycle Double Conversion Mode 30% sag 30 cycle Input Voltage Output Voltage Input Voltage Output Voltage Volts 400 300 200 100 0-100 -200-300 -400 0 50 100 150 200 Time (ms) Volts 400 300 200 100 0-100 -200-300 -400 0 50 100 150 200 Time (ms) Source: EPRI Solutions In high efficiency mode, there is typically one cycle (16.6 msec for 60 Hz) of UPS output voltage deviation. Power supplies downstream of UPS can ride through this!

140 120 100 80 60 40 20 0 131 AC DC Losses Electricity use in a server 32 32 DC/DC Losses Fans 72 Drives 41 PCI Cards 86 Processors 27 32 Based on a typical dual processor 450W 2U Server; Approximately 160W out of 450W (35%) are losses in the power conversion process (Source: Brian Griffith: INTEL) Memory Chipset

Power supply opportunity 1190 2380 HVAC Chilled Water Standby Generator Efficiency of Power Conversion Process IT Load (kw) UPS Losses (kw) Total Savings (kw) 4335 85 170 340 Lighting and Plug Loads UPS Losses Computing Load 65% 4335 340 0 70% 4025 316 334 75% 3757 295 623 80% 3522 276 877 85% 3315 260 1100 90% 3131 246 1299 HVAC Fan Load Based on one case study approximately 4335 KW of a total of 8500 kw was IT load. Assuming a 65% existing baseline efficiency, the savings opportunity using 90% efficient conversion process is approximately 1300kW not including any savings from HVAC

Power supply efficiency today Full Load Efficiency > 68% Redundant System of Power Supplies for Servers 68

Measured power supply efficiency 85% 80% 75% % Efficiency 70% 65% 60% 55% 50% Average of All Servers 45% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% % of Nameplate Power Output

Standby generation loss Several load sources Heaters Battery chargers Transfer switches Fuel management systems Heaters (many operating hours) use more electricity than the generator will ever produce (few operating hours) Opportunity may be to reduce or eliminate heating, batteries, and chargers

Standby generator heater

Additional benchmarks Computations per Watt Ability to enter sleep mode Nameplate vs. Actual Comparisons Others? IT Equipment UPS Chillers Transformers

California case studies LBNL sub-contractors, Rumsey Engineers and EYP Mission Critical Facilities, performed site data collection and preliminary analysis, provided efficiency recommendations, and case studies reports.

Case study example: Facility 8 % of total load 60% 50% 40% 30% 20% 10% HVAC (as a % of total load) The worst and the best ratios were both in facility 8 0% 1 2 3 4 5 6 7 8 9 10 11 12 Data Center Identifier

Facility 8 site characteristics Data Center 8.1 26,200 sq ft 6 UPS s 3 per side Redundancy: n+1 at PDU level, n+2 at UPS level Overhead ducted air distribution Air-cooled constant volume CRAC units

Facility 8 site characteristics Data Center 8.2 73,000 sq ft 5 UPS s Redundancy: n+1 at PDU level Overhead ducted air distribution Central chilled water plant Central air handling system Variable speed chiller, secondary pumps, air handlers

Facility 8 electricity end-use Data Center 8.1 Data Center 8.2 Total Power = 580 kw UPS Losses 6% Computer Loads 63% Total Power = 1700 kw Computer Loads 38% HVAC 54% UPS Losses 13% Lighting 2% Lighting 1% HVAC - Chilled Water Plant 14% HVAC - Air Movement 9%

How did they do it? Data Center 8.1 Air cooled CRAC units No economizing Constant Speed Fans Humidification control All CRAC units on Data Center 8.2 Optimal central chilled water plant Optimal central air handling units Little humidity control Good control Data Monitoring Gateways, EMCS

How could they have done even better? Data center 8.1 observations Disable humidification control Turn off CRAC units Shut off (rotate) surplus UPSs i.e. go from N+2 to desired N+1 Space temperature setpoints

How could they have done even better? Data center 8.2 observations Monitoring - chiller, total chiller plant kw/ton Run Cooling towers in parallel, nozzle replacement Chilled water setpoint Condenser water temperature reset

Energy efficiency opportunity Air management hot aisle-cold aisle, bypassing/short circuiting, underfloor congestion, high ceilings and adequate underfloor areas, use of modeling Air and water side economizers many hours of low cost cooling in CA UPS and power supply efficiencies, loading, redundancy strategies Temperature and humidity control eliminate CRAC unit fighting, ASHRAE thermal guidelines Chilled water plant optimization - efficient chillers, primary only pumping, etc. Variable speed drives pumps, chillers, fans Control strategies - setpoints, cooling tower staging Lighting controls

LBNL high-tech buildings website: http://hightech.lbl.gov

Thank you Questions?