Green Data Center Program

Similar documents
Green Data Center Program. Alan Crosswell

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Re Engineering to a "Green" Data Center, with Measurable ROI

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Recommendations for Measuring and Reporting Overall Data Center Efficiency

Data Center Technology: Physical Infrastructure

AIA Provider: Colorado Green Building Guild Provider Number: Speaker: Geoff Overland

Data Center Industry Leaders Reach Agreement on Guiding Principles for Energy Efficiency Metrics

Green Data Centre Design

Reducing Data Center Energy Consumption

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

FEAR Model - Review Cell Module Block

European Code of Conduct for Data Centre. Presentation of the Awards 2014

National Grid Your Partner in Energy Solutions

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

GREEN DATA CENTER DESIGN

Metering and Managing University of Sheffield Data Centres. Chris Cartledge Independent Consultant

Analysis of data centre cooling energy efficiency

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Greening Data Centers

Benefits of. Air Flow Management. Data Center

The New Data Center Cooling Paradigm The Tiered Approach

Data Center Energy Profiler Questions Checklist

University of St Andrews. Energy Efficient Data Centre Cooling

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Data Center Facility Basics

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

How To Power And Cool A Data Center

Reducing Data Center Loads for a Large-Scale, Net Zero Office Building

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Data Center Consolidation Trends & Solutions. Bob Miller Vice President, Global Solutions Sales Emerson Network Power / Liebert

Data Centers: Definitions, Concepts and Concerns

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

CENIC 2009 Conference

Power and Cooling for Ultra-High Density Racks and Blade Servers

Energy Efficiency and Green Data Centers. Overview of Recommendations ITU-T L.1300 and ITU-T L.1310

Energy Efficient High-tech Buildings

DataCenter Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.

Reducing the Annual Cost of a Telecommunications Data Center

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Optimization of energy consumption in HPC centers: Energy Efficiency Project of Castile and Leon Supercomputing Center - FCSCL

SoSe 2014 Dozenten: Prof. Dr. Thomas Ludwig, Dr. Manuel Dolz Vorgetragen von Hakob Aridzanjan

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Managing Data Centre Heat Issues

Presented at Computing in High Energy & Nuclear Physics

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Specialty Environment Design Mission Critical Facilities

Recovering Stranded Capacity in the Data Center. Greg Bell Driving e-research Across the Pacific `09 Sydney: 11/13/09

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

LEVEL 3 DATA CENTER ASSESSMENT

Data Center Energy Use, Metrics and Rating Systems

The College of Southern Maryland respectfully submits the following responses regarding the project cited above:

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Increasing Energ y Efficiency In Data Centers

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Implement Data Center Energy Saving Management with Green Data Center Environment and Energy Performance Indicators

DATA CENTER ASSESSMENT SERVICES

Convergence of Complexity

Tour of new eircom Data Centre Blanchardstown by Dervan Engineering Consultants 11 th March 2015

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Designing the Data Center. Gail M. Roper CIO City of Raleigh, North Carolina

Why Data Center Design and Innovation Matters

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

BRUNS-PAK Presents MARK S. EVANKO, Principal

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Optimizing Power Distribution for High-Density Computing

How to Build a Data Centre Cooling Budget. Ian Cathcart

Data Center Infrastructure Management (DCIM) Utility Energy Efficiency Opportunities Discussion & Demos July 2, 2013

Legacy Data Centres Upgrading the cooling capabilities What are the options?

The Efficient Enterprise. Juan Carlos Londoño Data Center Projects Engineer APC by Schneider Electric

Data Center Leadership In the DOE Better Buildings Challenge

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Data Center Design and Energy Management

Energy Management Services

Improving Sustainability in Campus Data Centers Click To Edit Master Title Style

Scalable. Affordable. Flexible. Fast.

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Tulevaisuuden datakeskukset Ympäristö- ja energianäkökulmia

Interested in conducting your own webinar?

POD S in Data Centres. The Frame Group Pty Ltd

The Pros and Cons of Modular Systems

Green HPC - Dynamic Power Management in HPC

EXAMPLE OF A DATA CENTRE BUILD

Leveraging Renewable Energy in Data Centers: Present and Future

ICT and the Green Data Centre

The Other Half of the Problem Server Closets and Small Server Rooms

Holistic, Energy Efficient Datacentre Cardiff Going Green Can Save Money

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

HPC TCO: Cooling and Computer Room Efficiency

Computing and Communications Services

Transcription:

Green Data Center Program Alan Crosswell Fall Internet2 Member Meeting, San Antonio

Agenda The opportunities Status of the main University Data Center and others around campus Green data center best practices Future goals Our advanced concepts datacenter project 2

The opportunities Data centers consume 3% of all electricity in New York State (1.5% nationally as of 2007). That s 4.5 billion kwh annually. Use of IT systems especially for research high performance computing (HPC) is growing. We need space for academic purposes such as wet labs, especially in our constrained urban location. Columbia s commitment to Mayor Bloomberg s PlaNYC 30% carbon footprint reduction by 2017. NYS Gov. Paterson s 15x15 15% electrical demand reduction by 2015 goal. National Save Energy Now 25% energy intensity reduction in 10 yrs. 3

Main university data center Architectural Built in 1963, updated somewhat in the 1980's. 4400 sq ft raised floor machine room space. 1750 sq ft additional raised floor space, now offices. 12 raised floor Adequate support spaces nearby Staff Staging Storage Mechanical & fire suppression (future) UPS room 1968 2009 4

Main university data center Electrical S upply: 3-phase 208V from automatic transfer switch. Distributio n: 208V to wall-mounted panels; 120V to most servers. No central UPS; lots of rack-mounted units. Gener at o r: 1750 kw shared with other users & over capacity. (-: so No metering. (Spot readings every decade or IT demand load tripled from 2001-2008 5

Main university data center 600 Historical and Projected IT Demand Load 537 500 486 400 336 363 406 382 445 409 438 477 kw 300 335 historical projected (low) projected (high) 200 100 96 137 0 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 Year Bruns-Pak, Inc. 6

Main university data center Mechanical On floor CRAC units served by central campus chilled water. Also served by backup glycol dry coolers. Supplements a central overhead air system. Heat load is shared between the overhead and CRAC. No hot/cold aisles. Rows are in various orientations. Due to tripling of demand load, the backup (generator-powered) CRAC units lack sufficient capacity. 7

Main university data center IT systems A mix of mostly administrative (nonresearch) systems. Most servers dual-corded 120V power input. Many old (3+, 5+ years) servers. Due to lack of room UPS, each rack has UPSes taking up 30-40% of the space. Lots of spaghetti in the racks and under the floor. 8

Other data centers around Columbia Many school, departmental & research server rooms all over the place. Range from about 5,000 sf to tiny (2-3 servers in a closet) Several mid-sized Most lack electrical or HVAC backup. Many could be better used as labs, offices, or classrooms. Growth in research HPC putting increasing pressure on these server rooms. Lots of money spent building new server rooms for HPC clusters that are part of faculty startup packages, etc. 9

Green data center best practices 1. Measure and validate You can t manage what you don t measure. 2. Power and cooling infrastructure efficiency Best Practices for Datacom Facility Energy Efficiency. ASHRAE (ISBN 978-1-933742-27-4) 3. IT equipment efficiency Moore s Law performance improvements Energy Star power supplies BIOS and OS tuning Application tuning 10

Measuring infrastructure efficiency The most common measure is Power Use Efficiency (PUE) or its reciprocal, Data Center Infrastructure Efficiency (DcIE). PUE = [Total Datacenter Electrical Load] [Datacenter IT Equip. Electrical Load] PUE measures efficiency of the electrical and cooling infrastructure only and chasing a good PUE can lead to bizarre results since heavily-loaded facilities usually use their cooling systems more efficiently. 11

LBNL Average PUE for 12 Data Centers Power Use Efficiency (PUE) =2.17 12

Making the server slice bigger, the pie smaller and green. Reduce the PUE ratio by improving electrical & mechanical efficiency. Google claims a PUE of 1.2 Consolidate data centers (server rooms) Claimed more efficient when larger (prove it!) Free up valuable space for wet labs, offices, classrooms. Reduce the overall IT load through Server efficiency (newer, more efficient hardware) Server consolidation & sharing Virtualization Shared research clusters Move servers to a zero-carbon data center 13

Data center electrical best practices 95% efficient 480V room UPS Basement UPS room vs. wasting 40% of rack space Flywheels or batteries? 480V distribution to PDUs at ends of rack rows Transformed to 208/120V at PDU Reduces copper needed, transmission losses 208V power to servers vs. 120V More efficient (how much?) Variable Frequency Drives for cooling fans and pumps Motor power consumption increases as the cube of the speed. Generator backup 14

Data center mechanical best practices Air flow reduce mixing, increase delta-t Hot/cold or double hot aisle separation 24-36 under floor plenum Plug up leaks in floor and in racks (blanking panels) Duct CRAC returns to an overhead plenum if possible Perform CFD modeling Alternative cooling technique: In-row or in-rack cooling Reduces or eliminates hot/cold air mixing More efficient transfer of heat (how much?) Supports much higher power density Water-cooled servers are making a comeback 15

Data center green power best practices Locate data center near a renewable source Hydroelectric power somewhere cold like Western Mass. Wind power but most wind farms lack transmission capacity. 40% of power is lost in transmission. So bring the servers to the power. Leverages our international high speed networks Use free cooling (outside air) Stanford facility will free cool almost always Implement follow the Sun data centers Move the compute load to wherever the greenest power is currently available. 16

General energy saving best practices Efficient lighting, HVAC, windows, appliances, etc. LBNL and other nations 1W standby power proposals Behavior modification Turn off the lights! Enable power-saving options on computers Social experiment in Watt Residence Hall Co-generation Waste heat is recycled to generate energy Planned for Manhattanville campus Possibly for Morningside campus Columbia participation in PlaNYC 17

Measuring IT systems efficiency A complementary measure to PUE is the amount of useful work being performed by the IT equipment. What should the metric be? MIPS per KwH? kilobits per MWh (an early NSFNet node benchmark:-) Green Computing Performance Index (from sicortex) for HPCC: n = 1 for Cray XT3 GCPI = n(hpcc benchmarks)/kw Uses a representative suite of HPCC benchmarks YMMV but better than just PUE. 18

http://sicortex.com/green_index/results 19

Barriers to implementing best practices Capital costs Perceived or actual grant funding restrictions Short-term and parochial thinking Lack of incentives to save electricity Distance Synchronous writes for data replication are limited to about 30 miles Bandwidth Delay product impact on transmission of large amounts of data Reliability concerns Server hugging Staffing needs 20

Key recommendations from a 2008 study performed for our data center Allocate currently unused spaces for storage, UPS, etc. Consolidate racks to recapture floor space Generally improve redundancy of electrical & HVAC Upgrade electrical systems 750 kva UPS module New 480V 1500 kva service Generator improvements Upgrade HVAC systems 200-ton cooling plant VFD pumps & fans Advanced control system 21

Future goals next 5 years Begin phased upgrades of the Data Center to improve power and space efficiency. Overall cost ~ $25M. Consolidate and replace pizza box servers with blades (& virtualization). Consolidate and simplify storage systems. Accommodate growing demand for HPC research clusters Increase sharing of clusters among researchers to be more efficient. Accommodate server needs of new science building. Develop internal cloud services. Explore external cloud services. 22

Future goals next 5-10 years Build a new data center of 10,000-15,000 sq ft Perhaps cooperatively with others Possibly in Manhattanville (West Harlem) or at the Lamont or Nevis campuses in the country Not necessarily in NYC Consolidate many small server rooms. Significant use of green-energy cloud computing resources. From www.jiminypeak.com 23

Our NYSERDA project New York State Energy Research & Development Authority is a public benefit corpor a tion funded by NYS electric utility customers. http://www.nyserda.org Columbia competed for and was awarded an Advanced Conce p ts Datacenter demonstration project. 18 months starting April, 2009. ( NYSERDA ~$1.2M ($447K direct costs from Goals: Learn abou t and test some industry best practices in a real world datacenter. Measure and verify claimed energy efficiency improvements. Share our learnings with our peers. 24

Our NYSERDA project specific tasks Identify 30 old servers to consolidate and replace. Instrument server power consumption and data center heat load in real time with SNMP. Establish PUE profile (use DoE DC Pro survey tool). Implement 9 racks of high-density cooling (in-row/rack). Implement proper UPS and higher-voltage distribution. Compare old & new research clusters' power consumption for the same workload. Implement advanced server power management and measure improvements. Review with internal, external and research faculty advisory groups. Communicate results. 25

Measuring power consumption Measure power use with SNMP at: Main electrical feeder, panels, subpanels, circuits. UPSes Power strips Some servers Chassis and blade power supplies SNMP instrumented power strip SNMP Modbus Inductive current tap 26

Measuring power consumption Use SNMP which enables comparison with other metrics like CPU utilization. Liebert GXT UPS (1 of 5 supporting an 800 core cluster) Raritan power strip 27

Measuring heat rejection Data Center chilled water goes through a plate heat exchanger to the campus chilled water loop. Measure the amount of heat rejected to the campus loop with temperature & flow meters to determine BTUH. These also use Modbus. hydrosonic flow meter 28

Measuring IT efficiency Run some HPC benchmarks. Correlate IT and electrical data with SNMP. Make a change and measure again to assess. Sum of primes 2:15,000,000 on 256 cores 29

Thanks to our NYSERDA project participants 30

FIN This work is supported in part by the New York State Energy Research and Development Authority. 31