DataCenter 2020. Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation.

Similar documents
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: first results for energy-optimization at existing data centers

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Elements of Energy Efficiency in Data Centre Cooling Architecture

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

HPC TCO: Cooling and Computer Room Efficiency

Enabling an agile Data Centre in a (Fr)agile market

Specialty Environment Design Mission Critical Facilities

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Dealing with Thermal Issues in Data Center Universal Aisle Containment

How To Manage Energy At An Energy Efficient Cost

Energy Management in a Cloud Computing Environment

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

The New Data Center Cooling Paradigm The Tiered Approach

ICT and the Green Data Centre

Great Lakes Data Room Case Study

- White Paper - Data Centre Cooling. Best Practice

THE GREEN DATA CENTER

THE WORLD S MOST EFFICIENT DATA CENTER

Green Data Centre Design

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

European Code of Conduct for Data Centre. Presentation of the Awards 2014

Energy Efficiency and Green Data Centers. Overview of Recommendations ITU-T L.1300 and ITU-T L.1310

Managing Data Centre Heat Issues

T-SYSTEMS ENABLER OF CONNECTED LIFE AND WORK. Company presentation T-Systems International GmbH

Benefits of. Air Flow Management. Data Center

Increasing Energ y Efficiency In Data Centers

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

How To Run A Data Center Efficiently

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

Green HPC - Dynamic Power Management in HPC

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Airflow Simulation Solves Data Centre Cooling Problem

CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

APC APPLICATION NOTE #92

Greening Data Centers

Data Center Power Distribution and Capacity Planning:

Server Room Thermal Assessment

Huawei Data Center Facility Integration Solution

Data Center Technology: Physical Infrastructure

IBM Twin Data Center Complex Ehningen Peter John IBM BS 2011 IBM Corporation

Design Best Practices for Data Centers

Reducing Data Center Energy Consumption

The Mission Critical Data Center Understand Complexity Improve Performance

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

Impact of Virtualization on Data Center Physical Infrastructure White Paper #27. Tom Brey, IBM Operations Work Group

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Legacy Data Centres Upgrading the cooling capabilities What are the options?

This 5 days training Course focuses on Best Practice Data Centre Design, Operation and Management leading to BICSI credits.

Building a data center. C R Srinivasan Tata Communications

Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs

Data Sheet FUJITSU Server PRIMERGY CX400 M1 Multi-Node Server Enclosure

Direct Fresh Air Free Cooling of Data Centres

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Data Center Power Consumption

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Analysis of data centre cooling energy efficiency

Data Centre Infrastructure Management DCIM. Where it fits with what s going on

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

The Efficient Enterprise. Juan Carlos Londoño Data Center Projects Engineer APC by Schneider Electric

Managing Cooling Capacity & Redundancy In Data Centers Today

abstract about the GREEn GRiD

COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION

Environmental Data Center Management and Monitoring

Microsoft Technology Center: Philadelphia

Data Center Equipment Power Trends

Data Centre Cooling Technologies. Jon Pettitt, EMEA DataCenter Business Development Manager

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

The CEETHERM Data Center Laboratory

How to build your Data Center Digitice roadshow Vaasa

Data center upgrade proposal. (phase one)

Implement Data Center Energy Saving Management with Green Data Center Environment and Energy Performance Indicators

High-density modular data center making cloud easier. Frank Zheng Huawei

AIRAH Presentation April 30 th 2014

A Blueprint for a Smarter Data Center Data Center Services helps you respond to change

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Using Computational Fluid Dynamics (CFD) for improving cooling system efficiency for Data centers

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Sage X3 in the Cloud. Business Benefits

Data Center Facility Basics

White Paper RiZone The Rittal Management Software for IT infrastructures

Transcription:

DataCenter 2020. Data Center Management and Efficiency at Its Best. OpenFlow/SDN in Data Centers for Energy Conservation. Dr. Rainer Weidmann, DC Architecture & DC Innovation Dr. Rainer Weidmann, DC Architecture 24.10.2011 1

Deutsche Telekom. Partner for connected life and work. Deutsche Telekom delivers one-stop services and solutions: for all customer communications needs at home, on the move and at work. T-Mobile Telekom T-Systems T-Mobile offers cell-phone solutions in the Netherlands, Austria, and the Czech Republic. The Telekom subsidiaries provide products and services for the fixed network, mobile communications, the Internet and IPTV in Europe. T-Systems delivers ICT solutions for major corporations and publicsector organizations worldwide. Dr. Rainer Weidmann, DC Architecture 24.10.2011 2

Design Implementation Operation Integrated ITIL Processes End-to-End Monitoring End-to-End Service Levels T-Systems delivers ICT: End-to-end IT and telecommunications services. Example: logistics company Receipt of parcel Distribution Delivery Logistics Desktop, applications Customer Business Processes Datacenter Applications, database, archiving Network LAN, FW, WAN, SMS Parcel station Scanner, screen, connectivity Mehrwerte durch ICT aus ICT expertise. einer Hand a single point of responsibility customer-specific price models, e.g. Price per desktop dynamic services with flexible upand down scaling replacement of up-front investments with monthly costs cost savings of up 30 percent easy integration of new technologies greater transparency and security more growth thanks to new business models Customer Internet, mobile device, etc. Realtime End-to-end Monitoring Dr. Rainer Weidmann, DC Architecture 24.10.2011 3

Scalable platforms: unmatched IT infrastructure skills. IT Infrastructure. Facts and figures 119,900 m 2 data center space 91 Data Center managed by T-Systems 118,600 MIPS 58,000 open systems server 23.6 m installed SAPs 3.2 m named SAP user 1.5 m managed seats 2,343 service desk agents 556,551 exchange mailboxes 276,697 notes mailboxes North and South Americas/South Africa Europa Europe Asia Pazific As of May 2011 T-Systems Partners Data center m 2 DC managed TS MIPS Open systems server Installed SAPs Named SAP user 19,800 17 11,100 11,400 3.2 m 186,700 Data center m 2 DC managed TS MIPS Open systems server Installed SAPs Named SAP user 96,200 67 106,900 44,700 24.4 m 3.0 m Data center m 2 DC managed by TS MIPS Open systems server Installed SAPs Named SAP user 3,900 7 600 1,900 45,700 14,000 Dr. Rainer Weidmann, DC Architecture 24.10.2011 4

DC2020 - Motivation/General Duties Observations & Responsibility Operational Steadily increasing power consumption of Data Centers Steadily increasing density in Data Centers Environment Climate Change Group (DTAG) T-Systems Core Beliefs: Sustainability & Corporate Responsibility. CO 2 Infrastructure Test Lab Closed Shop / No limits Testbed for benchmarking Optimize Energy efficiency Definition: energy efficiency of a DC (PUE, Power Usage Effectiveness, the green grid TM ) PUE = total Facility Power IT-Equipment Power (average value 1 year) Fields for improvement Legacy DC Blueprint for DC 2020 CFD Models Dr. Rainer Weidmann, DC Architecture 24.10.2011 5

DC2020. Technical Features. Floor space 67 + 67 m² Raised Floor 80cm 190 real Servers 8 Racks & 1 liquid cooled rack AC-UPS (250kVA) DC-Supply (48V, 600A) Adjustable ceiling Coated walls Smoke generator Cold water supply (290kW) Free cooling capability Plate Heat Exchanger CRAC EC CRAC FT CRAC Humidity Cold aisle containment Hot aisle containment Building mgmt. system (1800 Datapoints) Basic Idea 2008 Permanent inertization Dr. Rainer Weidmann, DC Architecture 24.10.2011 6

DC2020. Pictures. Top view from meeting room Raised Floor Cooling equipment Smoke Generator Enclosure and "Cooltrans" Dr. Rainer Weidmann, DC Architecture 24.10.2011 7

Hot/Cold Aisle Room Layout. General. Hot-/Cold-Aisle. T 2 [ C] Air short circuit Hot aisle Cold aisle Hot aisle leakage T A [ C] ΔT =T 2 -T 1 (ΔQ = c v m dt) CRAC Unit Fan Leakage Leakage Raised Floor T 1 [ C] p [pa] Dr. Rainer Weidmann, DC Architecture 24.10.2011 8

Hot/Cold Aisle Room Layout -- Status Quo. 18 C : 64 F 22 C : 72 F 24 C : 75 F Air short circuit PUE typ =2,20 1,80 T 2 = 24 C Hot aisle Cold aisle Hot aisle leakage T A = 22 C ΔT = 6K (ΔQ = c v m dt) CRAC Unit Fan 100% Leakage Leakage Raised Floor T 1 = 18 C p = 16pa Dr. Rainer Weidmann, DC Architecture 24.10.2011 9

Hot/Cold Aisle Room Layout Improvement I. Leakage Reduction -- Cold-Aisle Containment -- Fan Speed Adjustment. 21 C : 70 F 22 C : 72 F 38 C :100 F Cold aisle T 2 = 38 C Hot aisle leakage T A = 22 C Hot aisle ΔT = 17K (ΔQ = c v m dt) CRAC Unit Fan 30% Raised Floor T 1 = 21 C p = 4pa Dr. Rainer Weidmann, DC Architecture 24.10.2011 10

Hot/Cold Aisle Room Layout -- Improvement II. Leakage Reduction -- Cold-Aisle Containment -- Fan Speed Adjustment Temperature. 26 C : 79 F 27 C : 81 F 45 C :113 F Cold aisle T 2 = 45 C Hot aisle leakage T A = 27 C Hot aisle ΔT = 19K (ΔQ = c v m dt) CRAC Unit Fan 30% Raised Floor T 1 = 26 C p = 4pa Dr. Rainer Weidmann, DC Architecture 24.10.2011 11

Overview Free Cooling Period. Hours of Operation (Loc. Munich). 100% 90% 80% 70% Status Quo Improvement I Improvement II T A = 22 C T A = 22 C T A = 27 C 3147 5034 8 C : 46 F 14 C : 57 F 20 C : 68 F 22 C : 72 F 27 C : 81 F 60% 50% 40% 3683 7146 Hybrid Heat Exchanger "Dry" Hybrid Heat Exchanger "Wet" Chiller 30% 20% 10% 0% 3132 1936 1533 600 87 8 C w ater supply temperature 14 C w ater supply temperature 20 C w ater supply temperature Dr. Rainer Weidmann, DC Architecture 24.10.2011 12

PUE Cold- vs. Hot-Aisle Containment. PUE Improvement I. 14 C : 57 F 22 C : 72 F 22 C Server Inlet Temperature ; 14 C Water Supply Temperature 1,5 1,45 1,4 1,35 1,3 1,25 1,2 1,15 1,1 1,05 1 Density [kw/rack] Energy Savings 20-30% 5,5 7,5 10 12 15 17 22 Cold-Aisle Hot-Aisle Dr. Rainer Weidmann, DC Architecture 24.10.2011 13

Critical Reaction Time [mm:ss] Cold-/Hot-Aisle Containment Total Outage of Airconditioning. Critical Reaction Time vs. Density. 22 C : 72 F 35 C : 95 F 22 C 35 C Server Inlet Temperature T A 30:00 25:00 20:00 15:00 10:00 typ. UPS autonomic time hot aisle cold aisle no enclosure 05:00 00:00 Latency utilities 5,5 9,8 17,5 Density [kw/rack] Dr. Rainer Weidmann, DC Architecture 24.10.2011 14

Lessons Learned Phase 1. PUE of 1.3 or better possible with standard techniques already available Leakage Reduction and Enclosures necessary (Cold-/Hot-Aisle enclosure) Homogenous airflow very important (flow direction, air velocity, pressure) Standardized Hardware useful ( Dynamic Services ) Room height is of minor interest (in case of enclosures no influence) Increase of DC-Availability with Enclosures Savings increase with increase of power density in Datacenters Recommendations Reduce Fan speed in CRAC Units Increase Temperatures (ASHRAE 2008) Infrastructure on demand (IT leading) Increase Density to > 10kW/Rack (avg) Increase average CPU utilization Dynamic Load sharing Dr. Rainer Weidmann, DC Architecture 24.10.2011 15

Next steps for DC2020 - Phase 2 Focused on end-to-end enterprise load management Upgrade the data center to use: Private cloud application stacks Equipment: latest CPU generation utilizing Intel Node Manager/Data Center Manager (blade level density) Considering infrastructure is already in place to measure power, investigate effectiveness of strategies for server and network power conservation: Power capping vs. other forms of load limiting and distribution Enable power conservation in computing resources Reduce idle power consumption Dr. Rainer Weidmann, DC Architecture 24.10.2011 16

Potential of OpenFlow/SDN for energy conservation. Server level: Caching/Proxying requests until wake-up Mapping services to servers in an energy-aware manner Migration and consolidation across 1 or more data centers Workload allocation can be over a limited physical space so as to conserve cooling energy Network device level: Using centralized control to coordinate across 1 or more data centers such that energy proportionality or multiple energy states at device level (e.g., IEEE 802.3AZ) is effectively used: ElasticTree is an instance where we have only 2 modes: on/off. Pack workload efficiently using OpenFlow/SDN Turn off/on links/switches using config protocols For each workload, move traffic away from elements at high energy mode Dr. Rainer Weidmann, DC Architecture 24.10.2011 17

DC server power management. Energy-aware server provisioning and consolidation. Many existing ideas, on energy-aware provisioning, load consolidation and migration, for web services hosted across a collection of data centers: Provisioning: NSDI 2008 Server consolidation: HotPower 2008 Load migration: SIGCOMM 2009 SDN can enable these actions seamlessly without causing disruption: VM migration without dropping existing sessions Allocate load in a power-aware manner Triggering power on/off of computing nodes Dr. Rainer Weidmann, DC Architecture 24.10.2011 18

DC network power management. Simple strategy with interesting power savings. ElasticTree work in collaboration with HP and Stanford University Simple topology control can produce noticeable energy savings; savings increase with decrease in load. vary link speed power off switches power off links/switch power off links move workloads Dr. Rainer Weidmann, DC Architecture 24.10.2011 19

Summary OpenFlow/SDN has potential to: Intelligently trade-off power, redundancy, workload and performance. Enable Dynamicity, automatic management and evolvability of the DC. DC2020 will help evaluate the potential in realistic environments and explorate to the desired scale How should the data center network look like, if energy-awareness is enabled by SDN, for maximum energy conservation? Dr. Rainer Weidmann, DC Architecture 24.10.2011 20

Thank you for your attention! http://www.datacenter2020.de http://www.datacenter2020.com Contact: Dr. Rainer Weidmann T-Systems International GmbH Dachauer Str. 665 80995 Munich / Germany rainer.weidmann@t-systems.com Dr. Rainer Weidmann, DC Architecture 24.10.2011 21