Energy-efficient & scalable data center infrastructure design



Similar documents
Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM

OCTOBER Layer Zero, the Infrastructure Layer, and High-Performance Data Centers

Benefits of. Air Flow Management. Data Center

ADC-APC Integrated Cisco Data Center Solutions

Layer Zero. for the data center

SERVER ROOM CABINET INSTALLATION CONCEPTS

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Data Center Components Overview

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Data center upgrade proposal. (phase one)

Airflow Simulation Solves Data Centre Cooling Problem

Great Lakes Data Room Case Study

SMARTPLAY7. Increase your Deal Size and Profitability with SmartPlays

- White Paper - Data Centre Cooling. Best Practice

Power and Cooling for Ultra-High Density Racks and Blade Servers

Cabinet V800 V800 CABINETS

APC APPLICATION NOTE #112

APC APPLICATION NOTE #92

The New Data Center Cooling Paradigm The Tiered Approach

DATA CENTER DESIGN BEST PRACTICES: EFFICIENCIES BEYOND POWER AND COOLING

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

In Row Cooling Options for High Density IT Applications

Cabinets 101: Configuring A Network Or Server Cabinet

Cabcon. Data Centre Solutions. an acal group company UK +44 (0) IRE (00353)

Colocation and Cloud Hosting Solutions. Build a strong and efficient physical infrastructure that delivers service reliability and profitability

Applying Proper Cable Management In IT Racks A Guide For Planning, Deployment And Growth

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Data Centre PRODUCTS

How to Meet 24 by Forever Cooling Demands of your Data Center

How To Power And Cool A Data Center

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

DATA CENTER SOLUTIONS GUIDE

Server Room Thermal Assessment

How To Build A T7 Rack For Information Technology

How To Run A Data Center Efficiently

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

Managing Cooling Capacity & Redundancy In Data Centers Today

Airflow Utilization Efficiency (AUE) Can we cool today s servers with airflow?

Unified Physical Infrastructure (UPI) Strategies for Data Center Networking

Top of Rack: An Analysis of a Cabling Architecture in the Data Center

DATA CENTER RACK SYSTEMS: KEY CONSIDERATIONS IN TODAY S HIGH-DENSITY ENVIRONMENTS WHITEPAPER

Layer Zero. for the data center

Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers. Sami Alkharabsheh, Bahgat Sammakia 10/28/2013

BRUNS-PAK Presents MARK S. EVANKO, Principal

Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

EHDC. Cooling system for Data Center and server cabinets.

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Elements of Energy Efficiency in Data Centre Cooling Architecture

Managing Data Centre Heat Issues

Small Data / Telecommunications Room on Slab Floor

Product Brochure. Airflow Management Solutions

WHITE PAPER. Creating the Green Data Center. Simple Measures to Reduce Energy Consumption

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Rittal Liquid Cooling Series

Rack Hygiene. Data Center White Paper. Executive Summary

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Center Thermal Management Can Live Together

Thermal Optimisation in the Data Centre

Using a Fabric Extender with a Cisco Nexus 5000 Series Switch

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

Customized Server Cabinets

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

DataCenter 2020: first results for energy-optimization at existing data centers

Data Center Design Considerations. Walter P. Herring, RCDD/LAN/OSP Senior Systems Specialist Bala Consulting Engineers, Inc.

Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers

Airflow Management Solutions

Unified Physical Infrastructure (UPI) Strategies for Data Center Networking

Intelligent Power Distribution in the Data Center

Racks & Enclosures. Deliver. Design. Define. Three simple words that clearly describe how we approach a Data Center project.

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

Driving Data Center Efficiency Through the Adoption of Best Practices

Intelligent Data Center Solutions

TIA-942 Data Centre Standards Overview WHITE PAPER

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Rack Installation Instructions

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Choosing Close-Coupled IT Cooling Solutions

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Data Center Power Consumption

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Containment Solutions

Benefits of Cold Aisle Containment During Cooling Failure

ADC Structured Cabling Solutions

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Data Center Designs and Hospital Operations

White Paper Rack climate control in data centres

Transcription:

Seminar am 30. September 2010 in Wetzlar Nachhaltigkeit durch UPI (Unified Physical Infrastructure) Vortrag von Stefan Fammler: Energieeffiziente Strategien im Rechenzentrum Energy-efficient & scalable data center infrastructure design Lars-Hendrik Thom Stefan Fammler Regional Technical Support Manager D-A-CH Streategic & Benelux Account Manager 1

Yesterday s Physical Infrastructure Most businesses maintained the various system physical infrastructures in silos Power Communication Computing Security Control The Actual Trend of Infrastructure Systems Demand for IP Communications is leading to convergence of these systems Convergence requires physical and logical system integration, which affects system performance 2

The Physical Infrastructure Vision Align Merge Optimize www.panduit.com/upi Infrastructure Risk Management The complexities of convergence create risk in physical layer An IT Risk incident has the potential to produce substantial business consequences that touch a wide range of stakeholders. In short, IT risk matter now more than ever! IT Risk Turning Business Threats into Competitive Advantage Harvard Business Press 3

Integration & Interdependence Effective infrastructure management reduces risk throughout architecture As investment in integration technology increases, IT organisationswill continue to evolve their enterprise-wide integration infrastructure to handle user interaction, business process, applications and data. Colin White, BI Research IT risk matter now more than ever! Most of the businesses today need already a solid IT, but redundancy concepts allow to handle the risk of loosing data. IT infrastructure tomorrow is more all systems in a building will use it. Systems, which may have no possibility to get a backup It makes sense to have a closer look at the passive components and to invest now in clever solutions. 4

Physical IT Infrastructure Design should be Energy-efficient using intelligent cooling systems support the most effective usage of the cooling power Scalable and Reliable save investment react fast and easy on business and technology changes...the journey from the coal to the server 60% efficiency loss ~0,3 0,5 MW used for computing 5-10% loss for distribution and transformation 16 MW coal energy 15-30% loss of power supplies and devices ~5 MW usable energy 65% cooling and power transformation 5

Increasing Heat Load 2002: INTEL Pentium 3, 1 GHz max. power consumption 26W 2004: INTEL Pentium 4, 3 GHz max. power consumption 83W 2007: INTEL Xenon, 3,6 GHz max. power consumption 130W More speed and higher performance traditionally result in a higher energy consumption... How to fight temperature raises? Decrease the air temperature at the cold aisle to better cool the components > higher energy consumption of the CRAC Increase the pressure of the cold air inside the raised floor or speed up the cold airflow > higher energy consumption of the CRAC Invest in additional CRAC units > expensive invest AND additional energy consumption CRAC = Computer Room Air Condition 6

The better way to pay the bill... Total cost of ownership for a DC Energy; 40% i.e. ~ 3 Mio energy costs Energy costs Cooling; 40% Increasing the cooling efficiency could easy save several 100.000 per year. ~ 1.2 Mio cooling energy costs Cooling efficiency used for cooling; 40% bypass airflow; 60% bypass airflow: short cuts between the cold and hot regions at a DC without cooling active components ~ 720.000 loss per year! Source: Uptime Institute Energy-efficient design 1) Clear and consequent hot/cold aisle setup Orientation of cabinets front to front with no exception Single height cabinets and closed rows of cabinets to clearly separate hot and cold aisles Keep space for additional cooling options (if later necessary) Source: Emerson Liebert 7

Closed rows of cabinets Vertical blanking panels at the outside of the 19 frames Easy to install horizontal blanking panels Transformable cabinets to keep them in place if the usage changes (i.e. Network, SAN, Server etc.) Keep Space for additional cooling If the heat load exceeds react -even if the DC need to stay under full operation -with the installation of a i.e.: cold aisle containment system hot aisle containment system dedicated water based heat exchangers 8

Cold Aisle Containment Separates the hot air from the cold aisle and makes sure there s no mixture above the cabinets and at the end of the rows Some recommendations: Easy to install at cabinets in operation Wide to open & self closing doors Easy to open transparent top panel for access to cable paths above cabinets No loose parts and stable enough to handle the air pressure inside the cold aisle Hot Aisle Containment Separates the hot air from the cold aisle by channelizing the exhausted air from the components directly back to the CRAC unit. No mixture of cold and hot air results in higher possible cold aisle temperaturesup to about 25 C (depends on the component requirements) better operational efficiency of the CRACunit due to a higher temperature difference between cold and hot air 9

Partial Hot Aisle Containment If the room has just enough height already the add of a vertical exhaust channel results in a better cooling performance ideal solution for fast help if the heat load exceed at one single cabinet i.e. at the end of a row How does it work? The hot air is blown out higher into the room and the resulting sphere of hot air is staying more away from the cold air sphere. This reduces the mixture of cold and hot air. Additional cooling with passive heat exchanger If Water at the DC is not an issue for you prepare water pipesinside the raised floor at areas you plan to use for high performance components (blade server etc.) easy on demand installation of a passive water based heat exchangerto the rear side of a cabinet -connected to the low pressure chilled water circle of the CRAC unit. Up to 20kW additional cooling power at the single cabinet (about 25-30kW in total) Nomoving parts reduces costs for maintenance and additional energy consumption Should be easy to install in full operation of the cabinet 10

Energy-efficient design 2) Optimize the airflow within the DC (for new and existing ones) Use solutions to proper route the airflow inside the cabinets and racks Eliminate any blockings for airflow (Re-)calculate the number and the opening ratio of the perforated tiles the necessary amount Close all other openings to keep the air pressure inside the raised floor Limit the length of cabinet rows to get enough cold air even to the last cabinet (or if air comes from 2 sides at the middle) Source: Emerson Liebert From right -> left to front -> back Exhaust ducts to route the hot air of right to left blowing switches to the rear of a cabinet Cabinet without Exhaust duct NET-ACCESS Cabinet with exhaust duct 11

Rear side mounted ToR switches Use cold air inlet ducts to provide ToRswitches with enough cold air at their inlets Top of Rack Switch Temperature (C) 60.0 55.0 50.0 45.0 40.0 35.0 30.0 Switch Air Te mpe ratur e No Duct Duct Front closed with blind panels Without inlet duct Hot exhaust air of the server is flowing even to the cold air inlets of the switch Server 25.0 20.0 60% 75% 100% Fan Spe ed With cold air inlet duct Different Inlet Air Duct Samples Air inlets at the side ofa Cisco 4948 Front to back airflow at a Cisco Nexus 2k air duct extension to the front of the cabinet Inlet and exhaust air ducts for Cisco s Nexus 7018 12

Eleminate airflow blockings Airflow blockings occur: inside the raised floor due to pipes and cable mess behind servers due to flexible cable arms or even cable mess At the front of the components due to many connected patch cords -> the main reason for airflow blocking is often a weak cable management Design clear paths inside the raised floor The raised floor should be high enough to provide the necessary cold air pressure everywhere with the same level (recommended 60-90 cm). If cables are routed inside the raised floor it should be done on dedicated path ways, which allow the airflow above and below. Under floor cable pathway systems should be easy to install under operation (with minimized opening of tiles) to react on upcoming business demands for additional connections 13

Overhead Cable Routing Systems To minimize the risk of air flow blocking inside the raised floor use for the majority of cables the space above the cabinets if possible. Separate physically the more sensitive fiber patch cords from the copper cabling Look for a system that provides continuous bend radius control -for copper too! Easy access grants later the usage and minimizes work arounds Cabel Management./. Air Exhaust Even a clean looking cable management can cause heat problems at server cabinets. Flexible cable arms to slide a server out of the rack under operation become more and more useless due to the growing server virtualisation Virtual machines can be moved from one physical machine to an other Not necessary anymore to slide out a physical server under operation to change or repair parts inside the chassis. 14

Cable Management at RU level Vertical cable management supported by a finger system at the side of the 19 frame allows to proper route cables and overlenghts away from the back side of the server > minimized blocking of the air exhaust Side mounted patch panel allow also the usage of equal long patch cords > less stock. Look for dedicated cable path ways inside the server cabinets to separate cables by function. (LAN A / LAN B / OBM / SAN) This minimizes cable mess and makes moves and changes easier. Cable Management at the front Eliminate horizontal cable manager with a vertical cable management. Fingers for each RU at the left and right side allow to handle overlengthat the open space beside the 19 frame Look for angled panels, which also guide people to route patch cords to both sides and not crossing the middle of the panel -> minimized cable mess, over length at side -> easy operation, easy MAC -> no air flow blocking at the cold air inlets 15

Energy-efficient design 2) Optimize the airflow within the DC Route airflow front to back inside the cabinets Eliminate any blockings for airflow Calculate the number and the opening ratio of the perforated tiles the necessary amount Close all other openings to keep the air pressure inside the raised floor Limit the length of cabinet rows to get enough cold air even to the last cabinet (or if air comes from 2 sides at the middle) Source: Emerson Liebert Perforated tiles A typical DC is providing the cold air trough the raised floor. To bring the cold air to the front of the cabinets and to the inlet openings of the components, perforated tiles are positioned in front of the cabinet. The correct amount and position should be calculated to compensate only the expected heat load per cabinet/row. To many openings result in a loss of cold air pressure inside the raised floor To compensate this loss you need to increase the pressure at the CRAC units > increased energy costs. 16

Close all other openings As mentioned with each open hole at the raised floor you reduce the air pressure and need to compensate this with higher energy consumption of the CRAC unit. Close all unwanted holes especially at cable throughputs the more airtight the better! Closed but still easy to operate...? Brush systemsto close cable opening are very popular because they are easy to installand allow a fast an easy add or remove of cables But brush locksstill let air flow trough the opening and are not able to keep the air pressure. An alternative solution are closed air bags, which can also be installed around existing cable bundles and closed the hole AND keep the air pressure! MACsare almost as easy as with brush locks. 17

The lenght of a row... The length of a row of cabinets depends on the capacity and position of the CRAC units The longer a row the better should be the air flow control to provide enough cold air at any RU. Summary An optimized air flow and energy-efficient design creates more options and headroom for the specific data center infrastructure. Energy-efficiency and airflow control help to save energy and to reduce costs. 18

Thank you very much for your attention! More detailed information at www.panduit.com Do you have a question? 19