Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers



Similar documents
Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager

(Examples of various colocation sites)

Benefits of. Air Flow Management. Data Center

Cabinet & Enclosure Systems

Airflow Simulation Solves Data Centre Cooling Problem

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Managing Data Centre Heat Issues

Environmental Data Center Management and Monitoring

Dealing with Thermal Issues in Data Center Universal Aisle Containment

How High Temperature Data Centers and Intel Technologies Decrease Operating Costs

EO and Smart Federal Government Data Center Management

Managing Cooling Capacity & Redundancy In Data Centers Today

Data Centre Infrastructure Assessment

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Power Management Strategies for High-Density IT Facilities and Systems. A White Paper from the Experts in Business-Critical Continuity TM

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

Choosing Close-Coupled IT Cooling Solutions

CASE STUDY. TCS Utilizes Airflow Containment to Overcome Limited Power Availability and Win Two ASHRAE Awards. Optimize. Store. Secure.

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

DATA CENTER RACK SYSTEMS: KEY CONSIDERATIONS IN TODAY S HIGH-DENSITY ENVIRONMENTS WHITEPAPER

EO and Smart Federal Government Data Center Management

Application Briefs: Remote Office Environment

CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Intelligent Power Distribution in the Data Center

APC APPLICATION NOTE #92

Enclosure and Airflow Management Solution

Data center upgrade proposal. (phase one)

Specialty Environment Design Mission Critical Facilities

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Element D Services Heating, Ventilating, and Air Conditioning

We understand your need for IT infrastructure solutions that optimize equipment, operations and business.

CASE STUDY. The Evolution in Healthcare Requires Contemporary Design and Out of the Box Thinking

Power and Cooling for Ultra-High Density Racks and Blade Servers

Data Center Cooling: Fend Off The Phantom Meltdown Of Mass Destruction. 670 Deer Road n Cherry Hill, NJ n n

CASE STUDY. How the Lego Approach Helped Norway Giant SpareBank 1 Create a Flexible Data Centre. They used what they called the Lego Approach.

Airflow Management Solutions

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

Racks & Enclosures. Deliver. Design. Define. Three simple words that clearly describe how we approach a Data Center project.

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Implement Data Center Energy Saving Management with Green Data Center Environment and Energy Performance Indicators

How To Power A Bladecenter H Servers

How To Improve Energy Efficiency In A Data Center

Driving Data Center Efficiency Through the Adoption of Best Practices

Power Distribution Units (PDUs): Power monitoring and environmental monitoring to improve uptime and capacity planning

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

White Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010

Great Lakes Data Room Case Study

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

How to Maximize Data Center Efficiencies with Rack Level Power Distribution and Monitoring

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

The New Data Center Cooling Paradigm The Tiered Approach

How To Run A Data Center Efficiently

Center Thermal Management Can Live Together

How to Build a Data Centre Cooling Budget. Ian Cathcart

Power and Cooling Guidelines for Deploying IT in Colocation Data Centers

Layer Zero. for the data center

Improving Rack Cooling Performance Using Airflow Management Blanking Panels

Colocation and Cloud Hosting Solutions. Build a strong and efficient physical infrastructure that delivers service reliability and profitability

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Sensor Monitoring and Remote Technologies 9 Voyager St, Linbro Park, Johannesburg Tel: ;

BICSInews. plus. july/august Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

Measuring Power in your Data Center: The Roadmap to your PUE and Carbon Footprint

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

APC APPLICATION NOTE #112

Using Simulation to Improve Data Center Efficiency

Power Distribution Considerations for Data Center Racks

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

NAISSUS THERMAL MANAGEMENT SOLUTIONS

Green Data Centre Design

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Increasing Energ y Efficiency In Data Centers

Improve competiveness of IDC and enterprise data centers with DCIM

Energy Performance Optimization of Server Room HVAC System

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Mejora la Eficiencia Operativa en Centros de Datos

Data Centre Cooling Air Performance Metrics

DCIM Data Center Infrastructure Monitoring

RELIABLE AND FLEXIBLE POWER FOR YOUR DATA CENTRE

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Distributing Power to Blade Servers

The Efficient Enterprise. Juan Carlos Londoño Data Center Projects Engineer APC by Schneider Electric

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Energy-efficient & scalable data center infrastructure design

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Airflow Management Solutions

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

Energy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.

Transcription:

C P I P O W E R M A N A G E M E N T WHITE PAPER Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers By Anderson Hungria Sr. Product Manager of Power, Electronics & Software Published June 2014 800-834-4969 techsupport@chatsworth.com www.chatsworth.com While every effort has been made to ensure the accuracy of all information, CPI does not accept liability for any errors or omissions and reserves the right to change information and descriptions of listed services and products. 2014 Chatsworth Products, Inc. All rights reserved. Chatsworth Products, CPI, CPI Passive Cooling, econnect, MegaFrame, Saf-T-Grip, Seismic Frame, SlimFrame, TeraFrame, GlobalFrame, Cube-iT Plus, Evolution, OnTrac, QuadraRack and Velocity are federally registered trademarks of Chatsworth Products. Simply Efficient is a trademark of Chatsworth Products. All other trademarks belong to their respective companies. 6/14 MKT-020-613

Power consumption in the data center continues to be a rising trend. The need to provide redundant power systems with high reliability and availability of compute resources is a major driving force for the increase in power utilization. Some data centers use just as much power for non-compute or overhead energy like cooling, lighting and power conversions, as they do to power servers. The ultimate goal is to reduce this overhead energy loss, so that more power is dedicated to revenue-generating equipment, without jeopardizing reliability and availability of resources. There are many methods currently being implemented to reduce unnecessary power consumption in the data center high-efficiency servers, thermal containment, server inlet temperature increase and reducing power conversion loss. When used in combination, these approaches can deliver low Power Utilization Effectiveness (PUE) values and reduce energy expenses. As an added challenge, new trends in data center traffic highlight the importance of implementing energy efficiency techniques in facilities. According to the third annual Cisco Global Cloud Index (2012-2017), global data center traffic is expected to triple over the next five years a 25 percent combined annual growth rate (CAGR) to 7.7 zettabytes by the end of 2017 1. 9.0 25% CAGR 2012-2017 8.0 7.7 ZB Zettabytes / Year 7.0 6.0 5.0 4.0 4.2 ZB 5.2 ZB 6.4 ZB 3.0 3.3 ZB 2.0 2.6 ZB 1.0 0.0 2012 2013 2014 2015 2016 2017 Figure 1: The Cisco Global Cloud Index (2012-2017) estimates a 25% Combined Annual Growth Rate of data center networking traffic during the next five years. (Source: Cisco Global Cloud Index: Forecast and Methodology, 2012-2017) Increased virtualization and use of high-density devices, such as blades and switches require even more power. For these reasons, it s crucial to deploy a reliable and effective power distribution unit (PDU) at the cabinet level, which is the hottest place in the data center. 2

Managing Power Properly in the Data Center of the Future Power is the biggest expense in the data center, most of it being used to cool and keep these facilities at a temperature that prevents servers or devices from overheating. One way to be more energy efficient is to implement a containment strategy, and then raise the temperature in the cold aisle from the traditional setting of between F (15 C) and 70 F (21 C) to a higher temperature between 80 F (27 C) and 85 F (29 C). Thermal Guidelines for Data Processing Environments, part of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) Datacom Series, defines recommended and allowable environmental ranges (classes), as shown in (Figure 2) 2 below. Under certain conditions, data centers can save 4 to 5 percent in energy costs for every 1 degree Fahrenheit increase in server inlet temperature, according to the U.S. Environmental Protection Agency, Department of Energy, Energy Star program 3. 80 100% 90% 80% 70% % A3 50% A4 % 80 35 45 50 55 65 70 75 CONSTANT N T RELATIVE HUMIDITY CONSTANT DEW POINT CONSTANT NT WET T BULB 35 45 50 55 65 70 75 80 85 90 95 100 105 110 115 A1 A2 30% 20% 10% ASHRAE -- Recommended 75 70 65 55 50 45 35 30 Dew-Point Temperature (ºF) Dry-Bulb Temperature (ºF) Figure 2: 2011 ASHRAE environmental classes for data center applications support higher equipment inlet temperatures. (Source: Image based on psychrometric charts in: Data Center Environments: ASHRAE¹s Evolving Thermal Guidelines, ASHRAE, ASHRAE Journal, vol. 53, no.12, December 2011 and ASHRAE, Thermal Guidelines for Data Processing Environments, Third Edition, Chapter 2 and Appendix F, 2012, ASHRAE recommended range is added) Of course, the underlying problem with keeping temperatures high in data centers is that devices could fail faster, though most IT equipment manufacturers say it s safe to raise intake temperatures to reduce overall facility energy use. Companies such as Facebook 4 and Google 5 have been proponents of this practice for a number of years. 3

Achieve Power Optimization with Free Cooling The path to achieving optimization with free cooling begins with good airflow management practices, as described in the U.S. Department of Energy s publication Best Practices Guide for Energy-Efficient Data Center Design 6. As a pioneer in Passive Cooling Solutions to promote free cooling in data centers, Chatsworth Products (CPI) brings an unmatched level of quality, expertise and efficiency to airflow management. CPI Passive Cooling was one of the first solutions to use comprehensive sealing strategies and a Vertical Exhaust Duct to maximize cooling efficiencies at the cabinet level. Now expanded to the aisle level, CPI Passive Cooling and Aisle Containment Solutions (Figure 3) allow data centers to increase heat and power densities by as much as four times their original level and increase cooling efficiency by nearly threefold. Return Air Plenum Hot Aisle Cold Aisle Hot Aisle Raised Floor Figure 3: Good airflow management separates cold and hot airflow pathways within the data center, leading to higher temperatures in the hot aisle where rack-mount PDUs are typically placed. Containment keeps hot and cold air separate within the data center computer room, allowing you to confidently raise room temperature. However, when airflow containment is utilized at either the cabinet or the aisle level, the temperature in the rear of the cabinet or in the hot aisle also becomes significantly higher, which must be taken into consideration when selecting in-cabinet PDUs. 4

The Importance of a PDU with a High-temperature Rating PDUs are usually installed in the back of cabinets behind hot air exhaust from equipment, which is potentially the hottest part of every data center (Figure 4). Depending on the expected ΔT from servers ranging from 25 F to 30 F (13 C to 20 C), the heat at the rear of cabinets or hot aisle containment can reach 110 F to 1 F (43 C to C). In this type of environment, there are very few devices that can continuously operate reliably and efficiently. Figure 4: Computational Fluid Dynamics (CFD) model showing hot aisle containment and resulting increase in hot aisle temperatures. The Solution: CPI econnect PDUs CPI econnect PDUs currently have the highest ambient operating temperature rating of any PDU in the market (Figure 5). econnect PDUs have been specially designed and tested for continuous operation in ambient air temperatures up to 149 F (65 C) to exceed the anticipated temperatures in a typical contained aisle. 70 50 Top Operating Temperatures of Rack-Mount PDUs (ºC) 65 52 45 Estimated range of hot aisle temperatures within contained solutions when inlet temperatures are raised to 80.6 F (27 C) and above. 30 20 10 0 Low Temp Geist APC/ ServerTech Eaton/ Raritan Chatsworth Products Figure 5: This graph shows the high end operating temperatures of several manufacturers rack-mount PDUs. The shaded area represents the estimated outlet temperatures for data center equipment, with a 25 F to 30 F (13.9 C to 16.7 C) ΔT, at maximum conditions for recommended and allowable environmental ranges established in the Thermal Guidelines for Data Processing Environments. 5

To keep the product safely operational at such high temperatures, strategically placed air vents, a larger power supply, high temperature components and other elements were included in the design. econnect PDUs comply to safety standards by the International Electrotechnical Commission and are UL Listed.* Figure 6: In addition to operating in high ambient temperature, econnect PDUs feature a compact PDU chassis that fits the small space behind the equipment mounting rails to minimize interference with exhaust airflow, alternating outlet groups to help distribute load more evenly on models with multiple breakers and a large, centrally located display for easy viewing. Additionally, the PDU has a small, compact size to minimize the space it occupies within the cabinet. On units with multiple breakers, the outlets are arranged in an alternating pattern so you distribute load more evenly as you plug in equipment. For intelligent PDUs, the LCD display is centrally located so that it is easy to read when the PDU is installed. It is possible to access the unit remotely using a web browser for setup, monitoring and control. IP consolidation allows access of up to 20 PDUs through a single Ethernet connection and IP address. Alternately, the PDU supports SNMP so that it can be monitored with third-party monitoring software. Conclusion econnect PDUs are the Simply Efficient solution to the ever-increasing demand for reliable power in the data center. econnect PDUs allow for remote access with optional monitoring and switching capabilities on outlets. With more than 180 standard configurations, including high-density models in 50A and A 208V that meet power loads of up to 17kW per unit, econnect PDUs withstand the heat loads of any hot aisle containment and are the market s best answer to the growing industrywide demand for High Performance Computing (HPC), virtualization and cloud computing. *IEC 950-1:2005, second edition, Information Technology Equipment Safety Part 1; UL Listed under category NWGQ: Information Technology Equipment Including Electrical Business Equipment, UL file number E212076. 6

References and Acknowledgements 2 American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), Technical Committee 9.9, Mission Critical Facilities, Technology Spaces and Electonic Equipment. 2012. Thermal Guidelines for Data Processing Environments, 3rd Edition. pp.12-15 and Appendix F. 3 U.S. Environmental Protection Agency. Department of Energy. Energy Star program. Server Inlet Temperature and Humidity Adjustments. http://www.energystar.gov/index.cfm?c=power_mgt.datacenter_efficiency_inlet_temp (or http://www.energystar.gov/datacenterenergyefficiency) 4 Facebook. 2011. Reflections on the Open Computer Summit webpage. Building the Next Open Data Center in North Carolina. https://www.facebook.com/note.php?note_id=10150210054588920. 1 Cisco Systems, Inc. 2013. Cisco Global Cloud Index: Forecast and Methodolgy, 2012-2017. pp.1-3. http://www.cisco.com/c/en/us/solutions/collateral/service-provider/global-cloud-indexgci/cloud_index_white_paper.html. (or http://www.cisco.com/c/en/us/solutions/serviceprovider/global-cloud-index-gci/index.html). 5 Data Center Knowledge. Miller. 2012. Too Hot for Humans, But Google Servers Keep Humming. http://www.datacenterknowledge.com/archives/2012/03/23/too-hot-for-humans-but-google-serverskeep-humming/. 6 U.S. Environmental Protection Agency. Department of Energy. Office of Energy Efficiency & Renewable Energy. Federal Energy Management Program. Best Practices Guide for Energy-Efficient Data Center Design. pp. 5-8. http://www1.eere.energy.gov/femp/pdfs/eedatacenterbestpractices.pdf. (or http://www.energy.gov/eere/femp/federal-energy-management-program). Anderson Hungria Sr. Product Manager Power, Electronics & Software, Chatsworth Products Anderson Hungria graduated from North Carolina State University with a Masters in Electrical and Computer Engineering. He has worked in the data center industry for seven years. Hungria has managed and introduced a variety of power distribution and monitoring products. He previously worked at Eaton Powerware in the Data Center Solutions and Three Phase Power Groups. Hungria is currently involved with managing Rack PDUs, UPS, Environmental Monitoring and Software at Chatsworth Products (CPI). 7