WHITE PAPER. Creating the Green Data Center. Simple Measures to Reduce Energy Consumption



Similar documents
Developing the Green Data Centre

Data Center Power Consumption

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Benefits of. Air Flow Management. Data Center

Dealing with Thermal Issues in Data Center Universal Aisle Containment

OCTOBER Layer Zero, the Infrastructure Layer, and High-Performance Data Centers

TIA-942 Data Centre Standards Overview WHITE PAPER

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

ADC-APC Integrated Cisco Data Center Solutions

Developing the Green Data Center

Choosing Close-Coupled IT Cooling Solutions

- White Paper - Data Centre Cooling. Best Practice

ADC Structured Cabling Solutions

THE GREEN DATA CENTER

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

Data Center Components Overview

Power and Cooling for Ultra-High Density Racks and Blade Servers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

DataCenter 2020: first results for energy-optimization at existing data centers

APC APPLICATION NOTE #112

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

ADC KRONE s Data Centre Optical Distribution Frame: The Data Centre s Main Cross-Connect WHITE PAPER

Planning for 10Gbps Ethernet over UTP Questions to Ask When Planning the Cabling Plant WHITE PAPER

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Data center upgrade proposal. (phase one)

Thermal Optimisation in the Data Centre

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Energy-efficient & scalable data center infrastructure design

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Layer Zero. for the data center

Center Thermal Management Can Live Together

Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Measure Server delta- T using AUDIT- BUDDY

Server Room Thermal Assessment

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs

Cabcon. Data Centre Solutions. an acal group company UK +44 (0) IRE (00353)

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

Rack Hygiene. Data Center White Paper. Executive Summary

Small Data / Telecommunications Room on Slab Floor

Creating Order from Chaos in Data Centers and Server Rooms

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Element D Services Heating, Ventilating, and Air Conditioning

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Using CFD for optimal thermal management and cooling design in data centers

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

Managing Data Centre Heat Issues

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

How to Meet 24 by Forever Cooling Demands of your Data Center

Effect of Rack Server Population on Temperatures in Data Centers

Driving Data Center Efficiency Through the Adoption of Best Practices

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Airflow Simulation Solves Data Centre Cooling Problem

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Rittal Liquid Cooling Series

Reducing Data Center Energy Consumption

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Creating Order from Chaos in Data Centers and Server Rooms

Ten Steps to Solving Cooling Problems Caused by High- Density Server Deployment

The New Data Center Cooling Paradigm The Tiered Approach

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Managing Cooling Capacity & Redundancy In Data Centers Today

Using thermal mapping at the data centre

Containment Solutions

Benefits of Cold Aisle Containment During Cooling Failure

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

Cabinets 101: Configuring A Network Or Server Cabinet

Great Lakes Data Room Case Study

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Managing High-Density Fiber in the Data Center: Three Real-World Case Studies

Specialty Environment Design Mission Critical Facilities

Data Center Cabling Design Trends

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Improving Data Center Energy Efficiency Through Environmental Optimization

Green Data Centre Design

Strategies for Deploying Blade Servers in Existing Data Centers

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Data Center Cooling A guide from: Your partners in UPS Monitoring Systems

Data Center Equipment Power Trends

Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers

Data Center Design Considerations. Walter P. Herring, RCDD/LAN/OSP Senior Systems Specialist Bala Consulting Engineers, Inc.

RACKMOUNT SOLUTIONS. 1/7/2013 AcoustiQuiet /UCoustic User Guide. The AcoustiQuiet /UCoustic User Guide provides useful

Data Center Energy Profiler Questions Checklist

Office of the Government Chief Information Officer. Green Data Centre Practices

Using Simulation to Improve Data Center Efficiency

Raised Floor Data Centers Prepared for the Future

ADC s Data Center Optical Distribution Frame: The Data Center s Main Cross-Connect WHITE PAPER

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Introducing Computational Fluid Dynamics Virtual Facility 6SigmaDC

Transcription:

WHITE PAPER Creating the Green Data Center Simple Measures to Reduce Energy Consumption

Introduction Energy Awareness Driving Decisions in the DataCenter The continued thirst for energy is a recurring story in news headlines every day. Global warming forecasts rising temperatures, melting ice, and population dislocations due to accumulation of greenhouse gases in our atmosphere from use of carbon-based energy. There are strong arguments for and against the dire predictions of global warming, yet one fact is undeniable over the past 10 to 20 years, the inhabitants of Earth are collectively consuming more energy at a faster rate than ever before. No where is this more apparent than in the data center where power consumption has doubled in the past five years and is expected to rise at a steeper rate of 76% from 2005 to 2010. One culprit is steadily increasing power requirements for servers. For example, according to IDC (2006), the average small to medium size server required 150 W of power in 1996. These small to medium servers will require over 450 W by 2010. Of course, increased power requirements means increased heat to dissipate, driving another culprit for increased energy use in the data center cooling. One survey of IT executives shows that 45% of data center energy consumption goes to chiller/cooling towers and computer room air conditioners (CRACs). According to IDC 2006, in the year 2000, for every $1 spent on new servers, 21 cents was spent on power and cooling. By 2010, IDC predicts that every $1 spent on new servers will require 71 cents on power and cooling. This massive increase has led to the formation of industry consortiums such as The Green Grid SM that are specifically focusing efforts on lowering power consumption in the data center. For some businesses, increased energy costs are merely considered a cost of doing business. Yet there is a point at which these costs dampen profits and limit investment needed to grow and modernize a business. Worse are shortages of electricity occurring in pockets across India that prohibit businesses from expanding data center operations to keep pace with their growing company. There are many ways to promote conservation of electricity in the data center. For example, server virtualization allows multiple applications to run on individual servers, which means fewer servers to power and cool. In practice, a data center may be able to reduce the number of servers from 70 to 45, for example. Virtualization recognizes that a server gives off 100% of its heat if it is 20% or 90% in use. This dramatically reduces power and cooling costs across the data center. Yet there are many other ways to reduce power and cooling costs in the data center ways that are far simpler and less expensive to implement. Page 2

Airflow Management in Cabinets New server platforms can support 800 to 1,000+ optical fiber terminations or 600-1000+ copper cable terminations per chassis. The prospect of crowding too many cables into vertical managers poses a problem for thermal management in cabinets. When air cannot properly circulate in the cabinet, data center fans are called upon to move more air and cooling units to lower air temperature both of which consume additional, unnecessary electricity. For years the IT industry has promoted the benefits of increased rack and cabinet density. Servers are smaller than ever and more can fit into the same space. The rationale has always been to make the best use of data center floor space. Yet today the balance is shifting. New servers are consuming more energy than ever before, causing data center and facilities managers to weigh spiking operating costs due to more energy usage against the capital cost of wasted space of lower density configuration in raised floor environments. Instead of just focusing on density, energy efficiency demands that data center and facilities mangers look at managed density. Managed density recognizes that there really is a limit to the number of cable terminations and servers that can safely and economically be housed in cabinets. A prime issue is potential blocking of airflow caused by too many cables within the cabinet. One solution is to limit the number of servers and cable terminations in a cabinet, especially in copper racks where cable diameter is larger. Another is to employ basic cable management within the cabinet, such as securing cables along the entire length of vertical cable managers to open airflow. Similarly, integrated slack management systems locate and organize patch cords so that maximum space is available for flow of cool air into and out of the cabinet. Using smaller diameter copper cable is another means to improve airflow within the cabinet. For many data centers, copper equipment terminations are still prevalent, especially with the ability to push 10Gb/s over Augmented Category 6 cabling. The choice of copper cabling can impact airflow because some cables have a much smaller outside diameter. For example, ADC KRONE s AirES technology provides superior conductor insulation that allows cable to exceed standards for electrical performance using smaller gauge copper and less insulating material. The result is cable with an average outside diameter that is 28 to 32 percent smaller than standard Category 6 or Category 6a cables. Less cable means reduced blockage in the cabinet, allowing air to flow more freely and do its important job of cooling equipment which then uses less electricity. With proper cable management and smaller diameter cables, cable fill ratio for vertical cable guides of 60 percent supports higher density configurations without compromising airflow; higher server density is possible without inducing added electricity use for fans and cooling equipment. Page 3

Airflow Management in the Data Center Rear Front Front Rear Rear Front There are many simple solutions to improve overall airflow efficiency in the data center that can be implemented immediately, and without major changes in the design and layout of the data center. In general, unrestricted airflow requires less power for cooling efforts. Each incremental improvement results in less energy to cool equipment reducing costs and limiting output of greenhouse gases from the power company. These simple solutions include the following: Plug unnecessary vents in raised floor perforated tiles. Plug other leakages in the raised floor by sealing cable cutouts, sealing the spaces between floors and walls, and replacing missing tiles. Reduce air leakage by using gaskets to fit floor tiles more securely onto floor frames Ensure that vented floor tiles are properly situated to reduce hot spots and wash cool air into equipment air intakes. Manage heat sources directly by situating small fans near the heat source of equipment. Use time of day lighting controls or motion sensors to dim the lights when no one is in the data center; lights use electricity and generate added heat, which requires added cooling. Reduce overall data center lighting requirements by using small, portable lights within each cabinet, which puts light where technicians need it. Turn off servers not in use. There are also many avenues for improving data airflow that require more planning and execution. The most documented and discussed is the hot aisle/cold aisle configuration for cabinets. This design for the raised floor area effectively manages airflow and temperature by keeping hot aisles hot and the cold aisles cold. Aisles designated for cold air situate servers and other equipment in cabinets so that air inlet ports face the cold aisle. Similarly, hot air equipment outlets are situated in cabinets facing only into the hot aisle. Cool air for the data center is only pushed through perforated floor tiles into cold aisles; hot air from equipment exhausts into the hot aisle. Cabinets Telecom Cable Trays Perforated Tiles Power Cables Cabinets Telecom Cable Trays Cabinets Designing hot aisle/cold aisle presents its own set of challenges, including the following: Perforated Tiles Power Cables Ensuring that cool air supply flow is adequate for the space Sizing aisle widths for proper airflow Positioning equipment so hot air does not re-circulate back into equipment cool air inlets Adding or removing perforated floor tiles to match the air inlet requirements of servers and other active equipment. Accounting for aisle ends, ceiling height and above cabinet blockages in airflow calculations. Another ready means to improve cooling is removing blockages under the raised floor. The basic cable management technique of establishing clearly defined cable routing paths with raceway or cable trays under the floor keeps cables organized, using less space and avoiding the tangled mess of cables that can restrict airflow. Moving optical fiber cables into overhead raceway as well as removing abandoned cable and other unnecessary objects from below the floor also improves airflow. Dust and dirt are enemies of the data center. Dust has a way to clogging equipment air inlets and clinging to the inside of active equipment. All of this dust requires more airflow and more cooling dollars in the data center. There is probably an active program for cleaning above the raised floor. It is just as important to periodically clean below the raised floor to reduce dust and dirt in the air. Page 4

There are many other initiatives that can be implemented to improve airflow throughout the data center and reduce energy costs. These include the following: Move air conditioning units closer to heat sources. During cooler months and in the cool of the evening time, use fresh air instead of re-circulated air. Reduce hot spots by installing blanking panels to increase CRAC air return temperature. Consider using ducted returns. According to a study, implementing the hot aisle/cold aisle configuration can reduce electrical power consumption by 5 to 12 percent. This same study showed that even simple measures such as proper location of perforated floor tiles can reduce power consumption by as much as 6 percent. By implementing even the smallest measures, power consumption can be drastically reduced. Clearly defined cable routing paths keep cables organized, using less space and avoiding the tangled mess of cables that can restrict airflow. Moving optical fiber cables into overhead raceways, opens up airflow underneath floor panels Page 5

Selecting Green Cable It was shown earlier that cable made with AirES conductor insulation is smaller in diameter and therefore contributes to improved airflow and reduced energy costs. Yet for the truly environmentally conscious individual, copper cable made with AirES technology offers another important benefit less material is used for cable construction. By combining channels of air with traditional FEP insulation material, the net effect of AirES is superior electrical performance and smaller outside cable diameter. Yet the AirES cable is 28 to 32 percent smaller because there is simply less copper and FEP used in the cable manufacturing process. For every 1,000 feet of cable made with AirES conductor insulation, 1.45 fewer pounds of FEP insulation and 1.25 fewer pounds of copper are required. This may not make a difference today that can be verified. However, cable made with less insulation and copper will have less impact on the landfill for installation scrap and when the cable is abandoned. For recycled cable, less material requires less energy for the recycling process, too. Raw Material Savings: up to1.25 lbs. of copper 1.45 lbs of FEP per 1000 ft. of Cat 6 cable AirES Technology. Air channels allow for a smaller diameter cable while also using less FEP insulation. Smaller twisted pairs......equals smaller cables Page 6

LEED Certification and the Data Center Leadership in Energy and Environ mental Design (LEED) was developed by the U.S. Green Building Council (USGBC), a non-profit organization composed of building industry leaders promoting development of facilities that are environmentally responsible, while also profitable and healthy places to live and work. The Green Building Rating System is the internationally accepted benchmark for the design, construction, and operation of LEED certified buildings. The charter is as follows: to build and operate an environmentally and socially responsible, healthy, and prosperous environment that improves the quality of life for all stakeholders. Energy efficiency is a major component toward LEED certification. LEED certification is much more than a stamp of approval. In general, LEED buildings offer building owners significant financial rewards, including: have lower operating costs are healthier for occupants conserve water and energy offer increased asset value demonstrate owner s commitment to community requirement for most new federal, state and municipal buildings qualify for tax and zoning allowances and other incentives Building owners earn LEED certification by earning points for design and usage in the five key areas based upon the LEED rating system. It is important to note that projects (buildings) are certified, not products. In fact, today products and services do not earn LEED project points. LEED projects earn points during the certification process and then are awarded one of four certification levels. While there is no direct correlation between earning LEED certification and product selection, certain product choices can prove critical to overall project certification. Products such as AirES cable, FiberGuide, and RiserGuide - to name just a few - can greatly improve passive airflow, thereby improving overall energy efficiency, one of the key elements toward LEED certification. ADC KRONE s subject matter expertise with infrastructure upgrades can result in major benefits to those managers often focusing on things seen as bigger ticket items than infrastructure. But the savings and benefits cannot be denied. According to IBM, infrastructure upgrades can result in cooling cost savings of 15 to 40%. Data center engineers and designers can leverage ADC KRONE s product and design expertise to reap these types of immediate benefits, allowing them to focus on other areas critical to cost and energy savings, as well as pending site certification. Conclusion There are many ways to reduce cooling requirements for data centers. Improving cable management, stopping air leakages, removing cable dams under the floor, choosing smaller diameter cable to improve airflow are just a handful of the measures available to data center planners and managers. LEED certification or any initiative that conserves energy and saves money makes sense from many angles, especially for the power hungry data center: improved work environment for people and equipment, corporate commitment to community and environment, improved operating costs from reducing energy consumption. ADC KRONE supports green data centers by designing and manufacturing solutions for passive airflow improvement that reduce cooling and power requirements. A well planned infrastructure will make a tremendous difference and ADC can get you there today. For more information contact ADC KRONE at 1800 425 8232. Write to truenetindia@adckrone.com or visit www.adckrone.com/in Page 7

WHITE PAPER www.adckrone.com/in Corporate Office & Factory: P B No. 5812, 10 C II Phase, Peenya, BANGALORE - 560 058. India Ph: +91 80 2839 6101 / 6291, Fax number- +91 80 2372 2753 TOLLFREE: 1800 425 8232 NEW DELHI CHENNAI Ph: +91 99580 81515 Ph: +91 98400 22427 KOLKATA SECUNDERBAD Ph: +91 98310 08473 Ph: +91 98494 53556 MUMBAI PUNE Ph: +91 99673 59411 Ph: +91 98602 82629 ADC Telecommunications, Inc., P.O. Box 1101, Minneapolis, Minnesota USA 55440-1101 Specifications published here are current as of the date of publication of this document. Because we are continuously improving our products, ADC reserves the right to change specifications without prior notice. At any time, you may verify product specifications by contacting our headquarters office in Minneapolis. ADC Telecommunications, Inc. views its patent portfolio as an important corporate asset and vigorously enforces its patents. Products orfeatures contained herein may be covered by one or more U.S. or foreign patents. An Equal Opportunity Employer 400659_IN 03/08 2008 ADC Telecommunications, Inc. All Rights Reserved