Data Center Energy Consumption



Similar documents
How To Run A Data Center Efficiently

How To Improve Energy Efficiency In A Data Center

Data Center Power Consumption

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Managing Data Centre Heat Issues

Analysis of data centre cooling energy efficiency

Design Best Practices for Data Centers

abstract about the GREEn GRiD

Improving Data Center Energy Efficiency Through Environmental Optimization

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Measure Server delta- T using AUDIT- BUDDY

BCA-IDA Green Mark for Existing Data Centres Version EDC/1.0

Data Center Equipment Power Trends

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Airflow Simulation Solves Data Centre Cooling Problem

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

Data center upgrade proposal. (phase one)

Using CFD for optimal thermal management and cooling design in data centers

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Reducing Data Center Energy Consumption

Greening Commercial Data Centres

How To Find A Sweet Spot Operating Temperature For A Data Center

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

How to Build a Data Centre Cooling Budget. Ian Cathcart

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences

DataCenter 2020: first results for energy-optimization at existing data centers

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Rack Blanking Panels To Fill or Not to Fill

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

AIR-SITE GROUP. White Paper. Green Equipment Room Practices

Top 5 Trends in Data Center Energy Efficiency

Technical Systems. Helen Bedford

Improving Data Center Efficiency with Rack or Row Cooling Devices:

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

Datacenter Efficiency

- White Paper - Data Centre Cooling. Best Practice

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling

Benefits of. Air Flow Management. Data Center

DATA CENTER COOLING INNOVATIVE COOLING TECHNOLOGIES FOR YOUR DATA CENTER

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

The Mission Critical Data Center Understand Complexity Improve Performance

Energy Management Services

Specialty Environment Design Mission Critical Facilities

Alan Matzka, P.E., Senior Mechanical Engineer Bradford Consulting Engineers

Impact of Virtualization on Data Center Physical Infrastructure White Paper #27. Tom Brey, IBM Operations Work Group

Cisco Data Center Facilities Assessment Service

Energy Efficiency in New and Existing Data Centers-Where the Opportunities May Lie

Data Centre Energy Efficiency Operating for Optimisation Robert M Pe / Sept. 20, 2012 National Energy Efficiency Conference Singapore

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

The New Data Center Cooling Paradigm The Tiered Approach

Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL

HPC TCO: Cooling and Computer Room Efficiency

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Virginia Tech. Background. Case Summary. Results

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

Data Center Airflow Management Retrofit

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Enabling an agile Data Centre in a (Fr)agile market

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Application of CFD modelling to the Design of Modern Data Centres

Benefits of Cold Aisle Containment During Cooling Failure

Dealing with Thermal Issues in Data Center Universal Aisle Containment

THE GREEN DATA CENTER

Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Data Center Energy Profiler Questions Checklist

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

Data Center Environments

CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How

Energy Efficiency and Availability Management in Consolidated Data Centers

Data center lifecycle and energy efficiency

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Rack Hygiene. Data Center White Paper. Executive Summary

Reducing the Annual Cost of a Telecommunications Data Center

Office of the Government Chief Information Officer. Green Data Centre Practices

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Environmental Data Center Management and Monitoring

Data Center Cooling A guide from: Your partners in UPS Monitoring Systems

Data Center Lifecycle and Energy Efficiency

Transcription:

Data Center Energy Consumption The digital revolution is here, and data is taking over. Human existence is being condensed, chronicled, and calculated, one bit at a time, in our servers and tapes. From several perspectives this is a good thing, as it allows us to take information that would have previously filled warehouses with paper, and perform calculations that would have taken warehouses of mathematicians, and consolidate these functions into a neat little box. But there s a hidden cost to all of this. For every tweet, photo, or blog post, there is a physical manifestation somewhere in the world. It s so tiny you can t see it, but when you add these little nothings up by the trillions, they start to take shape in the form of miles of server farms and storage racks. Here s the real cost: these things use a lot of energy! Servers need power to process our data, and this power creates heat, which has to be dealt with to keep the servers working. As servers increase in density, and we can do even more things inside our neat little boxes, their heat output becomes immense and they require specialized systems to efficiently reject this heat. In 2013, data centers in the U.S. consumed an estimated 91 billion kilowatt-hours 1 of electricity. Per the U.S. Energy Information Administration 2, that s about 7% of total commercial electric energy consumption. This number is only going to go up. By 2020, it is estimated consumption will increase to 140 billion kilowatt-hours 1, costing about $13 billion in power bills. So where does this energy go? As I said before, the servers themselves need power to do their thing, but the cooling system consumes a lot of energy as well. In cutting-edge data centers, the cooling system uses a fraction of the energy of the servers. But in less efficient data centers, cooling systems can use just as much or even more power than the IT equipment. Figure 1: Energy Usage in Data Centers 3 Lucky for us, we have a handy metric to capture just how efficiently a data center is operating. It s called Power Usage Effectiveness (PUE), and it s used to measure how much additional energy is consumed in a data center that s not directly used by IT equipment. PUE is calculated as the total energy of the facility divided by the IT equipment energy. The higher the PUE, the less efficient that data center is performing. Figure 2: PUE Calculation 4 Operators are (or should be) constantly striving to reduce this number, because every dollar spent on energy is one less dollar that can be spent on operations. The best data centers in the world are pushing this number very close to 1.0, which is ideal efficiency. The reality is that most existing data centers are nowhere close to this holy grail. In fact, a survey by Digital Realty Trust in 2013 5 revealed the average PUE of large data centers to be a shocking 2.9! In plain terms, this means that for every dollar spent on IT power, almost two dollars are spent to keep the IT housed and comfortable. With modern technology, we can do so much better! PUE is somewhat climate-based. It s not apples-to-apples to compare a data center s performance in Dubai to one in Helsinki. But by and large, a data center s inefficiencies are a result of poor design, operations, or maintenance of the cooling system. It s very common to walk through a data center and see too many fans spinning and airflow wasted. Further, many existing data centers still operate under old notions that server rooms must be so cold you need a parka to walk through them. Recent guidance from ASHRAE 6 has taught us that servers are much tougher than we used to think, but designers and operators alike are still catching onto this. 1. http://www.nrdc.org/energy/data-center-efficiency-assessment.asp 2. http://www.eia.gov/forecasts/aeo/data.cfm#enconsec 3. http://ecmweb.com/energy-efficiency/data-center-efficiency-trends 4. http://www.thegreengrid.org/~/media/whitepapers/wp49- PUE%20A%20Comprehensive%20Examination%20of%20the%20Metric_v6.pdf?lang=en 5. http://www.computerworld.com/article/2496597/data-center/new-data-center-survey-shows-mediocre-results-for-energy-efficiency.html 6. http://tc99.ashraetcs.org/

Optimizing Data Center Airflow There are a lot of components to your average data center s cooling system. There need to be mechanisms to not only pick up the heat from the servers, but to carry it out of the building. These include computer-room air conditioners (CRACs), chillers, condensers, cooling towers, and so on. The point is there is not simply one line of attack for reducing a data center s PUE. The primary line of attack, in my view, should always be the energy expended to remove the heat from the servers. This is always done through fluid exchange. As server farms become more and more dense, oil- and water-cooled servers are becoming increasingly prevalent. For the present and near future, however, air remains the standard for most data centers. Figure 3: Data Center Air Flow 7 The best air-cooled data centers circulate just enough air to accomplish the primary goal of keeping the servers happy. Contrary to what it may seem to some, there is some serious energy cost to moving air around a data center. Data center air-conditioners are far from box fans! They have high-power motors and use some serious juice. Not only that, but circulating too much air can greatly reduce the effectiveness of the heatrejection portion of the cooling plant. As such, optimizing the airflow of a data center becomes mission #1 in keeping PUE under control. With ideal air circulation, fans are running at their most efficient point and it becomes much easier to tackle efficiency on the heat-rejection side (chillers, condensers, cooling towers, etc ). With a small server room, designing for efficient airflow is a matter of simple best-practices. But in a 10,000 square foot data center with hundreds of server racks, it gets a whole lot more complicated. Airflow networks in large, underfloor air systems are very complex and can be very tricky to balance with simple best practices. This is where advanced analysis is absolutely critical. Computational Fluid Dynamics (CFD) in Data Centers Since we re in the digital mood today, here s a less-than-140-character introduction to computational fluid dynamics (CFD): CFD takes a very complex fluid dynamics problem and breaks it up into tiny chunks that computers can solve. Figure 4: A server room CFD Model is broken into thousands of finite elements for computer analysis We don t quite have the computing power to predict where every molecule will go, but by breaking up a complex problem into discrete boxes and solving for the conditions of each box, computers can quickly and accurately simulate fluid behavior. This helps us to predict how air and heat will move around in a complex airflow network. For reasons that I hope are now obvious, this is extremely useful in data centers! Data center CFD helps answer detailed questions like: How much airflow are my servers really consuming? Do I need to pay for full aisle containment, or can I do just as well with end panels? How high can I set my cooling temperature to save energy? Do I really need this many CRACs? Do I really need this many perforated tiles? Is that column going to mess anything up? I often encounter skepticism when it comes to the true impact of CFD in a data center. I try to counter this with two points. First, CFD is based on tested results from real-world components. For example, components in our data center CFD models are calibrated to actual vendor performance

data. This provides a high level of assurance that the results in the model will match the conditions in the real world. On top of that, there are countless case studies of CFD saving data center owners thousands of dollars in wasted energy. Second, if you re going on a trip, would you rather have a general direction, or a map? If I wanted to go to Boston, I could just head North and maybe end up there, or I might end up in Vermont. Close, but not really where I wanted to go. If I have a map (or in the theme of the day, a GPSenabled smart dashboard), I ll make sure I get to Boston. Hopefully in time for a nice crab cake dinner. Figure 5: Air Streamlines for Server CFD Analysis In data center design, best practices are the general direction, and CFD is the map. Best practices will get me on my way to a good design. I know there are certain ways that servers need to be oriented, and I know general rules for the amount of air and cooling I need. But without a map, it s very likely I m going to over-design and put in too much equipment. This leads to wasted energy and often unintended consequences and imbalanced conditioning of my IT equipment. If you re designing or operating a data center, do yourself a favor and go get yourself a map! A CFD Case Study So what s the best time to use CFD? Before you start building! CFD is a very useful tool for retrofitting existing data centers, but its best use is as a predictive tool before the first tile is installed. In 2014, we had the good fortune of working with one of our clients to help them solve a data center problem. The Thomas Jefferson National Accelerator Facility, or Jefferson Lab, houses a large particle accelerator nearly a mile in length. They are also one of 17 national laboratories to be funded by the U.S. Department of Energy (DoE). Figure 6: Jefferson Lab Accelerator Site in Newport News, VA These fundamental traits create an interesting convergence for Jefferson Lab. As you can imagine, particle accelerator experiments crank out a lot of data. As a DoE-funded laboratory, energy-efficiency is also very important to their operations. It should come as no surprise then, that Jefferson Lab faced some critical decisions when it came time to expand their existing data center. Shifting space constraints and expanding IT needs had necessitated that Jefferson Lab consolidate several data rooms into a single, central data center. As with many data centers, this had to be done with minimal disruption to operations and without a loss of computing capacity. To top it off, the PUE goal for the data center was established at 1.4; an aggressive number for the often hot and humid climate of coastal Virginia. This required some careful thought and planning. By working directly with data center personnel, we were able to develop a detailed phasing plan and a conceptual design to safely and comfortably consolidate Jefferson Lab s IT operations. But to provide a true map to a 1.4 PUE, we had to move beyond traditional design.

Figure 7: Jefferson Lab CEBAF Data Center Rack Temperature Profiles By meeting and planning directly with IT personnel, we developed a predictive CFD model of Jefferson Lab s expanded data center. This allowed optimization of the proposed data center layout before relocating a single server. The benefits of this optimization were tangible and achieved several key outcomes for our client. In completing predictive CFD, we were able to: Confirm the proposed cooling solution would maintain inlet conditions for all servers within ASHRAE recommendations Provide assurance of continuous cooling in the event of failure of a single CRAC unit Consolidate cold aisles to free space for future expansion Validate the proposed aisle containment solution for energy efficiency Establish server airflow limits to be used in IT equipment procurement Elevate the recommended cooling supply temperature to expand the availability of free cooling throughout the year To tie it all together, we were able to feed the results of the CFD study back into our energy analysis engine. The data center airflow optimization, coupled with improvements to the chiller plant, were able to demonstrate a predicted annual PUE to below the 1.4 threshold. Figure 8: PUE Progression with Recommended Data Center Improvements The end result of this analysis was an optimized design, with the design team and client provided with much greater assurance that the retrofit would achieve the ultimate energy-efficiency goals of the project.