THE EFFICIENT DATA CENTER: OPTIMIZATION

Similar documents
Energy Efficiency and Availability Management in Consolidated Data Centers

Why Energy Management Is Driving Adoption of Data Center Infrastructure Management Solutions

Physical infrastructure: Effective consolidation of space, power, and cooling

The ROI of Tape Consolidation

Fluid Data storage: A fully virtualized foundation for the IT stack

Greener Pastures for Your Data Center

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

Debunking Five Myths About Tape Storage. Tech Brief. an Internet.com Tech Brief. 2011, Internet.com, a division of QuinStreet, Inc.

IBM Global Technology Services November Successfully implementing a private storage cloud to help reduce total cost of ownership

EMC PERSPECTIVE Managing Energy Efficiency in the Data Center

The Mission Critical Data Center Understand Complexity Improve Performance

Guideline for Water and Energy Considerations During Federal Data Center Consolidations

Data Center Consolidation in the Federal Government Looking beyond the technology

Gartner Webinar Technology Trends for Today

Energy Savings in the Data Center Starts with Power Monitoring

Simplifying Storage Operations By David Strom (published 3.15 by VMware) Introduction

Datacenter Efficiency

BC43: Virtualization and the Green Factor. Ed Harnish

Data Centre Outsourcing a Buyer s Guide

Managing Today s Data Centers Avoiding the Impending Crisis

Data Centers: Definitions, Concepts and Concerns

Design Best Practices for Data Centers

Technology Trends You Can t Afford to Ignore

The Logicalis Data Center Practice We help you bridge the gap between the data center you have today and the data center strategy you will need

Increase Utilization, Decrease Energy Costs With Data Center Virtualization

The HP IT Transformation Story

GUIDELINE. on SERVER CONSOLIDATION and VIRTUALISATION. National Computer Board, 7th Floor Stratton Court, La Poudriere Street, Port Louis

THE PATH TO A GREEN DATA CENTER. Hitachi Data Systems and Brocade. Joint Solution Brief

_experience the commitment TM. Seek service, not just servers

Reducing Data Center Energy Consumption

DATA CENTRE CONSOLIDATION; WHAT CAN ORGANISATIONS ACHIEVE? White Paper

Data Center Optimization: California s Blueprint for Success

Data Center Equipment Power Trends

Creating the dynamic data center of the future, now

Optimizing Power Distribution for High-Density Computing

OPTIMIZING SERVER VIRTUALIZATION

Greening The Data Center

EX ECUT IV E ST RAT EG Y BR IE F. Server Design for Microsoft s Cloud Infrastructure. Cloud. Resources

Say Yes to Virtualization: Server virtualization helps businesses achieve considerable reductions in Total Cost of Ownership (TCO).

How To Design A Data Center

NTT Com Asia deploys NEC's Express 5800/ECO CENTER to become more competitive.

Measuring Power in your Data Center

Table of contents. HP Insight Dynamics VSE software. Continuously analyze and optimize your Adaptive Infrastructure

SERVER VIRTUALIZATION.. ROADMAP REPORT..

DATA CENTERS BEST IN CLASS

IV. SHIFT TO THE CLOUD: ACHIEVING EFFICIENCY THROUGH CLOUD COMPUTING AND DATA CENTER CONSOLIDATION *

Driving the Green Data Center Strategy in Established Environments

How To Plan For Cloud Computing

Global Headquarters: 5 Speen Street Framingham, MA USA P F

The Advantages of Tier 0 Storage and Enhance Application Performance

Managing Power Usage with Energy Efficiency Metrics: The Available Me...

The Opportunity to Get the Best of All Worlds

Australian Paper knows green is good for business with IBM

Data Center Solutions - Modular Design

SERVICE-ORIENTED IT ORGANIZATION

Heat Recovery from Data Centres Conference Designing Energy Efficient Data Centres

Online Firm Improves Performance, Customer Service with Mission-Critical Storage Solution

Advantages on Green Cloud Computing

August Transforming your Information Infrastructure with IBM s Storage Cloud Solution

Data centre strategies for business growth in a hybrid cloud world

Assistance of Cloud Computing For Green Computing

Virtualization and the Green Data Center

Replication and Recovery Management Solutions

The facility in a converged infrastructure Michalis Kefalakis HP Data Centre Facilities Strategy Lead EMEA

Microsoft Deploys EMC SAN for Data Centers and Reduces Costs by 50 Percent

It Pays to Know Your Data Center: Saving Money through Increased Efficiency

HP ProLiant DL385 G7 - A Review

Tributary Systems Storage Director Provides Superior ROI. By Ed Ahl, Director of Business Development

ZCorum s Ask a Broadband Expert Series:

The Green/ Efficiency Metrics Conundrum. Anand Ekbote Vice President, Liebert Monitoring Emerson Network Power

can you effectively plan for the migration and management of systems and applications on Vblock Platforms?

Software Defined Storage Needs A Platform

DECREASING DATACENTER AND OTHER IT ENERGY

Reducing Data Center Power Consumption Through Efficient Storage

Whitepaper: Cloud Computing for Credit Unions

The Elephant in the Room is Lost Capacity

ICT and the Green Data Centre

The Revival of Direct Attached Storage for Oracle Databases

A VANTAGE DATA CENTER INFOPAPER. The Open Compute Project + Your Data Center: What you need to know

Four important conclusions were drawn from the survey:

Creating an IT Infrastructure that Adapts to Your Business PLAYBOOK

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Big data management with IBM General Parallel File System

EU Code of Conduct for Data Centre Efficiency

Managing the Unmanageable: A Better Way to Manage Storage

Data Centers That Deliver Better Results. Bring Your Building Together

Virtual Machine Environments: Data Protection and Recovery Solutions

Audit of the Data Center Consolidation Initiative at NARA. OIG Draft Audit Report No May 10, 2012

SECURE CLOUD SOLUTIONS FOR YOUR BUSINESS.

WHITE PAPER Addressing Enterprise Computing Storage Performance Gaps with Enterprise Flash Drives

Solutions White Paper. Using Storage Virtualization. to Meet the Challenges of Rapid Data Growth

Data Center Consolidation Trends & Solutions. Bob Miller Vice President, Global Solutions Sales Emerson Network Power / Liebert

Planning and Deploying a Disaster Recovery Solution

What to Consider When Designing Next-Generation Data Centers

Initial Steps in Capacity Management for the Green Datacenter

EO and Smart Federal Government Data Center Management

Hitachi Storage Solution for Cloud Computing

How to Select a Colocation Provider Offering High Performance Computing

The Data Center Migration (DCM) Evaluation Tool

10 Questions to Ask Your Cloud Service Provider

Transcription:

Special Report ONLINE REPORT SPONSORED BY: P2 P3 P4 P5 P6 RETHINKING DATA CENTER EFFICIENCY VIRTUALIZATION: THE KEY TO DRIVING EFFICIENCY STORAGE PLAYS STARRING ROLE IN EFFICIENCY EFFORTS IMPROVING EFFICIENCY: IT S ALL ABOUT METRICS IS THE CONTAINERIZED DATA CENTER AN ANSWER?

RETHINKING DATA CENTER EFFICIENCY The data center is evolving and with it the concept of data center efficiency. It s not just that agencies are under pressure to consolidate and reduce the cost of their data center operations. They also are dealing with increasing demands for the applications and services they provide. The old notion of a data center a building full of servers that can be expanded at will as the demand to host applications increased is gone for good. The new environment, defined by perpetual budget constraints but still increasing user needs, requires data centers that deliver more computing power but also cost less to run and take up less space. The term data center efficiency has stayed the same, but it s the technologies to drive those efficiencies that have changed, said Dennis Tolliver, enterprise server channel sales specialist at Hewlett-Packard. As the technology has developed, other things that we didn t have the means to interpret 10 years ago, such as Power Usage Efficiency (PUE), have cropped up, he said. The Federal Data Center Consolidation Initiative (FDCCI), launched by the Office of Management and Budget in 2010, sets the standard for what is expected from government agencies. With a goal of closing at least 40 percent of the government s 3,133 data centers by the end of 2015, the focus now has to be on computing power and density instead of capacity, according to federal CIO Steven VanRoekel. Along with that comes a need for agencies to wrap their arms around a number of different metrics for cost savings. Cutting energy use has been a focus for some years and power usage effectiveness (PUE), the ratio of the total amount of power used in the data center to the power used just by the IT equipment, is the most common metric used to measure data center efficiency. Now, the FDCCI mandate requires agencies to also include such things as the reduction in floor area and server rack counts in their overall cost metrics. The availability of new technologies such as virtualization, blade servers and high-density storage is helping to change the measure of data center efficiencies. Rich Campbell, federal chief technologist at EMC Corp., thinks that the equation has shifted from much computepower-per-square-foot can be thrown at an application to how much compute power, storage and network capacity is needed for each application. Compared to ten years ago, I can put the same amount of processing power into maybe a third of the space, he said. That changes the efficiency model, which is no longer based on compute-power-per-squarefoot but on compute-resources-per-rack-unit. And it turns out that when it comes to data center space efficiency a result of the focus on density that VanRoekel talked about organizations generally have not done a good job with that, despite the fact that most of them have made it a priority to get more out of their available floor space. It turns out there s a lot more room available than we thought, said David Cappuccio, a managing vice president responsible for data center research at Gartner. In both the private and public sectors, for example, they don t come close to filling up the server racks. They re only around 60 percent full, on average. Beyond any mandates to make their data centers as efficient as possible, agencies could also have a financial incentive to do so. Faisal Iqbal, manager for systems engineering in Citrix Systems public sector, said there are already very forward thinking folks at agencies who are looking to create data centers that can house several agencies and share capacity on as as-needed basis. Their thinking is that, if they can build a data center and make it so efficient that they ll have extra capacity, then they can share that and charge for it, he said. That makes the data center a much cheaper proposition than if the agency is just operating it for itself. FCW SPONSORED REPORT 2

VIRTUALIZATION: THE KEY TO DRIVING EFFICIENCY Virtualization is recognized as a key to better utilization of servers, which by extension will lead to more efficient data centers. Increasingly, however, it s also being seen as the foundation on which all other efficiency endeavors should be built. Compared to just a few years ago, virtualization is much more reliable and effective and is considered a proven technology. Much of the current generation of application and management software is written with virtualization in mind, which is essential to providing the redundancy and scalability needed to spread workloads over larger numbers of servers to achieve higher efficiencies. Three years ago virtualization was a new technology that was neither as reliable or as effective as it is now, with all of the redundant features it has today, said Dennis Tolliver, enterprise server channel sales specialist at Hewlett-Packard. And if they do go with virtualization, they can get as much as a 50 percent greater reduction if they go with newer server platforms also, rather than simply consolidating in place on older servers. Agencies need to think long-term, he said, developing a three-to-five-year outlook on their investments in the data center. So it s not just what they ll save today on their data center renovation, but what they will save over the next few years due to improved performance and efficiencies. Doing it that way I think they ll see that the payback will be very quick, Tolliver said. Go beyond the traditional idea of efficiencies and virtualization offers even more. It s driving more innovative and cost effective methods of data recovery, image manage, lifecycle management and even backup, among other things. When we work with agencies on virtualization, they certainly start out by looking to traditional elements of efficiency, such as the power and cooling savings they will make, said Keenan Baker, inside solutions architect for servers and storage at CDW Government (CDW-G). But as they go through the implementation, they also realize it eases a lot of other issues, such as bringing machines on line faster and easier, which also translates into cost savings. It could take some time for this to become part of the mainstream, however. By and large, agencies are turning to virtualization with traditional expectations of power and cooling savings, said Baker. Other benefits, such as enhanced management capabilities and consolidation of management interfaces, don t yet figure into a lot of buying decisions. Long term, they certainly see a lot of cost savings through virtualization, he said, but there are a lot of these other, more intangible things to do with overall data center efficiency that they have trouble putting numbers to. It might also require a change in the way even settled ideas associated with virtualization are considered. The FDCCI, for example, requires agencies to look at server virtualization as an increase in the number of virtual servers on each physical host, but that might not be an adequate indicator of efficiencies. What most organizations mean when they say they are near full virtualization is that they are running virtualization on something like 80 percent of their servers, said David Cappuccio, managing vice president at Gartner responsible for data center research. But if you ask them what the server performance levels are in peak hours, it s only in the mid-20s. What they should be looking at instead is what those peak performance levels are, then set a target for what they want them to be, and then put enough workload on the servers to hit that target, he said. Sometimes it could be one or two workloads, other times 40, but the idea is to get the most performance out of the servers during peak hours, and that way they d get the most data center utilization. FCW SPONSORED REPORT 3

STORAGE PLAYS STARRING ROLE IN EFFICIENCY EFFORTS Although storage has always played a part in improving data center efficiency, it s usually been treated as something of an afterthought. That s beginning to change, though, particularly as data centers are increasingly virtualized. In most scenarios, virtualized storage becomes a point of mediation for enabling dynamic resource scheduling, failovers and other vital functions. The importance of storage in the data center has been recognized by the Green Grid, an international industry consortium that developed the power usage efficiency (PUE) metric that has driven so much of the data center efficiency improvements over the past few years. Last year, the Green Grid proposed the Data Center Storage Efficiency (DCsE) metric as a way for data center operators to identify inefficiencies in their storage resources, in much the same way that PUE can be used to improve infrastructure energy efficiency. Most organizations are at least sensitive to the notion that efficiency is more than just infrastructure costs, according to Mark Weis, vice president, federal sales at Spectra Logic. They increasingly look to older storage technologies such as tape libraries as a way of extending power and cooling limitations. The more power and cooling they can save with tape libraries, the more room they have to make decisions with other parts of the IT infrastructure that will inevitably consume more power, he said. Just from an overall cost perspective, agencies are looking to a variety of different storage technologies to give them the maximum performance for the investments they make in the data center. One is example is the solid state flash drive, which has come down in price to the point where it can be considered a regular part of the storage set up, particularly in virtualized environments. You can mix solid state with [cheaper] SAS [Serial Attached SCSI] rotating drives that enables you to hit both capacity and performance marks at a very good price point, said Keenan Baker, inside solutions architect for servers and storage at CDW Government (CDW-G). It ends up saving you a bundle of money. And technologies like that are emerging all around the data center as you start to virtualize the environment. Getting the right mix of technologies also can be used to improve energy efficiencies. Many organizations are now considering such things as tiered storage arrangements as a way of reducing costs, moving oftenused data to faster and more power-hungry drives, while putting the majority of the rest of their data to slower and higher density storage. That approach worked well for one storage-hungry but space-deficient organization that David Cappuccio, a vice president at Gartner, worked with. He ran a model assuming that 40 percent of its data would be on high-performance drives, with the rest slowly migrating to high-density Serial ATA drives. The organization s storage density went from 700 terabytes to 2.2 petabytes in the same rack space, with an overall drop in power consumption of around 19 percent. Then we went even more aggressive and the total storage went to around 7 petabytes for an overall power increase of just 3 percent, he said. So, just using different technology produced a massive increase in performance and density-per-square-foot of space. What is clear that agency demand for data storage will only increase in the coming years as agencies invest more in Big Data, full-motion video and other dataintense applications. The federal storage market grew to over $1 billion in fiscal year 2011, according to GovWin Consulting, a 22 increase in just three years. With agencies under orders from OMB to cut IT spending, the drive to force data center storage efficiencies will only ratchet higher. FCW SPONSORED REPORT 4

IMPROVING EFFICIENCY: IT S ALL ABOUT METRICS Measuring data center efficiency is not just a way of seeing how well you are doing, but what you could do to increase efficiency. Getting the metrics right by which you measure efficiency is therefore critical to data center planning. It depends very much on what agency officials see as their particular efficiency challenge and what they want to get from the data center. Many agencies are focused on reducing their energy costs. Their goal is to use the smallest amount of power necessary to run their IT and cool the servers. Others are more focused on system uptime and application availability and so might be willing to lose a little on the energy use side of the equation. Power usage efficiency (PUE), a metric developed by the Green Grid, an international industry consortium, has been a major driver for greater energy efficiencies in data centers since it was published in 2007. It represents the ratio of the total power used to run the data center facility, including that for lighting and cooling, to the power used just to run the IT equipment. The ideal PUE is 1.0. The FDCCI has made a PUE between 1.3 and 1.6 a target for agencies, while also acknowledging that the typical results achieved in data centers so far are usually in the range of 2.0 to 3.0. Another metric, data center infrastructure efficiency (DCIE), is the inverse of PUE. The past few years have seen a rising interest in factors other than straight efficiency, such as environmental impact. With that in mind, the Green Grid announced in November 2012 that its members had agreed on measurement guidelines and development steps for three new efficiency metrics: Green Energy Coefficient (GEC), Energy Reuse Factor (ERF), and Carbon Usage Effectiveness (CUE). The new metrics go a long way in addressing the limitations of the original PUE metric, which only really measures how efficiently the energy consumed is being used, consortium officials said in a statement. It doesn t look at the inputs to and outputs from the PUE, nor at the implications in terms of carbon emissions. However, others question the value of PUE and the other metrics, saying they don t address the larger but related issue of how well data centers deliver on their IT mission. David Cappuccio, who is responsible for data center research at Gartner, said PUE might help an agency reduce energy costs, but it does nothing to help that agency get the most out of its IT resources. PUE makes no sense because it has nothing to do with IT efficiency, it just measures the facility, he said. I tell people that, if you run the IT shop and want to bring in newer technology because it will improve performance and productivity but also happens to be more energy efficient, then PUE can actually get worse even as the IT gets more efficient. Instead, he proposes using a Rack Unit Effectiveness (RUE) metric. Based on the same optimal/actual usage concept behind the PUE, his RUE would measure how much of the available rack space is being used. This could also be used as a planning tool from a technology refresh angle, he said. If organizations have, say, an 80 percent density in a data center today and want to upgrade half of the racks to current generation servers, they could use RUE to see the impact this would have on the space itself. That turns out to be an eye opener for most people, he said. FCW SPONSORED REPORT 5

IS THE CONTAINERIZED DATA CENTER AN ANSWER? Improving data center efficiency usually means upgrading a lot of the server, storage, power and cooling systems with newer technology that is both more energy-efficient than older technology and is smaller, which helps in meeting density requirements. But that costs money, time and labor so, why bother? Why not just buy a new, efficient data center in one go? That s the idea behind containerized data centers, a growing part of the overall data center business. They are literally data centers housed in shipping containers, with servers, power supplies and cooling systems delivered as single unit. Just plug them in and go. NASA was one of the first government agencies to use containerized systems, using them to help build its Nebula cloud computing platform, which provides compute and storage capacity on demand for NASA scientists and researchers. The containers enabled the agency to set up the data center infrastructure needed for Nebula in a fraction of the time and at much lower cost than it would have needed to actually build new data centers. Each container holds up to 15,000 CPU cores and some 15 petabytes of storage. Built on green energy principles, they were rated as some 50 percent more efficient than traditional, brick-and-mortar data centers when they were rolled into the NASA Ames Research Center site in 2009. They also helped cut the average data center planning cycle from an average of around two years to just 120 days. NASA Goddard also is planning to install a containerized data center as one of the three that will remain after it has consolidated down from its previous 13 data centers. The Army is another government organization that has also recently opted for containerized data centers, as part of its five-year, $250 million Area Processing Centers Army Private Cloud (APC2) initiative, for which it made awards in January of 2012. The Army plans to close around 185 of its data center by the end of 2015, as part of the Federal Data Center Consolidation Initiative (FDCCI), but also has a moratorium in place against the building of any new data centers, so the containers provide a ready answer for new data center capacity for its private cloud initiative. However, the verdict is still out on the ultimate effectiveness of containerized data centers. You still have to find enough space to put the containers on site, provide security for them, and also hook up power and chilled water supplies. Modern server technology provides modular and relatively cheap ways to upgrade existing data centers. Blade servers allow for high performance compute power in less space than traditional rack mounted servers, use fewer power supplies and need less fans and cooling. Intelligent power distribution units allow data center managers to know exactly where the power is going, how much is being used and even how hot servers are running. When you look at the technology behind containerized data centers, it s based on virtualization with a management layer on top of that, said Faisal Iqbal, manager for systems engineering in Citrix Systems public sector. That s pretty much the same approach that agencies are using to modernize their existing data centers. Containerized data centers definitely have some direct application to tactical scenarios in the military, he said, and even some civilian agencies that want to drive down physical maintenance costs in their data centers. But you can derive the same kind of efficiencies with the data centers you own today, if you do things correctly, he said. There s no reason why you can t make them as efficient as these containerized data centers. FCW SPONSORED REPORT 6

SPENDING VIRTUALIZATION DOLLARS. NOT GETTING ANY CHANGE. SOLVED. A surplus of servers. A shortage of progress. It s a virtualization downturn. And we ve got the expertise to help bail you out. Dedicated account managers. Solution architects. And industry-leading partners like HP. From assessment to implementation, we ll help revitalize your virtualization plan, jump-starting efficiency for some much-needed change. Virtualization becomes a reality at CDWG.com/datacenter + 2013 CDW Government LLC. CDW, CDW G and PEOPLE WHO GET IT are trademarks of CDW LLC.