Cloud Building Blocks: Paving the Road By understanding what provides a strong foundation for the cloud, CIOs can help make their deployments smooth and cost-effective. Companies are turning to cloud computing for one predominant reason: It can help reduce the cost of doing business. That s a concise and accurate answer, but it also oversimplifies the multitude of ways companies can save time, money and other resources with cloud computing. When IDG Research Services, in a survey (IDG Research Services study Cloud Computing: Viewpoints of IT and Finance Executives, January 2012, sponsored by AMD), asked corporate executives to cite the technical benefits they had achieved from cloud computing deployments over the previous 12 months, no single response significantly outnumbered any other. The benefits cited ranged from improved data access and infrastructure simplification to more-efficient application delivery and improved data management (see Chart 1). Yet when the survey respondents turned to the financial benefits, one answer stood out above the rest and that related to ROI. A small group (7 percent) stated that they had achieved return on their investment in less than three months, 41 percent had measured ROI in less than 12 months and another 28 percent had measured ROI in one to two years. That represents a high level of ROI in a short time for a relatively new capability, and it indicates that companies are highly satisfied with their results. How can companies that are just beginning to consider the cloud achieve similar results? Perhaps one of the first things to understand is that although companies can achieve a multitude of benefits, those benefits stem from a common technology foundation. Picture a pyramid: At the pinnacle are various benefits that survey respondents cited, and its base represents the foundational aspects of cloud computing. No matter which benefits you want to achieve at the top, your success can depend on understanding the components of a strong foundation.
CHART 1 TECHNICAL BENEFITS ACHIEVED FROM CLOUD DEPLOYMENTS Improved data access Infrastructure simplification More-efficient application delivery Improved data management Improved application management 39% 39% 36% 35% 33% Source: IDG Research Services study Cloud Computing: Viewpoints of IT and Finance Executives, January 2012. WHY THE BUILDING BLOCKS ARE IMPORTANT Notwithstanding their early successes, companies have a long way to go with cloud computing. Only about half of the executives surveyed saw themselves as being effective when it comes to creating a cloud computing strategy, streamlining and automating processes and enabling IT to allocate operating costs back to business units. Yet a higher percentage of the respondents (55 percent) anticipated that five years from now, they will have deployed a majority of their business applications in the cloud. This will undoubtedly involve a mixture of cloud architectures, depending on a company s needs. Private clouds are more commonly found in enterprises seeking to maximize their investment in data centers. Cloud service providers are focusing on providing public cloud access (although internally, it seems more akin to access to a private cloud). Public cloud users tend to be smaller companies that don t have the IT staff to manage an infrastructure but want the flexibility the cloud can provide. The ability to take advantage of a hybrid infrastructure can help offer flexibility. It can help companies take advantage of spikes in demand without actually having to deploy the internal computing capacity necessary to accommodate them. That can help save money as well as provide agility. It s a method commonly used by e-commerce providers during seasonal spikes, but according to AMD Product Marketing Manager Matt Kimball, companies such as Domino s Pizza also take advantage of Microsoft Azure cloud computing on days such as Super Bowl Sunday, when online ordering spikes measurably. IT, it seems, has already begun to reap the benefits of some of cloud computing s foundational strengths. The most obvious is virtualization. By virtualizing servers, companies can boost hardware utilization, help increase the speed of deploying new applications and enable internal IT teams to redeploy human resources. IT is finally able to focus resources on strategic issues rather than tactical maintenance. This can help enable a certain amount of efficiency, but IT departments want more. IT is looking for automated capabilities such as self-provisioning and metering. With these capabilities, not only can IT demonstrate the value the cloud brings to the organization but departments can also now easily compare the cost of public cloud options with private cloud options. They can determine where hybrid clouds, combining the strengths of public and private clouds, can provide maximum payoff. AMD White Paper: Cloud Building Blocks: Paving the Road 2
Identifying where applications can run most cost-efficiently can help reduce maintenance costs and help increase strategic opportunities. The goal, then, becomes to create an optimized cloud infrastructure that can enable IT to proactively respond to and serve strategic business needs. How can companies make the migration from where they are today to where they want to be in five years? The answer: by understanding how to lay down the building blocks of the cloud to create a strong foundation for future flexibility and agility. This applies not only to companies providing data to employees (and potentially consumers and partners) but also to service providers that offer hosting capabilities to customers. KEY BUILDING BLOCKS OF AN OPTIMIZED CLOUD INFRASTRUCTURE An optimized cloud infrastructure combines attributes of three key areas for a company: the business, its applications and the hardware the applications run on. The IT department should offer facets of each of these three attributes to the company s employees, and service providers should offer these to their clients. However, the fundamentals remain the same: The Business. A business really needs only four things from the cloud. Performance encompasses responsiveness that is, the ability to access data quickly, no matter what device employees use or how it s connected. Scalability relates to performance, ensuring that no matter how many employees access the system at any given time, responsiveness remains as close to consistent as possible. Efficiency relates to the productivity a reliable computing system provides, eventually extending to agility the ability to adjust focus for competitive advantage. Finally, the cloud should offer cost reduction as well as increased efficiency. One of the IDG Research survey participants was Michael Pate, vice president of IT at Complete Production, an energy services company based in Houston. He confirms the ability of his company to save money simply through virtualization. When we last did a server refresh, two years ago, we picked servers whose processors were purpose-built for virtualization. We handled all our growth last year through virtualized machines, he says. It made management very happy that I didn t spend a dime of capex for servers last year. Applications. Applications are naturally a key building block for cloud computing, especially when it comes to access, collaboration and insight the top three factors cited in response to a query in the IDG Research survey about the technical impact of cloud computing. According to the respondents, e-mail, CRM applications and desktop productivity applications (delivered through desktop virtualization) are the top three deployments either already in the cloud or planned for the cloud. Other deployments cited included performance reporting dashboards, business intelligence applications and planning or forecasting applications. Conducting these activities in the cloud helps companies compile information from multiple sources for an integrated view of multiple business segments. The need to match applications with servers frequently comes up, but with virtualization, even though you still need an appropriate number of processors to run applications, that concern no longer applies, because the relationship has been abstracted. The database sits on an operating system, and the operating system sits on a hypervisor, which runs on the server. Ultimately, it s probably best to understand how the hypervisor runs on the server, and the server s processor, rather than how the application itself runs on the server. AMD White Paper: Cloud Building Blocks: Paving the Road 3
Another survey respondent, Hari Sury, CIO of Indus Corp., a U.S. federal government IT consulting firm based in Tyson s Corner, Va., moved an internal SQL Server database onto virtualized servers. He cautions that you need to deploy sufficient memory for a database, as you would for any other memory-intensive application, but the results were extremely positive. There were some concerns about moving the database to a virtualized system, but performance improved dramatically, as did system availability. Hardware. The hardware elements of an optimized cloud infrastructure are more varied than the applications. At the foundation is the server, incorporating disk, I/O and processing power. Then come the associated pieces, such as virtualization, which enables IT to run multiple virtual instances of an operating system on a single physical device, helping increase flexibility and reduce costs. Other aspects include security, the need to protect the data; replication, the ability to archive the data; and failover protection, the ability either to move workloads from one server to another in real time or to access archived data quickly. Key pieces of a cloud infrastructure also include the ability to easily configure new virtual servers and to have users provision them on a self-service basis. This requires factoring in policy-driven automation. Carlos Reyes, CIO of the American School Foundation in Mexico City, who also participated in the survey, has since moved much of his organization s computing to a cloud-based architecture. He insists that if you re purchasing in-house servers, the most important thing is the hardware virtualization and the way the server works with the hypervisor. If hypervisor extensions aren t part of the hardware, the applications just won t run as well. Here s how the three building blocks fit together: Once IT has determined its business strategy, it must decide on the applications that support that strategy and then on what hardware to deploy for optimal results. With an understanding of an optimized infrastructure s building blocks, the next step is to drill down one more level. The agility that virtualization brings starts with servers or, more precisely, the processors within those servers. It s crucial for service providers as well as IT departments to think about how those processors accommodate virtualization. Why is the server platform important when it comes to cloud computing? Whether for internal IT or a service provider, one of the key goals is to deploy compute services as inexpensively as possible. Many companies compare operational expenses with capital expenses when they re looking at cloud computing, but an important metric is total cost of ownership. Getting the most out of each server means getting the most out of the virtual machines running on that server in other words, getting as close to near-native, on-premises performance as possible, even though you re taking advantage of multiple virtual machines in a private cloud or a service provider s hosted location. HOW AMD S PROCESSORS SERVE VIRTUALIZATION A server s processors are essentially the building blocks beneath the building blocks. For virtual machines (VMs), it s important for them to provide three capabilities: Compute density. The higher the number of cores available on a processor, the more VMs you can run and the more efficient the server becomes. The AMD Opteron 6200 Series processor-based platform enables customers to deploy 60 percent to 100 percent more VMs than a comparable Intel-based platform. 1 AMD White Paper: Cloud Building Blocks: Paving the Road 4
Memory architecture. You need a capable connection between the cores and the memory to ensure that I/O doesn t create a bottleneck that impedes performance. AMD Opteron processors provide four memory channels per CPU, which can support both the compute density and the I/O performance. Processor technology. Fast processors provide fast performance, a fundamental capability, but to help boost virtualization and cloud computing, AMD Opteron processors also include AMD-V technology, which can integrate smoothly with any vendor s hypervisors to create a highly optimized virtualization environment that can help eliminate any virtualization overhead. After architecture, the other key facet to look for in processor choice is flexibility. This can manifest itself in two ways: configuration choices and processor options. Consider the challenge of configuring an e-mail application such as Microsoft Exchange on a virtualized server. Tests conducted by Microsoft have shown that, because of the way e-mail applications interface with memory, there s an upper limit at which having more cores may actually become less efficient in supporting mailbox servers. The Microsoft tests show that with an eight-core processor server, committing one core to each VM works fine but that when you configure an application for use with more than 12 cores, performance starts to diminish. One of the benefits of virtualization is that you can take a 64-core server and configure it as eight cores, each supporting eight VMs. In terms of options, AMD has designed its processors specifically to offer choices to enterprises and service providers with cloud computing deployments for organizations of various sizes. Choices range from single-processor, price-optimized servers on the AMD Opteron 3000 Series platform, targeting dedicated hosting scenarios and small businesses, to one- or two-processor servers on the AMD Opteron 4000 Series platform, providing energy efficiency and cost optimization for smaller cloud deployments and small businesses, to the AMD Opteron 6000 Series platform, with two- and four-processor options for large private cloud data centers and public cloud service providers. The connection between the virtualization capabilities of a processor that fits in your hand and the vastness of the cloud may not seem obvious at first. The fact remains, however, that by building in those key capabilities for hypervisors, the right processor becomes a linchpin of any kind of cloud computing strategy. 1 Based on comparison of a 16-core AMD Opteron 6200 Series processor with an eight-core Intel Xeon 7500 Series processor and 10-core Intel Xeon E7 Series processor, using one VM per core loading rule. SVR-61 Paper sponsored by AMD. 2012 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow Logo, AMD Opteron, AMD Virtualization, AMD-V, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Microsoft is a registered trademark of Microsoft Corporation in the United States and/or other jurisdictions. All other copyrights or trademarks are the property of their respective owners and are being used under license. Other names used in this presentation are for identification purposes only and may be trademarks of their respective owners. PID #51581A AMD White Paper: Cloud Building Blocks: Paving the Road 5