1 Title of brochure In-memory The Path to Making Better Decisions More Quickly
3 New in-memory systems roughly analogous to flash memory in small laptops make it much easier to rapidly process greater volumes of data in real time. Here s how those systems work, who s behind them and what they promise for faster and more informed decision making. 3
4 An electrical power utility wants better information about the long-term performance of its large circuit breakers and their historic repair costs. A taxi company is keen to use traffic records data to improve its ability to direct and dispatch its cabs. A retail chain needs better and more immediate feedback on foot traffic and consumption patterns in its stores so it can fine-tune its staffing schedules. What do these three companies have in common? All three have hit a wall when it comes to being able to act on data that can, when gathered and appropriately analyzed, convey a competitive edge. And all three along with many other businesses across a range of industry sectors are actively exploring new in-memory systems that promise to significantly reshape the ways in which their management teams make decisions. These leading companies explorations promise a myriad of economic outcomes: from better matching of staff with each day s demand, in the case of the retailer, to the electrical utility s long-term cost savings over the lifetimes of its circuit breakers. The quick results from some of these early investigations are also helping these organizations to clarify where they can get the greatest value by being able to make better decisions more quickly. Acquiring the confidence to know where fast decisions about complex scenarios can make a difference to costs or competitive success, their management teams can place their analytics bets where they matter most. This viewpoint paper introduces the new in-memory systems, highlights their benefits for business users, describes the activities of some of the leading providers, and touches on the actions that readers should take now. 4
5 Why new approaches are needed now Organizations struggle at the intersection where business challenges collide with the limits of technology. In times of such enormous business volatility, the need for rapid, confident decision making is all the more acute. It is hardly a question of not having enough data; indeed, most organizations are unable to maximize the potential of all the data they already have in their own transaction-based databases. In addition, few have mastered what it takes to extract value from the data outside their own four walls their customers, suppliers and partners databases. And even fewer know what it takes to gather and capture meaningful insights from abundant s, video webcasts, blogs, and other forms of unstructured data. Specifically, business leaders today are looking for faster queries against bigger databases. Their organizations crave real-time data, immediate and easy access, and self-service, user-centered systems for delivering insights. That s why there is so much emphasis on investments in analytics capabilities, competencies and tools. But there is widespread frustration with the limitations of current analytics systems. Several business-intelligence barriers get in the way of effective, informed decision making. To begin with, most company data is still distributed throughout a wide range of applications and stored in several disjointed silos. Traditional databases rely on half-century-old disk-drive technologies with in-built delays. Creating a unified view of the available data is cumbersome and time-consuming; with traditional divided OLTP/OLAP systems, it can take a week to write the query and receive the answer. Additionally, analytical reports typically do not run directly on operational data, but on aggregated data from a data warehouse. Operational data is transferred into this warehouse in batch jobs, which makes it all the more challenging to use flexible, ad hoc reporting on up-to-date data. Presentations are made with high-level summary data created on spreadsheets, which do not allow users to dig into accurate information. And traditional databases are still geared to structured data, which is only part of the sum of all the data that is useful today. 5
6 The arrival of in-memory systems New technology developments are materializing just in time. Rapid increases in silicon memory capacity and in the number of the processors per chip are producing a step change in the economics of data storage. Laptops that lack on-board disk drives are increasingly common and increasingly attractive; Apple s MacBook Air is one of the better-known examples. Now so-called in-memory technology is moving into the corporate data center. Google searches owe at least part of their speed to the diskless memory used in the company s giant storage farms. It has become possible to store data sets of whole companies entirely in main memory, which offers performance orders of magnitudes faster than with traditional disk-based systems. By 2012, according to research firm Gartner, 70 percent of all Global 1000 organizations will load detailed data into memory as the primary method of optimizing the performance of their business-intelligence (BI) applications. The use of in-memory technology marks an inflection point for enterprise applications. With in-memory computing and insert-only databases using row- and column-oriented storage, transactional and analytical 6
7 processing can be unified. In-memory data warehousing finally offers the promise of real-time computing; business leaders now can ask ad hoc questions of the production transaction database and get the answers back in seconds. Over the past 18 months, most of the leading storage-technology vendors have declared their involvement with in-memory systems. Three of the largest players have aggressively pursued acquisitions. Hewlett-Packard recently purchased Vertica Systems, an analytic database management software company; last year, IBM bought data warehousing company Netezza while Oracle acquired Exadata. And SAP has developed its own in-memory solutions in-house, launching its High Performance Analytic Appliance (HANA) earlier this year. Putting in-memory to work In-memory data warehousing has application in every industry sector. But it is being explored with particular enthusiasm in the utilities industry, in telecommunications, retail and financial services all industries with very high transaction volumes and with a need for very fast time to insight. In the electrical power business, for example, smart-meter technology enables remote monitoring of usage. But if the utility could receive and analyze data from an entire neighborhood s smart meters every 15 or 20 minutes, it could develop a much more valuable picture of power consumption. In-memory can rapidly process those volumes of data; as a result, the electricity provider could make better and faster decisions about buying or selling power. And it could offer consumers applications that would be able to trigger home appliances based on the current price for electricity. 7
8 The electricity provider mentioned earlier wants to gather and interpret more information about its assets in order to make repair/replace decisions more quickly. The objective is to build and run complex event-processing systems that generate asset alerts for example, when the oil in a transformer is too hot or a circuit breaker fails early. Among other insights, the utility is keen to understand what alerts it is receiving on other similar assets, to get a sense of whether the outages are early indicators of more serious performance issues, and to obtain a clearer picture of how much has been spent to date on maintenance, asset by asset. The beauty of in-memory is that it does much more than help analyze one-time events. It enables business users to review whole series of assets and to do so over time. And then it can provide clear recommendations for action and schedule the needed work project. (See sidebar: Where in-memory pays off.) For their part, consumer packaged goods companies can use in-memory systems to analyze their retailers point-of-sale data to predict demand and activate the company s processes for replenishment of stock shelves with 48-hour turnaround. This can help to eliminate out-of-stock scenarios during promotions. And the taxi company noted earlier relies on a technology provider that uses SAP HANA to search through 360 million traffic records in a little over one second. The rapid interpretation of such vast volumes of data allows the taxi company to direct and dispatch cabs more efficiently and in real time. In cases where in-memory systems on-the-fly querying capabilities are augmented by real-time processing, the benefits are even more pronounced. It can certainly make it easier for users to understand the value of being able to make decisions more quickly. 8
9 Where in-memory pays off In-memory data warehousing provides a number of benefits to customers including: Faster insight: Previously, the sheer volume of information and computational power allowed only for pre-determined analysis of information. Data structures had to be developed to analyze the data and then had to be recalculated when data was updated, which took hours and diminished the freshness of information. With in-memory systems, detailed data is loaded into memory where calculations are performed on the fly at query time. Real-time visibility: In traditional BI systems, data is pushed from the sources to the data warehouse. In-memory systems provide real-time replication from ERP applications, which will provide visibility into the real-time business insight by analyzing business operations as they happen. Improved development time: Loading detailed data into memory for reporting and analysis reduces the need to build aggregate data structures a key part of most BI deployments. IT organizations typically must design and build a data layer optimized for query performance. In-memory loads columns of data in memory and uses a virtual layer (views) to access the data. In-memory is often as fast as or faster than aggregated-based architectures. It not only retrieves data faster but also performs calculations on the query results much faster than disk-based architectures. Empowerment: Building aggregated and pre-calculated data structures diminishes the promise of self service and limits what a user can explore. In-memory provides greater analytic flexibility because it reduces business users reliance on IT. Cost benefits: Memory databases can dramatically reduce hardware and maintenance costs through a flexible, cost-effective, real-time approach for managing large data volumes. Memory provides potential cost benefits based on the amount of data (memory is cheaper than data in high volumes). 9
10 SAP s move SAP has been especially assertive with its in-memory move. The technology company recently made its HANA appliance software available to all customers globally, following its pre-launch to selected customers in November HANA is already making waves, giving the German software goliath its fastest-growing sales pipeline for new products. In brief, HANA is a flexible, multipurpose, data-source agnostic in-memory appliance that combines SAP software components optimized on hardware provided and delivered by SAP's leading hardware partners. Data can be replicated from SAP in real time and is captured in memory as business happens, where flexible views expose analytic information rapidly. External data can be added to analytic models to expand analysis across the entire organization. The challenge for most users is that, for all of its stated benefits, they are not certain about how they can put it to work on their unique tasks. The typical query from business users: I want to see how it works with my project. And while the concept of in-memory is easily grasped by technology professionals, they struggle to answer business users questions about how best to use this new technology to meet business needs. Innovating on users terms SAP has partnered with Accenture to help users identify their most appropriate applications for HANA in effect, enabling them to innovate on their terms on real-world business issues. The two companies have set up a network of innovation centers that are designed and equipped to address a wide range of challenges that organizations face as they seek to glean deeper insights from data, improve decision-making processes, and understand the power of in-memory technology and mobility for delivering information anytime, anywhere. The innovation centers are effective test beds for users ideas: They use their data to rapidly develop proof-ofconcept studies. The centers house Accenture and SAP specialists who work side by side and bring together assets from both organizations including a fully integrated SAP technology platform that drives capabilities in business intelligence, in-memory analytics, enterprise mobility, enterprise content management, and enterprise information management. Recently, the innovation centers have helped a leading energy-services provider to quickly put its spend data on mobile platforms. The center teams utilized HANA Spend Analytics to extract actual spend data, loading the data in the HANA system, developing a supporting data model, and building a framework of explorer views to quickly unlock the data and make it available for use on ipads. The entire project was completed in four weeks. In another instance, a mining conglomerate is using one of the innovation centers to study the practicality of incorporating unstructured data into its decision-making processes. Time spent at one of the centers is an immersive experience. Visitors are exposed to high-performance analytics during strategic brainstorming sessions, technology demonstrations, and dayin-the-life scenarios showing analytics solutions at work in their organizations. Presentations on key economic, marketplace and technology trends presentations tailored to the visitors situations help them define their technology roadmaps for analytics in their industry sector and in their organizations. The innovation centers provide paths for very quickly determining time to value and for identifying the areas that will most resonate with users. There is no argument that in-memory data warehousing represents the next wave of innovation in business intelligence. The question is about how promptly companies act to take advantage of what it offers. The surge of interest in SAP s HANA is evidence enough that there is real hunger for solutions to increasingly complex business intelligence challenges. The technology groups at leading companies already have a good grasp of what in-memory can do and of what its weaknesses are. But if they are to persuade their business colleagues of its merits, they have to find low-cost, low-risk ways to test their own company s ideas using in-memory tools and techniques. Those efforts may already be overdue. 10
11 Sources 1. In-Memory Data Management: An Inflection Point for Enterprise Applications, Hasso Plattner and Alexander Zeier, Springer-Verlag Berlin Heidelberg 2011, ISBN e-isbn Benefits of in-memory computing, Financial Times, June 1, 2011, 8c6e-11e0-883f-00144feab49a.html#ixzz1Y4LWC Accenture and SAP Announce Strategic Relationship to Develop and Deploy New Mobility Solutions, Accenture press release, May 17, 2011, article_display.cfm?article_id= Accenture Recognized as a Leader in IDC MarketScape Cited for SAP Implementation Skills, Accenture press release, June 23, 2010, newsroom.accenture.com/article_ display.cfm?article_id= Invent new possibilities with the SAP HANA Appliance, SAP website, overview/index.epx 6. Understand the Power of SAP In-Memory Computing: Virtual Event Webcast, SAP website, 7. Accenture Technology Vision 2011 The Technology Waves That Are Reshaping the Business Landscape, Accenture, 2011, www. accenture.com/us-en/technology/ technology-labs/pages/insight- accenture-technology-vision aspx 8. Exploring New Opportunities to Unlock the Value of Data, Accenture, 2008, Pages/service-sap-master-datamanagement.aspx 9. SAP HANA Now Generally Available to Customers Worldwide, SAP press release, June 21, articleid=17213&category= 550&class=byd-news-overlay 10. BI Applications Benefit From In-Memory Technology Improvements, Gartner Research Note G , Kurt Schlegel, Mark A. Beyer, Andreas Bitterer, Bill Hostmann, October 2, 2006.
12 About the authors Hettie Tabor is a seasoned Accenture senior executive based in Accenture s Dallas office. She has more than 23 years of IT experience, including 17 years of practical SAP implementation experience. Ms. Tabor currently leads Accenture s SAP Business Analytics Global Group and has a wealth of technical and project management knowledge about SAP Business Intelligence, HANA, BusinessObjects, Business Planning and Consolidation, and Data Management. Nicola Morini Bianzino leads the Global Accenture Analytics Innovation Center Network. Joining Accenture in 1998, Mr. Bianzino has been working in the analytics and ERP space since the beginning of his career at Accenture, focusing on major global implementations. He is based in San Jose, California. About Accenture Accenture is a global management consulting, technology services and outsourcing company, with more than 223,000 people serving clients in more than 120 countries. Combining unparalleled experience, comprehensive capabilities across all industries and business functions, and extensive research on the world s most successful companies, Accenture collaborates with clients to help them become high-performance businesses and governments. The company generated net revenues of US $21.6 billion for the fiscal year ended August 31, Its home page is For further information about in-memory systems, please contact: Hettie Tabor SAP Business Analytics Global Lead Nicola Morini Bianzino Accenture Analytics Innovation Center Global Lead Copyright 2011 Accenture All rights reserved. Accenture, its logo, and High Performance Delivered are trademarks of Accenture.