Fall 2013 Issue 31 THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING 6 Supply Chain Forecasting & Planning: Move On from Microsoft Excel? 14 Forecasting with In-Memory Technology 21 The Future of Financial Market Forecasting 35 Two Important New Books: Demand and Supply Integration and Keeping Up with the Quants 41 Using Process Behaviour Charts to Improve Forecasting 49 New Directions in Managing the Forecasting Process
ISF 2014 Economic forecasting - past, present and future rotterdam, the netherlands JunE 29 - July 2 2014 Keynote SpeaKerS Jan Peter Balkenende (Former Prime minister of the netherlands) John Geweke (distinguished research ProFessor university of technology sydney) angelien kemna (cio apg group) Jos nijhuis (ceo schiphol group) LocaL organizers dick van dijk PhiliP hans franses DeaDLIneS 2014 Jan 31 invited session proposals mar 16 abstract submission mar 31 abstract acceptance may 18 Early registration erasmus school of economics celebrates its first centennial for more information: www.forecasters.org Erasmus Adv. 7"x9,5" tbv Foresight.indd 1 18-09-13 23:20
contents Knowledge of truth is always more than theoretical and intellectual. It is the product of activity as well as its cause. Scholarly reflection therefore must grow out of real problems, and not be the mere invention of professional scholars. 3 Notes from the Editor John Dewey, University of Vermont special feature: forecasting support systems 6 Supply Chain Forecasting & Planning: Move On from Microsoft Excel? 14 Forecasting with In-Memory Technology financial forecasting 21 The Future of Financial Market Forecasting: Five Trends Worth Watching forecaster in the field 28 Interview with Jeffrey Mishlove book reviews 35 Demand and Supply Integration: The Key to World-Class Demand Forecasting by Mark A. Moon 38 Keeping Up with the Quants: Your Guide to Understanding + Using Analytics by Thomas H. Davenport and Jinho Kim forecasting principles and practices 41 Using Process Behaviour Charts to Improve Forecasting and Decision Making 48 Upcoming in Foresight 49 New Directions in Managing the Forecasting Process P l n, M g r M o d, M g r M o d, G e n G e n P l n, M g r M g r, G e n M o d, P l n, M g r P l n, M g r Sujit Singh Tim Januschowski, Stephan Kolassa, Martin Lorenz, Christian Schwarz Jeffrey Mishlove John Mello John Pope Martin Joseph & Alec Finney Chris Gray Article Coding: General (g e n ), Managers (m g r ), Modelers (m o d ), Planners (p l n )
Practitioner Advisory Board Chairman: Joe Smith, Carrier Enterprise Carolyn Allmon, Business Forecasting Services Nariman Behravesh, IHS Global Insight Ellen Bonnell, Trend Savants Jason Boorman, Daiichi Sankyo, Inc. Charlie Chase, SAS Institute Simon Clarke, Coca-Cola Enterprises McKay Curtis, Walt Disney Parks and Resorts Robert Dhuyvetter, J. R. Simplot Company Alec Finney, Rivershill Consultancy Jeffrey Hunt, SHEA Business Solutions Jamilya Kasymova, Marriott International Joseph McConnell, McConnell Chase Software Works John Pope, Investment Economics Christian Hans Schäfer, Boehringer Ingelheim GmbH Udo Sglavo, SAS International Jerry Shan, Hewlett-Packard Sujit Singh, Arkieva Eric Stellwagen, Business Forecast Systems Bill Tonetti, Demand Works John Unger, Boise Paper Lauge Valentin, LEGO Patrick Wader, Bosch Editorial Board J. Scott Armstrong Anirvan Banerji Peter Catt Elaine Deschamps Robert Fildes Ram Ganeshan Adam Gordon Kesten Green Rick Hesse Jim Hoover Randy Jones Ulrich Küsters Mark Little Patrick McSharry Jeffrey Mishlove Mark Moon Steve Morlidge Marcus O Connor David Orrell Peter Sephton Tom Willemain Foresight, an official publication of the International Institute of Forecasters, seeks to advance the practice of forecasting. To this end, it will publish high-quality, peer-reviewed articles, and ensure that these are written in a concise, accessible style for forecasting analysts, managers, and students. Topics include: Design and Management of Forecasting Processes Forecast Model Building: The Practical Issues Forecasting Methods Tutorials Forecasting Principles and Practices S&OP and Collaborative Forecasting Forecasting Books, Software and Other Technology The World of Forecasting: Applications in Political, Climate and Media Forecasting Case Studies Contributors of articles will include: Analysts and managers, examining the processes of forecasting within their organizations. Scholars, writing on the practical implications of their research. Consultants and vendors, reporting on forecasting challenges and potential solutions. Editor Len Tashman lentashman@forecasters.org Associate Editors Stephan Kolassa stephan.kolassa@sap.com Roy Pearson Roy.Pearson@mason.wm.edu Column Editors: Financial Forecasting Roy Batchelor r.a.batchelor@city.ac.uk Forecasting Intelligence Roy Pearson roy.pearson@mason.wm.edu Forecasting Practice Mike Gilliland Mike.Gilliland@sas.com Forecasting Support Systems Stavros Asimakopoulos stavros@experiencedynamics.com Hot New Research Paul Goodwin mnspg@management.bath.ac.uk Long-Range Forecasting Ira Sohn ims20b@cs.com Prediction Markets Andreas Graefe graefe.andreas@gmail.com S&OPJohn Mello jmello@astate.edu Supply Chain Forecasting John Boylan John.Boylan@bucks.ac.uk Foresight Staff: Design and Production Liza Woodruff liza@lizawoodruff.com Marketing and Ad Sales Kim Leonard kimleonard@forecasters.org Stacey Hilliard staceyhilliard@forecasters.org Manuscript and Copy Editing Ralph Culver letter_perfect_rc@yahoo.com Mary Ellen Bridge me.foresight@comcast.net Subscriptions Pam Stroud, IIF Business Director forecasters@forecasters.org All invited and submitted papers will be subject to a blind editorial review. Accepted papers will be edited for clarity and style. Foresight welcomes advertising. Journal content, however, is the responsibility of, and solely at the discretion of, the editors. The journal will adhere to the highest standards of objectivity. Where an article describes the use of commercially available software or a licensed procedure, we will require the author to disclose any interest in the product, financial or otherwise. Moreover, we will discourage articles whose principal purpose is to promote a commercial product or service. Foresight is published by the International Institute of Forecasters, Business Office: 53 Tesla Avenue, Medford, Ma 02155 USA 2013 International Institute of Forecasters (ISSN 1555-9068)
fall 2013 notes from the editor The Fall 2013 Issue Forecasting Support Systems (FSS) essentially, decision support systems for forecasters are being given increasing scrutiny in forecasting circles, including our recent half-dozen articles in Foresight. Additionally this year, there has been a special issue of the International Journal of Forecasting focused on the topic. Keith Ord and Robert Fildes offer this definition of FSS in their new textbook, Principles of Business Forecasting: A set of (typically computer-based) procedures that facilitate interactive forecasting in an organizational context. An FSS enables users to combine relevant information, analytical models, and judgments, as well as visualizations, to produce forecasts and monitor their accuracy (p. 398). This 31st issue of Foresight contributes two new articles to the FSS literature. Sujit Singh, COO of software-services company Arkieva, leads with a comprehensive evaluation of the adequacy of Excel in forecasting support, Supply Chain Forecasting and Planning: Move On from Microsoft Excel? Sujit explains that Excel s shortcomings become more and more glaring as organizations grow in size and complexity. Then Tim Januschowski, Stephan Kolassa, Martin Lorenz, and Christian Schwarz present a systems development of intriguing potential in Forecasting with In-Memory Technology. The authors believe that this synthesis of analytical and transactional processing can enhance the use and acceptance of forecasting analytics within the organization. In our Financial Forecasting piece, Foresight welcomes Jeffrey Mishlove, author of The Alpha Interface series of books and blogs on financial markets, who looks ahead to The Future of Financial Market Forecasting: Five Trends Worth Watching. Don t miss the interview with Jeff in this issue as part of our occasional Forecaster in the Field feature. Our always popular Book Reviews look at a pair of interesting titles. The first is Professor Mark Moon s Demand and Supply Integration: The Key to World-Class Demand Forecasting. Reviewer John Mello examines Moon s contention that DSI represents the way Sales and Operations Planning should be practiced within an organization. The second review is on Keeping Up with the Quants: Your Guide to Understanding and Using Analytics by Tom Davenport and Jinho Kim, and is a follow-up to Davenport s Competing on Analytics: The New Science of Winning (coauthored with Jeanne Harris). Jack Pope s review judges the book to be most appropriate for those in non-quantitative roles that depend on others for the heavy analysis work. www.forecasters.org/foresight FORESIGHT 3
Our section on Forecasting Principles and Practices makes a convincing argument for greater application of statistical process control (SPC) in forecasting evaluation and planning. Using Process Behaviour Charts to Improve Forecasting and Decision Making is written by Martin Joseph and Alec Finney, longtime practitioners of SPC. They feel that process behaviour charts (PBCs) can bring significant new perspectives to S&OP meetings by distinguishing situations requiring action from those where changes are not really called for. Then, Chris Gray, noted author of books on S&OP and Operations Management, provides his perspective on the evolution of the forecasting function in organizations. In New Directions in Managing the Forecasting Process, Chris observes that businesses have shifted their focus away from the purely mathematical and statistical hammers and nails of forecasting and toward better management of the forecasting process. His article offers a set of seven requirements for effective forecast-process management. We Welcome a New Editor and Two New Board Members A hearty welcome first to John Mello as Foresight s new S&OP Editor. John has been one of Foresight s most prolific contributors, with 10 articles and reviews including his book review in this issue. Perhaps his most influential article was "The Impact of Sales Forecast Game Playing on Supply Chains" in our Spring 2009 issue. Special thanks go to Bob Stahl, stepping down as S&OP Editor after four years and numerous Foresight columns and articles on this important subject. Joining Foresight s Editorial Board is Jeffrey Mishlove, whose article on the future of financial forecasting appears in this issue as well as his Forecaster in the Field interview. And Sujit Singh, author of this issue s special feature article evaluating Excel as a forecasting support system, joins our Practitioner Advisory Board. Sujit is Chief Operating Officer of Arkieva, where he is involved in software consulting, product management and, more recently, business development. 4 FORESIGHT Fall 2013
Special Feature: Forecasting Support Systems Forecasting support systems (FSS) a specialized type of decision support system are designed to support the forecaster s tasks by offering (a) a database of historical activity, events, and managerial actions; (b) a menu of statistical forecasting models; (c) a facility to judgmentally adjust statistical forecasts; as well as (d) a storage space for all forecasts for subsequent monitoring and evaluation. An informative definition of an effective FSS comes from the introduction to a recent special feature in Foresight s sister publication, The International Journal of Forecasting: By removing the burden of dealing with the programmable aspects of forecasting, well-designed FSSs should free users to deal with the remaining aspects. They can supply relevant information in an amenable form, facilitating the analysis and modeling of data, providing advice, warnings, and feedback, and allowing users to assess the possible consequences of different forecasting strategies. (Fildes and Goodwin, 2013) Many dedicated commercial systems are available to support the data-management and statistical forecasting aspects of the forecasting process. Yet we repeatedly hear and some surveys have reported that Microsoft Excel has been and remains the most prevalent tool for forecasting support. How effective Excel can be in this role is examined by Sujit Singh in the first of two articles in this special feature section: Supply-Chain Forecasting & Planning: Move On from Microsoft Excel? Sujit explains that, for small companies with simple processes, Excel can be a winning choice to support forecasting and planning, but that its shortcomings become more and more serious as company size and complexity increase. The second article, Forecasting with In-Memory Technology by Tim Januschowski, Stephan Kolassa, Martin Lorenz, and Christian Schwarz, proposes that many aspects of the forecasting function can be improved by integrating the traditionally separate systems of analytical processing (OLAP) and transactional processing (OLTP) through in-memory technology. Written by a team of SAP researchers, the article suggests that the integration of a forecasting support system within in-memory technology can enhance cross-functionality, as integrated forecasting applications are implemented by departments throughout a company. This integration, in turn, may lead to a greater penetration of forecasting analytics in the organization. An issue that is not examined in these articles, however, concerns how well the FSS enables the user to integrate judgment into forecast generation. Fildes and Goodwin note that while most of today s systems enable judgmental overrides, little support is offered for decisions on when to intervene and how large such interventions should be. In addition, there is a need to effectively support users workflow as well as specific forecasting tasks via intuitive user-interface (UI) design. In this way, you ensure you are not building a forecasting support system based merely on your own preferences or commonly used conventions, but on what is actually surprise! a good experience for your users. We hope to give these matters more attention in future issues of Foresight. Special Feature www.forecasters.org/foresight FORESIGHT 5
Supply Chain Forecasting & Planning: Move On from Microsoft Excel? Sujit Singh PREVIEW Surveys of business use of forecasting support tools reveal that Microsoft s Excel spreadsheet software continues to reign supreme in smaller organizations. As Sujit Singh explains, there are many virtues to this solution; however, the balance of pros and cons begins to tip as organizational size and complexity increase. Sujit provides a comprehensive examination of the efficacy of relying on Excel to support the forecasting and planning functions and then describes the gains and costs of moving up to a best-of-breed planning application. MICROSOFT EXCEL Microsoft Office s Excel spreadsheet software is the predominant application of choice for businesses starting supply chain planning and forecasting applications. Whether by design or necessity, it retains a large part of its market share well beyond the start-up stage; in fact, some estimates attribute 50-70% of the total supply chain planning market to Excel. Its strengths include low cost (it is perceived as free since it is part of the Microsoft Office suite), ease of use (it is a known commodity), versatility, universal availability, and good basic functionality. Excel s most important advantage is that it enables experimentation and tinkering, valuable tools for those managing the planning process. Excel s most important advantage is that it enables experimentation and tinkering, valuable tools for those managing the planning process. In planning, the ability to do rapid-fire experimentation on various possibilities is very important. It should be noted, however, that Excel poses serious limitations when compared to more robust and full-featured software applications designed specifically for advanced supply chain planning. Here, we explore the pros and cons of Excel usage and find some guidelines for deciding when Excel is useful and when a company needs to graduate to an advanced planning solution. EXCEL: IN THE BEGINNING, THE OBVIOUS CHOICE Imagine a small manufacturing company, selling a few products to a few customers, that has acquired an order-taking/tracking software to keep tabs on transactions and due dates. As complexity increases and the need for planning accelerates, someone creates a simple spreadsheet with products in one column and time periods shown across the columns. In such a spreadsheet, a planner copies and pastes or types in the actual history in the historical months and then begins to type projections in the future months. Some high-level assumptions about capacities (the company can make and sell roughly 7,000 units per month) allow planners to quickly see whether they are above or below the acceptable amount. Here is where Excel s rich feature set, for lack of a better word, excels the software allows neophyte users to customize it to their liking. A colored cell here, a subtotal there, and voila, they have a planning spreadsheet in operation. Everyone (which at this stage might only be a couple of people) opens the same spreadsheet on a shared drive, reviews the same data, and makes decisions based on that same data. Very soon, they realize that it is good to get input from multiple people, and an email/phone/chat-based collaborative process begins to take shape. Every month, all that needs to be done is to copy/paste the new data and keep planning the future months. Toward the end of the year, it can get a bit more exciting when 6 FORESIGHT Fall 2013
deciding to add the next year s months, but the process remains manageable overall. Excel can also assist in developing the company s planning/forecasting in the early stages, as well as create internal planning disciplines. Because of the easy availability of Excel and its relative simplicity, such a spreadsheet can appear almost overnight. It could even take on a name of its own, such as Jane s spreadsheet (Jane being the creator, of course). In such circumstances, every aspect of planning is within Jane s grasp for example, it s easy to make a copy of the spreadsheet, change some of the numbers, and compare the results to do a what-if scenario. Time passes, and the company adds to its business (a new plant, new products, etc.); the planning worksheet gets expanded either by adding new rows of data or by adding new worksheets. Perhaps two different worksheets might be used to plan the East Coast and the West Coast operations. Excel is excellent when it comes to a databased personal productivity tool. As long as the planning process remains personal, no other software comes close to what Excel can provide. In the early days of the company, this is usually done on personal initiative, very often by one of the founding members. As a result, a forecasting/planning spreadsheet can often be created by Jane in just a few evenings work. WELL-KNOWN SHORTCOMINGS Despite these advantages, there are certain well-known shortcomings of Excel-based applications, which, while not the main thrust of this discussion, I have listed for the sake of completeness. The server crashes and the planning spreadsheet is lost. This concern can be mitigated by keeping backups. However, there can be a significant time loss if the saved copy is not the most current one because of the need to paste the new information into the spreadsheet once again. Jane wins the lottery or finds another job. Good for Jane but unless the company has thought ahead, it is unlikely that anyone else is trained and ready to take over Key Points Microsoft s Excel is an excellent option when it comes to selecting a data-based personal productivity tool. For small companies with simple processes, it can be a winning choice to support forecasting and planning. Excel s shortcomings, though, become apparent as company size and complexity increase. Here, we discuss Excel s limitations in coping with numerous areas of complexity, including database size, cost of failure, multiple users, data security, business conditions, planning, scenarios and connecting all bases. Best-of-breed forecasting and planning systems, which use sturdier database management, can overcome the shortcomings of Excel as well as provide key functionality to support essential planning needs. Their return-on-investment can be quick and very significant. Quite often, the first step in implementation pays for the subsequent steps. managing the spreadsheet. While this could apply to any system, the notion that Excel skills are easily and readily available can lead to overconfidence in the ability to pick it up on short notice. To be fair, however, if this was a discipline issue at the company, then the Excel program would potentially provide for a faster recovery thanks to a flatter learning curve compared to that of a full-blown specialized solution. It is very hard in Excel to keep track of what one is doing. Why was this cell excluded as an outlier but not that one? Why was triple exponential smoothing used on this time series but double on that one? What exactly was that analysis that was done three months ago, the one that management liked so much they are asking for it again? In a point related closely to the above, unless a very good versioning system has been set up, it is impossible to find out www.forecasters.org/foresight FORESIGHT 7
when, how, why, and by whom a given cell was last changed. Problems can occur when integrating with other systems (upstream or downstream). Though Excel now has connectivity tools that allow connections to a variety of data sources, this is one of its least-known and used features, resulting in stale data. Sometimes linked data sources change and the planner has to adapt the Excel spreadsheet, as the data source may not even be aware of the link. Keeping the spreadsheet data up to date takes considerable time. This is not limited to copy/paste; a majority of the time it involves retyping the data from one system to another. This is probably one of the program s biggest flaws. Should your planners spend their time planning or updating data onto spreadsheets? Absent the above-mentioned integration, Excel tools force users to do more data management and less decision making and anticipating business conditions. Since spreadsheets are easy to create, each department might decide to start its own. This could and when it happens, often will result in different plans across different departments within the same company. This is especially problematic if the sales and operations departments operate from different plans. The spreadsheet may have errors and unfortunately, it probably does. A study at the University of Hawaii (Panko, 2008) found that errors in spreadsheets are pandemic. The authors added, In general, errors seem to occur in a few percent of all cells, meaning that for large spreadsheets, the issue is how many errors there are, not whether an error exists. Since the average spreadsheet contains thousands of information-bearing cells, a few percent may translate into dozens of errors. In many noncritical applications, these errors may be considered a reasonable tradeoff for the affordability and ease of use offered by Excel spreadsheet software. However, when the errors result in serious supply chain miscalculations, the costs can be devastating. Examples of this abound (Krugman, 2013; Wailgum, 2007; Wolf, 2012). For instance, a miscalculation in the quantity of a key part can produce a domino effect, causing a delay in assembling the final product. This then leads to missed production deadlines, lost orders, rush shipping charges, and damage to the company s reputation. It is difficult to grow Excel into other types of analysis. Let s say that Jane wants to view trends across many types of products and then find which ones are significant outliers for further review. This is not always possible in Excel, or is at least very cumbersome, because the business has to settle on a few key attribute-based views. It takes too much discipline to maintain a rolling planning horizon in an Excel spreadsheet. As a result, most planning spreadsheets have an accordion time horizon. Periods decrease to three or four towards the end of the year and then increase to 15 or 16 to include the next year. This is not good for planning and forecasting. SIGNS THAT EXCEL IS INADEQUATE When a company is in its early stages, Excel s limitations are likely to be outweighed by its convenience and affordability. Unfortunately, as the company grows, these limitations become more serious. The above-mentioned shortcomings of Excel applications alone are enough for some companies to look on the market for a best-ofbreed software solution. Another company may be more tolerant toward Excel for a time, and then at last reach a stage where it becomes impossible to plan using Excel due to a variety of other complexities. Here are a few: 8 FORESIGHT Fall 2013
Amount-of-Data Complexity As business grows, the data grows; the business that grows in volume and revenue with little or no corresponding increase in data intensity is a possibility, but remains rare. Increased data intensity adds complexity to the planning and forecasting operation. For example, the numbers of products or customers have increased, or new attributes need to be summarized in future data (perhaps by a key ingredient or market characteristic), or users may want to switch from monthly to weekly time buckets to improve planning. Cost of Failure Complexity If the business is simple enough that a mistake made today can be corrected tomorrow, the solution is easy. However, if the mistake made today leads to being stuck with the wrong inventory for a long time, then a more complex planning paradigm is needed, one requiring detailed checks on all projected inventories. In some ways, this is an offshoot of the amount-of-data complexity. Multiple Users Complexity Over time, the company hires more sales reps to sell to different markets. These reps know the most about the forecast because they are closest to the customers. Depending on the company, one of two things might happen. First, the reps email Jane with the latest information, which she types into the spreadsheet; second, the sales team agrees to update the numbers themselves. Both these approaches are risky. In the first, errors creep in as typing mistakes. In the second, Jane has to worry about controlling access to different rows of data in the Excel spreadsheet; if she cannot program this logic, then she has to rely on the reps to type only into data rows assigned to them. Again, typing mistakes happen. It s even possible for a business which mails spreadsheets around for input to lose track of version control and submit the wrong spreadsheet to corporate headquarters for the annual budget. And then there is this question: Who controls the need for new product/customer combinations required for new forecasts? Data Security Complexity This is also caused by multiple users. For any number of reasons, data access might need to be limited by, and to, certain individuals. For example, a sales representative should see only her own data, possibly forcing the demand planner to create multiple spreadsheets with subsets of data. This again increases the risk of errors. Business Conditions Complexity Two customers buy the same product under different names (and prices) for confidentiality and/or profitability reasons. The supply planner needs to see the consolidated data, but the demand planning team still wants to have the detailed customer-level view for forecasting. To enable this, Jane must now use advanced Excel features, which she may or may not know how to do. Planning Process Complexity Management might insist on updating the spreadsheet with all open orders so that planners have a view of what is already committed. On the one hand, Jane can copy/paste or type in the information, again a tedious and error-prone process. On the other hand, Jane can figure out how to do the programming to read the data from source, but this is prone to programming errors as well as higher cost in terms of Jane s time. Mathematical Complexity Capacity is no longer a straightforward number because the product mix has a significant impact on the total throughput. Now Jane has to do matrix calculations to accurately project the capacity usage in the future. At the very least, this requires programming via formulas, thereby increasing odds for errors. Or Jane may want to experiment with more complex forecasting models such as ARIMA and Box-Jenkins. Unless she can use (and has the budget to buy) a few high-end addons, she will have to program some rather sophisticated formulas that don t really fit well to Excel s cell-based philosophy. As the spreadsheet becomes more complex, errors in data and calculations are more difficult www.forecasters.org/foresight FORESIGHT 9
to uncover. Further, researchers have found that the statistical formulas in Excel can be quite error prone (McCullogh, 2006). Scenario (or Uncertainty) Complexity The company needs to evaluate various what-if planning scenarios based on certain assumptions. The complexity arises not from the need to run these scenarios (which presumably can be done via multiple copies of the spreadsheet) but rather from the need to compare and contrast these scenarios and then be able to make decisions based on the comparison. Any changes in data content in one scenario also need to be copied into all the spreadsheets. Everything Is Connected Complexity As businesses grow more and more complex, most decisions have consequences in other areas. For example, a start-up company has enough capacity to meet all demand; a mature company, however, might have to make decisions on whom to short, based on profitability and other less tangible measures (such as a strategic account or a loss-leading sale). This process requires an optimization-based tool, or at the very least an engine that does these multitudes of calculations in a loop. In addition, there are business requirements that cannot be attempted in an Excel-based application because of the amount of work required: Need to implement a collaborative process for demand planning where various groups are updating the forecast (sales reps, marketing, sales managers, demand planners, etc.) Need for aggregation and disaggregation of user inputs on the fly. For example, - a sales rep updating the forecast for individual customers might want to do a quick reality check of the aggregate at the product level; - a sales manager with multiple sales reps reporting to him might look at the aggregate number and decide to turn it down at the high level itself. Built-in disaggregation routines in advanced forecasting and planning systems can proportionately decrease the forecast at lower levels. Need for flexibility to view and edit data at multiple levels Need for engines to do recursive calculations Need for realignment of product and customer names as they evolve or are acquired Need for netting open orders from the forecast before sending it to the schedulers Need to incorporate other streams of data such as inventories, bills of materials, and manufacturing costs Need to assimilate an acquired company s data into planning Need for speed in replanning. Today, most businesses need to have trigger-based dynamic planning, which requires a system that quickly goes through the steps of planning if the appropriate condition is breached. While all of the above can be programmed in Excel, typically it is not done because of the investment of time and effort required. As these requirements come to the fore, the original design of the spreadsheet may seriously hinder implementing some of these features. Off-the-shelf forecasting and planning systems typically (though not always) have built-in functionality to address these needs. In companies with these complexities and needs, planning with Excel forces the planners to focus primarily on the next big issue. There is only a skeleton planning process in place. The people themselves are the only thing between the company and the next disaster. As personnel are increasingly rewarded for putting out fires, it creates a culture that is even more focused on firefighting. Very often, this means there is no real planning in the business. THE NEXT STEP: MOVING UP If you decide your company has outgrown the Excel-based planning application, what is the next step? Most companies go for a best-of-breed advanced planning and scheduling (APS) application. These can be based on a proprietary database or on a database management system (DBMS). Examples of 10 FORESIGHT Fall 2013
DBMS include proprietary databases as well as Microsoft SQL server, Oracle, and DB2. These systems provide a way to deal with the complexities described above on the DBMS end of things. At the same time, these solutions should still be easy to use and integrate with the user s desktop. These applications gather data automatically from other systems throughout the company, not only at a preset frequency but also on demand, and they hold it on a centralized server where it may be accessed by many users. More importantly, they offer true business functionality, including specialized support for a wide range of supply chain processes, including inventory management, manufacturing, and value-chain collaboration. They also provide multiple security levels, allowing access based on roles. DBMS-based applications are not included with the purchase of a laptop; at the same time, most large companies have a few SQL servers running as part of their enterprise systems. DBMS applications can provide business value far exceeding their cost. Compared with Excel, solutions based on database management systems offer significant advantages: Visibility: Compared to Excel, a DBMS does a better job of sharing data across users. Data visibility is greatly improved along the supply chain and in the various groups such as sales, operations, finance, commercial, etc. Safety: In Excel, any unsaved data may be lost if a system crashes. Databases write data to the hard drive immediately and are usually backed up regularly at a corporate level. Volume and speed: High volumes of data bog down Excel; DBMS applications routinely manage high volumes of data. Related data: Storing related data together in a single table or spreadsheet is unwieldy and invites errors. Databases easily link tables of related data, such as customers and their orders. Future growth: A DBMS is a foundation to further extend the supply chain processes because it enables other advanced tools like planning and scheduling. Alerts: A DBMS enables rule-based alerts that can be emailed to users, thereby making them aware of the problem sooner and enabling quicker corrective action. By leveraging the functionality of a database-grounded APS application, a company s decision makers can immediately detect data and mapping errors. In addition, they can see how data relates across attributes. And there are several planning benefits that are not available in the spreadsheet world. Integration to other corporate systems (such as ERP, MES) is more reliable because it can be automated. This is significant because forecasting and planning require reliable input of past data to allow projection into the future. Mathematical calculations (e.g. for statistical forecasting and requirements planning) are usually more efficient in these systems because of the use of a programming language more suitable for intensive calculations. Collaboration is better because these systems are designed for multiple users. This means that there is user-based security to allow partitioning of data for read/write access. This eliminates the error-prone copy/paste methodology that exists in a spreadsheet world. Accountability is enhanced because these systems can keep track of who did what and therefore provide an audit trail. Participation from collaborators is higher because the system is easier to use. www.forecasters.org/foresight FORESIGHT 11
Furthermore, something of value can be provided to the person providing the value input (such as reports, alerts, and exceptions), and nonperformers can be tracked through the tracking system. This is a good example of carrot and stick. What-if scenario planning is better because the system can keep track of multiple versions of the plan. This enables a business to better understand the inherent uncertainties in their plans and to better prepare to deal with them. Accuracy metrics are easier to calculate and maintain because it is easy to keep past plans and compare plans to actuals. Management participation can be higher, especially if management-level views are created in the system. Management usually requires a graphical (MS PowerPointlike) view that can sit on top of the tabular (MS Excel-like) view. This again has a positive impact on the overall level of participation from the collaborators. TO SWITCH OR NOT TO SWITCH? Managers deciding to move to a databasefounded APS system should take several factors into account. First, they should carefully evaluate the time and risk involved in sticking with the spreadsheets. Second, they should weigh the costs of making their most experienced planners/forecasters crunch numbers when such personnel should really be thinking of multiple possible scenarios and ways to deal with them. Third, the tendency of human beings to resist change and stick with a known entity (the Excel spreadsheet in this case) should also be factored in. Lastly, management should evaluate the complexities as they currently exist, as well as the complexities in the near future. Deciding to switch from a homegrown Excel application to a vendor-provided APS tool is not easy. For starters, many people and departments (including planners and other users, IT personnel, and management) need to be convinced that switching is the way to go. Next, the timing needs to be right. A manager who thinks that the business has outgrown Excel-based tools may still need to wait for the right political and economic climate within the company before proposing the idea. For example, the disruptions caused by a natural or market event might make the case for a change, or the arrival of a senior executive with experience in these types of applications could also tip the scales. On the other hand, a major and ongoing global initiative to upgrade the company s ERP system may call for patience in asking for a new forecasting and planning system. Even when the timing is right, there are other factors to consider. A best-of-breed APS solution that runs on a DBMS is usually a significant expenditure, including software and implementation costs. Jane and her planning team will require training, incurring further expense. Future upgrades will mean additional financial investments as well. In addition, there is the danger that the selection committee might settle on a solution that does not adequately fit the company s needs. Moreover, it has often been observed that a centrally picked software solution ends up being supplemented by user-developed Excel spreadsheets for doing the actual forecasting and planning. If this happens, the entire investment going through the implementation could be a net loss. A slightly different variation of this might see the committee pick a solution that is a good fit now but is not flexible and adaptable towards future needs. As a result, it fails to adapt to changing business conditions and the associated forecasting and planning processes. AND KEEP IN MIND... The supply chain planning and forecasting field is poorly understood and undervalued in many companies because it deals with future and uncertain data, and this makes many people uncomfortable. Thus, the role of software is important to support people 12 FORESIGHT Fall 2013
in this process (Smith, 2009). It is also worth remembering that supply chain departments sit between two very powerful line organizations sales and operations and consequently expend time and energy trying to manage the strong gravitational pull of both. For these and many more reasons, top management is often wary of investing in supply chain planning systems. Once you have decided to move to a database-built forecasting and planning system, recognizing these realities increases the odds of getting your project(s) approved. Perhaps the best advice is to pick a system that can be implemented incrementally; the benefit of the first deliverable may well pay for the whole project. REFERENCES Krugman, P. The Excel Depression, http://www. nytimes.com/2013/04/19/opinion/krugmanthe-excel-depression.html?_r=0 McCullogh, B.D. The Unreliability of Excel s Statistical Procedures, Foresight, Issue 3 (February 2006), 44-45. Panko, R.R. (2008). What We Know About Spreadsheet Errors, Journal of End User Computing s special issue on Scaling Up End User Development, V10:2 (Spring 1998), 15-21. Revised May 2008. Smith, J. (2009). The Alignment of People, Process, and Tools, Part 1, Foresight, Issue 15 (Fall 2009), 12-18. Wailgum, T. (2007). Eight of the Worst Spreadsheet Blunders, http://www.cio.com/article/131500/ Eight_of_the_Worst_Spreadsheet_Blunders Wolf, T. (2012). Lack of Spreadsheet Error Detection Solution Can Lead to Nightmares for CFOs, http:// technews.tmcnet.com/channels/forensicaccounting/articles/267207-lack-spreadsheeterror-detection-solution-lead-nightmarescfos.htm Sujit Singh, CFPIM, CSCP, Chief Operating Officer of Arkieva (www.arkieva.com), is responsible for managing the delivery of software and implementation services, customer relationships, and the day-to-day operations of the corporation. ssingh@arkieva.com www.forecasters.org/foresight FORESIGHT 13
Forecasting with In-Memory Technology Tim Januschowski, Stephan Kolassa, Martin Lorenz, Christian Schwarz PREVIEW In-memory technology will radically transform the world of enterprise information systems according to our enterprising quartet of Tim, Stephan, Martin, and Christian. Their article provides a comprehensive description of the elements of in-memory technology and the ways it might improve the timeliness, flexibility, and speed of forecast generation. In the authors view, it may enable a new, interactive, enterprise-wide way of working with forecasting tools and applications. DATABASES AND ONLINE ANALYTICAL PROCESSING Forecasting Support Systems (FSS) traditionally leverage data from the enterprise information system (EIS), most of which are stored in centralized (relational) databases. Such databases form the technological foundation of most companies information infrastructure. Very recently, in-memory databases have entered the enterprise application mainstream (Plattner, 2009; Plattner & Zeier, 2011). The in-memory database, or more specifically in-memory technology, has the potential to change the foundation of EIS, in turn increasing the speed, flexibility, and timeliness of data analysis and forecast generation. Databases Databases are the foundation of any EIS. They are used to record most operational data (e.g., sales orders) that emerge from user interfaces or computational clients, such as cash registers. A database stores the information in a relational structure, called a table, each column of which stores similar data, such as dates. Each row represents a certain logical unit, such as characteristics of a product sale like product code, number of items, location of sale, and date. Over time, databases have been optimized to handle as much operational data input as possible. OLAP and OLTP With the ongoing trend to utilize operational data for forecasting and more generally for business intelligence (BI), databases have been experiencing a shift in workload. BI is characterized by online analytical processing (OLAP), such as calculation of maxima, sums, or arithmetical means of data stored in databases. In general, such operations touch only a few columns in a table, but they need to look into all entries of those columns. Depending on the size of the table, OLAP operations can impose a heavy load on the database, leading to delayed response times for the overall system. The increased importance of analytics in enterprise systems has led to the separation of systems into transactional systems (online transaction processing, or OLTP) and OLAP systems. Because databases were not able to handle all requests while guaranteeing satisfactory response times, data or business warehouses (BW) were introduced. BWs maintain subsets of the operational data and use data structures that are optimized for analytical processing. Unfortunately, having two systems leads to a variety of problems, including synchronization, higher maintenance costs, and loss of transactional information. In-memory technology offers a means to reunify the database foundation of enterprise information systems. IN-MEMORY TECHNOLOGY In-memory technology fuses well-known hardware and software concepts into a single, unified data foundation. Its key components include in-memory data storage, column orientation, stored procedures, and parallelism. While there are other important concepts (compression, for example), we will focus on the aspects noted above and give an overview of each, since they are most important for the forecasting function. Current implementations of in-memory technology 14 FORESIGHT Fall 2013
available on the market are limited to a small number of niche solutions. We expect that in-memory technology will become more prevalent, as all major database vendors are extending their portfolio towards this technology. In-Memory Data Storage In-memory databases keep all data in main memory, as opposed to conventional databases that use the hard disk as the primary storage device. While in-memory databases have been around since the 1980s, the available memory on large server systems was inadequate to hold the complete data set of large-enterprise applications. Today s modern servers, however, affordably provide multiple terabytes of main memory, allowing even the largest companies to keep in memory not only all transactional data but historic data as well. The advantage of keeping data in main memory is that data access to main memory is quick: it can be up to 10,000 times faster to access main memory than to access data on disk. Further, keeping data in main memory eliminates multiple I/O layers and simplifies database design, allowing for high throughput for any type of query (Plattner, 2009; Sikka et al, 2012). Even for main-memory databases, however, disks are necessary for data recovery after system crashes. That said, a conventional database with a lot of RAM will not offer the high performance of a main-memory database: whereas a disk-based database is designed for optimal usage of RAM, its bottleneck resource, the scarce resource for main-memory databases is the CPU cache. Optimizing for CPU cache is an art, requiring deep knowledge of the CPU s architecture and an understanding of data-access patterns: data that are accessed together need to reside next to each other in cache. Today, the data requirements of even the largest companies can be stored with in-memory technology as the primary source, because current servers can hold terabytes of highly compressed data with enough RAM to spare for complex forecasting tasks. Throughout our article, we assume such large servers as the hardware foundation. Although inmemory technology is not limited to such Key Points The inability of traditional databases to cope with an increasing load of online analytical processing (OLAP) has led to the separation of analytical and transactional processing. Having two systems, however, results in a variety of problems with synchronization, maintenance costs, and loss of transactional information. In-memory technology leverages multiple technological advances in-memory and columnoriented data storage, massive parallelization, and stored procedures to reunify the database foundation of enterprise information systems. We describe each of these advances and consider how they can benefit the forecasting function in terms of flexibility, timeliness, and speed. We believe that the integration of a forecasting support system within in-memory technology can enhance cross-functionality, as integrated forecasting applications are implemented by departments throughout a company. This integration, in turn, may lead to a greater penetration of forecasting analytics in the organization. high-end hardware, it facilitates our presentation here. Also, most large companies use such infrastructure. For in-memory systems based on large servers, data reloading is only necessary once at the start of the system or for disaster recovery. The former occurs once, and the latter can be compensated by any standard failure strategy. Column Orientation Relational databases map the logical twodimensional table structure into one-dimensional physical computer memory using either row-wise or column-wise storage (the values belonging to the same column are physically stored next to each other in main memory). Storing tables column-wise offers significantly better performance in situations where aggregates need to be computed over a large number of similar data types (e.g., calculation of an average), because all values of a column are placed next to each www.forecasters.org/foresight FORESIGHT 15
other in memory. Thereby, column orientation leverages the hardware-provided acceleration features of modern CPUs. This is especially important for tuning database performance towards CPU caches. Stored Procedures Any program that executes an algorithm, such as a forecasting library, needs to load the underlying data into main memory. With an in-memory database as a core component of in-memory technology, it makes sense to execute algorithms inside the database, as the data is already in main memory. Today, the data requirements of even the largest companies can be stored with in-memory technology as the primary source, because current servers can hold terabytes of highly compressed data with enough RAM to spare for complex forecasting tasks. A program that is executed inside a database is usually termed a stored procedure. The general aim of a stored procedure is to push application logic as close to the data as possible. The database can host complete libraries of algorithms for different types of applications, including forecasting. The common alternative to stored procedures is the retrieval of data from the database and execution of the algorithm outside the database, a slower read/write process. As more functionality becomes available as components of in-memory technology, one could envision in-memory technology as an in-memory computation engine. Forecasting tasks are typically I/O bound: a lot of input data leading to relatively little output data, with comparably simple computations. The transfer of large amounts of data from the database to the program can become a bottleneck. Therefore, forecasting algorithms are a strong candidate for implementation as stored procedures. Parallelism Today s enterprises tend to collect huge volumes of transactional data; as modern processing capabilities have increased immensely, this allows programmers to exploit data and task parallelism. While data-parallel operations like forecasting product-location time series can be executed on partitioned data sets, task-parallel algorithms used for optimization can be executed on the parallel resources. Since there is hardly any I/O bottleneck for in-memory databases, the number of processing units that can be fed with data at the same time is virtually unlimited. IN-MEMORY TECHNOLOGY AND FORECASTING Our belief is that in-memory technology can increase the flexibility, timeliness, and speed of forecast generation and analysis, possibly stimulating deeper penetration of forecasting tools into organizational operations. To understand this promise, we need to consider the creation and use of data cubes, the need for data synchronization, and the nature of the consuming applications for the data. Data Cubes As we have noted, the inability of traditional databases to cope with an increasing OLAP load has led to the separation of analytical and transactional processing, with business warehouses serving as the main source of data for use in analytics. The BW s central data structures are socalled data cubes. Cubes are used to precalculate aggregates for different dimensions (time, location, distribution channel, or product type). A data cube could be used to aggregate all point-of-sale (POS) data, for selection by date of sale, product type, amount, and location. Assume the chosen dimension for a particular cube is time, the key figure amount, and 16 FORESIGHT Fall 2013
that we want to aggregate the POS data on a weekly basis, disregarding location and product type. We typically would create an ETL process (an initialism for extract, transform, and load ) to produce a cube containing the amounts sold, aggregated by week. For example, a retailer with 2,000 stores selling 400 items a day would end up with 5.6 million daily records of POS data per week, assuming the stores are open 7 days a week. Since there are 52 weeks, a report showing the overall development of weekly sales over the past year would need to analyze 291.2 million records. A data cube that is preconfigured to store only the aggregated number of sales per week would need to store only 52 records. BW administrators set up cubes able to answer analytical inquiries. If there is no cube defined for a certain request, the BW system is unable to respond, confronting BW administrators with the need to define new cubes while maintaining the old ones. This process is time and money intensive, and sacrifices flexibility. Suppose we want to see which product was our best seller over the past year. Our example data cube does not contain that information, which is embedded in the transactional systems with their original, unaggregated data. But today s transactional systems with traditional databases may be slow to answer queries on such volumes of data in reasonable time. With in-memory technology, we d be able to aggregate data almost instantly, producing data cubes on the fly. Once a transactional system uses in-memory technology as its primary storage, analytical queries have access to all data captured by the EIS and not merely the data that was pre-calculated into cubes by ETL processes. There are no restrictions regarding type or depth of dimension, enabling the user to perform ad hoc queries to the data. The forecast user will have complete freedom to analyze any forecast direction, be it the overall sales volume for the next year, the expected number of sales for a particular product group in a specific region, or the predicted development of sales for a certain distribution channel. If transactional systems and analytical systems use the same data, flexibility of forecasting applications increases dramatically. And by allowing for navigating and drilling down into the entire set of transactional data as opposed to preaggregated subsets we can build in datavisualization tools to support the FSS user s data exploration and model development. BW administrators set up cubes able to answer analytical inquiries. If there is no cube defined for a certain request, the BW system is unable to respond, confronting BW administrators with the need to define new cubes while maintaining the old ones. With inmemory technology, we d be able to aggregate data almost instantly, producing data cubes on the fly. Data Synchronization The separation of transactional and analytical systems not only affects flexibility, it also raises the problem of data synchronization. Obviously, operational data represents the basis for any analytics. Transferring data from an OLTP into an OLAP system is often nontrivial and consumes significant computational resources. That is why data transfers are often batch processes, scheduled for times when the transactional system is running on a lower load. With regard to analytical systems, it follows that these systems do not contain the most current data, which is captured by the OLTP systems. Depending on the intervals between synchronizations, the data in OLAP systems may not contain information from the last day or even the last week. Any analytics carried out on OLAP systems do not take into account the most current data, which still resides in the transactional systems. As forecasters know, the most current data can make a great difference for the accuracy of the forecast. Consuming Applications and Processing Speed In-memory technology increases processing speed by getting data to the CPUs much faster. At first thought, it may not seem to make sense to pursue further improvement in the speed of forecasting because many current forecasting algorithms are incredibly fast. Nevertheless, we think that efforts www.forecasters.org/foresight FORESIGHT 17
to improve speed do make sense. Clients are becoming more and more sophisticated in terms of their statistical and analytical knowledge. Where they used to consume forecasts as output by FSS, they now hire data analysts and statisticians, with the explicit mission of tuning and improving models. Thus, FSS need to accommodate much more interactive use, including changing models and parameters, rerunning estimation and prediction across a large number of time series (so that we can actually trust the results!), and calculating diagnostic and out-of-sample forecasting accuracy metrics and all of this on the fly, not overnight. When FSS are used in a larger context with consuming applications that create replenishment orders or truck routes based on the forecast, the only way to really assess the performance of the entire system is to run simulations over time. Using historical data, we calculate forecasts, simulate the decisions these forecasts imply, evolve the system based on these decisions and the real data, step forward a day, and iterate. We thus assess how a particular forecast model and/or postforecast decision rule would have worked over a time period of half a year or longer. Such simulations currently run overnight in certain replenishment use cases, and we hope that in-memory technology will enable us to run simulations in a matter of minutes. This would allow us to explore a much wider range of possible forecasting models or subsequent decision rules. In certain use cases, forecasting speed is even a concern in day-to-day work. Let s consider supermarket replenishment. A supermarket chain s stores may close for the day at 10:00 pm. We can thus expect the day s sales and stock data to be uploaded to the data center around 10:15 pm. Then we start forecasting the next day s sales and calculate orders based on the forecasts and the current stocks. These orders for thousands of stores and tens of thousands of SKUs must be done by 2:00 am, because workers in the distribution centers start picking product to load on the delivery trucks at this time, so the trucks can leave at 5:00 am and finish delivering to stores at 7:00 am. If only 10% of a chain s 1,000 stores take just one minute longer to forecast and order their 30,000 SKUs, this translates into 100 minutes of additional running time, a major catastrophe in this process. Thus, although parallelization is already well established here, every millisecond shaved off forecasting a time series counts. And now add that retailers increasingly expect their forecasts to be based on daily or even transaction log data instead of weekly aggregates. CROSS-FUNCTIONALITY Classically, the forecasting function within an organization pulls its data from the central enterprise information system or BW, passes it to specialized forecasting software that resides outside the database, and uploads the forecasts to the system again. Having forecasting functions available in the database itself (as stored procedures) will lead to a greater availability and visibility of forecasts and forecasting within the organization. The tight integration of an FSS into in-memory technology thus delivers on the baseline FSS requirement that Stavros Asimakopoulos (2012) has pointed out: cross-functionality, i.e., integrated forecasting applications that can be used by various departments. Furthermore, we believe that trust in forecasts will increase because the forecasting users can access the most relevant and 18 FORESIGHT Fall 2013
timely data whenever they need it, instantly and playfully interacting with the data. In addition to the data-exploration facilitation mentioned earlier, there are also modeling tools that allow for an easy setup of complex forecasting pipelines from the existing forecasting functions. Additional forecasting algorithms can also be implemented directly in the database. If you link in-memory technology with specific FSS functionality, vendors might appreciate its usefulness and consider introducing it for particular tasks for their software. Furthermore, vendors of FSS can concentrate on the algorithmic parts of their software, because they work directly on the transactional data and not on a transformed copy. Overall, this echoes the fifth of the nine guiding principles for forecasting support systems proposed by Robert Fildes and Paul Goodwin (2012): The organization s FSS should be easy to use, easy to understand, and easy to improve and extend. And this ease of use and deep integration will, in turn, help lead to a greater penetration of forecasting analytics in the organization. DISTRIBUTED COMPUTING In-memory technology is not the only technological proposal to improve the forecasting function. The most prominent alternative is distributed computing in which a large cluster of computer nodes serve as the hardware foundation. The architecture promises greatly increased processing speeds at reasonable costs, primarily through parallelization of both (disk) data access and data processing. Distributed computing can be combined and in fact already has successfully been combined with in-memory computing, thus leveraging the advantages of both architectures. Ultimately, deciding which technology is more appropriate for forecasting boils down to two questions, perhaps of a nature more philosophical than methodological. First, is scaling out (distributed computing on a number of off-the-shelf nodes) or scaling up (in-memory technology on a single high-end server) the more appropriate architecture for forecasting? Since the data of even the largest enterprises can fit into a single in-memory database, we believe that in-memory technology will accommodate most enterprise-scale forecasting needs rather well. It is worth mentioning that distributed usage of main memory is an active area of research (for example, RAM cloud, a research initiative at Stanford), and massive-scale in-memory-technology instances are on the way (in the high terabyte area) that allow for scale-outs using in-memory technology. Second, there is the question of the nature of the forecasting process. We believe that forecasting is often an explorative task, requiring continuous refinement and user interaction to come up with a satisfying result. Using disk-based storage and distributed data processing often delivers some benefits, but it also implies a batch-style interaction, a major flaw for interactive systems. One example is how to understand misguided, far-off forecasts from automated processes. These need to be understood well; and here, an interaction with an OLAP cube is essential to figure out if bottom-up, topdown, middle-out, etc. techniques help in forecasting the time series. In our view, inmemory technology strikes a good balance between interactive and batch-processing capabilities. CONCLUSIONS Hasso Plattner Institute s joint projects with SAP s Innovation Center have explored the power of in-memory technology for forecasting (Januschowski and colleagues, 2012), and the results point to two major impacts: An increasing role for forecasting within the enterprise information system, thanks to improved flexibility, speed, and cross-functionality; A change in the way forecasting is conducted, from traditional reliance on a static batch process that produces an automatically generated document or report, to a more dynamic interactive process that permits and encourages greater experimentation with forecasting methods and procedures. Ultimately, we may even see a new perception of forecasting that will drive the evolution of business processes. www.forecasters.org/foresight FORESIGHT 19
tim@januschowski.de Dr. Tim Januschowski is a researcher and developer at SAP s recently founded Innovation Center in Potsdam, Germany. His main expertise is in operations research, in particular constraint/ integer programming. At SAP, he led a project in demand forecasting on inmemory technology. Dr. Stephan Kolassa, an associate editor at Foresight, is a senior research expert at the Center of Excellence in Forecasting & Replenishment at SAP Switzerland. He currently leads the research effort involved in developing a unified demand forecast for retail on SAP s in-memory technology implementation SAP HANA. stephan.kolassa@sap.com References Asimakopoulos, S. (2012). Forecasting Software: Improving the User Experience, Foresight, Issue 26 (Summer 2012), 34-39. Fildes, R. & Goodwin, P. (2012). Guiding Principles for Forecasting Support Systems, Foresight, Issue 25 (Spring 2012), 10-15. Januschowski, T., Lorenz, M., Folkerts, E., Heimburger, R., Akkas, A., Simchi-Levi, D. & Youssef, N. (2012). Demand Forecasting with Partial POS Data Using In-Memory Technology, Electronical Proceedings of 32nd International Symposium on Forecasting in Boston. Plattner, H. (2009). A Common Database Approach for OLTP and OLAP Using an In-Memory Column Database, Proceedings of the 35th SIGMOD International Conference on Management of Data. Plattner, H. & Zeier, A. (2011). In-Memory Data Management - An Inflection Point for Enterprise Applications, Springer. Sikka, V., Faerber, F., Lehner, W., Cha, S. K., Peh, T. & Bornhoevd, C. (2012). Efficient Transaction Processing in SAP HANA Database: The End of a Column Store Myth, Proceedings of the 2012 ACM SIGMOD International Conference on Management of Data. Martin Lorenz is a research assistant at the Hasso Plattner Institute, Potsdam, Germany. He has been part of the research group Enterprise Systems and Integration Concepts since August 2010. His main research interest is in-memory technology and its implications on existing programming models. He led a research effort on demand forecasting on in-memory technology conducted jointly with MIT and SAP. martin.lorenz@hpi.uni-potsdam.de Christian Schwarz, a research assistant at the Hasso Plattner Institute, is part of the Enterprise Systems and Integration Concepts research group. He was part of the first HANA project, evaluating the applicability of in-memory database technology for the reunification of transactional and analytical enterprise systems and now works on solutions for high-performance, enterprise-scale data challenges with in-memory technology. christian.schwarz@hpi.uni-potsdam.de 20 FORESIGHT Fall 2013
Financial Forecasting The Future of Financial Market Forecasting: Five Trends Worth Watching Jeffrey Mishlove Preview Jeffrey Mishlove, author of The Alpha Interface series of empirical research on the financial markets, highlights five trends that have the power to alter the landscape of financial forecasting. His message is essential food for thought for all of us in the forecasting profession: only rarely is it possible or even advisable to stick to our tried-and-true methods of the past. It is crucial for forecasters to stay on top of advances in such areas as constantly increasing computer capability, natural language processing, and expanding the power and complexity of algorithms. INTRODUCTION: ADAPTIVE MARKETS There is a view that financial markets are ecological systems in which different groups ( species ) compete for scarce resources. Called the adaptive markets hypothesis (AMH), it posits that markets will exhibit cycles where competition depletes existing trading opportunities, and then new opportunities appear. The AMH predicts that profit opportunities will generally exist in financial markets. While competition will be a major factor in the gradual erosion of these opportunities, the process of learning is an equally important component. Higher complexity has the effect of inhibiting learning strategies so that the more complex ones will persist longer than the simple ones. Some strategies will decline as they become less profitable, while other strategies may appear in response to the changing market environment. Profitable trading opportunities fluctuate over time, so strategies that were previously successful will display deteriorating performance, even as new opportunities appear. In this article, I highlight five relatively new, complex approaches that I believe will come to characterize the landscape of financial forecasting over the next several years. I. THE RISE OF THE SUPERCOMPUTER In this era of cloud computing, big data, server farms, and the smartphone in your pocket that s vastly more powerful than a roomful of computers of previous generations, it can be easy to lose sight of the very definition of a supercomputer. The key is capability, or processing speed, rather than capacity, or memory. For financial forecasters, the particular computing capability of interest is the probabilistic analysis of multiple, interrelated, highspeed, complex data streams. The extreme speed of global financial systems, their hyperconnectivity, large complexity, and the massive data volumes produced are often seen as problems. Moreover, the system components themselves increasingly make autonomous decisions. For example, supercomputers are now performing the majority of financial transactions. High-frequency (HF) trading firms represent approximately 2% of the nearly 20,000 www.forecasters.org/foresight FORESIGHT 21
Key Points Five trends likely to have the greatest impact in financial forecasting over the coming decade are: the lowered cost and greater power of parallelprocessing supercomputers breakthroughs in contextual, natural language processing via machine more powerful algorithms for pattern recognition ability to identify the unique characteristics of individual, expert forecasters advance warnings of bubbles and crashes trading firms operating in the U.S. markets, but since 2009 have accounted for over 70% of the volume in U.S. equity markets and are approaching a similar level of volume in futures markets. This enhanced velocity has shortened the timeline of finance from days to hours to nanoseconds. The accelerated velocity means not only faster trade executions but also faster investment turnovers. At the end of World War II, the average holding period for a stock was four years. By 2000, it was eight months; by 2008, two months; and by 2011, twenty-two seconds. The flash crash of May 6, 2010 made it eminently clear to the financial community (i.e., regulators, traders, exchanges, funds, and researchers) that the capacity to understand what had actually occurred, and why, was not then in place. In the aftermath of that event, the push was begun to try applying supercomputers to the problem of modeling the financial system, in order to provide advance notification of potentially disastrous anomalous events. Places such as the Center for Innovative Financial Technology (CIFT) at the Lawrence Berkeley National Laboratory (LBNL) and the National Energy Research Scientific Computing (NERSC) center assumed leading roles in this exploration. Fortunately for many forecasters, you no longer need to affiliate with a governmentfunded megalaboratory in order to access high-performance computing power. Although the only way to get high performance for an application is to program it for multiple processing cores, the cost of a processor with many cores has gone down drastically. With the advent of multicore architecture, inexpensive computers are now routinely capable of parallel processing. In the past, this was mostly available only to advanced scientific applications. Today, it can be applied to other disciplines such as econometrics and financial computing. It is worth taking a moment here to look at the size of the market data problem. Mary Schapiro, chair of the SEC from 2009 through 2012, estimated the flow rate of the data stream to be about twenty terabytes per month. This is certainly an underestimation, especially when one considers securities that are outside the jurisdiction of the SEC, or bids and offers that are posted and removed from the markets (sometimes in milliseconds). Nevertheless, supercomputers involved in scientific modeling such as weather forecasting, nuclear explosions, or astronomy process this much data every second! And, after all, only certain, highly specialized forecasting applications are going to require real-time input of the entire global financial market. Many forecasting applications do well enough with only a small fraction of this data. Let s look at two unique and creative examples of financial-forecasting research that could not have been accomplished without the assistance of supercomputers. Reverse Engineering a Financial Market Wiesinger and colleagues (2013) of the Swiss Institute of Technology in Zurich developed a method to reverse engineer real-world financial time series. They modeled financial markets as being made of a large number of interacting, rational Agent-Based Models (ABMs). ABMs are, in effect, virtual investors. Like real investors and traders, they have limited knowledge of the detailed properties of the markets they participated in. They have access to a finite set of strategies to take only a small number of actions at each time-step and also have restrictions on their adaptation abilities. 22 FORESIGHT Fall 2013
Given the time series training data, genetic algorithms were used to determine what set of agents, with which parameters and strategies, optimized the similarity between the actual data and the data generated by an ensemble of virtual stock markets peopled by software investors. By optimizing the similarities between the actual data and that generated by the reconstructed virtual stock market, the researchers obtained parameters and strategies that revealed some of the inner workings of the target stock market. They validated their approach by out-ofsample predictions of directional moves of the Nasdaq Composite Index. The following five types of ABMs were employed: Minority Game. Here, an agent is rewarded for being in the minority. An agent has the possibility not to trade, thus allowing for a fluctuating number of agents in the market. Majority Game. An agent is rewarded for being in the majority instead of in the minority. Delayed Majority Game. Like the majority game, but the return following the decision is delayed by one time-step. Delayed Minority Game. This game is like the minority game, except for the delayed payoff. Mixed Game. Here 50% of the agents obey the rules of the majority game, with the other 50% obeying the rules of the minority game. Figure 1. Out-Of-Sample Success Rate of Agent Based Models Predicting NASDAQ Composite The models were trained on simulated market data using a genetic algorithm. They were then tested on out-of-sample, actual data from the Nasdaq Composite Index. The results are shown in figure 1. All agent-based models performed to a level of statistical significance. This was largely due to the success of the models in trending markets. Interestingly, both the trendfollowing and contrarian strategies worked well during the trending markets. Similar results are reported by active traders. CEO Network Centrality and Corporate Acquisitions BoardEx is a business intelligence service used as a source for academic research concerning corporate governance and boardroom processes. It holds in-depth profiles of over 400,000 of the world s business leaders, and its proprietary software shows the relationships between and among these individuals. This information is updated on a daily basis. El-Khatib and colleagues (2012) from the University of Arkansas used a supercomputer to analyze this data. They calculated four measures of network centrality Degree centrality, Closeness centrality, Betweenness centrality, and Eigenvector centrality for each executive connected into such business networks. Degree centrality was the sum of direct ties an individual had in each year. Closeness centrality was the inverse of the sum of the shortest distance between an individual and all other individuals in a network. Betweenness centrality measured how often an individual rested on the shortest path between any other members of the network. Eigenvector centrality was a measure of the importance of an individual in the network, taking into account the importance of all the individuals that were connected in the network. The amount of computation was daunting and required storing information for each and every possible pair of business leaders in computer memory. Processing the Closeness factor, for example, took about seven days on the Star www.forecasters.org/foresight FORESIGHT 23
of Arkansas supercomputer at the Arkansas High-Performance Computing Center. The final result, interestingly, showed that CEOs more centrally positioned were more likely to bid for other publicly traded firms, and these deals carried greater value losses to the acquirer as well as greater losses to the combined entity. The researchers followed the CEOs and their firms for five years after their first value-destroying deals, and found that firms run by centrally positioned CEOs better withstood the external threat from market discipline. Moreover, the managerial labor market was less effective in disciplining centrally positioned CEOs because they were more likely to find alternative, highpaying jobs. Ultimately, they showed that CEO personal networks could have their darker side well-connected CEOs became powerful enough to pursue any acquisitions, regardless of the impact on shareholder wealth or value. Figure 2. Acquisitions By Executives With Low and High Network Centrality 200 million pages in three seconds. Such skills are well suited for the finance industry. Watson can make money for IBM by helping financial firms identify risks and rewards. The computer can go through newspaper articles, documents, SEC filings, and even social-networking sites to try to make some sense out of them. This approach is not entirely new. Many high-frequency traders have trained algorithms to capture buzzing trends in the social-media feeds. The hitch here, however, is that they haven t been fully taught the dynamics of accurate context of the information being diffused. The results at times can be fascinating, odd, even comical and, if you will, potentially ominous. A good example: on February 28, 2011, during the annual hubbub and media excitement of the U.S. movie industry s Academy Awards when actress Anne Hathaway hosted the event televised worldwide, stock prices of Berkshire Hathaway rose by 2.94%. Figure 3, of BRK.B stock, shows only one of many instances where the curves of Ms. Hathaway s career have shaped the stock price of this particular conglomerate. It is interesting to note that, in this instance, traders realized the error and the huge stock price jump reversed itself the very next day on March 1. Figure 3. Daily Stock Chart As shown in Figure 2, across all four dimensions of CEO network centrality, the research study clearly demonstrated that CEOs with the most social connectivity were those most willing to make risky, and generally unprofitable, acquisitions. II. FORECASTING WITH NATURAL- LANGUAGE PROCESSING IBM s Watson computer, which beat champions of the quiz show Jeopardy! in a wellpublicized face-off two years ago, is now being employed to advise Wall Street on risks, portfolios, and clients. Citigroup Inc., the third-largest U.S. lender, was Watson s first financial services client. The unique Watson algorithms can read and understand Berkshire Hathaway (BRK.B) daily stock chart, February 2011, showing an unusual price jump on the day that actress Anne Hathaway hosted the Academy Awards ceremony. 24 FORESIGHT Fall 2013
Recent events such as these demonstrate the shortcomings in the contextual discrimination of natural-language processing. Fortunately, things are changing rapidly. A Google search on the phrase natural language processing yields over 3.1 million results. This is a very hot area for forecasting, as natural-language processing of news stories, tweets, and message-board posts have now been the focus of dozens of research studies. Although the original Watson computer contained $3 million worth of hardware alone, IBM is now releasing a new server that can be purchased for about $67,000 complete. It includes a scaled-down version of the brain IBM engineered to build Watson. Reuters publishes 9,000 pages of financial news every day. Wall Street analysts produce five research documents every minute. Financial services professionals receive hundreds of emails a day. And these firms have access to data about millions of transactions. The ability to consume vast amounts of information to identify patterns and formulate subsequent hypotheses naturally makes Watson-style computing Figure 4. Comparison of Trend-Following Algorithms an excellent solution to making informed decisions about investment choices, trading patterns, and risk management. III. SMARTER PATTERN RECOGNITION AND PATTERN RECALL The following example of a breakthrough in pattern-recognition technology is reprinted from my book The Alpha Interface: Empirical Research on the Financial Markets, Book Two. It exemplifies the level of creativity and power available to the new generation of personal computers with parallelprocessing capacity. Fong and colleagues (2012) from the University of Macau, China, and the University of Riyadh, Saudi Arabia presented a new type of trend-following algorithm more precisely, a trend recalling algorithm that operated in a totally automated manner. It worked by partially matching the current trend with a proven successful pattern from the past. The algorithm drew upon a database of 2.5 years of historical market data. The system spent the first hour of the trading day evaluating the market and comparing the initial market pattern with hundreds of patterns from the database. The rest of the day was spent trading based on the match that was eventually made, and using sophisticated trading algorithms to avoid conditions where volatility was either too high or too low. The schematic of the trading system is diagrammed below. Their experiments, based on real-time Hang Seng index futures data for 2010, showed that this algorithm had an edge in profitability over the other trend-following methods. The new algorithm was also compared to time-series forecasting types of stock trading. In simulated trading during 2010, after transaction costs, the system attained an annual return on investment of over 400%, making over 1,100 trades. Figure 4 compares the trend-recalling protocol to four other trend-following algorithms (as listed on the top of the chart): Reprinted from Fong, Tai, and Pichappan (2012) with permission. This mind-boggling result of a return greater than 400% is the most robust I have encountered thus far in my survey of the scientific literature on the financial markets. It requires the creation of a unique database for each market being traded. In all likelihood, not every market will provide results as strong as these found in the Hang Seng index. However, there are many potential markets that could be exploited in this manner. Considering the costs of developing trend-recalling algorithms and also creating unique databases for each market, the www.forecasters.org/foresight FORESIGHT 25
Figure 5. Out of Sample. Next Day Test Results for Per User Ranking potential for success seems considerable for those who are equipped and ready to pursue this path. IV. GREATER SKILL IN IDENTIFYING EXPERT FORECASTERS Bar-Haim and colleagues (2011) from the Hebrew University of Jerusalem downloaded tweets from the StockTwits.com website during two periods: from April 25, 2010, to November 1, 2011, and from December 14, 2010, to February 3, 2011. A total of 340,000 tweets were downloaded and used for their study. A machine learning system was used to classify the tweets according to different categories of fact (i.e., news, chart pattern, report of a trade entered, report of a trade completed) and opinion (i.e., speculation, chart prediction, recommendation, and sentiment). A variety of algorithms were then employed to determine if some microbloggers were consistently more expert than others in predicting future stock movement. Figure 5 shows cumulative results for the first twenty users in the per user model. This model learned from the development set a separate Support Vector Machine regression model for each individual user, based solely on that user s tweets. The approach was completely unsupervised machine learning, and required no manually tagged training data or sentiment lexicons. The results showed that this model achieved good precision for a relatively large number of tweets, and for most of the data points reported in the table the results significantly outperformed the baseline. Overall, these results showed the effectiveness of two machine learning methods for finding experts through unsupervised learning. While the accuracy level declined as additional users were included, the results were statistically significant for the first eleven users, and again for users seventeen through twenty. Overall, these results illustrate the importance of distinguishing microblogging experts from nonexperts. The key to discovering the effectiveness of individual microblog posters was to develop unique regression models for each poster, rather than relying on a one-size-fits-all heuristic. It was also important to understand the relevant time frames involved. Another study, for example, found that retail traders responded most favorably to recommendations of message-board posters who had been most accurate during the previous five days. V. BETTER RECOGNITION OF BUBBLES AND CRASHES The theoretical underpinnings of bifurcations and phase transitions in finance have been around for many years. In the 1970s, the mathematical framework of catastrophe theory became a popular field of research, as it provided one of the first formalizations that included notions both of equilibrium and nonlinear-state transitions. This formalism resulted in parsimonious descriptions of bull and bear markets and market crash dynamics. It employed a small number of parameters such as the relative proportion of technical traders (those who base their strategies on historical prices) and fundamentalists (those who base their strategies on the underlying business dynamics). Since 1999, many researchers have argued that financial bubbles and crashes exhibit unique mathematical signatures known as log-periodic oscillations. This refers to a sequence of oscillations with progressively shorter cycles of a period decaying according to a geometrical series. The pattern has been documented in unrelated crashes from 1929 to 1998, on stock markets and currencies as diverse as those in the U.S., Hong Kong, and Russia, as well as for oil and even real 26 FORESIGHT Fall 2013
estate. Recent refinements indicate that this approach is becoming increasingly sophisticated (Filimonov and Sornette, 2013), and with the onset of massive databases and high-performance computing, it has become possible to empirically study the more granular dynamics of the relationships between securities. Researchers are coming to understand these processes with ever greater mathematical sophistication. As one example of several recently published, Quax and colleagues (2013) from the University of Amsterdam in the Netherlands have measured this selforganized correlation in terms of the transmission of information among units. They recently introduced the information dissipation length (IDL) as a measure of the characteristic distance of the decay of mutual information in the system. As such, it can be used to detect the onset of long-range correlations in the system that precede critical transitions. The higher the IDL of a system, the larger the distance over which a unit can influence other units, and the better the units are capable of a collective transition to a different state. Because of this, they can measure the IDL of systems of coupled units and detect their propensity to a catastrophic change. As a demonstration of the IDL as a leading indicator of global instability of dynamical systems, they measured the IDL of risk trading among banks by calculating the IDL of the returns of interest-rate swaps (IRS) across maturities. CONCLUSION The markets never rest. Almost all new developments eventually become incorporated into the collective intelligence of the market itself. Only in the rarest of circumstances is it possible for financial forecasters to stick to the tried-and-true methods of the past. For the most part, it is essential to stay ahead of the curve. In today s global economy, as this article has shown, new creative forecasting approaches and solutions are coming from places such as Macau, Saudi Arabia, the Netherlands, and Israel. Most trends highlighted in this article will come to lose their potency as the market as a whole digests them. For the foreseeable future, though, computers will continue to operate at faster speeds allowing for greater granularity of analysis. And as for human creativity, it seems to be an almost infinite resource. REFERENCES Bar-Haim, R., Dinur, E., Feldman, R., Fresko, M. & Goldstein, G. (2011). Identifying and Following Expert Investors in Stock Microblogs, Proceedings of the Conference on Empirical Methods in Natural Language Processing, 1310-1319, Association for Computational Linguistics. El-Khatib, R., Fogel, K., & Jandik, T. (2012). CEO Network Centrality and Merger Performance, Available at SSRN 2024484. Filimonov, V. & Sornette, D. (2013). A Stable and Robust Calibration Scheme of the Log-Periodic Power Law Model, Physica A: Statistical Mechanics and its Applications, 392(17), 3698-3707. Fong, S., Tai, J. & Pichappan, P. (2012). Trend Recalling Algorithm for Automated Online Trading in Stock Market, Journal of Emerging Technologies in Web Intelligence, 4(3), 240-251. Quax, R., Kandhai, D. & Sloot, P. M. (2013). Information Dissipation as an Early-Warning Signal for the Lehman Brothers Collapse in Financial Time Series, Scientific Reports, 3, 1898; DOI:10.1038/ srep01898 (2013). Wiesinger, J., Sornette, D. & Satinover, J. (2013). Reverse Engineering Financial Markets with Majority and Minority Games Using Genetic Algorithms, Computational Economics, 1-18. Jeffrey Mishlove is the author of The Alpha Interface series of books about empirical research on the financial markets. For 15 years he hosted the national public television series Thinking Allowed. He and his wife, Janelle Barlow, own the U.S. license for the international business consulting and training groups, TACK and TMI. Jeff received an interdisciplinary doctoral degree from the University of California, Berkeley in 1980. jeff@alphainterface.com. www.forecasters.org/foresight FORESIGHT 27
Forecaster in the Field Interview with Jeffrey Mishlove, author of The Alpha Interface series How did you get started in financial forecasting? In the mid-1990s, I was serving as president of a nonprofit organization called the Intuition Network, which was dedicated to helping people cultivate and apply their intuitive abilities. A number of our members were active in the financial markets, and several were eager to instruct me in their approaches. Before long, I complemented the intuitive approach with a study of technical analysis and neural networks. The results proved to be surprisingly successful in a short period of time. How did you come to create The Alpha Interface book series? One day, my wife, Janelle Barlow, came to me and said, I have a billion-dollar idea for you. In the course of our conversation, I realized that there was a huge body of recent, empirical research about the operation of the financial markets. Old theories concerning the random walk and the efficient market were being challenged on multiple fronts. Ironically, most traders, investors, and even economists were unaware of the full scope of this research. The financial forecasting landscape is changing rapidly. Because I have the ability to digest a large volume of information and make it understandable to the general public, I took this project on. Three books have now been released. What other forms of forecasting have been important in your development? As a doctoral student at UC Berkeley with an interdisciplinary program in parapsychology, I became familiar with the research literature in such areas as precognition and remote viewing. Although I began as a skeptic in my inquiry into these topics, after carefully studying the evidence that s accumulated for more than a century, I became convinced that this is an important field. Some outstanding scientists have contributed to this area of study. My first book, The Roots of Consciousness (1975), dealt extensively with these phenomena. I also had the privilege of hosting the national public-tv series Thinking Allowed over a 15-year period. During that time, I conducted intimate interviews with thought leaders in philosophy, psychology, health, science, and spirituality. Each of my guests in their own way was attempting to forecast and to influence the future of humanity. A number of them were professional forecasters. Their influence rubbed off on me. So it s fair to say my approach is interdisciplinary. Do you work with businesses to improve their forecasting accuracy? As scientific as business forecasting can be, there s always a human element. My wife Janelle and I own the licenses in the United States for TMI and TACK, two international training and consulting consortiums. These businesses work with organizations to optimize the full potential of their systems and staff. We like to say that TACK is about getting customers, and TMI is about keeping them. Our clients are typically, but not always, multinational companies. Missed forecasts often occur when different branches of a business are not aligned with each other, nor with the brands they represent. We help them to identify and then correct these gaps in their organizational alignment and in the collective skill sets of their staffs. Tell us about your other interests. In 2010, two friends and I discovered the first confirmed dinosaur footprints found in the state of Nevada. They were just a few miles from my home in Las Vegas, in the nearby Red Rock Canyon National Conservation Area. 28 FORESIGHT Fall 2013
M c Connell Chase Software Works, llc Comprehensive Highly-Adaptable Solutions SaleS and OperatiOnS planning (S&Op) Proactively balance aggregate demand and supply Derive consensus decisions and improve operational teamwork across marketing, sales, production, distribution, capacity, financial, and strategic management Track plans, KPIs, and assumptions for ongoing improvement 4 Configurable S&OP standard planning formats (FTO, MTO, MTS, hybrids) 4 Rough-cut capacity planning 4 Financial integration 4 Scenarios 4 Simulation mode 4 Performance to plan analysis 4 Change of plan analysis 4 Volume / Financial / UOM conversions 4 Key performance indicators 4 Process Management 4 Extensive numeric and graphical reporting FOrecaSting and demand planning Proactively balance aggregate demand and supply Derive consensus decisions and improve operational teamwork across marketing, sales, production, distribution, capacity, financial, and strategic management Track plans, KPIs, and assumptions for on-going improvement 4 Configurable S&OP standard planning formats (FTO, MTO, MTS, hybrids) 4 Rough-cut capacity planning 4 Financial integration 4 Scenarios 4 Simulation mode 4 Performance to plan analysis 4 Change of plan analysis 4 Volume / Financial / UOM conversions 4 Key performance indicators 4 Process Management 4 Extensive numeric and graphical reporting inventory OptimizatiOn and replenishment Accomplish customer service goals Minimize inventory and operational cost Optimize replenishment planning efficiency and productivity 4 Safety stock optimization 4 Time-phased replenishment planning based on on-hand inventory, orders, shipments, forecast, lot sizing, lead times 4 DRP Distribution system maintenance and operations 4 Purchasing management 4 Integration to Forecasting and S&OP 4 Exception flagging 4 Prioritization 4 Extensive numeric and graphical reporting Improve accuracy Improve process integration, efficiency, and productivity Efficiently handle data extraction, preparation, and analysis; develop comprehensive Forecasting and S&OP Data Warehouse Call Now for Your Free Live Web Demo 773 528 2695 P.O. Box 159 n Winnetka, Illinois 60093 www.mcconnellchase.com mcswcontact@mcconnellchase.com n 773 528 2695
AnAlytic Solver PlAtform easy to Use Predictive and Prescriptive Analytics in excel How can you get results quickly for business decisions, without a huge budget for enterprise analytics software, and months of learning time? Here s how: Analytic Solver Platform does it all in Microsoft Excel, accessing data from PowerPivot and SQL databases. Sophisticated Data Mining and Predictive Analytics Go far beyond other statistics and forecasting add-ins for Excel. Use classical multiple regression, exponential smoothing, and ARIMA models, then go further with regression trees, k-nearest neighbors, and neural networks for prediction, discriminant analysis, logistic regression, k-nearest neighbors, classification trees, naïve Bayes and neural nets for classification, and association rules for affinity ( market basket ) analysis. Use principal components, k-means clustering, and hierarchical clustering to simplify and cluster your data. Simulation, Optimization and Prescriptive Analytics Analytic Solver Platform also includes decision trees, Monte Carlo simulation, and powerful conventional and stochastic optimization for prescriptive analytics. Help and Support to Get You Started Analytic Solver Platform can help you learn while getting results in business analytics, with its Guided Mode and Constraint Wizard for optimization, and Distribution Wizard for simulation. You ll benefit from User Guides, Help, 30 datasets, 90 sample models, and new textbooks supporting Analytic Solver Platform. Analytic Solver Platform goes further than any other software with Active Support that alerts us when you re having a problem, and brings live assistance to you right where you need it inside Microsoft Excel. Find Out More, Download Your Free Trial Now Visit www.solver.com to learn more, register and download a free trial or email or call us today. Tel 775 831 0300 Fax 775 831 0314 info@solver.com
The road to world class demand and supply planning starts with Smoothie Demand Works offers a range of forecasting, demand and supply management solutions beginning with affordable desktop products and progressing to integrated server or cloud-based solutions for the whole enterprise. Smooth Progression With our Smooth Progression methodology, you can start simply with a desktop solution or go straight to a server. You can also progress functionally from forecasting to safety stock optimization, collaboration or multi-echelon supply planning. Demand Works is the only company that lets you start where you want and take it as far as you want to go. Join Other Champions Hundreds of companies use Smoothie including many of the world s largest, most successful global leaders. Our customer list also includes scores of excellent mid-sized companies. Contact us to learn how our range of demand and supply planning solutions can help you to improve your business. Demand Works Co. www.demandworks.com +1 484-653-5345
Over 1,250 Logility customers are proof that our comprehensive demand planning solutions help improve forecast accuracy and supply chain performance. Download your free copy of 7 Methods that Improve Forecast Accuracy white paper at www.logility.com/moo to learn how. Moo..ve forward with a better forecast using Logility Voyager Solutions. www.logility.com/moo
Forecast Pro TRAC Powerful, Flexible, Cost-Effective Forecasting Forecast Pro TRAC is the perfect solution for demand forecasting, team forecasting, creating forecasts and reports in support of your S&OP process, creating one number forecasts and much more. This comprehensive forecasting system includes: Proven forecasting methods Flexible forecast adjustments Multiple conversions and hierarchies Accuracy tracking and exception reporting Team forecasting Trusted by more than 35,000 users worldwide, Forecast Pro improves your forecasts, provides a solid foundation for your forecasting process and integrates easily with your existing software systems. Forecast Pro TRAC has freed me from number crunching and given me more time to analyze my business. Michael Pan Wakefi eld Canada, Inc. www.forecastpro.com Do You & Your Crew Want More Info? Download a free Demo Pack Request a WebEx demo using your own data
Interested in analytics? We can help you with that. The number one source for all things analytics In the analytics world, access to information is key. That s why INFORMS and the Analytics team have worked long and hard to provide up-to-date news, events and quality editorial. Let Analytics, complete with Analytics Magazine, www.analytics-magazine.org and the Analytics enewsletter, help you drive better business decisions today. Sign up for your FREE subscription! www.analytics-magazine.org! Sign up for the Analytics enews monthly analytics news updates sent right to your inbox! www.analytics-magazine.org
Book Reviews Demand and Supply Integration: The Key to World-Class Demand Forecasting by Mark A. Moon Reviewed by John Mello When I was asked to review Dr. Mark Moon s new book, I was naturally curious about the approach taken to the classic forecasting dilemma of demand/supply integration. The critical importance of matching supply to demand is nothing new to those who struggle with trying to keep customers happy without overburdening one s own company with excessive amounts of inventory. During my almost 30 years in the consumer packaged-goods industry, I saw firsthand what happens when supply and demand are not well matched from having seven warehouses full of components to having no inventory for a product relaunch and I became acutely aware of what can happen when forecasts are inaccurate. The mental scars of my experiences in industry have carried over into my academic career, where I have been involved in researching ways we can better match supply and demand through improved sales forecasting and S&OP. DSI vs. S&OP And it has to be said that, initially, where Moon s take on the subject is concerned, I was disappointed. The author presents the main idea behind demand and supply integration (DSI) as a single process to engage all functions in creating aligned, forward-looking plans and make decisions that will optimize resources and achieve a balanced set of organizational goals. This, after all, does not seem very different from S&OP. The author then goes on to explain how DSI is different: DSI is presented as more strategic in nature due to a longer planning horizon; it is characterized by more involvement with functions outside a firm s supply-chain operation, particularly sales, in implementing and executing the process of matching demand with supply. Furthermore, Moon maintains that S&OP has a tactical aura that prevents engagement from various company functions such as marketing, finance, and senior leadership while DSI does not. The author then goes on to admit that the goals of S&OP and DSI are similar, but given the number of failed S&OP implementations in the field, perhaps it is time for a branding campaign and an alternate label in the form of DSI. COLLABORATION IS THE KEY So while at first I questioned Dr. Moon s proposal that DSI is significantly different from S&OP, and that a new brand name would lift the image of the process of matching supply to demand, the more thought I have given it, the more open I have become to embracing www.forecasters.org/foresight FORESIGHT 35
the idea. If there is indeed a tactical aura around the term S&OP, if DSI can change the way people in organizations view the process towards a more positive attitude, and if DSI does encourage more collaboration within the process, then perhaps it is time for a new brand image. One point in DSI s favor is that it could serve as a framework or model for moving toward more integration, not only within a company but also between companies in a supply chain. The concept of DSI could be useful in moving processes such as sales forecasting beyond a single-firm focus toward a supplychain focus. The three principles that Moon advances for DSI it should be demand driven, collaborative, and disciplined are exactly what companies need to apply to achieve proper balance between supply and demand. The key is collaboration. I have come to believe that Dr. Moon is right: it may indeed be time to move beyond S&OP toward a more collaborative process that is strategic in nature, looks at a long-range-planning horizon, engages other functions beyond a firm s supply-chain management organization, and seeks to include other members up and down the company s supply chain. ACROSS THE SUPPLY CHAIN Demand and Supply Integration illustrates how DSI can be implemented across a supply chain through linking demand and supply plans across companies. Customers communicate their demand plans to a manufacturer, and these become the inputs to the manufacturer s demand forecast. The manufacturer in turn uses the demand forecast to develop operational plans, which are communicated back to the customer to be used to develop a capacity forecast. The manufacturer also communicates its demand plan to its tier-one suppliers for developing their operational plans, which are communicated back to the manufacturer. As the book points out, companies already use collaborative processes such as CPFR to share forecasts and plans, but this is typically done between two firms. More comprehensive collaboration could be gained if DSI processes were implemented across multiple tiers in a supply chain. FRAMEWORK OF A WELL-RUN DSI PROCESS The author explains that the book is not a primer on the detailed implementation of DSI. What it does nicely is to provide a framework for understanding the important elements of a well-run DSI process, including foundational principles, necessary components, and elements that need to be in place to make the process work. Along the way, the book also provides excellent advice pertaining to specific actions that demand forecasters should take to achieve accuracy, such as: techniques companies can use to smoke out and minimize some of the gameplaying tactics and other forms of bias in forecasting; questions they should ask customers concerning the processes they use to generate forecasts; ways they can measure forecasting performance; and methods they can use to obtain and incorporate market intelligence into sales forecasts. There are also excellent chapters on quantitative and qualitative forecasting techniques that would be very useful to readers who are unfamiliar with these topics, or who simply want to renew their knowledge of them. WHERE DOES OUR COMPANY STAND? Another excellent section of the book explains how companies can identify where they stand in terms of world-class forecasting, and it provides a framework with which companies can diagnose their forecasting problems to determine the areas that need to be focused on for improvement. They include functional integration, approach to forecasting, forecasting systems, and performance measurement. Companies can rank these aspects of their forecasting process within four stages, one being the lowest in quality and four being world-class, and use the methods described in the book to improve those areas that need development. I have seen this framework put into 36 FORESIGHT Fall 2013
practice as a member of several University of Tennessee sales forecasting audit teams, and can attest to its use as an excellent way to approach forecasting improvement. TARGET AUDIENCE So who should read this book? Moon sees his target audience as business professionals who manage demand-forecasting processes. It is written for practicing managers with the intent of giving practical advice on how to do demand forecasting better. While I would agree, I would expand upon that target audience to include VPs and C-level executives. Many of us who have tried to implement major process and systems changes have seen what happens when upper management gives lip service rather than real support. Without a good understanding of what DSI means and entails, executive-suite personnel will not likely buy into the necessary commitments in time, money, and people to implement an effective DSI process. Without a firm buy-in, DSI will probably not be set up and funded; even if it is, the results will likely be disappointing. This book provides higher-level personnel with the knowledge they need to get behind a DSI implementation: an excellent review of the philosophy behind DSI, what is required to put the process in place, and what firms can expect from implementing it. Other candidates for reading this book would include salespersons involved in forecasting, production planners, operationsplanning managers, finance managers, marketing personnel, and manufacturing managers anyone, really, who is involved in or affected by DSI. The more people understand its importance and how it can help companies achieve long-term goals, the better the chance that it will be adopted and implemented. Essentially, I have come to believe that Dr. Moon is right: it may indeed be time to move beyond S&OP toward a more collaborative process that is strategic in nature, looks at a long-range-planning horizon, engages other functions beyond a firm s supply-chain management organization, and seeks to include other members up and down the company s supply chain. Ed. Note: Is DSI simply a rebranding of S&OP, or is it a substantial change to the way things are done? The author would like to hear your thoughts on this matter. Please feel free to email John Mello at jmello@ astate.edu. FTPress: New Jersey, 2013 ISBN-10: For bulk purchases: corpsales@pearsontechgroup.com and International@pearsoned.com John E. Mello is Associate Professor of Marketing in the Department of Marketing and Management at Arkansas State University and Foresight s Editor for S&OP. Prior to entering academia, he spent almost three decades in supply chain management positions within the CPG industries. jmello@astate.edu www.forecasters.org/foresight FORESIGHT 37
Keeping Up with the Quants: Your Guide to Understanding + Using Analytics by Thomas H. Davenport and Jinho Kim Reviewed by John Pope FOR WHOM Quantitative analysis is a timely topic, with Big Data concepts reaching a crescendo of popularity these days. But not everyone who has an interest in the outcome of analytics has quantitative knowhow. Some may be in non-quantitative roles that depend on others for the heavy analysis work. These are the intended readers of Keeping Up with the Quants. Problem solving is not the core of analytical thinking. A solution in a vacuum solves nothing. Problems must be framed correctly to reflect a relevant context. Then the results need to be communicated in a manner which facilitates correct interpretation. Now, if you consider yourself quantitative, say because you have been chewing on meaty algorithms for years, then reading this book will be like biting into a marshmallow. But don t let that disappoint you. You may find value here nonetheless. A stroll through the book may help you empathize and communicate with the non-quants. Of course, you might also share the book with non-quant colleagues. I recommend this book for organizations where communication needs to be improved between teams of mixed analytical ability. It may also be handy for business courses in project management where analytics is the focus. COMMUNICATION OF QUANTITATIVE ANALYSIS Communication is the book s theme. Indeed, it is a tenet of problem solving that a conceptual and communication infrastructure should exist for analysis to be meaningful. To facilitate this, the authors recommend six steps of analytical thinking within three stages: Framing the Problem Problem recognition and framing Review of previous findings Solving the Problem Modeling and variable selection Data collection Data analysis Communicating and Acting on Results Results presentation and action Problem solving is not the core of analytical thinking. A solution in a vacuum solves nothing. Problems must be framed correctly to reflect a relevant context. Then the results need to be communicated in a manner which facilitates correct interpretation. In any organization, analytics is not plainly mechanical, but a communications process. You can get started on this process with help from the Worksheet for Solving the Problem in chapter three. From there, you will want to maintain a communications framework to keep analytical work relevant to project goals. As such, there must be effective interaction between analysts and non-quantitative stakeholders, who might include managers, customers, suppliers, and support personnel. Chapter four provides advice for communicating with stakeholders. Most notably, this includes these steps: 38 FORESIGHT Fall 2013
Present results in interesting, comprehensible formats. In place of tables, use color graphics and interactive visual analytics when possible. Consider a Q&A meeting rather than a lecture; even a quantitative heavyweight will doze off during a tedious after-lunch barrage of minutiae. Engage the stakeholders in discussion using nontechnical language, analogies, and storytelling. Indeed, the authors case studies exemplify the storytelling aspect of conveying analytics. To develop communication skills, the authors describe exercises performed at Intel that switch the responsibilities of quants and non-quantitative stakeholders. These facilitate inter-working relationships with personnel putting themselves in the shoes of others. After all, ongoing communication has a lot to do with mutual respect, particularly in firms where specialized engineers spend a great deal of time interfacing with machines rather than people. Chapter seven presents communications tips for managers, which boil down to honing the quantitative analyst s assumptions with challenges to clarification and relevance to the project context. Because mathematics tends to be learned in a context-free format, poor communication with stakeholders may lead to phantom solutions, a situation that can linger as long as technical complexity remains beyond the grasp of the stakeholders. Conversely, if analysis is on target, its misinterpretation by management can mean a lack of conviction to make use of it. THE CASE STUDIES The book s many case studies drive home the importance of communication. While they should enhance one s appreciation of analytics and problem solving, they are not extensive enough for the development of methodological understanding. If only the authors did not suggest otherwise. For example, the book s jacket makes this ambitious claim: This book promises to become your quantitative literacy guide helping you develop the analytical skills you need right now in order to summarize data, find meaning in it, and extract value. Then, midway into chapter one there is:... [this book] shows how to implement analytics with many real-world cases, and should make you substantially better at understanding analytics yourself and should transform you into someone who can communicate effectively with others about analytical solutions to problems in organizations. While clever analytical investigations are addressed, more method details would be enlightening for quants and non-quants alike. We read that the subject of a case did impressive analysis, but we aren t allowed to assess the results ourselves. To judge the validity of a study, it would be nice to see the system from which a conclusion was derived. Let us assume that the authors did not really intend for us to assess the quality of the cases themselves, but just wanted to demonstrate a kind of best-practices with their six step process. As such, I don t think the cases will boost one s quantitative power. Consider the description of the Medallion hedge fund in chapter three. The authors infer that rigorous methods are used, but do not tell us what are they and to what extent they matter. Human nature is subject to a fatal conceit from possessing things that are rare, elegant or impressively complex, like some quantitative methods. Consider also the description of the famous Black-Scholes options-pricing model. While the model was derived with rigorous statistical tests, it rests on flawed assumptions such as risk-free rates, volatility based on arbitrary periods, and a normal movement of prices that is small and random, with rare events considered irrelevant. As we know with investment systems, 99% accuracy is meaningless if annihilation occurs 1% of the time. The practitioners of Black-Scholes at Long Term Capital Management found this out the hard way, at best applying the formula and with excessive leverage, and at worst applying even more dubious methods behind the marketing veil of the famous formula. Maybe the authors should have used Black- Scholes as an example of what not to do, say www.forecasters.org/foresight FORESIGHT 39
next to their discussion of AIG s mistaken analysis of credit default swaps. Such discussion would exemplify the danger of ignoring context and of applying erroneous assumptions. It could even be connected with the author s statistical mirages and fallacies discussion in chapter six. Human nature is subject to a fatal conceit from possessing things that are rare, elegant or impressively complex, like some quantitative methods. Such conceit can lull us to disaster. From Enron to LTCM, from Lehman to AIG, forecasting precision took precedence over forecast relevance, as some successes with impressive quantitative methods fueled hubris and recklessness. Jack Pope, an economist and system developer at Investment Economics, is engaged in programming and system administration to facilitate actionoriented investment analytics. Pope@InvestmentEconomics.com USING VS. COMMUNICATING ANALYTICS The book would benefit from its own advice that results cannot speak for themselves but must be presented in a more compelling manner if it put an emphasis on graphics. The book contains short descriptions for several types of visual analytics, like a bar chart, but without examples. Further, none of the case studies contain example visuals in their Results Presentation and Actions segment. Please, show me, don t tell me. The same goes for the section on model selection, which begs for a flow chart. There is a list of software packages in chapter three, but you have to go to chapter five to get a very general view of their areas of application. It is in no way a buyer s guide. Perhaps the book s subtitle should instead be: Your Guide to Communicating Analytics. The authors describe Hadoop and MapReduce (chapter three) as tools for classifying and filtering data. This is, perhaps, too narrow a description. Hadoop is a technology for setting up and running distributed computing systems, on top of which you can accomplish intensive statistical routines via MapReduce or simple batch processing, like converting reports from doc to pdf. CONCLUSION My recommendation is that this book could benefit organizations where communication needs to be improved between teams of mixed analytical ability. The authors six steps of analytical thinking are practical rules of thumb and should enhance analytics where teamwork includes non-quantitative individuals. While the book is intended for non-quants, I don t think the authors should downplay the importance of their book for quants. Communication is a two-way street. Perhaps putting a greater emphasis on communication and foregoing any attempt to teach analytics would broaden the book s appeal without increasing its potential for disappointment. Harvard Business Review Press (2013) ISBN-13: 978-1422187258 240 pages List Price: $27 40 FORESIGHT Fall 2013
Using Process Behaviour Charts to Improve Forecasting and Decision Making Martin Joseph & Alec Finney Forecasting Principles and Practices PREVIEW Martin and Alec have long studied the application of statistical process control (SPC) concepts to forecasting and planning. The objective of SPC is to distinguish normal variation in the output of a process from a signal that the process is changing and possibly out of control. The authors show 1) how Process Behaviour Charts (PBCs) can be created and used to good effect within S&OP and budgeting by providing context for observed changes in sales, and 2) whether they are signalling that a forecasting problem should be addressed. INTRODUCTION We are presented with data every day. We look at the data for relevance, information, and if we are lucky insight. Our subsequent behaviours and the decisions we make are closely linked to the way we see that information. By adapting a proven technique from manufacturing process control, we can present forecasting and planning data in a more understandable way, show meaningful context, and differentiate between noise and important signals. Our discussion has four parts. The first, Data to Information to Insight, shows the way reports have evolved from simple, tabulated data through time-series presentations which provide some historical context and finally to Process Behaviour Charts (PBCs), which set the boundary conditions for detecting real change (signals) amongst the ever-present noise. From the original applications of PBCs in linear process control, we extend the technique to a trended process making it suitable for much of our work in forecasting and planning. The second part, Control Limits and Signals of Change, shows how to create a PBC and use it to identify significant changes. Part three, Application of PBCs to Forecasting, shows how using PBCs can significantly improve forecast quality and target forecasting resources. The final segment, Application of PBCs to Planning, consists of vignettes describing how PBCs provide focus and aid decision making in an S&OP environment. DATA to INFORMATION to INSIGHT Data to Information Most companies manage their businesses by means of tabular reports, often comparing one month, quarter, or year with the previous period as well as with their internal targets or budget. Figure 1 shows a simplified example: these data are used subsequently for most of our tables and charts. Figure 1. A Tabular Management Report SPLY = same period last year www.forecasters.org/foresight FORESIGHT 41
Key Points Most companies manage their businesses by means of tabular reports, leaving important information upon which decisions need to be taken indistinguishable from the unremarkable. Process Behaviour Charts (PBCs) provide a means to make this distinction. PBCs can be extended from their original applications in quality control to time series of sales histories and forecasts. In this context, PBCs provide boundary conditions for detecting real change (signals) from the ever-present noise or normal variation in product sales. Building upon author Donald Wheeler s procedures for distinguishing signal from noise, we present three vignettes of situations that regularly occur in S&OP meetings and discuss how the PBCs provide a crucial context for deciding whether new actions are needed. Figure 2. Time Series with Trend Figure 3: Example of a PBC These comparisons can mislead an organisation in a variety of ways: Important information upon which decisions need to be taken cannot be distinguished from the unremarkable. The tabular format encourages binary comparisons: when only two data points are considered, historical or future context is ignored. The use of percentage differences can mislead depending on the base the reader s eye is naturally drawn to the largest number. There frequently is no accompanying narrative. While tabular formats are commonplace, most organisations are also familiar with time series, a sequence of data points over time, usually plotted as simple line charts with or without trend as shown in Figure 2. The time plot has clear advantages over the tabular style: it provides context while eliminating the temptation to make binary comparisons. However, it lacks boundary conditions that distinguish real change from background noise. Information to Insight Control Charts, otherwise known as Statistical Process Control Charts, Shewhart Charts, or Process Behaviour Charts, have been in use since the 1920s, particularly in the manufacturing arena. They have been a mainstay of the Six Sigma system of practices, originally developed by Motorola to eliminate process defects, and are latterly closely associated with lean manufacturing approaches. An example with upper and lower control limits is shown in Figure 3. We like the term Process Behaviour Chart as being most descriptive of the application of statistical process control techniques to sales, sales forecasting, and business planning processes. It is frequently human behaviour that introduces bias and the confusion between forecasts and plans and between those plans and targets (Finney & Joseph, 2009). The published works of W. Edwards Deming, Walter Shewhart, and Donald J. Wheeler are familiar in the production setting but not 42 FORESIGHT Fall 2013
in the commercial arena. Wheeler s book Understanding Variation: The Key to Managing Chaos (2000) stimulated our thinking on the applications of statistical process control to forecasting and planning. Here are his key ideas, each of which we apply in this article. Data have no meaning apart from their context. PBCs provide this context both visually and in a mathematically honest way, avoiding comparisons between pairs of numbers. Before you can improve any system, you must listen to the voice of the process. There is a crucial distinction between noise, which is routine and is to be expected even in a stable process, and signals, which are exceptional and therefore to be interpreted as a sign of a change to the process. The skill is in distinguishing signal from noise, determining with confidence the absence or presence of a true signal of change. PBCs work. They work when nothing else will work. They have been developed empirically and are thoroughly proven. They are not on trial. In manufacturing, PBCs are employed mainly to display the outcomes of a process, such as the yield of a manufacturing process, the number of errors made, or the dimensions of what is produced. In this context, a signal identifies a deviation from a control number and indicates a potential concern. We have found that signals in sales data can indicate real changes in the commercial environment. CONTROL LIMITS AND SIGNALS OF CHANGE Our focus here is on the application of PBCs to the forecasting process and the monitoring of sales. There are some unexpected benefits, too; we will describe these later. The major innovation involves the application of PBCs to trended data. Although doubtless done in practice, we are not aware of any publication covering the systematic application of PBCs to sales, forecasts, and planning in a business setting. Control Limits Where calculating process control limits is concerned, there are several methods described in the literature. We have found that applying the experiential methods described in Wheeler s 2000 book will give organisations a very adequate platform to implement PBCs. We have slightly modified Wheeler s method in order to allow for the trend in sales data. First, average sales are calculated as points on a trend line fit to the Sales data, So if the data are trending upward, average sales will reflect that growth. Wheeler calculates moving ranges from the absolute differences between successive sales data points; for example, for monthly data we d calculate the differences, February minus January, March minus February, and so on. Figure 4 shows this applied to our sales data presented in Figure 1. The sequence of these absolute values is the moving range. We then calculate the moving range average by fitting a trend line to the moving range data. Then we calculate upper and lower process limits to represent the range of normal variation to be expected in the process. We use Wheeler s experiential factor of 2.66 (as opposed to others who use 3 Ó) to calculate Figure 4. Calculation of the Upper and Lower Process Limits www.forecasters.org/foresight FORESIGHT 43
Figure 5. Type 1 Signal A single data point outside the processcontrol limits. the upper and lower limits as follows: Upper Process Limit = Average Sales + (2.66 x Average Moving Range*) Lower Process Limit = Average Sales (2.66 x Average Moving Range*) Figure 6. Type 2 Signal Three or four out of four consecutive points closer to one of the limits than to the trend. Figure 7. Type 3 Signal Eight or more successive points falling on the same side of the trend. Figure 8. Illustrative Historical Sales Analysis * We use the average moving range at the start of the trend to avoid the complication of diverging process limits, which in our view only adds unnecessary complexity. We now have the data in the correct format in the PBC, and have introduced a set of controls that will help us distinguish signal from noise. What we need now is to be able to recognise signals as they appear. Types of Signals The literature also contains many examples of different criteria for identifying signals, but in our experience the ones recommended by Wheeler (Types 1, 2 & 3) work well in practice. Examples of these are shown in Figures 5, 6, and 7. The framework of the PBC is now established, as are the types of signal we need to recognise. Before we can use the PBC as a forecasting tool, however, we need to understand the nature of the back data for all the items we wish to forecast. The Historical Sales Data First, it is necessary to specify what sales behaviour is being evaluated: factory shipments or retail store sales, for example. We then examine the historical sales data in order to establish the current trend and the point at which it began. This analysis may well reveal historical changes to either or both the trend and the limits. As illustrated in Figure 8, identification of these signals enables analysis of the historic sales patterns for any item, product, or brand. A key component of the analysis is an understanding of the stability the inherent volatility of the item. Stability Classification Identification of these signals enables analysis of the historic sales patterns for any item, product, or brand. A key component of the analysis is an understanding of the stability the inherent volatility of the item. The familiar bullwhip effect can introduce drastically different volatility at different stages in the supply chain (Gilliland, 2010). 44 FORESIGHT Fall 2013
How do we define stable? First, we suggest accumulating 12 data points to provide a reliable identification of signals (although Wheeler suggests that useful average and upper and lower limits may be calculated with as few as 5 to 6 data points). We classify items based on the stability as determined by the signals detected. Insights: 1. Group 1: Items determined to be stable (all values within the process limits) based on at least 12 data points 2. Group 2: Items which may be stable (no signals) but not yet proven to be so because we have fewer than 12 data points within control limits 3. Group 3: Unstable items (those showing signals within the previous 12 data points) In many industries, stable items represent the majority of the items typically 80% and include commercially important items. Unstable situations result from lack of data (including new products), sporadic data, or genuine rapid changes to product noise or trend caused by changes to the commercial environment. When two or more years of stable data are available, PBCs can also detect seasonal patterns. Forecasts go awry if seasonality is present and not accounted for. We could also create an archive or library of historical trends, rates of change to those trends and, similarly, for noise levels ideally coupled with associated information on cause. APPLICATION OF PBCs TO FORECASTING The three groups of items have to be treated differently for forecast generation. Group 1: Stable Trend Items These items are ideal for automated forecasting, which extrapolates the trend in the best-fitting way. Using PBCs, the trend and control limits can be locked after 12 points and then extrapolated. Only after a signal should they be unlocked and recalculated. Since most organisations have statistical forecasting systems, generating these forecasts is essentially free. If there is no commercial intelligence about the items (for example, no known changes to competitive profile, pricing, or resource levels), then there is no basis for tampering with the forecast. Indeed, such tampering may be wasted effort in that forecast value added is zero or even negative (Gilliland, 2013). Organisations find it irresistible to adjust forecasts, especially of business-critical products, in the light of progress against budgets or targets. Many organisations waste time and scarce resources making minor adjustments to forecasts (Fildes and Goodwin, 2007). With the exception of adjustments for seasonality, there is no forecast value added if the amended forecasts still follow the trend and the individual point forecasts sit within the upper and lower process limits. Group 2: Stable Until Proved Otherwise The approach to these items is essentially the same as for Group 1, except that we recommend a rolling recalculation of the trend and limits until 12 points have been accumulated. These result in periodic adjustments to the limits, but signals are still evident. With the exception of some one-off Type 1 signals, any signal occurring will indicate that items move to Group 3. Group 3: Unstable These are the problem children, from a forecasting point of view. While there are statistical methods that attempt to deal with the problem child (e.g. Croston s method for intermittent data), it is our experience that software suppliers make exaggerated claims about the application of statistical methods to unstable data sets. Other techniques such as econometric methods (applied at a brand rather than SKU level) are often needed and are not within the scope of this paper. In the absence of alternative valid forecasting methods, we usually recommend handling the inherent uncertainty of these situations on a tactical basis, for example by holding increased stock. APPLYING PBCs FOR DECISION MAKING Now we are in a position to evaluate forecasts based on the context of sales history, www.forecasters.org/foresight FORESIGHT 45
Figure 9. New Marketing Plan Forecast Figure 10. New Marketing Plan Forecast with Budget and Trend through Historical Sales Figure 11: Tactical Classification for Decision Making with the aid of a library of what changes are reasonable. Evaluating a New Marketing Plan Figure 9 shows a situation in which a product manager has a new marketing plan, the implementation of which he is convinced will increase market share. Using the very best methods available, let s say he produces a forecast that we ll label Most Likely Forecast or MLF. If his projections are correct, we should expect a Type 2 signal (three out of four consecutive points closer to one of the limits than they are to the average) by month 23. Without the control limits to provide context, any departure of actual from trend will tend to elicit a response from the business. There should be no euphoria (or bonuses) if the sales track the existing trend to month 23, as this is within expected noise level! However, if there is material market intelligence projecting that a signal will appear, use a new forecast and monitor closely looking for the expected signal. Adding a Budget or Target Building on Figure 9, things get more interesting if we add a budget or target. We now have a discussion that is informed by historical trend and noise levels, the automatic extrapolative forecast, the forecast assumptions associated with the new marketing plan, and some context of historical trend changes from our reference library. The PBC (Figure 10) can provide the transparency necessary to assess uncertainty in meeting the budget/ target and appropriate acceptance of the level of risk. Businesses often use their budgets to set stretch targets and don t consider the inherent downside risk. Then along comes an ambitious marketing person who sees that sales might be below budget and who also is enthusiastic about the positive 46 FORESIGHT Fall 2013
effect of his new plan (MLF). (We label this as MLF because it s the most likely forecast based on his rather optimistic assumptions!) BRINGING PBCs INTO S&OP AND BUDGETING PBCs have a valuable contribution to make in the Sales and Operations Planning environment as well as in the budget/business review setting. The classification of products into the three groups can help organisations decide tactics. Group 2 (stable until proved otherwise) can be subsumed within Group 1 (stable) until such time as instability is detected. We use the terms Business Critical and Non-critical to represent the importance of the item/brand to the business and consequently when reliable commercial intelligence is likely to be available. Figure 11 offers a simplified tactical classification for S&OP deliberations. Here are three vignettes describing how PBCs provide focus and aid decision making in this simplified S&OP environment. Item One: We had a bad month last month we need to do better. As depicted in Figure 12, by month 17 there were two consecutive months of belowaverage sales. In previous S&OP meetings, this may have led to a search for the culprits, a summary butt-whipping, and a message that implied the need to continue such thrashings until morale improves. Now there is context to understand what is really happening. First, the slightly lower month is within the control boundary conditions; it is not a signal. Second, there is not (at this time) any evidence of a potential trend change. Figure 13 shows what happened to sales in the ensuing months: there was no change to the sales trend, and the item remained stable with no signals! Outcome using PBC: Maintain a watching brief and bring to next meeting. If the numbers are above the mean but within the limits, avoid the conclusion that there is a causal link between the thrashing and the improvement! Item Two: This is a Type 1 signal what shall we do? In trying to understand why the signal occurred, we should first ask if the team knew of any reason for its appearance. It could have resulted from an unexpected (and maybe unforecasted) event like a one-off order or an out-of-stock. Competitor activity could provide the answer. If it were considered to be a singular event, then actions are identified as appropriate. Alternatively, if the signal was considered to be the start of a new trend, then forecasts should be amended to manage the risk associated with this change. Outcome using PBC: The signal provides the basis for a discussion not an unreasoned reaction to a potential change. Figure 12. A Bad Month Figure 13. A Bad Month 2 www.forecasters.org/foresight FORESIGHT 47
Item Three: It looks like the start of a Type 3 signal do I have to wait for eight data points? If one point appears above the average trend line, then there is no change to the trend one point cannot constitute a change. If the next point is also above the average trend, then there is a 1-in-2 probability of this happening by chance. If we take this logic all the way to 8 successive points, the risk that this is not a signal of real change is less than 1 in 250. But intervention can take place at any time. The probability that 5 successive Martin Joseph and Alec Finney are founders of Rivershill Consulting and former forecasting managers at AstraZeneca. Their previous joint articles in Foresight include "Getting Your Forecasting and Planning Fundamentals Right" (Winter and Spring, 2011) and "The Forecasting Mantra: A Holistic Approach to Forecasting and Planning" (Winter 2009). Alec@Rivershill.com Martin@Rivershill.com points will lie outside the control limits when there is no real signal is 1 in 32. Outcome using PBC: PBC has given context, this time about the cost of missing an opportunity to act. But these signals should not always be seen as warnings; they highlight opportunities as well. The outcome in these examples is better, more informed decision making. As Donald Wheeler says, Process Behaviour Charts work. They work when nothing else will work. They have been thoroughly proven. They are not on trial. And we have shown here that they work equally well when applied to the sales forecasting and business-planning processes. References Finney, A. & Joseph, M. (2009). The Forecasting Mantra: A Holistic Approach to Forecasting and Planning, Foresight, Issue 12 (Winter 2009). Gilliland, M. (2013). Forecast Value Added: A Reality Check on Forecasting Practices, Foresight, Issue 29 (Spring 2013). Gilliland, M. (2010). The Business Forecasting Deal, John Wiley and Sons. Goodwin, P. & Fildes, R (2007). Good and Bad Judgment in Forecasting: Lessons from Four Companies, Foresight, Issue 8 (Fall 2007). Wheeler, D.J. (2000). Understanding Variation: The Key to Managing Chaos, SPC Press (http://www.spcpress.com/) UPCOMING in Early 2014 A New Foresight Guide Forecasting Methods Tutorials Our nontechnical overviews of statistical forecasting methods, enabling business forecasters to make more informed use of their forecasting software Exponential Smoothing Intermittent Demand Models Box-Jenkins (ARIMA) models Regression Models VAR and Econometric Models Bayesian Forecasting Methods Neural Networks A Guide to Delphi Systems Approach to Forecasting Forecasting Exceptional Demands and Rare Events The Boundaries of Statistical Forecasting 48 FORESIGHT Fall 2013
New Directions in Managing the Forecasting Process Chris Gray Preview: Chris Gray, noted S&OP author, summarizes the most important elements of the business world s new focus on proper management of the forecasting process. The primary requirements as he sees them: Maintain accountability and transparency Compare different forecasting methods Recognize that one size does not fit all Apply statistical process control Broaden forecasting into sales planning Document assumptions Improve forecastability through improved product design INTRODUCTION If you grew up in manufacturing in the 1970s, as I did, you may think that for the first few decades of the computer age there had been two common misconceptions about forecasting: First, that it was possible to develop a right number that by devising more and more sophisticated and complicated forecasting algorithms, it would be possible to compute the one right number. Second, that it was possible to develop a single technique that would work to forecast all items. We might call this the tools era of forecasting. The focus was on building the best models and using the computer to apply them universally across items and product families, in effect eliminating people from the modeling activity. Today, businesses have shifted their focus away from the purely mathematical and statistical hammers and nails of forecasting and toward better management of the forecasting process. Fewer companies think they ll find a mathematical magic bullet that will cure their forecasting ills or eliminate people from the forecasting process. THE BROADER VIEW Most companies now take a more encompassing view: Ensure that the demand-related processes of the company, including forecasting, don t violate the basic system principles of accountability and transparency. In the end, computers cannot be accountable for the results of forecasting and demand-management business processes; that responsibility resides with people. Computer logic should support these important activities, but it should not attempt to replace people through automation, or confuse them by undermining the basic accountabilities of those individuals involved in the business process. www.forecasters.org/foresight FORESIGHT 49
Preserve the transparency of the system. Without the ability to understand where the numbers are coming from or how they were developed, people will resist being held accountable. Given the choice between two forecasting methods producing the same results where one is simple and the other complex, it is generally better to use the simpler method. As Önkal and Gönül (2005) said in their article in the inaugural issue of Foresight, an important factor to creating user confidence is the forecaster s ability to thoroughly explain the forecasting method and the justification for choosing it. Use the computer for evaluating and recommending a forecasting method, rather than simply producing an optimized result based on a single arbitrary method. For example, in many kinds of manufacturing, instead of seeking a single result from an optimization algorithm, it makes more sense to use the massive data manipulation capability of today s computers to evaluate the behavior of various forecasting strategies to see which, if any, would have worked best in the recent past. In the best-designed systems, people can review the recommendations of the forecasting system as to which would work best, choosing the specific technique that makes the most sense for a specific item or family. Recognize that one size does not fit all when it comes to forecasting methods. As Stellwagen and Tashman (2013) noted in a recent article in Foresight, Forecasters should switch between different methods as appropriate, rather than taking a onesize-fits-all approach. Forecasting methods that may make sense for manufacturing Computers cannot be accountable for the results of forecasting and demand-management business processes; that responsibility resides with people. Computer logic should support these important activities, but it should not attempt to replace people through automation, or confuse them by undermining the basic accountabilities of those individuals involved in the business process. enterprises supplying parts to other manufacturers often don t work well at all for companies managing large-scale distribution networks with store-level sales data. In high-volume retail supply-chain situations with millions of SKU/store combinations, for example, specialized forecasting methods (including ones for handling low-volume intermittent sales) along with flowcasting (DRP-like) methodologies to the distribution centers and manufacturing supply points may be better for getting a more accurate picture of demand. Take advantage of analytical tools originally developed as part of statistical process control (SPC). The tools of SPC promise to illuminate the normal behavior of demand, to identify where statistical forecasting is appropriate (or inappropriate), to suggest when changes to the forecast are warranted (and not), to help identify and eliminate bias in the forecast, and to monitor and manage the accuracy of forecast demand. In their article in this issue of Foresight (see page 41), Martin Joseph and Alec Finney explain how control charts, which were popularized in industry because of their application to monitoring manufacturing processes, can be used to monitor demand processes. They describe how control charts can distinguish normal variation from a signal that the process is changing and is possibly out of control, how different types of signals can mark shifts in demand behavior versus normal noise, and how control charts can help limit the amount of unwarranted forecast tampering. Broaden forecasting into sales planning. The experience of operating effective systems suggests that the keys to better demand numbers include coupling forecasting techniques to good human judgment, formulating plans for the specific actions needed to achieve the numbers, and clearly defining who will be held accountable. Forecasts are estimates of future demand, in terms of quantities and timing. These raw estimates, which are purely quantitative, must be complemented by judgment and accountability in order to be effective. As John Dougherty and I wrote in our book 50 FORESIGHT Fall 2013
Sales and Operations Planning Best Practices (2006), the statistical forecast is based heavily on past history. As long as the future is going to be much like the past, then everything works well. But usually changes in the product line, the customer base, the competition, promotion plans, the economy, and so on make the future quite different from the past. It s the job of people, using their innate intelligence and their knowledge of current conditions and the expected future outlook, to adjust the statistics and establish the best forecast possible. People formulate sales plans defining the specific activities that must occur in the future to achieve company targets and goals, and the market activities required to realize the forecasted demands. This also includes processes to document the assumptions that went into the original numerical data as well as those associated with activities and assignments. Sales planning involves supplementing original forecast numbers the raw estimates from above with human judgment about external factors, such as: Market-development activities Promotions Product portfolio adjustments and newproduct introductions Pricing changes Collaboration with key accounts to gain market intelligence about likely future demand not reflected in history Account plans for specific customers Product placement considerations Trade-show plans These, along with the existing customer order backlog, specific assignments and accountabilities for work that must be done, and the documentation of the major assumptions that went into the final demand numbers become the sales plan for the product or products. Document assumptions. Every forecast makes some assumption about the numbers that were developed. These could be gross assumptions such as the future will be like the past, the best method for forecasting future demand is the method that would have worked best in the recent past, we ll have normal weather this summer, or housing starts will increase by x%. Or they could be much more granular, like pricing changes will give x% lift over the baseline demand, new product A at price point P will cannibalize sales of product B by Y% per period once it is introduced nationwide, or the top 6 customers will increase their purchases by Z units because of our targeted pricing. When the real orders appear and are different from forecast, as they inevitably are, the questions management will ask are: Why is actual demand different from forecast? Was our forecast flawed or is this just normal variability? Did we make bad assumptions that might explain the variability? Did the key elements of our sales plan affect demand differently from what we anticipated? Without having written down the basic assumptions that went into the original forecast, it s difficult to make much sense of the difference between forecast and actual. In retrospect, everything that happened makes sense. But write down the assumptions and then evaluate the actual demands in light of those assumptions, and you have some hope of learning something about the nature of your demand and what levers you have that are actually effective. In The Forecaster as Leader of the Forecasting Process (Foresight, 2007), Borneman discusses the importance of these postmortems in the pharmaceutical industry: The final step, and one not consistently applied in many companies, is performance analysis, a monthly review of the variances between actual and forecasted outcomes. Here we attempt to allocate a variance across the major assumptions in the forecast: How much of the variance is due to departures from our assumptions about market size, market share, dosing, inventory, and pricing? Use improved product design and supply-chain management methods to reduce the need for forecasting or eliminate it entirely. www.forecasters.org/foresight FORESIGHT 51
Here are some common methods, each of which can be employed alone and in combination with the others: In cases where your customer has an effective planning and scheduling system, get close enough to use its output to reduce or replace your internal item-level forecasts. And irrespective of the quality of the customer s planning system or the geography of your supply chain, involve all the supply-chain partners in your product design, supply-chain design, and inventory and capacity planning processes so you can respond rapidly to demand changes. In cases where shortening supply-chain lead times, both internally and externally, is possible, make a transition to a make-to-order or finish-to-order fulfillment model. This is especially important when there is a high degree of commonality of parts and assemblies at lower levels of your product s bill of material. Moving away from the level of entangled features (the finished-product level) to the level of disentangled options/modules/ components will allow you to forecast a smaller number of items with more accuracy. Ensure that sensible strategies for inventory and capacity exist across the entire supply chain. Determine where you and your supply-chain partners can best hold buffer inventory or buffer capacity as a way to absorb demand variability. Increase manufacturing flexibility to enable short-term schedule changes to respond to shifts in demand from the customers. Use strategic safety stocks at appropriate stages in the manufacturing process to Chris Gray is coauthor, with John Dougherty, of Sales and Operations Planning Best Practices, as well as influential books on MRP II/ERP, Lean, and software functionality for planning, scheduling, and management. cgray@grayresearch.com reduce cumulative lead times (and hence the forecasting horizon), and to increase flexibility to respond to forecast error. Plan for safety capacity and/or use selective overplanning to accommodate variability as well as shifts in demand. SUMMARY Ultimately, the answer to the problem of managing forecasts and demand has much less to do with the statistical techniques than with appropriate use of the computer to assist people in decision making, accountability, and transparency; use of appropriate analytical tools to understand the nature of demand and how it may be changing; analysis of sensible forecasting and sales-planning processes; documenting assumptions; and sensible product and supply-chain design. In the bibliography, you ll find some useful recent writings on these subjects, some of which were referred to above. Perhaps most exciting: we have not yet scratched the surface on the kinds of improvements we can make to forecasting and demand management. The last word on these subjects is far from being written. REFERENCES Borneman, J. (2007). The Forecaster as Leader of the Forecasting Process, Foresight, Issue 7 (Summer 2007), 41-44. Deschamps, E. (2005). Six Steps to Overcome Bias in the Forecast Process, Foresight, Issue 2 (October 2005), 6-11. Dougherty, J. (2012). Dealing With Inaccurate Forecasts, Retrieved from Partners For Excellence Web Site: http://www.partnersforexcellence.com/ newsv4.htm Dougherty, J. & Gray, C. (2006). Sales and Operations Planning: Best Practices, Trafford Publishing. Landvater, D. & Gray, C. (1989). The MRP II Standard System, New York: John Wiley and Sons. Mello, J. (2009). The Impact of Sales Forecast Game Playing on Supply Chains, Foresight, Issue 13 (Spring 2009), 13-22. Önkal, D. & Gönül, M S. (2005). Judgmental Adjustment: A Challenge for Providers and Users of Forecasts, Foresight, Issue 1 (June 2005), 13-17. Stellwagen, E. & Tashman, L. (2013). ARIMA, the Models of Box and Jenkins, Foresight, Issue 30 (Summer 2013), 28-33. 52 FORESIGHT Fall 2013
Summer 2012 Issue 26 THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING 32 Forecasting Software that Works For Not against Its Users 41 reliably Predicting Presidential elections Fall 2012 Issue 27 THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING Spring 2013 Issue 29 THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING 5 Forecasting revenue in Professional Service Companies 14 Forecast value added: A Reality Check on Forecasting Practices 19 s&op and Financial Planning 26 cpfr: Collaboration Beyond S&OP 39 Progress in Forecasting rare events 50 Review of "global trends 2030: alternative Worlds" Connect with the earned expertise of business forecasters and practical research from top academics from around the globe. Each issue of Foresight contains articles that you ll use in your day-today work, whatever types of forecasting you do. Here s what our readers say: The information is relevant to practitioners and is presented in a way that is not overly academic but with significant credibility. - Thomas Ross, Financial Analyst, Brooks Sports Foresight make(s) important research findings available to the practitioner. - Anirvan Banerji, Economic Cycle Research Institute...an important forum for practitioners to share their experiences... - Dan Kennedy, Senior Economist, Connecticut Department of Labor I find Foresight very useful! I use it as a teaching resource to bring theoretical forecasting techniques to life for the students. - Dr. Ilsé Botha, Senior Lecturer, University of Johannesburg Put Foresight to work to improve your forecasts and rally support for the people, processes, and tools that accurate forecasting requires. Subscribe or Renew today! forecasters.org/foresight/subscribe/ 5 Setting Internal Benchmarks Based on a Product s ForecaStaBIlIty DNa 18 Regrouping to Improve Seasonal Product Forecasting 38 Book Review Abundance: The Future Is Better Than You Think 5 Special Feature: Why Should I Trust Your Forecasts? 23 Tutorial: The Essentials of Exponential Smoothing 29 S&OP: Foundation Principles and Recommendations for Doing It Right 40 New Texts for Forecasting Modelers www.forecasters.org/foresight FORESIGHT 53
Business Office: 53 Tesla Avenue Medford, Massachusetts 02155 USA www.forecasters.org/foresight Don't Miss Out! Business forecasters read Foresight to deepen their professional forecasting knowledge and improve their forecasts, in whatever context they're used. Subscribe or Renew Now! RENEW or Start Your Subscription: www.forecasters.org/foresight/subscribe.html RENEW or START YOUR IIF MEMBERSHIP: www.forecasters.org/join.html 54 FORESIGHT Fall 2013