Ten Mistakes to Avoid When Creating Performance Dashboards Wayne W. Eckerson Wayne W. Eckerson is the director of research and services for TDWI, a worldwide association of business intelligence and data warehousing professionals. Eckerson is the author of many in-depth reports, a columnist for several business and technology magazines, and a noted speaker and consultant. He is the author of Performance Dashboards: Measuring, Monitoring, and Managing Your Business, published by John Wiley & Sons in October 2005. weckerson@tdwi.org Foreword In 2004, I began research for a book titled Performance Dashboards: Measuring, Monitoring, and Managing Your Business (John Wiley & Sons, 2005). It took many hours of thought, dozens of interviews, and thousands of words to piece together the puzzle of dashboards and scorecards in a way that provides a clear and complete picture without distorting current perceptions that people have about these systems. In highly abridged form, what I came up with is this: dashboards and scorecards are part of a larger performance management system which I call a performance dashboard that enables organizations to measure, monitor, and manage business performance more effectively. A performance dashboard is more than just a screen with fancy performance graphics on it: it is a full-fledged business information system that is built on a business intelligence and data integration infrastructure. A performance dashboard is very different from plain dashboards or scorecards. The latter are simply visual display mechanisms that deliver performance information in a user-friendly way, whereas performance dashboards knit together the data, applications, and rules that drive what users see on their screens. 66 BUSINESS INTELLIGENCE JournaL vol. 11, No. 1
one 1. Failing to Apply the Three Threes Every performance dashboard appears and functions differently. People use many different terms to describe performance dashboards, including portal, BI tool, dashboards, scorecards, and analytical application. Each of these contributes to a performance dashboard, but is not a performance dashboard by itself. Here is my definition: A performance dashboard is a multilayered application built on a business intelligence and data integration infrastructure that enables organizations to measure, monitor, and manage business performance more effectively. This definition conveys the idea that a performance dashboard is more than just a screen populated with fancy performance graphics: it is a full-fledged business information system designed to help organizations optimize performance and achieve strategic objectives. Despite the wide variation among performance dashboards, each shares three basic characteristics the three threes, as I call them. (If they don t, they are imposters that will not deliver long-lasting business value.) Three Applications. Each performance dashboard contains three applications woven together in a seamless fashion: a monitoring application, an analysis application, and a management application. Each application provides a specific set of functionality delivered through a variety of means. Technically speaking, the applications are not necessarily distinct programs (although sometimes they are), but sets of related functionality built on an information infrastructure designed to fulfill user requirements to monitor, analyze, and manage performance. (See Mistake 3 for more details on each application.) Three Layers. Perhaps the most distinctive feature of a performance dashboard is that it consists of three views or layers of information: a monitoring layer, an analysis layer, and a detailed information layer. Just as a cook might peel layers of an onion, a performance dashboard lets users peel back layers of information to get to the root cause of a problem. Each successive layer provides two 2. Overrating the Importance of Dashboards TEN MISTAKES additional details, views, and perspectives that enable users to understand a problem better and identify the steps they need to take to address it. (See Mistake 4 for more details on each layer.) Three Types. The last thing you need to know about performance dashboards is that there are three major types: operational, tactical, and strategic. Each applies the three applications and layers described above in slightly different ways. The three threes is a shorthand way to remember the key features of a performance dashboard when you are evaluating commercial products or building your own. and Scorecards Most people (even me!) use the terms dashboard and scorecard to refer to different types of applications for delivering performance information. In reality, dashboards and scorecards are simply visual display mechanisms within a performance management system that convey critical performance information at a glance. They are not the application or system in and of themselves! The real system the performance dashboard or performance management system is more than just a monitoring layer, as discussed in Mistake 1. Dashboard Scorecard Purpose Measures performance Charts progress Users Supervisors, specialists Executives, managers, staff Updates Right-time feeds Periodic snapshots Data Events Summaries Display Visual graphs, raw data Visual graphs, text comments BUSINESS INTELLIGENCE JournaL vol. 11, No. 1 67
A good performance dashboard should be able to deliver either a dashboard or scorecard interface, since both do the same thing: display the status and trending of key performance indicators (KPIs). The primary difference between the two is that dashboards monitor the performance of operational processes, whereas scorecards chart the progress of tactical and strategic goals. three 3. Failing to Deliver Three Applications As mentioned, a performance dashboard contains three applications that make it easier for users to access, analyze, and act on information. Performance dashboards that don t support these three applications force users to rely on other tools or people to obtain insights or take action. four 4. Failing to Deliver Three Layers The problem with most dashboards and scorecards today is that they are flat that is, they don t contain the multiple layers of information that users often need to get to the root cause of a problem or issue. This layered approach gives users self-service access to information and conforms to the way most prefer to interact with information: (1) monitor, (2) analyze, and (3) examine. That is, most business users first want to monitor key metrics for exceptions; then explore and analyze information that sheds light on those exceptions; and finally, examine detailed data and reports before taking action. By starting at high-level views of information and working down, this layered approach helps users get to the root cause of issues quickly and intuitively. The monitoring application conveys critical information at a glance using timely and relevant data, usually with graphical elements; the analysis application lets users analyze and explore performance data across multiple dimensions and at different levels of detail to get at the root cause of problems and issues; and the management application fosters communication among executives, managers, and staff and gives executives continuous feedback across a range of critical activities, enabling them to steer their organizations in the right direction. Top Layer Middle Layer Graphical Summarized Views Purpose: Monitor key performance metrics Display: Graphical indicators, numbers, text Technology: Dashboards, scorecards, portals Multidimensional View Purpose: Explore information from multiple dimensions Display: Interactive charts and tables Technology: OLAP, interactive reports Purpose Monitoring Analysis Management Conveys information at a glance Lets users analyze exception conditions Improves alignment, coordination, and collaboration Bottom Layer Transactional View Purpose: Examine details before taking action Display: Table or report in separate window Technology: Operational reports, data warehouse queries Dashboard, scorecard, BI portal, right-time data, alerts, agents Components Multidimensional analysis, time-series analysis, reporting, scenario modeling, statistical modeling Meetings, strategy maps, annotation, workflow, usage monitoring, auditing 68 BUSINESS INTELLIGENCE JournaL vol. 11, No. 1
five 5. Failing to Create the Right Type of Performance Dashboard Many people think that performance dashboards are operational in nature and always deliver real-time information. Not true. There is a wide spectrum of performance dashboards, which I ve grouped into three categories based on the way they leverage the three applications mentioned in Mistake 2. Operational dashboards track core operational processes using real-time or right-time data and emphasize monitoring more than analysis or management; tactical dashboards track departmental processes and projects and emphasize analysis more than monitoring or management; and strategic dashboards monitor the execution of strategic objectives and emphasize management more than monitoring or analysis. Strategic dashboards are often implemented using the balanced scorecard methodology and referred to loosely as scorecards. An organization can and should have multiple versions of each type of performance dashboard, but it shouldn t try to make one performance dashboard support the functionality contained in all three types, since each type requires slightly different technical architecture and application functionality. Purpose Users Operational Tactical Strategic Monitor operations Supervisors, specialists Measure progress Managers, analysts Execute strategy Executives, managers, staff Scope Operational Departmental Enterprise Information Detailed Detailed/ summary Updates Real-time or intraday Daily/weekly Detailed/ summary Monthly/ quarterly Emphasis Monitoring Analysis Management GUI Dashboard Portal Scorecard six 6. Falling Prey to Glitz seven 7. Building a Dashboard with a TEN MISTAKES It s easy to sell performance dashboards because they demo well to information-hungry executives. Executives love the graphical gauges and speedometers that flicker with real-time data; stoplights and color-coding that instantly show performance against plans and forecasts; and alerts delivered to mobile devices. But glitz is deceptive. Without a real information infrastructure underneath, the dashboard or scorecard will be shortlived because it won t deliver real insights in right time. Moreover, the glitzy interface can actually become a hindrance. The best performance dashboards tone down the interface so that it conveys only the most relevant and important information. All extraneous graphical designs or features that don t contribute to the user s understanding of the data must be eliminated. Stoplights, gauges, and thermometers that look like the actual objects should be turned into colored dots or lines that convey the same information more efficiently. Basically, the interface should be clean, not cluttered, and display a maximum of seven KPIs. If more KPIs are needed, then they should be grouped under separate tabs or drilldown pages. Lightweight Architecture The problem with many performance dashboards is that they have minimal information infrastructure. In essence, they are simply spreadsheets with a pretty face that some manager has automated using the latest technology gadget. I call these quickie dashboards because the vendors hawking these tools usually tout how quickly customers can deploy them. While time to deploy is important, quickie dashboards usually don t address long-term requirements that provide lasting value to an organization. A performance dashboard needs to align with an organization s business architecture and run on a robust BI analytic environment and data integration platform that BUSINESS INTELLIGENCE JournaL vol. 11, No. 1 69
Stakeholders Investors Board Workforce Customers Suppliers Regulators business Strategy SWOT Mission Objectives Vision Plans Strategy map Tactics Knowledge Financial assets People Process Technology Projects Metrics Leading Lagging Diagnostic information technology Displays Dashboard BI portal Scorecard Applications Monitoring Analysis Management Data Stores Low-latency data store Data warehouse, MDB Data mart, documents Integration Custom API EAI ETL EII Manual Data Sources Legacy systems Packaged apps Web pages Files Surveys Text Performance Dashboard supports multiple methods of accessing and delivering information residing in multiple, heterogeneous systems. Of course, there are exceptions to this rule. Some strategic dashboards (such as balanced scorecards) don t require much data, and the data they do require at least initially doesn t exist in any automated system and must be manually tabulated and loaded. eight 8. Delivering Real-time Data without Context Operational dashboards and some tactical dashboards provide users with a current view of operational processes. Gauges and dials flicker as events occur and new data is displayed on the screen. Or each time users access the dashboard, all the elements are refreshed with the most up-to-date information. Unfortunately, these dashboards rarely provide enough context to allow users to make useful decisions. Users need to see the historical context for an event to understand whether the event is good, bad, or indifferent, and what action to take. Even operational dashboards require historical information to put operational events in context. Ideally, the historical data is incorporated into the same table or chart as the real-time or right-time data or, secondarily, as an adjacent table or chart. But putting right-time data in context is not easy; it requires a robust data integration infrastructure that pulls data from multiple sources in right time and blends it with historical data from a data warehouse. Many commercial dashboard products maintain a repository of data for each metric so they can display time-series charts without having to query a data warehouse, which would bog down performance. 70 BUSINESS INTELLIGENCE JournaL vol. 11, No. 1
nine 9. Failing to Design Effective KPIs A metric measures business activity, while a key performance indicator (KPI) measures business performance in the context of predefined targets and goals. To define effective KPIs, organizations need to go beyond just interviewing users to collect requirements, which they do when defining metrics. Effective KPIs: n Are aligned with strategic objectives. Even if the KPIs populate an operational dashboard, they should reflect the strategy of the organization or at least try to influence the KPIs in the dashboard at the level above them. n Are easy to understand. The purpose of KPIs is to translate strategic objectives into measures that workers can influence through their actions. To do this, KPIs need to be simple to understand and few in number so workers can focus on meeting the implied goals and targets. n Foretell future performance. The best KPIs foretell future performance while there is still time for workers to take action to affect the outcome. These types of KPIs are known as leading indicators, as opposed to the lagging indicators that populate the majority of reports and performance dashboards today. True leading indicators are difficult to create. n Reinforce each other. KPIs can t be designed in isolation; otherwise, there s a good chance that they may undermine each other. For example, minimizing stockouts in a retail store while minimizing inventory in its distribution warehouse are conflicting goals. ten 10. Failing to Apply KPIs Correctly Once KPIs are designed, organizations often make the mistake of not applying them in an optimal manner. KPIs by themselves won t change people s behavior; people need to do this. Thus, to get optimal impact from KPIs, organizations need to: n Assign ownership. Every KPI needs an individual not a group to own it and be held accountable for its outcomes. This person may represent a group of people working to influence the metric, but in the end, it s best if an individual s name is associated with each KPI. n Empower workers. The organization needs to assign individuals (not groups) to be accountable for each KPI on a performance dashboard. n Vet KPIs before attaching incentives. Most organizations rush to attach bonuses or other compensation to KPIs, and the results can be catastrophic. For example, before linking KPIs to pay, the organization must make sure each KPI is fair and balanced and has the support of the majority of workers whose pay will be affected by its outcomes. n Revise KPIs periodically. KPIs are not perfect when deployed; they need to be tweaked as workers and executives discover nuances about the data, the process, or the calculation they hadn t anticipated. In addition, the half-life of KPIs is short. Over time, KPIs lose their ability to change behavior, so they need to be refreshed periodically. n BUSINESS INTELLIGENCE JournaL vol. 11, No. 1 71