A Capability Model for Business Analytics: Part 3 Using the Capability Assessment



Similar documents
A Capability Model for Business Analytics: Part 2 Assessing Analytic Capabilities

Get The Picture: Visualizing Financial Data part 1

USING TIME SERIES CHARTS TO ANALYZE FINANCIAL DATA (Presented at 2002 Annual Quality Conference)

SETTING UP THE NEW FACEBOOK BUSINESS PAGE DESIGN

How to Create a Campaign in AdWords Editor

USING DATA DISCOVERY TO MANAGE AND MITIGATE RISK: INSIGHT IS EVERYONE S JOB

Profit maximization in different market structures

Critical Thinking. I m not sure what they expect when they ask us to critically evaluate

Measuring Evaluation Results with Microsoft Excel

How Effectively Are Companies Using Business Analytics? DecisionPath Consulting Research October 2010

Simple Predictive Analytics Curtis Seare

How to Build a Business Case for Automated E-procurement

Session 7 Bivariate Data and Analysis

Kotter and Bridges handouts for participants who did not attend Workshop 1.

5 Tips for Creating Compelling Dashboards

How to Make the Most of Excel Spreadsheets

The Very Best Way We Know to Play the Exacta

Changing Roles of the Procurement Leader: The Analytical CPO

The Basics of Building Credit

Data Visualization Best Practice. Sophie Sparkes Data Analyst

Principles of Data Visualization for Exploratory Data Analysis. Renee M. P. Teate. SYS 6023 Cognitive Systems Engineering April 28, 2015

Measurement and Metrics Fundamentals. SE 350 Software Process & Product Quality

Analyzing Data Using Excel

Louis Gudema: Founder and President of Revenue + Associates

hundred thousands one thousands ten thousands one millions hundreds

Keeping up with the KPIs 10 steps to help identify and monitor key performance indicators for your business

CREATING LEARNING OUTCOMES

Part II Management Accounting Decision-Making Tools

Go Beyond Excel to Analyze Data. 5 Strategies For Improving Your Analytics

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

Architecture Artifacts Vs Application Development Artifacts

INTERVIEW TIPS 1. PREPARING BEFORE THE INTERVIEW PRINT OFF YOUR JOB APPLICATION RESEARCH THE COMPANY PLAN YOUR JOURNEY

GE Capital The Net Promoter Score: A low-cost, high-impact way to analyze customer voices

Card sort analysis spreadsheet

Google Analytics Guide. for BUSINESS OWNERS. By David Weichel & Chris Pezzoli. Presented By

USC Marshall School of Business Academic Information Services. Excel 2007 Qualtrics Survey Analysis

Tableau's data visualization software is provided through the Tableau for Teaching program.

Planning a Responsive Website

A Guide to Preparing Your Data for Tableau

PG Calc Featured Article, August 2014

Qlik s Associative Model

A Visualization is Worth a Thousand Tables: How IBM Business Analytics Lets Users See Big Data

How To Get Feedback From Tma 360 Degree Feedback Workbook

Integrated Marketing Performance Using Analytic Controls and Simulation (IMPACS SM )

How to Develop a Logic Model for Districtwide Family Engagement Strategies

Scheduling is a DRAG

How to Get More Value from Your Survey Data

Quality Tools, The Basic Seven

Executive Dashboard Cookbook

How much time do you waste every week trying to prepare reports for

Top 5 best practices for creating effective dashboards. and the 7 mistakes you don t want to make

Demand Generation Best Practices: Lead Scoring

GUIDELINES FOR THE IEP TEAM DATA COLLECTION &

BPM 101: Selecting a Business Performance Management Vendor

PloneSurvey User Guide (draft 3)

THE TOP 5 RECRUITMENT KPIs

Quantitative Displays for Combining Time-Series and Part-to-Whole Relationships

FORECASTING. Operations Management

Club Accounts Question 6.

Trends in Corporate Climate Change Governance

ebook 5 BEST PRACTICES FOR ANALYZING PRICING DATA UNLOCK YOUR DATA UNLEASH YOUR SALES

PROJECT RISK MANAGEMENT - ADVANTAGES AND PITFALLS

Data Analysis Tools. Tools for Summarizing Data

Drawing a histogram using Excel

CRISP-DM, which stands for Cross-Industry Standard Process for Data Mining, is an industry-proven way to guide your data mining efforts.

The Definitive Guide to Preparing Your Data for Tableau

Service Desk/Helpdesk Metrics and Reporting : Getting Started. Author : George Ritchie, Serio Ltd george dot- ritchie at- seriosoft.

ASSESSSMENT TASK OVERVIEW & PURPOSE:

WHITE PAPER. The Five Fundamentals of a Successful FCR Program

Sales Performance Management Using Salesforce.com and Tableau 8 Desktop Professional & Server

Infographics in the Classroom: Using Data Visualization to Engage in Scientific Practices

Transcript: What Is Progress Monitoring?

Business Process Discovery

How to Write a Marketing Plan

Tools for Effective Performance Management

Introduction to Competitive Intelligence Portals

STEP-BY-STEP HOW TO APPLY TO COLLEGE GUIDE ETOWAH HIGH SCHOOL COUNSELING DEPARTMENT

Overcoming Objections to Data Governance. White Paper

Grading Rubrics PHD Program MGMT 7702 The nature of management research

Understanding the Value of In-Memory in the IT Landscape

Updates to Graphing with Excel

Six Signs. you are ready for BI WHITE PAPER

How$Spotify$builds$products$

An Evaluation of No-Cost Business Intelligence Tools. Claire Walsh

Predictor Coef StDev T P Constant X S = R-Sq = 0.0% R-Sq(adj) = 0.

6 TWITTER ANALYTICS TOOLS. SOCIAL e MEDIA AMPLIFIED

Transcription:

A Capability Model for Business Analytics: Part 3 Using the Capability Assessment The first article of this series presents a capability model for business analytics, and the second article describes a process and provides a tool for analytic capability assessment. Capability assessment is interesting, as indicated by the sustained popularity of Carnegie-Mellon s SEI Capability Maturity Model. Interesting, however, is not enough. As anyone who is involved in analytics knows, measurement without action is pointless. This third and final article of the series takes the next step from an interesting assessment to informative and actionable measures of analytic capability. A Quick Review of Analytic Capability Assessment The model that provides the foundation for assessment is based upon the SEI model from Carnegie-Mellon. That model describes six levels of process capability: Level 0 Incomplete Level 1 Performed Level 2 Managed Level 3 Defined Level 4 Quantitatively Managed Level 5 Optimized To apply the model to analytics we examine two kinds of analytic processes the processes that create analytics and the processes that use analytics to understand, gain insights, and make decisions about the business. The processes for creating and using analytics are evaluated for each of seven common kinds of business analysis performance, behavioral, predictive, causal, discovery, textual, and geospatial. For a detailed explanation of the analytic capability model see the first article in this series. Capability assessment uses the model as a structure for data collection. Fourteen data points (two process types for each of seven kinds of analysis) make a self-assessment tool that is fast and light. The response for each data point is made by selecting best-fit among a predefined set of descriptive phrases. Quantified assessment is achieved by assigning each phrase a numerical value that corresponds with the SEI capability levels. This method of analytic capability assessment is illustrated in Figure 1. Quality and accuracy of the assessment are, of course, influenced by the participants in the process and the care and consideration given to each response. For a detailed description of the assessment process see the second article of the series. Assessment with Meaning Every analyst knows that measures alone can t tell a story. To derive meaning from measurement requires comparative context. Consider, for example, that you know a vehicle is traveling at a speed of 60 miles per hour. It is impossible to distinguish good news from bad news given that single measure. Is the vehicle an automobile on the highway? Is it an automobile in a school zone? A runaway bicycle on a steep slope? Or a jet airliner at an altitude of 35 thousand feet? The point is that you must know what speed is desirable or appropriate a target value before you can determine if 60 mph is a number that indicates need for action.

At first glance, the assessment shown in Figure 1 appears to be a case of really bad news: A score of 2.1 on a scale of zero to five that s barely above twenty percent and most certainly a failing grade! The news, however, may not be as bad as it seems. There may, in fact, be good news in this assessment. Two fundamental errors are made in the leap to conclusion that 1.7 is a failing score. The first interpretive error occurs by using the top of the five point scale as the basis for comparison, and by assuming that 5 is the target value. Not every organization needs to have level 5 analytic capabilities. Thus the comparative basis should not be what is possible the top of the scale but what is needed. The second error occurs by looking only at the aggregate score. Not every organization needs to have top-of-scale analytic capabilities, and very few need to be at level 5 for every category of analysis. When examined on a row-by-row basis the assessment shows a substantial score of 3.5 for performance analysis a clear statement that this organization has defined performance analysis processes. A closer look shows that use of performance analytics is at level 4. Use of performance analytics is quantitatively managed in this organization. Quantitatively managed performance certainly seems like good news. But is it really? Once again, it is difficult to know because the basis for comparison is an external scale with no direct connection to the needs of the organization. Perhaps the capabilities far exceed the needs creating analytics cost from which no value is derived. Another row-by-row look highlights some apparent bad news. The very low 0.5 score for textual analysis looks particularly weak. It bears strong influence on the relatively low aggregate score. Yet, if there is no need for text analytics capability level is appropriate to the needs, and a cursory look at the low aggregate score may be misleading. Figure 2 shows corresponding needs assessment for the organization. The ability to compare current capabilities with needs makes the assessment much more informative.

To find real meaning in this assessment we must know what is needed. A complete analytic capability assessment must collect data about both current capabilities and needed capabilities, and must provide the basis to perform capability gap analysis. Assessment with Purpose With all three assessment functions current capabilities, needed capabilities, and gap analysis we now have the essential elements to measure and evaluate with purpose. The purposes for which this tool is designed are: To confidently know your current level of analytic capabilities. Develop a well-reasoned consensus view of your need for analytic capabilities. Understand the gap between analytic capabilities and needs. Gain insights that help you make plans to close the gap. The assessment data shown in figures 1 and 2 responses and scoring of both capabilities and needs is the input data to gap analysis. But the data isn t easily analyzed because it is not organized for comparison. The gap analysis tab of the spreadsheet reorganizes the data to support visual comparison of capabilities and needs. Figure 3 shows a tabular view that helps to compare the numbers. This table takes attention away from the aggregate scores by not displaying them. (Remember that too much attention to the aggregate score causes errors in interpreting the assessment.) The table is organized to suggest row-by-row and column-by-column analysis. Nothing new is found by examining the first two sets of columns. They simply restate the capabilities and needs numbers shown in the earlier tables.

The interesting data appears in the third set of columns where gaps are quantified for each row. Examining the table I can quickly observe that: For all seven types of analysis the needs exceed the capabilities. For most of the analysis types a moderately wide gap of 1.5 exists. Discovery analysis and geospatial analysis have narrower gaps of 0.5 and 1.0 respectively. There is no significant difference in the width of gap for creating analytics vs. the gap for using analytics. The logical conclusion from these observations is that some action should be taken to close the gap between needs and capabilities. Obviously the answer is to increase capabilities. The hard questions for planning are Which capabilities to increase? and Where to begin closing the gap? To answer these questions it helps to visualize the gap. The gap analysis charts shown in Figure 4 help with visualization. The visual perspective brings new observations, adding to the understanding gained by looking at the gap analysis table. Now we see that:

Performance analysis is the area of greatest need, and it does have a significant gap between needs and capabilities. The performance analysis gap is wider for creating analytics than for using analytics. The capability to use performance analytics exceeds the capacity to produce them. Predictive analysis is another area of significant need, particularly for processes to create predictive analytics. Behavioral analysis and causal analysis both have relatively high levels of need and somewhat lower levels of capability. In general the gap for creating analytics is wider than that for using analytics. Causal analysis has the widest gap of all analytic types for using analytics, and is the one instance where the using gap is wider than the creating gap. Discovery analysis, textual analysis, and geospatial analysis have relatively low levels of need and relatively narrow gaps. There is no gap between needs and capabilities for using discovery analytics. The organization has no capability for using text analytics. The twelve observations gained through gap analysis provide good insight into the state of analytic capabilities. Having examined the gap at the level of columns, rows, and visual comparisons, we can now return briefly to aggregate scores. The aggregate score for current capability is 2.1 as shown in figure 1, and the aggregate score for needed capability is 3.4 as shown in figure 2. The aggregate gap, then, is 1.3 more than half the value of current capabilities. We seek to increase analytic capabilities by approximately 60% over their current level. Now we have a sense of the real magnitude of the overall gap. From Assessment to Action Once the assessment has helped to gain insight, the work to be done is that of setting priorities, making decisions, and making plans. Many variables, of course, beyond the data of assessment go into that process. But we can follow the example a bit further using some assumptions. Let s assume that: Analysis areas with greatest need are given highest priority. Areas with the widest gaps are to be addressed first and most aggressively. Working with these assumptions, a plan to increase analytic capability should have as highest priority improved capability to create performance analytics and to create predictive analytics. Second priority improvements include ability to use performance and causal analytics, and to create behavioral analytics. The near-term plan, then has five distinct goals: 1. Improved capability to create performance analytics. 2. Improved capability to create predictive analytics. 3. Improved capability to use predictive analytics. 4. Improved capability to use causal analytics. 5. Improved capability to create behavioral analytics. The goals describe what needs to be accomplished. And they are measurable goals the assessment process has already defined the measures and set the targets. The next

stages of planning address how to achieve the goals. Here you ll explore available resources that likely include training, technology, collaboration, and consulting. Making the right choices is very specific to your organization. Using analytic capability assessment, you can be sure that you are making informed choices. You can download the assessment spreadsheet from the Business Analytics Collaborative Learning Center.