Measuring Program Effectiveness an introduction to the Performance Dashboard for the Ontario Literacy Coalition January 19 th 2012 Dr. Alan C. Middleton Objectives 1. to introduce the Performance Dashboard as a tool for checking the effectiveness of programs 2. To describe how the Dashboard works and its measures 3. To use Marketing and Marketing Communications Dashboards as examples 4. To indicate how to develop a Program Dashboard 1
Ancestry of the Scorecard and Dashboard: the Balanced Scorecard A strategic management system to manage strategy. Its implementation provides measurement of critical management processes that are linked systematically to achieving objectives; Four equal and interlinked measures: Financial, Customer, Business Processes and Learning, and Growth Ancestry of the Dashboard: Performance Dashboards Performance Scorecards and Dashboards are used in many aspects of business: operations, HR, sales, marketing and marketing communications, brand management etc. And in tracking programs 2
Program Dashboard The dashboard on a car, boat or plane contains the instruments that establish the condition of the external environment and the vehicle, and the direction, condition and speed of the vehicle So, the dashboard contains the key information to help guide and maintain the direction and condition of the organization s marketing programs Used to contain the key measures that need to be regularly tracked to determine the contribution of marketing activity translates the organization s strategy into objectives, metrics, initiatives and tasks customized to each group in the organization Program Dashboards Roles: - Monitor: business processes and activities and triggers alerts when performance falls below targets - Analyze: cause of problems by mapping multiple perspectives - Detail management: by steering people through shared information and seeking fact based causal relationships 6 3
The Marketing Dashboard It is intended as a graphic map that allows an observable or modeled relationship between effort and objective without - the simplistic attempts at ROI - the assumptions of linear relationships - Stalling measurement at interim metrics like intent to buy, customer satisfaction etc. even if they are drivers It is not intended to answer specific marketing activity effectiveness questions The Marketing Dashboard The key is in its causal mapping focus and mapping: Input Intermediary Output/ Marketing (Driver) metrics End Objectives activity Brand Metrics Business & Brand and - awareness revenue, margin spending - associations brand share, eg: pricing, - intent to buy, brand value etc. distribution, frequency/ MarCom SOV etc. volume/share of purchase, net promoter score etc 4
The MarCom Dashboard The key is also in the specific set of metrics chosen for each of the output and driver metrics: Input Intermediary Output/ MarCom (Driver) metrics End Objectives activity Brand Metrics Business & Brand and - awareness revenue, margin spending - associations reputation/ brand - advertising value, share etc. associations MarCom Metrics eg: clicks, fans, followers, views, engagement etc (Data shown is illustrative) 5
(Data shown is illustrative) Creating the Cross Marketing View 6
Discovering Marketing Impacts Applying your Hypothesis 7
The Program Dashboard e.g: ABC Life Literacy Family Literacy Day January 27 th since 1999, read 15 minutes/day to children Input Intermediary Output/ Program (Driver) metrics End Objectives activity Program Metrics Business & Brand and - awareness - # of people reading spending - associations 15 minutes/day; (eg: - # of sponsors -# of people doing it Journey to - media value regularly; Learning passport) - any researched relationship with ongoing reading ability The Program Dashboard e.g: ABC Life Literacy LEARN : since 1994 Yellow Pages LEARN or LookUnderLearn.ca has allowed Canadians to quickly and easily access contact information for their local literacy organization. Input Intermediary Output/ Program (Driver) metrics End Objectives activity Program Metrics Business & Brand and - # of potential learners # of potential learners spending: calling; who took literacy listing of - # of potential learners programs local literacy viewing the ad and feeling organizations encouraged to call across Canada - public awareness and respect for the initiative - support derived due to knowledge of the initiative 8
Dashboard Stages Level 1: limited collection of mostly financial reports based on spending activity (Inputs) Level 2: the addition of occasional Interim ( driver ) data but not linked to Input activity or Output objectives Level 3: regular collection of Input and Interim data but still project -focused Level 4: consistent tracking of (Input and Interim data linked to Output metrics: internal and external data used to record and graph results from the past in a clear and consistent manner Level 5: use of this full data to do predictive modeling that is graphed on the Dashboard Dashboard Development Levels 1-3 Dashboard Learn which ones are important for you Which ones align most closely to your Organizational objectives Observe the interconnection between them Levels 4-5 Dashboard Model and review the interconnections 9
Getting Started: a Phased Approach Phase I: Form a Program Measurement Task Force (PMTF): multi-disciplinary Phase II: Assemble all current and 3 year measures and activity (listed as Dashboard measures) If data is missing, plan to collect key data Phase III: Review data and develop hypotheses about cause/effect relationships. Phase IV: Determine organizations overall objectives and align program objectives to them Phase V: Determine which measures align best as drivers to the overall objectives: begin to regularly track these Work in Progress 10
Hints and Horrors Like Balanced Scorecard, the Dashboard will overwhelm you with numbers in the initial stages until you discover: - Which are the key drivers - What is required at different organizational levels Input measures will be more available than driver or output measures, but to get the Dashboard operating effectively, they must be matched, and have sufficient data points to create trend lines Consistent and aligned output measures are the key to success in moving beyond Level 3 The KISS principle applies but only after the complexity has been understood. Conclusions The advantage of a dashboard is: It measures the key activities and results It graphically shows these over time in a consistent manner It forces a systematic approach: input interim ( driver ) metrics output (results) format You learn the interconnection between the activities measured: it may not be linear, you may not know the exact nature of the interconnection, but you think the model through and construct it You then measure those things consistently over time. You discuss the feedback loops and adjust the model accordingly Net, it ll help make program evaluation better and your decisions more fact based 11