Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies



Similar documents
Guide for the Development of Results-based Management and Accountability Frameworks

KEY PERFORMANCE INFORMATION CONCEPTS

Manage IT Service Continuity and Availability

Audit of Mobile Telecommunication Devices

CDC UNIFIED PROCESS PRACTICES GUIDE

DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper

Performance Management Framework

Frequently Asked Questions in Project Management

Practice Overview. REQUIREMENTS DEFINITION Issue Date: <mm/dd/yyyy> Revision Date: <mm/dd/yyyy>

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Research and information management strategy Using research and managing information to ensure delivery of the Commission s objectives

Standard Monitoring Procedures

Guide to the Performance Management Framework

(Refer Slide Time: 01:52)

Manage Release and Deployment

Gonzaga University Virtual Campus Ignatian Pedagogical Approach Design Portfolio (IPA) Updated: October 15, 2014

[project.headway] Integrating Project HEADWAY And CMMI

P3M3 Portfolio Management Self-Assessment

PHASE 6: DEVELOPMENT PHASE

PHASE 8: IMPLEMENTATION PHASE

Business Continuity Policy

Office of the Auditor General AUDIT OF IT GOVERNANCE. Tabled at Audit Committee March 12, 2015

Final. North Carolina Procurement Transformation. Governance Model March 11, 2011

Project Proposal Writing Module

Guide to Successful Nonprofit Executive Onboarding

3.B METHODOLOGY SERVICE PROVIDER

EQR PROTOCOL 6 CALCULATION OF PERFORMANCE MEASURES

Guide to Assessment and Rating for Regulatory Authorities

Framework for Managing Programme Performance Information

Sector Development Ageing, Disability and Home Care Department of Family and Community Services (02)

A COMPARISON OF PRINCE2 AGAINST PMBOK

Safety Management Program

Overview of Medical Device Design Controls in the US. By Nandini Murthy, MS, RAC

Measurement Information Model

Queensland Government Human Services Quality Framework. Quality Pathway Kit for Service Providers

USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan

Develop Project Charter. Develop Project Management Plan

PHASE 3: PLANNING PHASE

REAL PROPERTY ASSET MANAGEMENT (DOE O 430.1B) REQUIREMENTS CHECKLIST

Guidance Note: Corporate Governance - Board of Directors. March Ce document est aussi disponible en français.

Audit of the Test of Design of Entity-Level Controls

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION

Procurement Performance Measurement System

Project Scorecard Template

IT Service Management

PROJECT MANAGEMENT PLAN CHECKLIST

PHASE 9: OPERATIONS AND MAINTENANCE PHASE

Project Management Process

1.20 Appendix A Generic Risk Management Process and Tasks

2008 review 2013 review Targeted measures Targeted deadline. ENQA Criterion / ESG. February 2016/ Measures already taken. Level of compliance

Transforming Internal Audit: A Maturity Model from Data Analytics to Continuous Assurance

Appendix 1: Key Supporting Documents for Standard 9 20 Appendix 2: Key Supporting Documents for Standard 10 21

Using the logical framework matrix

R000. Revision Summary Revision Number Date Description of Revisions R000 Feb. 18, 2011 Initial issue of the document.

2016 COMMUNITY SERVICES GRANTS

EXECUTIVE OVERVIEW OF THE ACCOUNTABILITY AGREEMENT

MARKET CONDUCT ASSESSMENT REPORT

TOOL. Project Progress Report

Cyber Situation Awareness Package. Product Descriptions

PHASE 6: DEVELOPMENT PHASE

Enterprise Performance Life Cycle Management. Guideline

4 Project Implementation and Monitoring

1. Trustees annual report

ADVANCING PARTNERS & COMMUNITIES

VICTORIAN GOVERNMENT DEPARTMENT ENVIRONMENTAL MANAGEMENT SYSTEM MODEL MANUAL

PHASE 3: PLANNING PHASE

How To Monitor A Project

Logical Framework: Making it Results-Oriented

Project Management Plan. Project Management Plan Guide. Strategic Capital, Infrastructure and Projects

Planning, Monitoring, Evaluation and Learning Program

Project Implementation Process (PIP)

Project Management for Everyone

Provider Monitoring Process

SCO Monitoring Process

Integrated Risk Management:

ITRM Guideline CPM Date: January 23, 2006 SECTION 4 - PROJECT EXECUTION AND CONTROL PHASE

delta media Proposal for a Marketing and Communications Plan Strengthening the Cochrane Collaboration Brand Presented to:

Performance Management Framework

How To Develop Software

TRANSPORT CANADA MARINE SAFETY PLEASURE CRAFT OPERATOR COMPETENCY PROGRAM QUALITY MANAGEMENT SYSTEM FOR ACCREDITATION

Purpose: Content: Definition: Benefits: outputs outcomes benefits Business Case dis-benefit Key Responsibilities: Approach: Executive Developed

Draft Template. Quality Manual. For Biotechnology Quality Management System v1b

COMPREHENSIVE ASSET MANAGEMENT STRATEGY

INTERNATIONAL LABOUR ORGANIZATION THE OFFICE OF THE ILO LIAISON OFFICER. Call for Expression of Interest ILO/YGN/15/10.

National Commission for Academic Accreditation & Assessment. Standards for Quality Assurance and Accreditation of Higher Education Programs

Project Management Standards: A Review of Certifications/Certificates

Framework for Conducting Post Implementation Reviews (PIR)

Alberta Health. Primary Health Care Evaluation Framework. Primary Health Care Branch. November 2013

Performance Monitoring and Evaluation System (PMES) Framework Document

Achieve. Performance objectives

Performance audit report. Ministry of Education: Monitoring and supporting school boards of trustees

Frequently Asked Questions in Identifying and Assessing Prospective Risks

Job Description and Person Specification. Post Number: HCI.C24 JE Ref: JE028

Transcription:

Home > Centre of Excellence for Evaluation Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies 6.0 Performance Measurement Strategy Framework 6.1 Overview of the Performance Measurement Strategy Framework The PM Strategy Framework identifies the indicators required to monitor and gauge the performance of a program. Its purpose is to support program managers in: continuously monitoring and assessing the results of programs as well as the efficiency of their management; making informed decisions and taking appropriate, timely action with respect to programs; providing effective and relevant departmental reporting on programs; and ensuring that the information gathered will effectively support an evaluation. Program managers should consult with heads of evaluation on the selection of indicators to help ensure that the indicators selected will effectively support an evaluation of the program. See Section 2.3 of this guide for more information on the roles and responsibilities of program managers and heads of evaluation. 6.2 Content of the Program Performance Measurement Strategy Framework Table 3 summarizes the major components of the PM Strategy Framework. The framework should include the program's title as shown in the departmental PAA as well as the PAA elements that are directly linked to the program, i.e. the program activities, subactivities and/or sub- subactivities. It should also include the program's outputs, immediate and intermediate outcomes (as defined in the logic model), as well as one or more indicators for each output and outcome. For each indicator, provide: the data source(s) the frequency of data collection baseline data targets and timelines for when targets will be achieved the organization, unit and position responsible for data collection the data management system used Table 3: Sample Table for the Performance Measurement Strategy Framework: PAA Elements Linked to the Program Program outputs and outcomes Indicator Output 1 Indicator 1 Output 2 Rows m ay be added below for additional outputs. Indicator 2 Indicator 3 Outcome 1 Indicator 4 Date to Data source Frequency Baseline Target achieve target Organization and position responsible for data collection Data management system Indicator 5 1/6

Outcome 2 Rows m ay be added below for additional outcom es. Indicator 6 Indicator 7 Indicator 8 6.3 Performance Measurement Frameworks and the Performance Measurement Strategy Framework The Policy on Management, Resources and Results Structures (MRRS) requires the development of a departmental Performance Measurement Framework (PMF), which sets out the expected results and the performance measures to be reported for programs identified in the PAA. The PMF is intended to communicate the overarching framework through which a department will collect and track performance information about the intended results of the department and its programs. The indicators in the departmental PMF are limited in number and focus on supporting departmental monitoring and reporting. The PM Strategy Framework is used to identify and plan how performance information will be collected to support ongoing monitoring of the program and its evaluation. It is intended to more effectively support both day- to- day program monitoring and delivery and the eventual evaluation of that program. Accordingly, the PM Strategy Framework may include expected results, outputs and supporting performance indicators beyond the limits established for the expected results and performance indicators to be included in the MRRS PMF (see Figure 2 below). Unlike the MRRS PMF, the PM Strategy Framework has no imposed limit on the number of indicators that can be included; however, successful implementation of the PM Strategy is more likely if indicators are kept to a reasonable number. Figure 2: Comparison of Requirements: Performance Measurement Frameworks and Performance Measurement Strategies Figure 2 - Text Version As illustrated in Figure 2, the indicators in the PM Strategy Framework focus on supporting ongoing program monitoring and evaluation activities and therefore align with and complement the indicators included in the departmental PMF. In instances where the program is shown as a distinct program in the PAA and indicators have been identified in the departmental PMF, the PM Strategy Framework should include, at a minimum, the indicators reported in the departmental PMF. When a PM Strategy is developed for a new program that is not represented in the departmental 2/6

PAA, the outcomes, outputs and related indicators developed for the PM Strategy Framework should be considered for inclusion in the departmental MRRS. [10] 6.4 Accountabilities and Reporting The PM Strategy Framework should be accompanied by a short text that describes: the reporting commitments and how they will be met, including who will analyze the data, who will prepare the reports, to whom they will be submitted, by when, what information will be included, the purpose of the reports and how they will be used to improve performance; and if relevant, the potential challenges associated with data collection and reporting, as well as mitigating strategies for addressing these challenges (e.g. there may not be a system that can be used for data management). 6.5 Considerations when Developing the Performance Measurement Strategy Framework The chart below provides guidance on how to develop the PM Strategy Framework. [11] Step Description 1. Start with the MRRS PMF: Review the MRRS PMF that the department developed in accordance with the Policy on MRRS. Include, as appropriate, the performance indicators from the MRRS PMF in the PM Strategy Framework. 2. Identify performance indicators: Develop or select at least one performance indicator for each output and each outcome (immediate, intermediate and ultimate) that has been identified in the program logic model. Keep in mind that, in addition to day-to-day program monitoring, the performance indicators will also be used for evaluation purposes. As such, it is recommended that program managers also consider the core issues [12] for evaluation (i.e. relevance and performance) and consult with heads of evaluation when developing the performance indicators. Comments The PM Strategy Framework should not be developed in isolation. In accordance with the Policy on MRRS, all programs represented in a department's PAA must contribute to its strategic outcome(s). As such, when developing the PM Strategy Framework, the performance indicators should complement those already established in the departmental PMF. There are two types of indicators: quantitative and qualitative. Quantitative performance indicators are composed of a number and a unit. The number indicates the magnitude (how much) and the unit gives the number its meaning (what), e.g. the number of written complaints received. Qualitative indicators are expressed in expository form, e.g. assessment of research quality. As much as possible, qualitative indicators should be condensed into a rating scale, e.g. research quality is rated as "excellent," "average" or "below average." This will allow for comparability over time. Choosing appropriate performance indicators is essential for effectively evaluating and monitoring a program's progress. Appropriate indicators are characterized as follows: 3/6

Valid The indicators measure what they intend to measure. Reliable The data collected should be the same if collected repeatedly under the same conditions at the same point in time. Affordable Cost- effective data collection (and analysis) methods can be developed. Available Data for the indicators are readily and consistently available to track changes in the indicator. Relevant The indicator clearly links back to the program outcomes. Keep the number of performance indicators to a manageable size. A small set of good performance indicators are more likely to be implemented than a long list of indicators. 3. Identify data sources: Identify the data sources and the system that will be used for data management. There are a number of possible data sources: Administrative data Information that is already being collected through program files or databases or could be collected with adjustments to regular programming processes or funding agreements; Primary performance data Specialized data collection exercises, such as focus groups, expert panels or surveys, in which information needs to be collected; and Secondary data Information that has been collected for other purposes, such as national statistics on health or economic status (e.g. Statistics Canada data). Use readily available information. Take advantage of any available sources of information at your disposal. 4. Define frequency and responsibility for data collection: Identify frequency for data collection and the person(s) or office responsible for the activity. Describe how often performance information will be gathered. Depending on the performance indicator, it may make sense to collect data on an ongoing, monthly, quarterly, annual or other basis. 4/6

When planning the frequency and scheduling of data collection, an important factor to consider is management's need for timely information for decision making. In assigning responsibility, it is important to take into account not only which parties have the easiest access to the data or sources of data but also the capacities and system that will need to be put in place to facilitate data collection. 5. Establish baselines, targets and timelines. Performance data can only be used for monitoring and evaluation if there is something to which it can be compared. A baseline serves as the starting point for comparison. Performance targets are then required to assess the relative contribution of the program and ensure that appropriate information is being collected. Baseline data for each indicator should be provided by the program when the PM Strategy Framework is developed. Often, and particularly for indicators related to higher- level outcomes, this information will have already been collected by program managers as part of their initial needs assessment and to support the development of their business case. Targets are either qualitative or quantitative, depending on the indicator. Some sources and reference points for identifying targets include: baseline data based on past performance or previous iterations of the program; the achievements of other, similar programs that are considered leaders in the field (benchmarking); generally accepted industry or business standards; and publicly stated targets (e.g. set by the government, the federal budget). If a target cannot be established or if the program is unable to establish baseline data at the outset of the program, explicit timelines for when these will be established as well as who is responsible for establishing them should be stated. 5/6

6. Consult and verify: As a final step, program managers should consult[13] with heads of evaluation and other specialists in performance measurement to validate the performance indicators and confirm that the resources required to collect data are budgeted for. Consultation with other relevant program stakeholders (e.g. information management personnel) is also encouraged. This step complies with the Directive on the Evaluation Function, which holds heads of evaluation responsible for reviewing and providing advice on the PM Strategies for all new and ongoing direct program spending, including all ongoing programs of grants and contributions, to ensure that they effectively support an evaluation of relevance and performance (Section 6.1.4, subsection a). It is highly recommended that program managers work with the head of evaluation to ensure that the performance indicators selected will provide adequate data to support an evaluation of the program. The choice of consultation mechanism remains at the discretion of the program manager and head of evaluation. Key questions to be considered during the consultation and verification stage include: Does the selection of indicators align with and complement the indicators in the departmental PMF? Are the indicators valid measures of the outputs and outcomes (i.e. do they measure what they intend to measure)? Are the indicators reliable (i.e. would the data recorded be the same if collected repeatedly under the same conditions at the same point in time)? Is it likely that the required data will be provided in a timely manner? Is the cost reasonable and clearly budgeted for? If the answer to any of the above questions is "no," adjustments need to be made to the indicators or to the resources allocated for implementation of the PM Strategy. Implementation The full benefits of the PM Strategy Framework can only be realized when it is implemented. [14] As such, program managers should ensure that the necessary resources (financial and human) and infrastructure (e.g. data management systems) are in place for implementation. Program managers should begin working at the design stage of the program to create databases, reporting templates and other supporting tools required for effective implementation. Following the first year of program implementation, they should undertake a review of the PM Strategy to ensure that the appropriate information is being collected and captured to meet both program management and evaluation needs. Keep in mind that, according to the Directive on the Evaluation Function, the head of evaluation is responsible for "submitting to the Departmental Evaluation Committee an annual report on the state of performance measurement of programs in support of evaluation" (Section 6.1.4, subsection d). Given that the PM Strategy Framework represents a key source of evidence for this annual report, its implementation is especially important. Date Modified: 2010-09- 29 6/6