Performance Measurement: A Reference Guide
|
|
|
- Arnold Horton
- 10 years ago
- Views:
Transcription
1 Performance Measurement: A Reference Guide What gets measured gets done If you don t measure results you can t tell success from failure If you can t see success you can t reward it If you aren t rewarding success, you re probably rewarding failure If you can t see success you can t learn from it If you can t recognize failure you can t correct it If you can demonstrate results you can win public support A Reference for Ministries March 2005
2 Queen s Printer for Ontario, Please do not reproduce this document without the permission of the Queen s Printer for Ontario
3 Acknowledgements Kevin Perry, Sheila DeCuyper, Barbara Adams and Godfrey Edwards at Management Board Secretariat would like to thank all of those involved in the development of the Performance Measurement Guide. Nilam Bedi, Nancy Elliot, Sam Erry, Rob Glaister, Rian Pienaar, Sara Sandhu, Simon So, Michel Theoret, Scott Tyrer, Henric Wiegenbroeker, and Bev Wolfus formed the inter-ministerial committee who kindly read the early drafts and helped to define the structure and contents of the final product. A number of programs offered examples of their work in performance measurement for the appendices. Thanks to Geoff Steers of the OPP for the article on the PRISM system, Bohdan Wynnycky for the description of the Municipal Performance Measurement Program and Bev Wolfus for sample program logic models. Our thanks to Russ Obonsawin and Simon Trevarthan of the Ministry of Finance for ensuring consistency between the guide and Modern Controllership s performance measurement training. A number of people took the time to provide detailed feedback on the guide as part of our consultation with CAOs across government. Thank you to Richard Clarke, Angela Faienza, David Field, Chris Giannekos, John Kerr, Robert Montgomery, Sheila McGrory, Mercedes Nolan, Anne Premi, Elizabeth Rogacki, Len Roozen, Deborah Ross, Carol Townsend, Lisa Watson and Kim White. Thanks to central agency staff who generously contributed their expertise and creativity to make the guide interesting and align the framework offered in the Guide with the direction of government. Deputy Ministers Kathy Bouey and Carol Layton and Assistant Deputy Minister, Peggy Mooney provided direction and encouragement at each stage of development. Special thanks to the members of the Performance Measurement and Evaluation Network without whose experience developing performance measures and systems across government the guide could not have been written. We hope that you will find the guide useful in your ongoing work to develop results-based management practices across the Ontario Public Service. A Reference for Ministries February 2005
4 Table of Contents Executive Summary... 4 Performance measurement is the method for demonstrating results...4 Performance Measures: Six Steps to Creating Performance Measures...5 Logic Model...6 Meaningful Measures...7 Performance Targets...8 Reporting Performance...8 Internal reporting...8 Public Reporting...9 Measures Checklist...9 Conclusion SECTION 1: The Big Picture Key Terms: Context: Results-Based Management Performance Measurement as part of Results-Based Management Performance Measurement Systems Reporting Performance Measures Performance Measurement Terminology Summary of Key Points Checklist SECTION 2: The Who, What and Why of Performance Measurement Key Terms What is a Performance Measure? Why Measure Performance? Who Uses Performance Measurement Information and Why? Levels and Types of Measurement Summary of Key Points Checklist Performance Measurement: A Reference Guide Table of Contents 2
5 Table of Contents SECTION 3: How to Develop Performance Measures Key Terms Overview Developing Performance Measures STEP 1: Define the strategy in relation to government priorities STEP 2: Identify and consult on cross-ministry and horizontal initiatives STEP 3: Identify the performance measures STEP 4: Check the measures for pitfalls STEP 5: Establish Baselines STEP 6: Set Performance Targets Summary of Key Points Checklist SECTION 4: Reporting Performance Measures Key Terms: Introduction Canadian Comprehensive Auditing Foundation / La Fondation canadienne pour la vérification integrée (CCAF/FCVI) Principles for Reporting Appendix Performance Measurement Resources in the Ontario Public Service Other Resources Appendix Glossary of Performance Measurement Terms Introduction Definitions References...45 Appendix 3a Examples of How Ontario is Using Performance Measurement Appendix 3b Municipal Performance Measurement Program Appendix Outputs Outcomes High-Level Indicators Appendix Sample Logic Models Appendix Step-by-Step Instructions for Creating a Logic Model Appendix Measures Checklist Appendix Example of How Benchmarking Can Be Used Performance Measurement: A Reference Guide Table of Contents 3
6 Executive Summary This guide describes the Ontario Public Service s approach to performance measurement and explains how performance information is used in decision-making and business planning. Performance measurement is the method for demonstrating results. The diagram below illustrates the relationship of performance measurement to other parts of the resultsbased management process. The focus is on ministry strategies for meeting government priorities or serving another important public interest. Performance measures for these strategies demonstrate the contribution that their results make to government priorities. The results achieved at the activity level, in the form of outputs or short-term outcomes, will be used to support project plans, quarterly reporting and Management Board of Cabinet and Cabinet policy submissions. However, the latter submissions should also be supported by evidence that demonstrates intermediate-term outcomes. The performance measures that ministries report centrally will be a combination of: measures related to public reporting and; intermediate level outcome measures that demonstrate the contributions of ministry strategies to meeting government priorities or serving other important public interests. Figure 1: Performance Measurement is a key to results-based management Government Priorities Results All other Government activities Public results measures and high level indicators Ministry Strategies Intermediate-term outcome measures Agency Activities Ministry Activities Broader Public Sector Activities Outputs and shortterm outcomes Project Plans and milestones Performance Measurement: A Reference Guide Executive Summary 4
7 Performance Measures: Six Steps to Creating Performance Measures Developing performance measures is not easy. Poorly integrated performance measurement systems can be worse than no system at all and may actually support poor decision-making. There are six steps to establishing good performance measures. Step 1 Use a logic model to define the ministry's strategies in relation to government priorities Ministries should use logic models to illustrate relationships among activities, strategies and the results to be achieved. Step 2 Identify and consult on cross-ministry or horizontal initiatives Ministries should consult with third party service providers, broader public sector organizations and other ministries to align performance measurement systems to promote greater coordination and avoid duplication. Step 3 Identify individual performance measures Developing performance measures is an iterative process and it is rare to get a satisfactory product the first time. It may be helpful to research measures that have been developed for similar activities elsewhere. Step 4 Check the measures for pitfalls A number of pitfalls can compromise the value or usefulness of a performance measure. The most common pitfalls are attribution and measurement corruption. Step 5 Establish baselines A baseline is the level of results at a given time that provides a starting point for assessing changes in performance and establishing objectives or targets for future performance. Step 6 Set performance targets A target is a clear and concrete statement of planned results (including outputs and outcomes) to be achieved within the time frame of parliamentary and departmental planning and reporting against results, which can be compared. Every ministry is asked to use comparative data to set targets based on its own performance, established industry standard, articulated customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable. Performance Measurement: A Reference Guide Executive Summary 5
8 Once established, measures are used to measure progress, take corrective actions and adjust targets where applicable. Let s look at Step 1 Using a logic model in more detail, because the logic model provides the building blocks for good performance measures. Logic Model The logic model provides a foundation for developing performance measures that will support decisionmaking. A logic model is a tool that can help define strategies and activities in relation to government priorities. It clearly shows the relationships among government priorities, ministries' strategic objectives, and how ministry activities contribute to achieving those objectives and priorities through their expected outcomes. The process of creating a logic model and making the linkages among inputs, outputs and outcomes can help build common understanding of what is expected, prioritize activities and identify appropriate performance measures. Many people will be familiar with program logic models, but they can also be used at a strategic level or created for the work of an entire organization. Everyone who uses logic models adjusts them to meet their own purposes, but a standard logic model is shown below. The important points to remember when creating a logic model are: a logic model is an iterative process that involves many people working together rather than a product which one person can produce; the information entered into the logic model at the early stages may need to be revised as new information is entered; it's not just the information in the boxes that counts, but the relationships between the boxes. Figure 2: Standard Logic Model Diagram Larger Public Interest: (Government Priorities are usually here) Customers: Who Benefits Objectives Of The Strategy: What does the strategy hope to accomplish Inputs Activities Outputs Desired Short-Term Outcomes Desired Intermediate Outcomes List all inputs List all activities List the tangible products of the activities List the changes in participation, awareness, behaviour, compliance or capacity that are expected to result from the activities in the short term List the benefits or changes in conditions, attitudes and behaviour that are expected to result from the activities in the intermediate-term Performance Measurement: A Reference Guide Executive Summary 6
9 Figure 3: Logic Model Process How Each Component Leads To The Next Stage Your Planned Work Achieving Objectives Inputs Activities Outputs Outcomes High Level Changes A logic model will help identify all the potential measures for a strategy or activity, but we don t want to measure everything, just those things that are meaningful for decision-making. See Appendix 6 for step-by-step instructions for creating a logic model. Meaningful Measures Meaningful performance measures should subscribe to the following criteria. They should: show how ministry activities contribute to achieving results; use reliable, verifiable and consistent data collection methods; provide key information for decision-making; capture all areas of significant spending; identify and track impact as well as progress towards meeting desired outcomes; incorporate consideration of risks and act as thermometers for risk management. The Ontario government uses three levels of performance measurement: Output measures should be developed to demonstrate the short-term progress that ministry activities make towards achieving the objectives of ministry strategies. Outcome measures (short-term and intermediate-term) should be developed to demonstrate the achievement of ministry strategies and/or the contribution of ministry strategies to meeting government priorities. High-level indicators measure social, environmental or economic conditions for which government alone is not accountable, but which reflect the extent to which the government's priorities are being achieved. Performance Measurement: A Reference Guide Executive Summary 7
10 Identifying output measures and high-level indicators is relatively easy but identifying good outcome measures can be difficult. The Ontario government uses three types of outcome performance measures. Efficiency: The extent to which a strategy is producing its planned outputs in relation to use of inputs. Effectiveness: The extent to which a strategy is producing its planned outcomes and meeting intended objectives. At least one outcome effectiveness measure is required for each ministry strategy. Customer satisfaction: The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency. Ministries should use outcome measures of effectiveness, efficiency, and customer satisfaction wherever possible. In addition to using performance measurement information to support internal decision-making, ministries will be asked to report their progress to support central decision-making and public reporting. Performance Targets Every ministry is asked to use comparative data to set targets based on its own performance, established industry standards, articulated customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable. Reporting Performance Government and ministries need to be able to report on performance so that the public and stakeholders can make informed judgements about achievements with public resources. Reporting performance internally to Ontario Public Service management and staff is also important in order to guide decisionmaking and support continuous improvement efforts. There are two types of reporting: internal and public. Internal reporting Annual reporting on the results achieved to date; Quarterly reporting includes performance measurement information to show how short-term objectives are being achieved and to identify risks to achievement. Performance Measurement: A Reference Guide Executive Summary 8
11 Public Reporting Evidence to support reports to the public Ministries will also use performance measurement in a variety of other reporting relationships. For example, Deputy Ministers and Ministers will require ongoing performance reporting for internal ministry management and many ministries also contribute data to sector-specific federal, provincial and territorial bodies. The 2004 Ontario Budget referred to the nine principles for public reporting developed by the Canadian Comprehensive Auditing Foundation (CCAF/FCVI) as a model for improving public reporting. Measures Checklist Relevance: Validity: Is the measure actually a good measure of the objectives the strategy intends to achieve? Does the measure actually measure what it is supposed to? Reliability: Do different users of the same measure report the same result? Verifiable: Will the measure produce the same data if measured repeatedly? Attribution: Does the measure relate to factors that the ministry can affect? Clarity: Is the measure clearly defined and easily understood? Accuracy: Cost Effectiveness: Does the measure provide correct information in accordance with an accepted standard? Is the value of the measure greater than the data collection costs? Sensitivity: Is the measure able to measure change? Timeliness: Comparability: Consistency: Can data be collected and processed within a useful timeframe? Can the data be compared with either past periods or with similar activities? Does the data feeding the measures relate to the same factors in all cases at all times? Integrity: Will the measure be interpreted to encourage appropriate behaviours? Performance Measurement: A Reference Guide Executive Summary 9
12 Conclusion In order to demonstrate results we need to measure our performance and use that performance information for planning and on-going management of government activities. Reporting performance is important to guide decision-making and support continuous improvement efforts. For resources to support performance measurement see Appendix 1. Performance Measurement: A Reference Guide Executive Summary 10
13 SECTION 1: The Big Picture Key Terms: Accountability: The obligation to answer for results and the manner in which responsibilities are discharged. Accountability cannot be delegated. Responsibility is the obligation to act whereas accountability is the obligation to answer for an action. Activity: An activity is the work performed by ministries to implement public policy and provide services to the public. All activities consume resources and produce products and/or services. One or more activities will be critical to the achievement of overall public policy objectives. Ministries must be able to demonstrate a direct causal link between the activity and the outcome(s). Monitoring: The process of collecting and analyzing information to track program outputs and progress towards desired outcomes. Outcome: The actual effects/impacts or results of the outputs. Outcomes provide relative information, for example: Percentage of total cases resolved to client satisfaction within x weeks. Output: The products or services that result from activities. Outputs are often expressed as, Number of x, for example: Total number of cases. Priority: Higher order goals of government that reflect its commitment to citizens and contribute to enduring human, economic, civic and environmental benefits. Result: A condition (outcome) or product (output) that exists as a consequence of an activity. Strategy: Plan outlining how specified ministry activities and programs contribute to a government priority and results. Performance Measurement: A Reference Guide Section 1: The Big Picture 11
14 This guide: describes the Ontario Public Service s approach to performance measurement; explains how performance information is used in decision-making and business planning; provides guidance on how to use a logic model to develop performance measures; establishes criteria for good performance measures; identifies pitfalls to performance measurement; details central performance measurement and reporting requirements. The appendices will be of particular interest to staff with ministry leadership roles for performance measurement and the guide may be used as a support to internal ministry training. 1.0 Context: Results-Based Management Transparency is an important part of government accountability and governments all over the world are increasingly expected to demonstrate what results they achieve with taxpayers money. It is not enough to know what government is spending; decision makers want to know the outcome or impact of government decisions. The process of identifying objectives and desired outcomes, measuring progress and making decisions on the basis of results is called results-based management. Results-based management is a comprehensive, government-wide approach that informs results-based decision-making, ensuring that all government-funded activities are aligned with strategies that contribute to meeting government priorities or serve an important public interest. Results-based management incorporates the following principles: government priorities drive planning processes across all government function; effective accountability within ministries and between ministries and third party service providers and broader public sector organizations requires that performance measures be built into programs and activities so that expectations are clearly articulated, to monitor and evaluate performance on a regular basis and take corrective action where necessary; horizontal integration of strategies and activities across ministries and the broader public sector helps to demonstrate how wide ranging activities complement each other in achieving government priorities; demonstrable results drive the development of strategies and activities, policy and legislative agendas, investment decisions and public accountability. Results-based management requires reliable, objective information at all levels of government. This information is gathered through performance measurement. Strong performance measures, that demonstrates objectives of government-funded services are being met, and are critical to implementing results-based management successfully. The vehicles for implementing results-based management are: Ministries results-based plans, which are the key government documents guiding ministry decisions, and Performance measurement, which is how results are demonstrated. Performance Measurement: A Reference Guide Section 1: The Big Picture 12
15 1.1 Performance Measurement as part of Results-Based Management Performance measurement has been used in the Ontario Public Service for over 20 years but has not always actively informed ongoing decision-making. The critical changes to performance measurement that are introduced by a results-based management approach are: In addition to being used by ministries internally, performance measurement is integrated into all key government decision-making processes including: o budget setting and printed estimates; o Management Board of Cabinet and Cabinet Policy Committee decisions; o broader public sector contract management; o ongoing internal government management (results-based planning, quarterly reporting, expenditure and risk management, report-backs). Performance measurement becomes part of a larger management process that aligns ministry strategies and activities with broad government priorities and specific results. Broader public sector organizations are accountable to the government for the results that they achieve. Ministries performance measures for contract and service management must be aligned with ministry strategies to meet government priorities and specific results. Figure 1: Performance Measurement is a key to results-based management Government Priorities Measures Results All other Government activities associated with results reported to the public may be drawn from either strategies or activities Ministry Strategies Intermediate-term outcome measures, reported annually in RbP Some may be linked to results Agency Activities Ministry Activities BPS Activities Outputs and shortterm outcomes to support quarterly reporting, release from hold-back, and activity-based costing Project Plans and milestones Some may be linked to results Performance Measurement: A Reference Guide Section 1: The Big Picture 13
16 Some performance measurement information will be used for both internal management and for publicly reporting on progress towards achieving government priorities. The diagram on the previous page illustrates the relationship of performance measurement to other parts of the results-based management process. The focus is on ministry strategies for meeting government priorities or serving another important public interest. The results achieved by various governmentfunded activities - whether delivered directly, through contracts, agencies or the broader public service should measure results that demonstrate the contribution they make to ministry strategies. The results achieved at the activity level, in the form of outputs or short-term outcomes, will be used to support project plans, quarterly reporting and Management Board of Cabinet and Cabinet policy submissions. However, the latter submissions should also be supported by evidence that demonstrates intermediateterm outcomes. The performance measures that ministries report centrally will be a combination of: a) measures related to public reporting and, b) intermediate level outcome measures that demonstrate the contributions of ministry activities and strategies to meeting government priorities or serving other important public interests. Figure 4: Identifying the right performance measures means knowing how activities align with strategies Strategy 1 Strategy 2 Strategy 3 A number of activities may be needed to meet the objectives of a strategy. Intermediate-term outcome performance measures demonstrate that the objectives of the strategy are being met. Activity 1 Activity 2 Activity 3 Activity 4 Activity 5 It is important to know what portion of an activity is contributing to a strategy 100% 60% Strategy 1 Strategy 1 40% Strategy 2 100% Strategy 2 30% Strategy 1 70% Strategy 3 100% Strategy 3 Short-term outcome and output performance measures demonstrate the progress that activities are making to meeting the objectives of the strategy. Performance Measurement: A Reference Guide Section 1: The Big Picture 14
17 As the diagram on the previous page illustrates, it will be advantageous for ministries to develop integrated sets of performance measures that can be combined or disaggregated to serve a variety of purposes. A ministry strategy usually will include numerous activities. A single activity may contribute to meeting the objectives of more than one ministry strategy. Knowing how and to what degree ministry activities contribute to meeting the objectives of ministry strategies is a precondition to identifying the key performance measures that demonstrate success and provide information to support decision-making. For more information on the relationship between performance measurement and results-based management, see PIL - Results-Based Planning: Further Reading Performance Measurement Systems In consultations about making the shift to a results-based management approach within government, an Ontario Public Service staff member observed that; as public servants, we have always been conscious of meeting a need and providing a service, but it s also important to think about the impact those activities have and the results they achieve. The results based management approach helps us measure our achievements. Results-based management focuses on outcomes, not activities and thus, a systematic approach to performance measurement is needed to demonstrate the results that government-funded activities produce. A performance measurement system is a comprehensive, integrated set of measures (of various types and levels) that provides a multi-faceted picture of the ministry s progress toward its targeted outputs (products and/or services) and outcomes. A good system contains strong performance measures and is easy to access, update and adapt so that individual pieces of information can be grouped for different purposes. Performance measurement systems enable ministries to manage their strategies and demonstrate they are achieving their own, and government, objectives. Rather than simply stating facts, performance measurement will be used to provide a full picture of results including a: description of the overall context for which performance is being assessed; statement of what was expected to be accomplished and at what cost; description of what was accomplished in light of the expectations; discussion of what was learned and what will be done next; description of what was done to assure the quality of the data 1. Ministries will need to assess existing performance measurement systems to: ensure existing performance measurement systems are working as intended; coordinate with transfer payment recipients and other ministries to ensure horizontal integration and agree on roles and responsibilities for data collection and reporting; identify gaps in performance measurement and create appropriate measures and systems for consistent data collection and reporting. 1 Adapted from Mayne, J Reporting on Outcomes: Setting Performance Expectations and Telling Performance Stories, Ottawa: Office of the Auditor General of Canada. Performance Measurement: A Reference Guide Section 1: The Big Picture 15
18 1.3 Reporting Performance Measures There are two types of reporting: internal and public. Internal reporting Annual reporting on the results achieved to date (primarily outcome measures, linked to ministry strategies) Quarterly reporting includes performance measurement information to show how short-term objectives are being achieved (primarily output measures, linked to ministry activities) Public Reporting Evidence to support reports to the public 1.4 Performance Measurement Terminology There is an extensive vocabulary associated with performance measurement that is often a subject for debate. In order to ensure consistency in the way we talk about performance measures in the Ontario Public Service, Management Board Secretariat has developed a glossary of terms, some of which are found at the beginning of each section in this guide. Appendix 2 contains the full Glossary of Performance Measurement Terms. 1.5 Summary of Key Points Results-based planning is an integrated, government-wide approach that aims to ensure all government activities are aligned with strategies that contribute to achieving government priorities. Results are demonstrated through performance measurement and may be reported publicly. A systematic approach to performance measurement needs to be developed in order to demonstrate the various ways that government-funded activities contribute to achieving results. Performance Measurement: A Reference Guide Section 1: The Big Picture 16
19 Checklist Ministries are expected to use performance measurement to: Manage activities internally Monitor the performance of broader public sector organizations via contract and service management Report progress on a quarterly basis to Management Board of Cabinet Support requests for resources or policy change Report to the public on results Ministries performance measures for contract and service management must be aligned with ministry strategies to achieve government priorities and specific results. Ministries will need to assess existing performance measurement systems to: Ensure they are working as intended Coordinate with transfer payment recipients and other ministries to ensure horizontal integration and agree on roles and responsibilities for data collection and reporting Identify gaps in performance measurement and create appropriate measures and systems for consistent data collection and reporting Performance Measurement: A Reference Guide Section 1: The Big Picture 17
20 SECTION 2: The Who, What and Why of Performance Measurement Key Terms Customer Satisfaction: The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency. Effectiveness: The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives. Efficiency: The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources. High Level Indicator: A measure of changes in social, environmental or economic conditions. Inputs: The resources (human, material, financial, etc.) allocated to carry out activities, produce outputs and/or accomplish results. Performance Measurement: A Reference Guide Section 2: The Who, What and Why 18
21 2.0 What is a Performance Measure? A performance measure is quantifiable information that provides a reliable basis for assessing achievement, change or performance over time. 2.1 Why Measure Performance? We measure performance because: The public wants to know and has a right to know what government does with their money. Managers need evidence to help improve activities to meet ministry strategies. Government needs to report accurately to the public on what it has done. HOW WELL ARE WE DOING? CAN WE DO BETTER? Government-funded activities need evidence to support their recommendations to their Minister, Management Board of Cabinet or Cabinet for changes to ministry activities. Ministers (and Deputy Ministers) need to tell Cabinet what their ministry has accomplished on an annual basis. Management Board of Cabinet and Cabinet need evidence to support their decisions. Systematic measurement and assessment of performance supports results-based planning by generating evidence of: how well existing government activities perform; the extent to which these activities are in the public interest and meet client needs; whether these activities are consistent with government priorities and expectations. Performance Measurement: A Reference Guide Section 2: The Who, What and Why 19
22 2.2 Who Uses Performance Measurement Information and Why? Performance measurement information has four uses: demonstrating accountability for expenditure; supporting informed decision-making; providing accurate information about government-funded activities to the public; promoting continuous improvement of government-funded activities and the administration of government itself. Different users have different information needs: external users like clients of government programs and citizens might use performance information to better understand government s accomplishments and as a way to be more involved in the democratic process; internal users such as Ministers, senior management and central agency staff use performance information to support continuous improvement of ministry activities and strategies and in making strategic and resource allocation decisions. The charts below summarize the different uses of performance measurement information by external and internal government users. For examples of effective use of performance measurement information in Ontario see Appendix 3. Table 1: How government performance information is used externally Citizen/Stakeholder Purposes of Performance Measurement Accountability Decision-Making Public Reporting To understand what the government thinks is important To see what results government produces with the money spent To understand what the government thinks is important To see what results government produces with the money spent Enables analysis, interpretation and evaluation of government performance Public reports demonstrate that government is making good use of performance information (e.g., improvements to programs/services and operational efficiencies) Citizen education about performance information can help citizens and stakeholders understand performance data and the various ways to use it, from how to ask questions and evaluate performance to how to influence public decisions Performance Measurement: A Reference Guide Section 2: The Who, What and Why 20
23 Table 2: Levels of organization at which performance measures are used internally Individual Level Activity Level Ministry Level Across Government Accountability for Expenditure Annual Deputy Minister performance contracts Individual performance plans Performance measures support accountability for expenditure via Integrated Financial Information System, Management Board of Cabinet and periodic audits by Internal Audit Division and the Ontario Provincial Auditor Performance measures support accountability for expenditure at the ministry level via Results-Based Plans and quarterly reporting to Management Board of Cabinet Performance measures are used across government to support the development of the Fiscal Plan and the Printed Estimates Uses of Performance Measurement Information Decision-Making Public Reporting Performance measurement information is used to guide individual decisions, at all levels of the organization, from program managers to Cabinet Committee members Performance measurement information is used to guide individual decisions, at all levels of the organization, from program managers to Cabinet Committee members Performance measures are used to inform decisionmaking at the activity level by: Providing evidence to managers within the ministry about how well objectives are being met Helping to identify gaps in service or manage risks to activity delivery by ministries or other broader public sector organizations Performance measures are used to inform decisionmaking at the activity level by: Providing evidence to managers within the ministry about how well objectives are being met Helping to identify gaps in service or manage risks to deliver activities by ministries or other broader public sector organizations Performance measures are used at the ministry level to help identify or adapt ministry strategies to serve vital public interests and deliver on government priorities Performance measures are used at the ministry level to help identify or adapt ministry strategies to serve vital public interests and deliver on government priorities Performance measures are used at the level of Management Board of Cabinet and other Cabinet Committees to support expenditure and policy decisions in the context of broader government priorities Performance measures are used at the level of Management Board of Cabinet and other Cabinet Committees to support expenditure and policy decisions in the context of broader government priorities Continuous Improvement All Ontario Public Service employees can use performance measures to identify opportunities for continuous improvement Performance measures are used to support continuous improvement at the activity level by helping staff and managers to adapt activities to meet the changing needs of clients Performance measures are used at a ministry level to support continuous improvement by ensuring: Activities align with ministry strategies to meet objectives Opportunities for integration among activities are maximized Performance measures are used to support continuous improvement across government by promoting horizontal coordination, communication and integration among government-funded services to improve public access, and maintain a high quality service Performance Measurement: A Reference Guide Section 2: The Who, What and Why 21
24 2.3 Levels and Types of Measurement There are different types and levels of performance measures for different uses. Later sections of this guide will explain how to develop the right type of performance measure for each purpose. There are three levels of performance measurement used in the Ontario Public Service: Output measures measure the tangible products or services that result from activities and are often raw data expressed in terms of frequency. Number of (x) is an output measure. For example, Number of acres of green space protected through legislation is an output. Outcome measures (short-term and intermediate-term) measure the effects/impacts or results of the outputs. Outcomes provide information in relation to other information, or the broader context. Thus, outcomes contain a level of analysis. For example, Percentage of new growth on existing developed land, e.g., infill, brown- or grey-fields" is an outcome measure because it measures the outputs (new growth) in relation to the existing developed land base. High-level indicators measure social, environmental or economic conditions. High-level indicators provide important information about the broader context to which government must respond and are therefore useful to support decision-making. However, they are rarely influenced solely by government initiatives. For example, Percentage of households living below the national poverty line provides important information about the economic security of families, which is important to know when government is developing social policy. However, government programs in and of themselves are unlikely to affect a change in this measure. Short-term and intermediate-term outcome performance measurement information is required to support decisions concerning legislation, policy, resource requests and budget and allocations. In some cases, output information may also be valuable. Output or short-term performance measurement information is required for quarterly reports of progress towards meeting the objectives of ministry strategies and may be included in public reports. Appendix 4 contains detailed descriptions of output measures, outcome measures and high-level indicators for ministries internal training purposes. Identifying output measures and high-level indicators is relatively easy. Identifying good outcome measures can be difficult. The Ontario Public Service uses three types of outcome measures: Efficiency: The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources. Effectiveness: The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives. Customer Satisfaction: The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency. Performance Measurement: A Reference Guide Section 2: The Who, What and Why 22
25 Ministries are already required to conduct customer surveys and, if adapted, these tools may also be used to support measurement of customer satisfaction. (See for more information on Ontario Public Service quality standards). 2.4 Summary of Key Points The Ontario Public Service uses performance measurement information at all levels of the organization to demonstrate accountability for expenditure, to support decision-making and continuous improvement and to report to the public. Performance Measurement: A Reference Guide Section 2: The Who, What and Why 23
26 Checklist Ministries are expected to use performance measurement information to support internal decision-making, policy submissions and requests for resources. External and internal users of performance measurement information have different needs and these uses need to be considered in the development of performance measurement systems. The three levels of performance measurement used in the Ontario Public Service are output measures, outcome measures and high-level indicators. Output measures illustrate short-term progress; outcome measures demonstrate the impact of that progress. Each outcome measure should measure one of the following: Efficiency, Effectiveness and Customer Satisfaction. Ministries should have a range of outcome measures. Performance Measurement: A Reference Guide Section 2: The Who, What and Why 24
27 SECTION 3: How to Develop Performance Measures Key Terms Attribution: The demonstrable assertion that a reasonable connection can be made between a specific outcome and the actions and outputs of a government policy, program or initiative. Baseline: The level of results at a given time that provides a starting point for assessing changes in performance and for establishing objectives or targets for future performance. Benchmarking: The process of measuring and comparing one s own processes, products or service against a higher performing process, product or service and adapting business practices to improve. Target: A clearly stated objective or planned result [which may include output(s) and/or outcome(s)] to be achieved within a stated time, against which actual results can be compared. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 25
28 3.0 Overview Developing good performance measures is not easy. Poorly integrated performance measurement systems can be worse than no system at all and may actually support poor decision-making. There are six steps to establishing good performance measures: 1. Define the strategy in relation to government priorities. 2. Identify and consult on cross-ministry initiatives. 3. Identify the performance measures. 4. Check the measures for pitfalls. 5. Establish baselines. "There is nothing so useless as doing efficiently that which should not be done at all." Peter F. Drucker 6. Set performance targets. The following sections take the reader through each step in some detail. 3.1 Developing Performance Measures STEP 1: Define the strategy in relation to government priorities The first step involves articulating how ministry strategies contribute to meeting government priorities or serving other important public interests. This takes time, involves many people and requires tools to help organize information logically. A logic model is a tool that describes ministry activities as part of ministry strategies to meet government priorities. The framework for organizing the logic model presented in this guide is the same as that used by ministries to develop their results-based plans. A logic model clearly shows the relationships among government priorities, ministries strategic objectives, and how ministry activities contribute to achieving those objectives and priorities through their expected outcomes. The process of creating a logic model and making the linkages among inputs, outputs and outcomes can help build common understanding of what is expected, prioritize activities and identify appropriate performance measures. A standard logic model diagram is shown below. For examples of real logic models that ministries have created or found particularly useful see Appendix 5. Step-by-step instructions for how to develop a logic model are provided in Appendix 6. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 26
29 Figure 2: Standard Logic Model Diagram Larger Public Interest: (Government priorities are usually here) Customers: Who Benefits Objectives Of The Strategy: What does the strategy hope to accomplish Inputs Activities Outputs Desired Short-Term Outcomes Desired Intermediate Outcomes List all inputs List all activities List the tangible products of the activities List the changes in participation, awareness, behaviour, compliance or capacity that are expected to result from the activities in the short term List the benefits or changes in conditions, attitudes and behaviour that are expected to result from the activities in the intermediateterm The logic model provides a foundation for developing performance measures that will support decisionmaking and is also the basis for meeting Management Board Secretariat reporting requirements. The important points to remember when creating a logic model are: developing a logic model is a process that involves many people working together rather than a product which one person can produce; the information entered into the logic model at the early stages may need to be revised as new information is entered; It s not just the information in the boxes that counts, but the relationships between the boxes. Figure 3: Logic Model Process how each component leads to the next stage Your Planned Work Achieving Objectives Inputs Activities Outputs Outcomes High Level Changes Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 27
30 STEP 2: Identify and consult on cross-ministry and horizontal initiatives The next step is to identify and consult with other ministries and broader public sector organizations that are contributing to the same result, or that are working to serve a common public interest to: 1. Share performance measures for like activities. 2. Reduce the number of performance measures used. 3. Increase the value of each measure. For example, both the ministries of Natural Resources and Municipal Affairs and Housing are involved in activities that protect natural areas or prime agricultural land from certain kinds of development. However, Natural Resources developed its activities with respect to green space, while Municipal Affairs and Housing developed its activities in relation to clearly defined green belts. Rather than have two measures measuring slightly different things related to the same public interest, the ministries worked together to develop a mutually acceptable definition of green space and performance measures to which both will contribute data. Is there agreement about what will be measured and how? "If every general contractor defined a foot as the length of his own foot, our buildings and the construction industry in general would be in a state of disarray. Yet that is precisely what we have done in health care. Every hospital, practice and specialty society has traditionally measured key processes of healthcare in unique and differing ways." Gene Beed, M.D., 1996 Conference of the (US) National Committee for Quality Assurance on Health Data. STEP 3: Identify the performance measures Performance measures should: show how ministry activities contribute to achieving government objectives; provide key information for decision-making; capture all areas of significant spending; identify and track impact as well as progress towards meeting desired outcomes; incorporate consideration of risks and act as thermometers for risk management. Data collection methods should ensure that measures are: Valid (actually measure what they are intended to measure); Reliable (different users of the same measure will report the same results) and; Sensitive (able to measure change). Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 28
31 The SMART rule provides a way to test the strength of a performance measure: S Specific: Performance measures state clearly and concisely what will be measured. They focus on significant activities and capture all major areas of spending. They are outcome focused and not a list of activities M Measurable: Performance measures should be quantified, even if based on qualitative data. A Achievable and Attributable: Performance measures relate to things that the ministry can influence and achieve. R Realistic: Performance measures are based on reliable, verifiable data that reflect the ministry/activity contribution to achieving government priorities and results. T Timely: Performance measurement data can be collected, processed and distributed within a useful timeframe and at reasonable cost. Appendix 7 contains a complete checklist for testing performance measures. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 29
32 STEP 4: Check the measures for pitfalls The most common pitfalls to performance measurement are attribution and measurement corruption and the only way to detect and avoid them is to know how they may arise. Ask enough questions to be reasonably confident that the pitfalls don t undermine the usefulness of the measure. Attribution Attribution is the assertion that a connection can be made between an outcome and the actions of a government policy, program or initiative. Determining attribution for outputs is relatively straightforward as outputs are the tangible products produced through activities. Demonstrating attribution for outcomes is more complicated because a number of intervening factors, in addition to the activities, may contribute to the outcome. Creating a logic model is the strongest method for identifying the contribution of an activity to the achievement of intended results. What does the measure really measure? In Ontario, in 2000, it was illegal to drag a dead horse down a main street. Was the fact that no dead horses were dragged down main streets in Ontario in 2000 proof that the law was working? To test for attribution, ask the question, Is the result we are measuring produced by our actions? Sometimes, the answer to the above question is clear, as in the example on the right. The fact that no dead horses were dragged down main streets of Ontario in 2000 is actually a better indicator that the law is no longer relevant than it is that the law is working. Other times, the answer is less clear. Government activities are often not solely responsible for the results achieved in relation to their objectives. There may be factors or events over which the government has no control that affect the outcome. For example, in 2003, the tourism industry in Ontario was negatively affected by the outbreak of SARS, despite ongoing government-funded programs to increase tourism in the province. Thus, when designing performance measures, it is important to measure the contribution that government-funded activities make, or the influence they exert, rather than measuring only those things over which government has direct control. At the same time, it is important to maintain an awareness of changes that could affect the results. These changes are external risk factors and include broader economic, environmental and sector-specific trends. Clearly identifying the risk factors also qualifies the performance measure so the results reported can be interpreted correctly. When we think of problems of attribution, we usually think about how performance measurement information is used to claim undue success. But problems of attribution can also make success appear as failure, as illustrated in the example below. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 30
33 If you can t see success, you can t reward it In the US, the Social Services department was responsible for assuring the welfare of children who were wards of the state. Though the state did not directly deliver services to children, they were accountable, and they established a whole range of output measures to ensure children were well caredfor. For example, third-party providers of care to children had to report to the Department of Social Services on how many children received their annual medical and dental examinations, were in full attendance at school, etc. The results were rolled up into a Social Services Department performance measure of compliance. Compliance rarely reached 60 per cent and the Social Services department was concerned. On further examination they learned that children might have missed a dental examination because they had the flu (and therefore could not be treated safely) or had missed school because they were in transition to adoption. In other words, low compliance by the third party delivery agent was not necessarily an indicator of poor care. In fact, sometimes the lack of compliance was an indicator of high quality service to children that went unrecognized. The Social Services department re-examined their compliance indicators in consultation with third-party service providers. Not only did they obtain higher quality information, rates of compliance increased, as did public confidence in Social Services. The example above also raises an important area in which attribution issues arise in situations where a broader public sector organization or agency delivers service on behalf of the government, but for which the government remains accountable for the result. Where such organizations also have a high degree of independence, ministries sometimes have reservations about their ability to demonstrate attribution and therefore accountability for the results achieved by broader public sector organizations. This is why it is essential to involve broader public sector organizations and agencies in the creation of logic models for ministry strategies to which their activities contribute. Obtaining agreement about the objectives and strategies and involving third party organizations in the identification of appropriate performance measures is critical to resolving problems of attribution. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 31
34 Figure 5: Good performance measures measure the contribution of activities to government priorities Attribution Barrier Output 3 Outcome 2 Ministry Strategies and Activities Output 2 Desired Results Government Priorities Outcome 1 Output 1 Factors Outside Control Attribution Barrier Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 32
35 Measurement illustrates the contribution that activities make, without falling into the trap of incorrectly attributing all success or failure to the activity. Even when all appropriate steps have been taken, attribution can remain an issue. This is why providing a qualitative description and analysis of performance information is valuable. It explains the larger context and data collection methods, the limitations on the interpretation of the data, and the key risk factors so that attribution issues are clearly identified. In this way, users of information are less likely to use the performance measurement information inappropriately. For a more detailed discussion of attribution and how to avoid attribution problems, see John Mayne s article, Addressing Attribution Through Contribution Analysis: Using Performance Measures Sensibly, (Office of the Auditor General of Canada, 1999), which can be found at the following internet address: Measurement Corruption The second pitfall of performance measurement is measurement corruption. People adapt their behaviour in relation to the thing being measured, as in the example on the right. It is important to try to anticipate any undesirable behaviours the measure could inspire and revise the measure accordingly. Another example of measurement corruption in the context of target setting is given on page 36. The risk of measurement corruption in performance measurement is relatively high, due to our natural desire to be successful and our boundless creativity and adaptability. It is rarely intentional and is usually produced by the measurement itself. Performance measures are less likely to become corrupt if all those involved understand the measure, what outcome it is intended to measure and agree to the choice as relevant and appropriate. This is another reason why everyone affected should be involved in the development of the logic model and identification of performance measures. Another way to deal with the effects of measurement corruption is to ensure performance measures are part of a system of measurement. If one measure becomes corrupt, other measures may signal that corruption by their lack of change. The example of training programs on the right illustrates this point. The lack of change in employment rates of the target population provided a clue that the measurement of training completion may have been corrupt by raising the question, Why isn t there change for the target group as a result of the government-funded program to improve its access to employment? What gets measured gets done The Russian furniture industry, in an attempt to increase efficiency, adopted a pay-for-performance policy in which workers were paid a commission on every pound of furniture produced. Production rates remained stable, but Russian furniture became the heaviest furniture in the world. If you aren t rewarding success, you re probably rewarding failure In the UK, national government provided funding to support training programs targeted to people who faced significant employment barriers and had been unemployed for more than two years. The training was provided on a four-year contractual basis by private sector companies whose contracts were renewed on the basis of performance - the number of successful graduates. After four years, the training companies had met their targets and their funding was renewed. However, research showed that there had been no significant change in the target group. This was because the training companies had not admitted members of the target group to the training because they were more difficult to train, less likely to complete within the contract time, and would therefore jeopardize the company s ability to meet its targets. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 33
36 Unintended Consequences Performance measures can also help to identify unintended consequences of implementing a policy or program. In the example to the right, staff assumed they were treating all clients equally, but performance measurement demonstrated the rapid response times of the program led to jobs being given to those clients who were already the most prepared for employment. Measuring performance will not save the world. Performance measures are a tool. They help us to: discover quickly if progress is being made toward objectives as expected and to tell the difference between success and failure; make ongoing decisions about how to best use resources, when to make changes to services and identify existing gaps; identify when corrective action may be needed and guide what that action should be; most activities are on a large scale and delivered over vast regions. Performance measurement helps us to know when we are successful and to identify potential problems quickly. STEP 5: Establish Baselines A baseline is the level of results at a given time that provides a starting point for assessing changes in performance and establishing objectives or targets for future performance. Ministries are expected to have identified baselines for existing performance measures. When a ministry introduces a new performance measure, it should establish the baseline during the first year the performance measure is in use. Every time performance measures or data collection methods are changed, new baselines need to be set. If you can t recognize failure you can t correct it A transitional employment (TE) program was designed to place clients with mental illness particularly those with little work history - in short-term, entry-level jobs on a rotational basis. At any given time there were 20 jobs and 300 clients. Program staff were certain that all clients were being given the opportunity to participate. When a new performance measurement system was established, staff were surprised to learn that most of the time the same 20 clients were being offered the TE jobs as they became available. The fast pace of the program, the heavy demands on staff time and the lack of objective information had resulted in available jobs being offered to clients with the strongest employment record, a total contradiction of the program s fundamental purpose. In response to the performance information, the program changed their job allocation practices to ensure that everyone on the wait list, particularly the most disabled clients, were given access to TE jobs. STEP 6: Set Performance Targets A target is a clear and concrete statement of planned results (including outputs and outcomes) to be achieved within the time frame of parliamentary and departmental planning and reporting against which results can be compared. Every ministry is asked to use comparative data to set targets based on its own performance, established industry standard, customer preference and/or performance of a comparable organization. Reasonable targets are challenging but achievable. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 34
37 Setting annual and long-term targets Performance targets set expectations for results and are the basis from which measurement takes place and improvement begins. Without targets, it isn t possible to know whether the organization is improving or falling behind in achieving results in priority areas. Targets show whether the ministry proposes to meet or exceed the standards for performance. They should be clear and quantifiable and expressed in absolute number, percentage or ratio terms. They should also define the timeframe within which they will be achieved. Targets are used as a key tool to drive, measure, improve and control performance. Benchmarking Benchmarking is the process of measuring and comparing one s own processes, products or services against a higher performing process, product or service and adapting business practices to improve performance. Where possible, ministries should benchmark their performance information against other Ontario Public Service ministries and public or private sector standards. Benchmarking performance information against another internal or external high performing organization may help ministries to identify more effective and efficient processes for achieving intended results. For an example of how one jurisdiction used benchmarking to demonstrate its performance see Appendix 8. There are three commonly accepted forms of benchmarking: Standards Benchmarking: Setting a standard of performance in relation to the performance of other organizations, which an effective organization could be expected to achieve. The publication of a challenging standard can motivate staff and demonstrate a commitment to improve services; Results Benchmarking: Comparing the performance of a number of organizations providing a similar service. In the public sector, this technique can allow the public to judge whether their local provider makes effective use of resources, compared to similar providers; Process Benchmarking: Undertaking a detailed examination within a group or organization of the process that produces a particular output. This is done to understand the reasons for the variances in performance and incorporate best practices. While there is currently no formal requirement for ministries to benchmark their performance, it is expected that such comparisons will be incorporated into all public reporting by Summary of Key Points There are six steps to establishing good performance measures: 1. Define the strategy in relation to government priorities. 2. Identify and consult on cross-ministry initiatives. 3. Identify the performance measures. 4. Check the measures for pitfalls. 5. Establish baselines. 6. Set performance targets. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 35
38 Checklist Ministries should use logic models to illustrate relationships among activities, strategies and the results to be achieved. Ministries should consult with third party service providers, broader public sector organizations and other ministries to align performance measurement systems to promote greater coordination and avoid duplication. Performance measures should follow the SMART rule and subscribe to the following criteria: Show how ministry activities contribute to achieving government priorities and results Use reliable, verifiable and consistent data collection methods Provide key information to support decision-making Capture all areas of significant spending Identify and track impact as well as progress towards meeting desired outcomes. Outcome measures should be developed to demonstrate the achievement of ministry strategies. Output measures should be developed to demonstrate the progress that ministry activities make towards achieving the objectives of the strategies. Targets are required for all performance measures. Baselines are required for all performance measures. Performance Measurement: A Reference Guide Section 3: How to Develop Performance Measures 36
39 SECTION 4: Reporting Performance Measures Key Terms: Public Performance Reporting: the formal mechanisms that a government uses to communicate with the public and legislatures in accordance with agreed guidelines. CCAF: Canadian Comprehensive Auditing Foundation is a national non-profit research and education foundation that has researched public sector governance, accountability, management and audit. Quarterly Reporting: Ministries report quarterly to Management Board of Cabinet on their expenditure progress, towards meeting objectives and on the risks to achievement. Non-Financial Information: Information about the quality and effectiveness of government services; for example, levels of access to health care in remote areas. Financial Information: Information that assists government in accounting for the costs of operations for citizens and the legislature. Performance Measurement: A Reference Guide Section 4: Reporting Performance Measures 37
40 4.0 Introduction Ministries are expected to report performance measures as part of: Results-based planning: Ministries will be required to report annually through the resultsbased planning process on their progress in meeting the performance targets associated with the strategies identified in their previous year s results-based plan. Requests for resources or policy change and in Management Board of Cabinet report backs. In these cases, performance measures are used to demonstrate the need for resources or policy, and/or to provide evidence of progress to date towards meeting objectives. Ministries will be expected to demonstrate performance in all requests for resources or policy approval. Quarterly reporting. Ministries report quarterly to Management Board of Cabinet on their progress towards meeting objectives and on the risks to achievement. Public Reporting: The government may use performance measurement results to report to the public on achievements. Deputy Ministers performance contracts: Performance measurement continues to be a key part of Deputy Ministers performance contracts and ministries will be required to report annually on progress towards the achievement of annual targets. 4.1 Canadian Comprehensive Auditing Foundation / La Fondation canadienne pour la vérification integrée (CCAF/FCVI) Principles for Reporting The format of public reporting on performance measures is changing. The emphasis increasingly is on providing the larger context to explain how and why particular results were achieved. This change in emphasis can be seen through efforts to: Ensure ministry performance measures are aligned with strategies and government priorities; Create cross-ministry measures, reducing the number of measures and increasing the coordination of ministries contributing data to the outcome measure; Incorporate the CCAF/FCVI Principles for Public Reporting into all government reporting vehicles. The Canadian Comprehensive Auditing Foundation (CCAF/FCVI) has developed a set of nine principles that represent the next generation of reporting for modernizing governments. These principles have been officially adopted by the federal government, the governments of British Columbia (BC), Saskatchewan and others, and were referred to in the Ontario Budget as a model for improving our reporting on performance. The performance measure template and the quarterly reporting requirements are being adjusted to incorporate a more comprehensive analysis of the results achieved. These changes will facilitate the adoption of the CCAF/FCVI principles. The following chart provides a summary of the nine principles. A full discussion of the CCAF/FCVI principles is located at: documents/executive_summary.pdf. Performance Measurement: A Reference Guide Section 4: Reporting Performance Measures 38
41 CCAF/FCVI Reporting Principle Focus on the few critical aspects of performance Description This principle is intended to add focus to reporting. By focusing on the priorities of an organization and the needs of the users, the usefulness and the quality of performance reports is greatly improved. The focus of reporting should be driven by the likely use of the information. As part of the results-based planning process, ministries are identifying the critical performance measures that align strategies and activities with the government s priorities. Look forward as well as back Explain key risk considerations Explain key capacity considerations Explain other factors critical to performance Integrate financial and non-financial information Provide comparative information Present credible information, fairly interpreted Disclose the basis for reporting This principle focuses on the use of performance measures as predictive tools (identifying what the desired outcomes and accompanying targets are) and reporting on achievements. By identifying government priorities and results, the government is looking forward. Reports to the public will report on achievement towards addressing the priorities. Reporting results in the context of a previously released plan is a major transition that supports the look back portion of this principle. Public performance reporting should identify key strategic risks, explain their influence on policy choices and performance expectations, relate results achieved to the risks, and level of risk accepted. The identification and evaluation of risks has been incorporated into both the quarterly reporting requirements and the new performance measurement template. Ministries are asked to use Modern Controllership risk assessment tools and provide a summary of both internal and external risk factors in all future reporting. The term capacity refers to an organization s ability to achieve results at a specified level. A number of internal factors can impact on an organization s results including the quantity and quality of human, financial, intellectual, technological or physical resources available. Disclosure and discussion of capacity are important to meaningful public reporting. For example, starting up an activity may mean that the development of capacity is the single most significant aspect of performance. Introducing new delivery approaches may introduce new risks, require new techniques or processes, or call for different skills sets. Consideration of capacity has been built into the new performance measurement template. Other factors that could impact on results include changing economic, social or demographic information; standards of conduct, ethics or values; public perception of performance or acceptance of objectives; and unintended impacts. This information is essential to assessing the degree to which the results achieved can be attributed to government actions. Where attribution is less than complete, an explanation of other critical factors serves to guide interpretation of the results and qualifies the assessment of performance. Meaningful public reporting must put results information in context with their associated costs. Only when results are presented in relation to the resources consumed can the user assess and evaluate results. Relating resources to strategies and to results is essential if the public is to understand the value that government gets for its money. Ministries are asked to report on the costs of delivering strategies in relation to the achievement of government priorities. This principle encompasses two distinct types of comparative information: 1. Historical trend information for the organization over time; and 2. Comparison of the organization s performance to other similar entities. Public performance reporting should be based on credible quantitative and qualitative information fairly interpreted and presented. Publicly reported information should be consistent, relevant, reliable, understandable and fair. Since performance reports should focus on a few critical aspects of performance rather than present volumes of information, it is especially important to tell the reader why the reported measures were selected and how they relate to the overall strategic direction of the organization. This principle deals with providing disclosure on the choices made and the items to be reported, as well as changes in measurement or presentation and steps taken to ensure the reliability of data. This is necessary to increase user confidence in the information being reported. Performance Measurement: A Reference Guide Section 4: Reporting Performance Measures 39
42 Appendix 1 Performance Measurement Resources in the Ontario Public Service Performance Measurement and Program Evaluation Team, Program Management and Estimates Division, Management Board Secretariat Sets Ontario Public Service policy on performance measurement; Offers portfolio-based advice to ministries on performance management; Maintains a database of Ontario Public Service performance measures and coordinates performance measurement reporting; Coordinates the Performance Measurement and Program Evaluation Network for ministry staff involved in the above activities. The PMED website includes links to: the Performance Measurement Guide; web links to international Performance Measurement literature, resources, and cross-jurisdictional comparisons; web links to Ontario Public Service Ministry materials about their own performance measurement and management systems. Modern Controllership Training unit, Fiscal and Financial Policy Division, Ministry of Finance offers free courses on performance measurement to support ministries in their efforts to develop performance measurement systems and monitor and report on performance. Information can be found on there website: Other Resources Policy Innovation and Leadership (PIL) supports the policy community by building knowledge networks, promoting continuous learning, sharing innovative tools and practices, and profiling excellence. Enhancing our capacity to deliver cross-cutting and well-informed policy analysis in support of the government s priorities will put the Ontario Public Service at the leading edge of public policy innovation. PIL - About Us Performance Measurement: A Reference Guide Appendix 1 40
43 Appendix 2 Glossary of Performance Measurement Terms Introduction The following definitions are based on a number of sources and have been tailored for a variety of Ontario Public Service purposes such as program evaluation, performance measurement, risk management, results-based planning and controllership. These definitions are intended to clarify and encourage a common, consistent language within the Ontario Public Service governance, accountability and planning frameworks. Definitions Accountability The obligation to answer for results and the manner in which responsibilities are discharged. Accountability cannot be delegated. Responsibility is the obligation to act whereas accountability is the obligation to answer for an action. Activity An activity is the work performed by ministries to implement public policy and provide services to the public. All activities consume resources and produce products and/or services. One or more activities will be critical to the achievement of overall public policy objectives. Ministries must be able to demonstrate a direct causal link between the activity and the outcome(s). Attribution The demonstrable assertion that a reasonable connection can be made between a specific outcome and the actions and outputs of a government policy, program or initiative. Baseline The level of results at a given time that provides a starting point for assessing changes in performance and for establishing objectives or targets for future performance. Benchmarking The process of measuring and comparing one s own processes, products or service against a higher performing process, product or service and adapting business practices to improve. Cost Benefit Analysis A process that assesses the relation between the cost of an undertaking and the value of the resulting benefits Cost Effectiveness The extent to which an organization, program, etc. is producing its planned outcomes in relation to use of resources (inputs). Performance Measurement: A Reference Guide Appendix 2 41
44 Cross-ministry Measure Measure with a desired result to which more than one ministry, but not all ministries, contribute. (See also horizontal measure). Customer The person, whether inside or outside the organization, to whom services or products are delivered. Customer Satisfaction The degree to which the intended recipients or beneficiaries of a product or service indicate that the product or service meets their needs and expectations for quality and efficiency. Effectiveness The extent to which an organization, policy, program or initiative is producing its planned outcomes and meeting intended objectives Efficiency The extent to which an organization, policy, program or initiative is producing its planned outputs in relation to expenditure of resources. Evaluation The systematic collection and analysis of information on the performance of a policy, program or initiative to make judgements about relevance, progress or success and cost-effectiveness and/or to inform future programming decisions about design and implementation. High Level Indicator A measure of changes in social, environmental or economic conditions. Horizontal Measure Measure with a desired result to which most or all ministries contribute. (See also Cross-ministry measure). Indicator A quantitative or qualitative ratio or index used to signal and indirectly measure the performance of a program over time. Inputs The resources (human, material, financial, etc.) allocated to carry out activities, produce outputs and/or accomplish results. Intermediate Outcomes Benefits and changes in behaviour, decisions, policies and social action attributable to outputs to demonstrate that program objectives are being met, e.g., increased employability as a result of a training program. Long-term outcomes The ultimate or long-term consequences for human, economic, civic or environmental benefit, to which government policy or legislation contributes, e.g., life expectancy rates, overall economic performance. Performance Measurement: A Reference Guide Appendix 2 42
45 Monitoring The process of collecting and analyzing information to track program outputs and progress towards desired outcomes. Objective Achievable and realistic expression of a desired result. Outcome The actual effects/impacts or results of the outputs. See Short-term Outcomes, Intermediate Outcomes, Long-term Outcomes. Output The products or services that result from activities. Performance-Based Budgeting A method of allocating resources that connects inputs provided to the achievement of pre-defined objectives. Performance-based budgeting is focused on what can be achieved for a given amount and allows for comparisons between expected and actual progress. Performance measure Quantifiable information that provides a reliable basis for directly assessing achievement, change or performance over time. Performance measurement The process of assessing results, e.g., the degree to which objectives are being achieved. Priority Higher-order goals of government that reflect its commitment to citizens and contribute to enduring human, economic, civic and environmental benefits. Qualitative data Non-numeric information collected through interviews, focus groups, observation and the analysis of written documents. Qualitative data can be quantified to establish patterns or trends, e.g., improvement in a child s reading level as observed by parents and teacher. Quantitative data Information that is counted, or compared on a scale, e.g., improvement in a child s reading level as measured by a reading test. Reliability The extent to which measurements are repeatable and consistent under the same conditions each time. Result A condition (outcome) or product (output) that exists as a consequence of an activity. Results-based Management A comprehensive, government-wide approach that informs results-based decision-making, ensuring that all government-funded activities are aligned with strategies that contribute to meeting government priorities or serve an important public interest. Performance Measurement: A Reference Guide Appendix 2 43
46 Risk The chance of something happening that will impact on the achievement of objectives. Risk Management The active process of identifying, assessing, communicating and managing the risks facing an organization to ensure that an organization meets its objectives. Short-term Outcomes First-level effects of, or immediate response to the outputs, e.g., changes in compliance rates or degree of customer satisfaction. Standards Pre-defined quantifiable levels of performance that are commonly understood and agreed upon and are the basis for judging or comparing actual performance. Strategy Plan outlining how specified ministry activities and programs contribute to a government priority and results or other important public interest. Target A clearly stated objective or planned result [which may include output(s) and/or outcome(s)] to be achieved within a stated time, against which actual results can be compared. Validity The extent to which a measurement instrument accurately measures what it is supposed to measure. For example, a reading test may be a valid measure of reading skills, but it is not a valid measure of total language competency. Vote The numerical designation of a program or a group of activities within the Government of Ontario s Printed Estimates. Performance Measurement: A Reference Guide Appendix 2 44
47 References Canadian Comprehensive Auditing Foundation (CCAF/FCVI), 2002 Reporting Principles: Taking Public Performance Reporting to a New Level. Canadian Evaluation Society, 1999 Evaluation for Results. Gaster, C Business Plan 101 in Behavioural Healthcare Tomorrow, February issue. Governmental Accounting Standards Board of the Financial Accounting Foundation (GASB), 2003 Reporting Performance Information: Suggested Criteria for Effective Communication. Management Board Secretariat, 2000 Glossary, in Performance Measurement Guide. Management Board Secretariat, 2000 Business Planning and Allocations Directive. Management Board Secretariat, 2000 Performance Management Operating Policy. Management Board Secretariat, 2003 Guide to Program Evaluation. Management Board Secretariat, 2003 Guidelines And Instructions: Deputy Ministers And Senior Management Group Performance Management Plan Management Board Secretariat, 1999 Making Measures Count: Integrating Performance Measurement into Decision Making. Management Board Secretariat, 2000 Performance Measurement in the Business Planning Process. Mayne, John, 2003 Reporting on Outcomes: Setting Performance Expectations and Telling Performance Stories, Ottawa: Office of the Auditor General of Canada. Ministry of Education, /05 Business Planning Training slides. Ministry of Finance, 2003 Modern Controllership Glossary on MC - Glossary Ministry of Finance, Modern Controllership Training Unit, 2003 Governance and Accountability in the Public Sector. Schacter, Mark, 2002 What Will Be, Will Be, Ottawa: Institute on Governance. Treasury Board Secretariat, 2001 Guide for the Development of Results-based Evaluation and Accountability Frameworks: Lexicon. Ottawa: Treasury Board Secretariat. UK Evaluation Society, 2003 Glossary of Evaluation Terms on UNDP (United Nations Development Programme) 1997 Results-Oriented Monitoring and Evaluation Handbook. UNESCO (United Nations Educational, Scientific and Cultural Organization) Internal Oversight Service, 2003 Glossary of Main Evaluation Terms. Performance Measurement: A Reference Guide Appendix 2 45
48 Appendix 3a Examples of How Ontario is Using Performance Measurement PRISM: The Ontario Provincial Police s Performance Reporting Information Systems Manager Forthcoming in Chief Information Officer magazine, 2004 Senior level decision makers in the Ontario Provincial Police were interested in finding a way to gain direct access to the knowledge they required to make decisions. As an organization, the OPP knew that they were providing outstanding service, (according to a 2001 survey completed by Leger Marketing for the Canadian Press Association, 88% of Ontarians are satisfied with the work of the OPP) but decision makers felt that to be able to continue to improve and provide better services, they needed better access to the information they collected. If this was really going to work, the OPP understood that they required a solution that: was a business enabler, not just a piece of technology addresses the priorities of the organization (very rarely technology driven) everyday non techie people can use provided an accountability framework was designed to allow sophisticated analysis and score-carding without requiring users to have expertise in statistical analysis could be developed and implemented quickly and within limited budgets that reflect today s realities was built on a technology which addresses the concerns and priorities of the IT community To meet their requirements, the OPP developed PRISM - Performance Reporting Information Systems Manager. PRISM is a performance measurement application, designed to provide senior level decision makers direct access to critical data required for decision-making purposes. PRISM provides the OPP, specifically members of Commissioner s Committee with the ability to intuitively plan, monitor, evaluate and control expenditures and activities. Under the leadership of Deputy Commissioner Bill Currie, members of the OPP s Strategic Services Command worked with an Ontario based company, ABS System Consultants to develop PRISM. The OPP used ABS performance measurement and decision support platform, Metrics3D, to act as the foundation for PRISM. Unlike virtually all other Business Intelligence Technologies, Metrics3D is a comprehensive all-in-one package, combining numerous components normally requiring several different and often incompatible software products including: Performance Measurement Decision Support Risk Management OLAP Benchmarking Scorecard Performance Measurement: A Reference Guide Appendix 3a 46
49 Advanced Analysis and Reporting Dashboards Meta Data Dictionary Software PRISM works by accessing data the OPP maintains in a number of disparate internal operational systems and cross-referencing it with other sources of data, including data from other police jurisdictions and socio-demographic census data. Users have the ability to explore their organization s data in a variety of ways including, comparative analysis, peer analysis, cross tab (OLAP), time series detailed graphs, benchmarking, and dashboards. PRISM allows our organization to analyze a number of critical factors as they relate to performance, quickly stated Currie. I now have the ability, on my desktop, to instantly look at our organization from a variety of different perspectives and evaluate the performance and effectiveness of our services, he continued. According to Staff Sgt. Jeff Steers, PRISM s Project Manager, PRISM can be used by everyone in our organization, including those who are not very IT literate. PRISM provides our people with more than the ability to just drill up and down in an existing report. The point-and-click interface allows them to easily create new reports and what-if scenarios from hundreds of indicators, he says. Why did the PRISM Implementation work so well? According to Staff Sgt. Steers, as with any successful implementation, there are several reasons. First, the OPP chose to use software that was designed specifically for the public sector and could be tailored to address the organization s unique priorities. Second, the OPP teamed up with experienced consultants who understood how to balance the complex relationship between the business priorities of the organization and what was technically feasible. Third, an executive leader who was willing to think outside the box and champion a vision of what users truly required. Deputy Commissioner Currie mentioned that the OPP are currently working to roll out the application to the rest of their organization via the web, including regional and local detachments. PRISM is changing the way the OPP provides services and demonstrates how the organization is accountable to the Province of Ontario Currie said. Performance Measurement: A Reference Guide Appendix 3a 47
50 Appendix 3b Municipal Performance Measurement Program Performance Measurement: A Reference Guide Appendix 3b 48
51 Performance Matters! Welcome to the first edition of Performance Matters! This newsletter, produced by the Ministry of Municipal Affairs and Housing, is the first in a series on performance measurement in the municipal sector. We trust it will help readers better understand the concept and benefits of performance measurement. In the short term, we hope it provides opportunities for learning about how other municipalities use the practice and for sharing lessons learned. In the long term, we d like it to contribute to the growing movement of performance measurement in the public sector. Performance measurement is not a new concept. Numerous Ontario municipalities measure their performances in key service areas. Many have participated in performance measurement studies, sharing and comparing their results to learn new ways of improving service delivery. This newsletter highlights the initiatives of two municipalities: Thunder Bay and Guelph. Future editions will showcase the equally successful efforts of other Ontario municipalities. For more information, call (416) , [email protected] or visit Guelph In January 1998, the City of Guelph established a performance measurement system. The city views performance measurement as a way to permit valid comparisons that lead to new and improved ways of doing business. David Kennedy, Guelph s director of finance and head of its performance measurement project, believes that performance cannot be judged as absolutely good or poor. For municipalities, the comparison of performances over time or between organizations is what really provides useful information on trends and alternative process systems. With its new system, the city wanted to find out whether it was providing the right services at the right time and in the right quantity. The first steps were to establish a project team and set its objectives, which were to run pilot projects in select service areas in1998 and, from these, develop lessons learned to use in a subsequent citywide rollout of the new performance measurement system; roll out the new system city-wide in the 1999 fiscal year; tie the performance measurement system to the city s new financialmanagement and budgeting system; and tie the performance measurement system to the city s financial and non-financial strategic goals. The first pilot projects were done in the areas of transit, planning, public works and museum services. Additional projects were developed for waste-water treatment, waterworks, solid-waste management, building inspection and parks and recreation services. Staff working in these areas decided what activities or indicators to measure and how to collect the data. They also chose time frames for measuring, methods for presenting and displaying the results and the goals they hoped to achieve. During the balance of 1998, staff collected and plotted monthly data. The city s project team met every second week to swap experiences, share information and solve any problems. Early in 1999, the team identified lessons learned and recommended a process for rolling out the system to other service areas. Among the lessons was the need to evaluate performance measurement initiatives in other municipalities as well as in the private sector. (continued on page 2)
52 Guelph (continued from page 1) Guelph has made its performance measurement system integral to all its operations. The pilot projects went well and taught participants a great deal. As Kennedy says, The reason we had pilot projects was to learn from them. We documented the good, the bad and the ugly and shared that information to allow other departments a smoother transition into their performance measurement systems. Good results included improved productivity, communication and workflow scheduling. The team also identified new ways to make service delivery more efficient and effective. Overall, service delivery improved. For example, the city s fleet-services division is responsible for maintaining more than 500 vehicles. It measured its performance in shop effectiveness by tracking the time spent on redoing repairs to get them right. It found that almost six per cent of work done in1999 was spent redoing repairs. As a result, the division made improvements that reduced the number of additional repairs by more than 50 per cent. Guelph has since made its performance measurement system integral to all its operations. It views the system as providing an excellent focus for identifying problems and effecting improvements. The city is also involved in forming a new association, the Municipal Performance Improvement Association, to encourage the development of new performance measures and the sharing of knowledge and experiences. For more information, contact David Kennedy by phone at (519) or by at [email protected]. Thunder Bay In 1994, Thunder Bay created a division called management studies to help other city divisions and departments maximize their operating potential, improve their efficiency and effectiveness in service delivery and seek out best practices. In 1996, this division helped the finance department examine payment options for municipal water, property tax and telephone bills. This review was prompted by three factors: the growing number of choices available in the private and public sectors; the technological advances that reduced the cost of previously cost-prohibitive alternatives; the demands from customers for a broader range of options. The division of management studies looked at what was involved in paying bills through banks, credit cards, debit cards, pre-authorized payments, telephone and mail. By benchmarking, or comparing processes used by other North American cities, it discovered that 93 per cent of municipalities accepted payments for tax and water bills through banks and that payment by credit card was the method used least. It also learned that the most efficient method was to pay bills through a bank; mail payments processed through banks cost less than mail payments processed through the city; and pre-authorized payment was the most cost-effective method if processing was done through the city. It was evident that allowing customers to make payments at banks would improve service and save the city an estimated $92,723 a year. Local banks, however, were not keen on taking on this business because they were trying to reduce the volume of transactions made over the counter. (continued on page 3)
53 Thunder Bay (continued from page 2) Faced with this obstacle, the city expanded its acceptance of pre-authorized payments. In measuring its performance with this new process, the city showed positive results. The cost per transaction fell 200 per cent to $0.19 from $0.56 by mail and $0.60 at city wickets. The city also added debit-card terminals at three locations to make the service more accessible. While the cost of processing payments through terminals was higher than most other payment options, the total annual cost was relatively low, and the added customer convenience outweighed the dollar costs. The city continues to measure costs per transaction and review customer payment options to continually improve its payment services. It has since introduced telephone banking and is considering credit-card and onlinepayment options. The results of this project are cost savings and improved customer service, thus greater efficiencies and more effective services. The division of management studies measures how well it does at cutting costs for its clients. Its target is to find potential savings in excess of 200 per cent of its annual operating budget. It reports yearly on the number of projects completed, the savings identified and the overall level of client satisfaction. Since 1994, this division has met or exceeded its performance target. For example, in 1999, it received a client satisfaction rating of 87 per cent and identified more than $300,000 in cost savings, revenue and productivity opportunities. Its operating budget was $89,500. For more information, contact Kathleen McFadden by phone at (807) or by at [email protected]. In measuring its performance with this new process, Thunder Bay showed positive results. Did You Know? The Municipal Performance Measurement Program T he Ministry of Municipal Affairs and Housing recently introduced the Municipal Performance Measurement Program. This program requires all Ontario municipalities to collect data on 35 broad performance measures in nine core service areas and to report to taxpayers on their performances. The service areas are garbage, water, sewage, transportation, fire, police, local government, land-use planning and social services, which, according to 1998 data, make up about 80 per cent of municipal operating costs. The measures indicate how effective and efficient municipalities are in delivering these services. The program s objectives include improving services and strengthening municipal accountability to taxpayers. Opportunities for improvement exist because this program will provide a common set of performance measures and a comprehensive, Ontario-based data inventory. Municipalities can access this information to share and compare performance results and to learn better practices. This year, municipalities must submit 2000 data through the ministry s financial information return by April 30, 2001, and report to their taxpayers by June 30, For more information and materials on the program, contact your local Municipal Services Office: Central Southwest Northwest East Northeast
54 Other Performance Measurement Programs Thunder Bay and Calgary are members of the International City/ County Managers Association s (ICMA) Centre for Performance Measurement. The ICMA is a non-profit professional and educational organization based in Washington, DC. Some 120 cities and counties participate annually in the centre s comparative performance measurement program. They collect, analyse and compare their results on selected municipal performance measures. For more information, visit the ICMA Web site at The Cordillera Institute is an independent research and public policy organization. One of its projects is called the Sharing and Comparing Initiative, a joint effort with the Ontario Good Roads Association and ten municipalities: North Bay, Burlington, Markham, Belleville, Kingston, Cornwall, Peterborough, Kitchener, Waterloo and Chatham-Kent. This project aims to give municipalities practical methods for achieving measurable savings in their operations while enhancing their effectiveness. Participating municipalities compare their performances and share best practices in the areas of general government and road maintenance. For more information, contact the institute at [email protected]. Performance Measurement Web Sites Grande Prairie, Alberta Grande Prairie s site includes issues and principles of performance measurement as well as a bibliography and links to resources. Governmental Accounting Standards Board (GASB) From the GASB home page, click on performance measures to reach the GASB Performance Measurement Project. This site contains information about the use and reporting of performance measures for government services. Government Performance Project (GPP) The GPP rates the performance management of local and state governments and selected federal agencies in the United States from Performance Measurement Books Performance Measurement: Getting Results Harry Hatry, Urban Institute Press (Washington, DC, 1999). Information on this book can be found on the Urban Institute s Web site at Also from the Urban Institute: How Effective Are Your Community Services? Procedures for Monitoring the Effectiveness of Municipal Services 2nd edition, Hatry, Blair, Fisk, Greiner, Hall, and Schaenman, ICMA and The Urban Institute (Washington DC, 1992). Measuring Up: Governing s Guide to Performance Measurement for Geniuses (and other Public Managers) Jonathan Walters, Governing Books (Washington DC, 1998). This book, written in plain language, takes a lighthearted approach to performance measurement. For more information, see the Governing Magazine Web site at References to any organization, publication or Web site are for information purposes only. The Ministry of Municipal Affairs and Housing neither endorses nor condones the information that these sources provide. Issued by the Ministry of Municipal Affairs and Housing, March 2001
55 Appendix 4 Outputs Outputs are the tangible items produced through activity. Because they are tangible, they can be recorded and counted. For example, number of customers served; number of cases closed and number of facilities built or serviced are all outputs. Outputs are essential to operational management; they may tell managers: If a project is progressing on time and on budget Whether the right customers are being served Output measures are very useful for projecting expenditures and assessing resource needs within specific time frames. However, output measures have limited value in assessing the degree to which the overall objectives of a project are being met. For example, Number of customers served does not tell a manager whether the needs of customers were met. That is why each output should be linked to a short-term outcome. Outputs in a Nutshell Advantages Easy to identify because they are tangible Easy to monitor because they involve counting and that means data can be readily available (if good data collection systems are in place) Useful for creating ratios (time: client served) that demonstrate efficiency or act like a thermometer and give operational managers timely signs that the activity is on track or going off the rails Useful for demonstrating progress towards an objective. For example, if healthcare in a region is poor and one reason is low access to health services, then number of health care providers may be a useful output to monitor to demonstrate progress toward increased access before the desired changes in the provider-patient ratio can be demonstrated. Limitations Output measures tell us something was done, but they aren t enough to assess whether the thing done was the right thing, or if it was done well. Purpose To demonstrate accountability for expenditures because they tell what tangibles were produced for what cost. To help inform short-term decision-making, for example related to the timelines and costs for project implementation or to monitor policy processes. They may be used for public reporting to demonstrate progress towards an objective is being made even though the desired results are not yet apparent. Who uses them? Output measures are most useful for operations. They may also be used by central agencies, for example, as part of quarterly reporting, or to demonstrate that progress towards an objective is being made Cautions Measuring a few critical outputs is more valuable than measuring many that are not useful. For example, a school-based activity aiming to reduce the number of school drop-outs may measure: number of students in the activity, ongoing attendance rates in the activity, student-teacher ratios, cost per student of extra support, etc., but activity managers may find that ongoing attendance is the real gauge of progress toward achieving the objective. Monitoring the other outputs may be unnecessary. Performance Measurement: A Reference Guide Appendix 4 53
56 Outcomes Outcome performance measures help to demonstrate the extent to which each ministry s activities contribute to the achievement of ministry objectives and government priorities. Outcome measures may be short term or intermediate-term. Short-term outcome measures track the immediate impact of activities; e.g., changes in compliance rates or degree of customer satisfaction. Intermediate-term measures track changes in behaviour, decisions, policies and social action that result from governmentfunded activities; e.g., increased employability as a result of a training activity. Outcome measures cannot be separated from objectives because they are intended to measure the degree to which objectives are being met. Therefore, in order to ensure that the performance measures identified are the right ones, it is important to have clearly articulated objectives. Outcome measures provide the information needed for planning, setting policy direction and allocating resources among competing priorities. Outcome measures should also be used by ministries to highlight activity successes and to help manage risks that may affect the ability to meet objectives. Outcomes in a Nutshell Advantages Outcomes are focused on the impact of an activity, so they provide the strongest evidence of whether or not objectives are being met. Outcome measures provide the right level of information to support strategic planning and decision-making. Limitations Purpose Who uses them? Developing useful outcome measures is a challenge. Outcome measures are more useful for strategic decision-making than for time-sensitive project management. Outcome measures serve all four purposes of measurement: accountability for expenditure, informing decision-making, public reporting and continuous improvement. Outcome measures demonstrate accountability for the results achieved with expenditure, not just the products produced. Outcome measures are used at all levels of the organization. Activity managers use outcome measures to support continuous improvement, to ensure that activities continue to meet the needs of customers and to align activities with government priorities. Ministries use outcome measures to justify to central agencies what resources should be allocated and where. Cabinet committees and the Cabinet use outcome measures to inform policy decisions and government priorities. Cautions Make sure that the outcome measure relates to objectives and is integrated into the design of the strategy. Performance Measurement: A Reference Guide Appendix 4 54
57 High-Level Indicators High-level indicators measure societal-level change; for example, life expectancy or economic growth. Many factors affect high-level indicators and it is rarely possible to identify a single cause for change. In other words, it is difficult to attribute change to a single cause or solely to the activities of government initiatives. There are too many stakeholders and variables involved to hold ministries accountable for high-level indicators. However, because high-level indicators provide an overview of socio-economic conditions and government has a role to positively affect those conditions, it is important that ministries track high-level indicators to which their strategies contribute. Furthermore, government priorities, such as Better Student Achievement may require that high-level indicators be tracked in addition to other ministrybased performance measures. High-Level Indicators in a Nutshell Advantages Have often been identified already, have been broadly accepted as useful and can be adopted by a jurisdiction or government with little or no adjustment. Most data and data collection issues have been identified and articulated, which facilitates the adoption of such measures. Data is often collected by another organization, e.g., Statistics Canada, which reduces the resources required to monitor the high-level indicator. Provides ability to compare with other jurisdictions/governments. Helps managers ensure that their activities are aligned with higher-level objectives and strategies (e.g., government priorities). Limitations In many (but not all) cases it is difficult to attribute the result to any particular action. Rather, the result is frequently produced by interactions among a variety of activities. Can only rarely be used to demonstrate the success of a particular activity. Purpose Primarily used as part of public reporting, particularly to make general crossjurisdictional comparisons about the impact of government activities. May also be used to highlight the need for government intervention in a particular area. Who uses them? Ministers and others use them in identifying government priorities, for making comparisons among jurisdictions and for highlighting positive change during their administrations. Senior management in ministries may use them to support re-focusing the strategies or activities of a ministry. Cautions It is sometimes tempting to use high-level indicators to attribute greater success than is warranted to an activity. For example, one year there was six per cent growth in a particular economic sector (a high level indicator). The same year, the government had targeted spending to improve growth in that sector. However, the six per cent sector growth may have been spurred by a depreciation of the domestic currency, rather than government investment. Therefore, the ministry would have difficulty claiming that, as a result of government investment, economic growth in x sector was six per cent. Performance Measurement: A Reference Guide Appendix 4 55
58 Appendix 5 Sample Logic Models Performance Measurement: A Reference Guide Appendix 5 56
59 Clients & Program Objectives: Domestic Violence Services of Greater New Haven, Inc. 24-Hour Hotline Program Logic Model The hotline provides victims of domestic violence with 24-hour, seven day a week access to a trained Domestic Violence staff person who provides free, confidential telephone counselling focused on domestic violence issues. This approach works because it protects a woman s confidentiality while providing her with an opportunity to be heard and validated which is essential in order for her to take any next steps toward a life without violence. Long-term Outcome Caller exercises more control in her significant relationship. Intermediate Outcomes Caller gains more emotional stability Caller develops a safety plan Caller develops a support network Caller able to think more clearly Caller identifies next steps Caller identifies other social supports Caller manages feelings about immediate crisis Caller identifies that she has rights and choices Caller reports decreased sense of isolation Initial Outcome Caller s anxiety is lowered [EMOTIONAL] Caller s perception of herself as victim validated [COGNITIVE] Caller provided immediate support [SOCIAL] Outputs 2500 annual calls received/responded to Activities Staff are available to speak to victims of domestic violence 24 hours/day, 7 days/week. Immediate telephone counselling is provided to assess the caller s need for emotional support, safety planning and referrals Inputs (1) 5 Certified Battered Women s Counselors with annual certification training (2) 1 Master s level supervisor (3) Telephone lines, answering service, cell phones, pagers (4) Office supplies paper, pens, clips, files, rolodex (5) Listings of community resources (6) Standards review by CT Coalition Against Domestic Violence
60 Chart 2: Logic Model for Service System Management Goals Outcomes Outputs Activities Inputs Strategy Ministry-funded programs are delivered though effective systems of services Clients are able to access services: Clients are able to easily find out about services. Clients are able to easily get to services or services are brought to the client. Clients receive services in a language they understand. Services are client-focused. Clients receive services in a way that respects their beliefs and practices. Clients are offered the range of services and supports they need. Services are coordinated: Within the system of services, service providers know about each others' services, involve other providers when appropriate, and have good working relationships Accessible Services Products that are culturally acceptable Service plans that address the range of client needs, referrals, case conference notes Coordination protocols, joint agency activities Market services, arrange transportation, provide services at home, locate services in convenient locations, schedule work, provide translation Determine cultural needs, modify practices, provide products that are culturally acceptable Assess clients, provide direct services, refer clients to other needed supports and services Providers communicate on a regular basis. Providers develop common tools. Providers develop shared goals. Policies, funding strategies, protocols, guidelines, directives, FTEs, $s Service system managment Coordinated mangement: Funders have a shared understanding of overarching goals. Policies, plans and funding strategies complement each other. Communications across ministries are consistent with each other. Joint policies, decisions, plans, communications. Jointly develop policy, make decisions, and plan services. Coordinate communications. Horizontal Management Appendix 5a: United Way of Greater New Haven, Hhttp:// Appendix 5b: Ministry of Community and Social Services, Corporate Policy and Intergovernmental Affairs Branch, Performance Measurement: A Reference Guide Appendix 5 58
61 Chart 3: Logic Model for Governance and Management GOALS OUTCOMES OUTPUTS ACTIVITIES INPUTS As a result of good governance and management, clients and communities receive affordable and effective services that reflect provincial priorities. Performance Measures Sub-goal 1. The ministry produces high quality policy and operations products. Sub-goal 2. Government-funded organizations demonstrate accountability to funders and to the public. Sub-goal 3. Service system managers achieve value for money. Outcome Measures Policy and operations products and strategic plans are informed by results, environmental information, and stakeholders The public, stakeholders, and partners receive information about programs and results. Ministry-funded services are delivered effectively. Service targets are achieved or exceeded. Outcome Measures Steering: Transformation vision. Policy documents. Strategic plans. Guidelines and resources. Planning tools for agencies. Information: Performance / evaluation reports. Consultation information. SERVICE DELIVERY: Service contracts. Budgets. Directly-delivered services. Information resources. Performance/evaluation information. Local plans. Volume measures. Quality measures of business processes and products, including compliance. PROVINCIAL/REGIONAL LEVEL: Consult stakeholders. Develop policy products. Define expectations and risks. Plan services at regional/provincial level. Identify risks. Negotiate contracts with municipal level and agencies. Allocate resources. Deliver direct services. Monitor/evaluate performance. Implement corrective action. Provide support to service providers. Information management; public information. COMMUNITY/AGENCY LEVEL: Plan agency services. Negotiate contracts with province, municipal level and providers. Allocate resources within agency. Deliver services. Monitor/evaluate performance. Implement corrective action. Information management; provide information to provincial/municipal levels. Corrective action. MUNICIPAL LEVEL: Define expectations. Plan services at municipal level. Identify risks. Negotiate contracts with province and agencies. Allocate resources to agencies/providers. Deliver services. Monitor/evaluate performance. Implement corrective action. Provide support to service providers. Information management; provide information to provincial level; public information. Corrective action. Ministry (MCSS / MCYS) Agencies Municipal-level CMSMs & DSSABs Government policy, directives. guidelines & resourcing decisions Other ministries; federal government Appendix 5a: United Way of Greater New Haven, Hhttp:// Appendix 5b: Ministry of Community and Social Services, Corporate Policy and Intergovernmental Affairs Branch, Performance Measurement: A Reference Guide Appendix 5 59
62 Appendix 6 Step-by-Step Instructions for Creating a Logic Model Figure 2: A Standard Logic Model Diagram Larger Public Interest: (Government Priorities Are Usually Here) Customers: Who Benefits Objectives Of The Strategy: What Does The Strategy Hope To Accomplish Inputs Activities Outputs Desired Short-Term Outcomes Desired Intermediate Outcomes List all inputs List all activities List the tangible products of the activities List the changes in participation, awareness, behaviour, compliance or capacity that are expected to result from the activities in the short term List the benefits or changes in conditions, attitudes and behaviour that are expected to result from the activities in the intermediate- term There are seven steps to creating a logic model: 1. Defining the public interest served by the ministry s strategy. 2. Defining who are the strategy s primary customers. 3. Defining the objectives of the strategy. 4. Identifying the inputs. 5. Defining activities. 6. Defining outputs. 7. Defining the desired short and intermediate-term outcomes. Performance Measurement: A Reference Guide Appendix 6 60
63 Define Public Interest (Government Priorities and Other Important Public Interest) Answering the question, what is the public interest? involves thinking about the higher purpose of the strategy. Some people will think of this as the vision, others will think of it as the mission. Vision and mission are different things, and that s why they need to be discussed - to ensure that everyone involved really does agree about the overall purpose of the strategy and the public interest being served. Where the strategy clearly relates to a government priority, this work is made easier because the priority provides the wording for what public interest is served. Where the strategy serves other important public interest, the ministry will need to identify that purpose and may need to involve other ministries in the discussion of how that purpose will be articulated, for example, on government s regulatory role. Define Primary Customers The next step is to clearly define who is going to benefit from, or be directly impacted by, the strategy. All government strategies will have a long list of internal and external customers but it is important to clearly identify which one(s) are intended to benefit from the strategy. Keep in mind that there should be a clear link between who benefits and what public interest is being served. Define Objectives An objective is a statement that says what the strategy wants to accomplish. Clear objectives: provide purpose and direction; motivate activity delivery and action; help to provide guidance and plan training; increase productivity and efficiency; make the link between the initiative and the public interest. Good statements of objectives are specific, include an action and leave little room for interpretation. To use a very simple example: An activity provides out-of-school support over six years to ESL children aged 6-12, to ensure they are reading at a high school level by the time they enter high school. Reading at a high school level is only the beginning of a good statement of the objective. Children who complete the activity will be able to read in English at a grade nine level by the time they complete the activity is specific, contains an action and leaves little room for interpretation. Staff or customers new to the activity, other organizations and Management Board of Cabinet will know what the activity is trying to achieve. Notice that clear objectives make identifying performance measures much easier! CAUTION: Make sure that the objectives of the strategy align with the public interest and customer groups that have been identified. Performance Measurement: A Reference Guide Appendix 6 61
64 Identify Inputs Inputs are all the resources available to achieve the objectives of the strategy. In developing a logic model, it is always useful to list the inputs available because: There may be resources available that haven t been considered, such as external research, because they aren t used on a continual basis. Resources limit what is possible to achieve, so they force the question, Can the objectives be met with the resources available? If the answer is no, the objectives may need to be reconsidered or inputs may need to be used more creatively. Define Activities Activities are the actions undertaken by the ministry directly or by broader public sector organizations on behalf of the ministry to meet the objectives of the strategy. Identify as activities any programs that help to meet the objectives of the strategy. There may be a number of activities related to each objective and there may be activities that contribute to meeting more than one objective. This is the stage when the value of the logic model becomes clear. If there are activities that don t align with an objective, then perhaps those activities aren t really nessessary. Similarly, if there are several activities contributing to an objective, perhaps those activities can be better aligned to avoid duplication and free up resources for other purposes. The logic model helps to ensure that: All activities that contribute to meeting objectives are listed All activities listed do meet objectives Define Outputs Outputs are the tangible things produced by the activities. Outputs might include, for example, fines, books, beds, waiting lists, inspections, consultations, customers or response times. In other words, outputs are things that can be counted easily. List all the outputs that each activity is expected to produce. Then go through the list of outputs and decide which outputs really help to meet objectives. If an output isn t necessary to meet the objectives, then perhaps it is not needed and the inputs used to create it could be better used in another way. Or, perhaps the output is necessary, but contributes to meeting the objectives of a different strategy. Then perhaps the activities that produce the output need to be reconsidered to better align them with the strategy. Define the Desired Short-term and Intermediateterm Outcomes In some ways, stating the desired outcomes is similar to stating the objectives, and in this way, a logic model is less a linear process than a cyclical one. However, desired outcomes may be staged elements of the objectives. Let us return to the earlier example of an activity targeted to ensuring that the English reading skills of ESL children aged HINT: Because outputs can be counted, they are easily converted into performance measures. When the logic model is completed, identify which outputs really demonstrate that progress is being made towards meeting the objectives. Those outputs are the ones that should be tracked as output performance measures. Performance Measurement: A Reference Guide Appendix 6 62
65 six to twelve reach a grade nine level by the time they complete the activity. Educators creating a logic model for the activity may know that the children in the activity face barriers to reading other than language skills learning disabilities, poor physical health or low self esteem. Thus, in order to achieve the objective, a number of precursors to learning may also need to be achieved. Therefore, the desired short-term outcomes may relate to improved conditions to support learning. The desired intermediate-term outcome of the activity, implemented over a number of six year cycles, may be to ensure that all ESL children entering grade nine are able to read at a grade nine level. Notice that, in this case, the desired short-term outcome needs to be reached before the objective can be met. The desired intermediate-term outcome relates to meeting the objective on a continuous basis. In other words, desired short-term outcomes relate to the impact of the activity and desired intermediateterm outcomes relate to the achievement of the objectives. Achievement of the objectives, in turn, contributes to achievement of government priorities, which may be measured through high-level indicators. SHORT-TERM OUTCOME IMPACT OF THE ACTIVITIES INTERMEDIATE-TERM OUTCOME MEETING THE OBJECTIVES Some people find it helpful to apply timeframes to short-term (1-3 years) and intermediate-term (3-5 years), but time frames are only a guide and will vary across activities. HINT: Identifying desired outcomes helps to identify outcome-based performance measures. The difference between a desired outcome and an outcome measure is that: A desired outcome is a statement of what the activity wants to achieve, related to the objectives. (e.g., All ESL children entering grade 9 will be able to read in English at a grade 9 level) An outcome measure is a statement of what will be measured to demonstrate progress towards the desired outcome. (e.g., Percentage of ESL children entering grade 9 who pass a standardized English reading test) 2 These are examples used for explanatory purposes only and not suggestive of practices or measures to be adopted. Performance Measurement: A Reference Guide Appendix 6 63
66 Appendix 7 Measures Checklist Relevance: Validity: Is the measure actually a good measure of the objectives the strategy intends to achieve? Does the measure actually measure what it is supposed to? Reliability: Do different users of the same measure report the same result? Verifiable: Will the measure produce the same data if measured repeatedly? Attribution: Does the measure relate to factors that the ministry can affect? Clarity: Is the measure clearly defined and easily understood? Accuracy: Cost Effectiveness: Does the measure provide correct information in accordance with an accepted standard? Is the value of the measure greater than the data collection costs? Sensitivity: Is the measure able to measure change? Timeliness: Comparability: Consistency: Can data be collected and processed within a useful timeframe? Can the data be compared with either past periods or with similar activities? Does the data feeding the measures relate to the same factors in all cases at all times? Integrity: Will the measure be interpreted to encourage appropriate behaviours? Performance Measurement: A Reference Guide Appendix 7 64
67 Appendix 8 Example of How Benchmarking Can Be Used Performance Measures Count in Tight Budget Times Bob Layton, city manager, and Don Gloo, assistant to the city manager of Urbandale, Iowa, recently used performance measurements to support a request for additional financial resources during a tight budget season. According to Layton, a request for additional police officers had been declined when the performance measurement information that they submitted to support the request - on crime rates, case clearance rates, and citizen survey data - did not justify additional staffing. Urbandale used benchmarking to compare its performance on crime rates and clearance rates to more than 125 other cities in North America. The benchmarking results showed that Urbandale s performance on case clearance was outstanding (see graph below). "Throughout the entire budget process, financial constraints forced us to make difficult decisions and considerable compromises on priorities. Having good performance measures makes it easier to defend our resource allocation decisions," Gloo said. Performance Measurement: A Reference Guide Appendix 8 65
Guide for the Development of Results-based Management and Accountability Frameworks
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
Framework for Managing Programme Performance Information
Framework for Managing Programme Performance Information Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Managing
Performance Monitoring and Evaluation System (PMES) Framework Document
Performance Monitoring and Evaluation System (PMES) Framework Document Performance Management and Evaluation Unit Cabinet Support and Policy Division Cabinet Office 8 th November 2010 Table of Contents
U.S. Department of the Treasury. Treasury IT Performance Measures Guide
U.S. Department of the Treasury Treasury IT Performance Measures Guide Office of the Chief Information Officer (OCIO) Enterprise Architecture Program June 2007 Revision History June 13, 2007 (Version 1.1)
Logan City Council. Strategic Planning and Performance Management Framework
Logan City Council Strategic Planning and Performance Management Framework 1 Table of contents 1. Overview 3 a) Purpose 3 b) Key Features 3 2. Context 4 a) National Framework for Sustainability 4 b) Elements
EXECUTIVE DIRECTOR STRATEGIC SERVICES CORPORATE STRATEGIES AND SERVICES DIVISION ALBERTA INFRASTRUCTURE
EXECUTIVE DIRECTOR STRATEGIC SERVICES CORPORATE STRATEGIES AND SERVICES DIVISION ALBERTA INFRASTRUCTURE Executive Manager I Salary Range: $125,318 - $164,691 ($4,801.47 - $6,310.03 bi-weekly) Limited Competition
Information Management
G i Information Management Information Management Planning March 2005 Produced by Information Management Branch Open Government Service Alberta 3 rd Floor, Commerce Place 10155 102 Street Edmonton, Alberta,
Status Report of the Auditor General of Canada to the House of Commons
2011 Status Report of the Auditor General of Canada to the House of Commons Chapter 1 Financial Management and Control and Risk Management Office of the Auditor General of Canada The 2011 Status Report
Guidance on the Governance and Management of Evaluations of Horizontal Initiatives
Guidance on the Governance and Management of Evaluations of Horizontal Initiatives Centre of Excellence for Evaluation Expenditure Management Sector Treasury Board of Canada Secretariat Her Majesty the
Organisational Performance Measurement and Reporting Guide
Organisational Performance Measurement and Reporting Guide Australian Capital Territory, Canberra, April 2013 This work is copyright. Apart from any use permitted under the Copyright Act 1968, no part
Contract Performance Framework
Contract Performance Framework Version 4 September, 2014 1 Provincial CCAC Client Service Contract Performance Framework Introduction: Home care plays a critical role in achieving successful and sustainable
Performance Management. Date: November 2012
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
Revenue Administration: Performance Measurement in Tax Administration
T e c h n i c a l N o t e s a n d M a n u a l s Revenue Administration: Performance Measurement in Tax Administration William Crandall Fiscal Affairs Department I n t e r n a t i o n a l M o n e t a r
KEY PERFORMANCE INFORMATION CONCEPTS
Chapter 3 KEY PERFORMANCE INFORMATION CONCEPTS Performance information needs to be structured to demonstrate clearly how government uses available resources to deliver on its mandate. 3.1 Inputs, activities,
Results-Oriented Cultures. Using Performance Management Systems in U.S. Agencies and Abroad
Results-Oriented Cultures Using Performance Management Systems in U.S. Agencies and Abroad Balanced Scorecard Interest Group November 20, 2002 A Model of Strategic Human Capital Management Assesses the
Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Prepared by: Audit and Assurance Services Branch.
Aboriginal Affairs and Northern Development Canada Internal Audit Report Audit of Water and Wastewater Infrastructure Prepared by: Audit and Assurance Services Branch Project # 12-10 February 2013 TABLE
Internal Audit Manual
Internal Audit Manual Version 1.0 AUDIT AND EVALUATION SECTOR AUDIT AND ASSURANCE SERVICES BRANCH INDIAN AND NORTHERN AFFAIRS CANADA April 25, 2008 #933907 Acknowledgements The Institute of Internal Auditors
Striving for Excellence
School of Public Administration Performance Management Systems in Canada: Striving for Excellence By: Andrew Bucci February 24 th, 2014 A Bucci 1 Canada s Public Service continually strives towards finding
Structure of the Administration (political and administrative system)
Thailand Performance Management 1. Introduction Structure of the Administration (political and administrative system) Since the declaration of the first constitution in 1932, the politics and government
Guide to the Performance Management Framework
Guide to the Performance Management Framework November 2012 Contents Contents... Guide to the Performance Management Framework...1 1. Introduction...4 1.1 Purpose of the guidelines...4 1.2 The performance
COMPREHENSIVE ASSET MANAGEMENT STRATEGY
COMPREHENSIVE ASSET MANAGEMENT STRATEGY APPROVED BY SENIOR MANAGEMENT COMMITTEE ON AUGUST 23, 2012 (TO BE FINALIZED AFTER APPROVAL OF CAM POLICY BY COUNCIL) August 2012 Contents CONTENTS EXECUTIVE SUMMARY
Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Audit of Internal Controls Over Financial Reporting.
Aboriginal Affairs and Northern Development Canada Internal Audit Report Audit of Internal Controls Over Financial Reporting Prepared by: Audit and Assurance Services Branch Project #: 14-05 November 2014
How To Write A Listing Policy For A Species At Risk Act
Fisheries and Oceans Canada Species at Risk Act Listing Policy and Directive for Do Not List Advice DFO SARA Listing Policy Preamble The Fisheries and Oceans Canada (DFO) Species at Risk Act (SARA) Listing
1Developing a Strategy for Clear Language
Effective Communication for Municipalities Developing a Strategy for Clear Language in Municipal Communications Key Steps in a Clear Language Strategy How to Get Organized How the Clarity Kit can Help
Service Delivery Review
Building Strong Communities A Guide to Service Delivery Review for Municipal Managers August 2004 The Ministry of Municipal Affairs and Housing and its partners on this project offer A Guide to Service
Auditor General of Canada to the House of Commons
2010 Report of the Auditor General of Canada to the House of Commons FALL Chapter 1 Canada s Economic Action Plan Office of the Auditor General of Canada The Fall 2010 Report of the Auditor General of
Managing Performance Guidelines for the Tasmanian State Service
Managing Performance Guidelines for the Tasmanian State Service Managing Performance Guidelines for the Tasmanian State Service These guidelines are for use by State Service agencies and authorities which
DRAKENSTEIN PERFORMANCE MANAGEMENT SYSTEM. Municipality Munisipaliteit umasipala POLICY FRAMEWORK:
Of all the decisions an executive makes, none is as important as the decisions about people, because they determine the performance capacity of the organisation. (Peter Drucker as noted by the Business
APPLYING THE STRATEGIC PLANNING PROCESS
MANAGEWARE APPLYING THE STRATEGIC PLANNING PROCESS Requirements Each department and agency of state government must engage in the process of strategic planning and must produce a strategic plan to be used
STEVE TSHWETE LOCAL MUNICIPALITY
STLM Performance Management System Framework 1 STEVE TSHWETE LOCAL MUNICIPALITY PERFORMANCE MANAGEMENT SYSTEM FRAMEWORK 2015-16 REVIEW STLM Performance Management System Framework 2 Contents CHAPTER 1...
Operational Policy Statement. Follow-up Programs under the Canadian Environmental Assessment Act
Operational Policy Statement Follow-up Programs under the Canadian Environmental Assessment Act Original: October 2002 Update: December 2011 Purpose This operational policy statement provides direction
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
Audit of the Policy on Internal Control Implementation
Audit of the Policy on Internal Control Implementation Natural Sciences and Engineering Research Council of Canada Social Sciences and Humanities Research Council of Canada February 18, 2013 1 TABLE OF
5 FAM 670 INFORMATION TECHNOLOGY (IT) PERFORMANCE MEASURES FOR PROJECT MANAGEMENT
5 FAM 670 INFORMATION TECHNOLOGY (IT) PERFORMANCE MEASURES FOR PROJECT MANAGEMENT (CT:IM-92; 08-01-2007) (Office of Origin: IRM/BPC/PRG) 5 FAM 671 WHAT ARE IT PERFORMANCE MEASURES AND WHY ARE THEY REQUIRED?
The Government of Canada Action Plan to Reform the Administration of Grant and Contribution Programs
The Government of Canada Action Plan to Reform the Administration of Grant and Contribution Programs Her Majesty the Queen in Right of Canada, represented by the President of the Treasury Board, 2008
Sample. Employee Recognition Presentation
Sample Employee Recognition Presentation What is Employee Recognition? Any word or deed towards making someone feel appreciated and valued for who they are and recognized for what they do. A range of formal
PERFORMANCE MANAGEMENT SYSTEM
. CAPE WINELANDS DISTRICT MUNICIPALITY. PERFORMANCE MANAGEMENT SYSTEM A POLICY and FRAMEWORK GUIDE Document review and approval Revision history Version Author Date reviewed 1 2 3 4 5 This document has
EXAMINING PUBLIC SPENDING. Estimates Review: A Guide for Parliamentarians
EXAMINING PUBLIC SPENDING Estimates Review: A Guide for Parliamentarians Some information for this booklet was taken from the Treasury Board of Canada Secretariat s website. A description of the Expenditure
Performance Management Plan 2014-2015
North Dakota Department of Health Performance Management Plan 2014-2015 1 North Dakota Department of Health Performance Management Plan 2014-2015 CONTENTS Purpose 3 Policy Statement 3 Organizational Structure
Value to the Mission. FEA Practice Guidance. Federal Enterprise Architecture Program Management Office, OMB
Value to the Mission FEA Practice Guidance Federal Enterprise Program Management Office, OMB November 2007 FEA Practice Guidance Table of Contents Section 1: Overview...1-1 About the FEA Practice Guidance...
Key Players in Performance Management & Performance Measurement
Key Players in Performance Management & Performance Measurement William Dorotinsky October 11-12, 2012 2009 IAS Conference: Performance Audit by Public Sector Internal Auditors Table of Contents Objectives
The Framework for Strategic Plans and Annual Performance Plans is also available on www.treasury.gov.za
Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Strategic Plans and Annual Performance Plans is also available
Best Practice Asset Management
Case Study Best Practice Asset Management An example of an Activity Management Plan, published by the Working Group Case Study AMP 01-2014 The Working Group was established by the Road Efficiency Group
Integrated Risk Management:
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
Oudtshoorn Municipality. Performance Management Framework / Policy
Oudtshoorn Municipality Performance Management Framework / Policy July 2011 2 Table of Contents 1. Introduction 4 2. Objectives and Benefits of a Performance Management System 5 2.1 Objectives 5 2.2 Benefits
Part B1: Business case developing the business case
Overview Part A: Strategic assessment Part B1: Business case developing the business case Part B2: Business case procurement options Part B3: Business case funding and financing options Part C: Project
Reporting Service Performance Information
AASB Exposure Draft ED 270 August 2015 Reporting Service Performance Information Comments to the AASB by 12 February 2016 PLEASE NOTE THIS DATE HAS BEEN EXTENDED TO 29 APRIL 2016 How to comment on this
Performance Management Framework
Purpose of the framework: To explain how we manage in Poole. It applies to all directly managed services of the Council. Introduction: Effective management at the council will: Ensure our goals are prioritised
Implementing an Integrated City-wide Risk Management Framework
AUDITOR GENERAL S REPORT ACTION REQUIRED Implementing an Integrated City-wide Risk Management Framework Date: June 11, 2015 To: From: Wards: Audit Committee Auditor General All Reference Number: SUMMARY
the role of the head of internal audit in public service organisations 2010
the role of the head of internal audit in public service organisations 2010 CIPFA Statement on the role of the Head of Internal Audit in public service organisations The Head of Internal Audit in a public
Handbook for municipal finance officers Performance management Section J
1. Introduction The Department of Provincial and Local Government (DPLG) defined performance management as a strategic approach to management, which equips leaders, managers, employees and stakeholders
Management and Use of Information & Information Technology (I&IT) Directive. Management Board of Cabinet
Management and Use of Information & Information Technology (I&IT) Directive Management Board of Cabinet February 28, 2014 TABLE OF CONTENTS PURPOSE... 1 APPLICATION AND SCOPE... 1 PRINCIPLES... 1 ENABLE
Service Delivery Collaboration in Non-profit Health and Community Services: What Does Government Want? By Rob Howarth
Service Delivery Collaboration in Non-profit Health and Community Services: What Does Government Want? By Rob Howarth Introduction There is a widespread perception on the part of Toronto non-profit organizations
Office of the Executive Council. activity plan 2014-17
Office of the Executive Council activity plan 2014-17 Message from the Premier In accordance with my responsibilities under the Transparency and Accountability Act, I am pleased to present the 2014-17
MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION
MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION I. Introduction to Results Based Monitoring HED will use its results-based monitoring and evaluation system to effectively manage partnership
How To Monitor A Project
Module 4: Monitoring and Reporting 4-1 Module 4: Monitoring and Reporting 4-2 Module 4: Monitoring and Reporting TABLE OF CONTENTS 1. MONITORING... 3 1.1. WHY MONITOR?... 3 1.2. OPERATIONAL MONITORING...
Guideline. Records Management Strategy. Public Record Office Victoria PROS 10/10 Strategic Management. Version Number: 1.0. Issue Date: 19/07/2010
Public Record Office Victoria PROS 10/10 Strategic Management Guideline 5 Records Management Strategy Version Number: 1.0 Issue Date: 19/07/2010 Expiry Date: 19/07/2015 State of Victoria 2010 Version 1.0
www.chsrf.ca / www.fcrss.ca
www.chsrf.ca / www.fcrss.ca IS RESEARCH WORKING FOR YOU? A SELF-ASSESSMENT TOOL AND DISCUSSION GUIDE FOR HEALTH SERVICES MANAGEMENT AND POLICY ORGANIZATIONS making research work pour que la recherche porte
Audit of Financial Management Governance. Audit Report
Audit of Financial Management Governance Audit Report March 2015 TABLE OF CONTENTS Executive Summary... 2 What we examined... 2 Why it is important... 2 What we found... 2 Background... 4 Objective...
Preparing, Reviewing and Using Class Environmental Assessments in Ontario
Code of Practice Preparing, Reviewing and Using Class Environmental Assessments in Ontario Legislative Authority: Environmental Assessment Act, RSO 1990, chapter E.18 January 2014 This Code of Practice
AER reference: 52454; D14/54321 ACCC_09/14_865
Commonwealth of Australia 2014 This work is copyright. In addition to any use permitted under the Copyright Act 1968, all material contained within this work is provided under a Creative Commons Attribution
Internal Audit. Audit of HRIS: A Human Resources Management Enabler
Internal Audit Audit of HRIS: A Human Resources Management Enabler November 2010 Table of Contents EXECUTIVE SUMMARY... 5 1. INTRODUCTION... 8 1.1 BACKGROUND... 8 1.2 OBJECTIVES... 9 1.3 SCOPE... 9 1.4
TIPS PREPARING A PERFORMANCE MANAGEMENT PLAN
NUMBER 7 2 ND EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS PREPARING A PERFORMANCE MANAGEMENT PLAN ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related
Standards, quality processes and accountability
Standards, quality processes and accountability Delivering and coordinating planning and infrastructure for the community are achieved through a strong management structure that is supported by management
LEJWELEPUTSWA DISTRICT MUNICIPALITY
LEJWELEPUTSWA DISTRICT MUNICIPALITY PERFORMANCE MANAGEMENT POLICY INDEX Introduction 3 Background 4 Definitions 7 Legislative Framework 8 Overview of Performance Management 9 The Performance Management
PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS
NUMBER 18 1 ST EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related
DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper
DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES Performance Information Paper 1. Introduction Performance management is the process of setting agreed objectives and monitoring progress against these
PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE:
PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE: Project Name Project Management Plan Document Information Document Title Version Author Owner Project Management Plan Amendment History
Deliverable 6: Final Implementation Evaluation Report
Phase 2 Contract Number: HHSF223201010017B, Order No. 22313004 Deliverable 6: Final Implementation Evaluation Report February 1, 2016 TABLE OF CONTENTS 1. EXECUTIVE SUMMARY... 1 2. PROJECT BACKGROUND AND
Ministry of Finance. Plan for 2015-16. saskatchewan.ca
Ministry of Finance Plan for 2015-16 saskatchewan.ca Statement from the Minister I am pleased to present the Ministry of Finance Plan for 2015-16. Government s Direction and Budget for 2015-16 is focused
QUAๆASSURANCE IN FINANCIAL AUDITING
Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1
Evaluation policy and guidelines for evaluations
Evaluation policy and guidelines for evaluations IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and O
Results-based Performance Management System (RPMS) EMPLOYEE S MANUAL
Results-based Performance Management System (RPMS) EMPLOYEE S MANUAL Table of Contents Introduction 1 Phase 1 9 KRAs 10 Objectives 12 Competencies 19 Performance Indicators 25 Reaching Agreement 27 Phase
HARLOW COUNCIL PERFORMANCE MANAGEMENT FRAMEWORK
HARLOW COUNCIL PERFORMANCE MANAGEMENT FRAMEWORK July 2013 1 P age Contents Page 1.0 Definition 3 2.0 Context 3 3.0 Purpose and aim of the policy 4 4.0 Policy Statement 4 5.0 Framework for Performance Management
DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES
Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?
Internal Audit of the Sport Canada Hosting Program
Internal Audit of the Sport Canada Hosting Program Office of the Chief Audit and Evaluation Executive November 2009 Table of Contents Executive Summary...i 1. Introduction and Context...1 1.1 Authority
Key performance indicators
Key performance indicators Winning tips and common challenges Having an effective key performance indicator (KPI) selection and monitoring process is becoming increasingly critical in today s competitive
Guidance Note: Corporate Governance - Board of Directors. March 2015. Ce document est aussi disponible en français.
Guidance Note: Corporate Governance - Board of Directors March 2015 Ce document est aussi disponible en français. Applicability The Guidance Note: Corporate Governance - Board of Directors (the Guidance
Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet
Applies from 1 April 2007 Revised April 2008 Core Competence Framework Guidance booklet - Core Competence Framework - Core Competence Framework Core Competence Framework Foreword Introduction to competences
Enterprise Project Management Initiative
Executive Summary Enterprise Project Management Initiative At a time when budgetary challenges became more and more aggressive, the Commonwealth of Kentucky determined in late 2000 it must develop a more
City of Johannesburg. ANNEXURE 2 Group Performance Management Framework
City of Johannesburg ANNEXURE 2 Group Performance Management Framework August 2009 Table of Contents 1 INTRODUCTION... 4 2 LEGISLATIVE FRAMEWORK... 6 3 GROUP PERFORMANCE MANAGEMENT FRAMEWORK OBJECTIVES...
Queensland Government Human Services Quality Framework. Quality Pathway Kit for Service Providers
Queensland Government Human Services Quality Framework Quality Pathway Kit for Service Providers July 2015 Introduction The Human Services Quality Framework (HSQF) The Human Services Quality Framework
How councils work: an improvement series for councillors and officers. Managing performance: are you getting it right?
How councils work: an improvement series for councillors and officers Managing performance: are you getting it right? Prepared for the Accounts Commission October 2012 The Accounts Commission The Accounts
How To Complete An Assessment Questionnaire In Alberta
Alberta Municipal Sustainability Strategy Self-Assessment Questionnaire Promoting Municipal Sustainability This page intentionally left blank INTRODUCTION Sustainable, responsive, and accountable municipal
How to Measure and Report Social Impact
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
CHAPTER SEVEN PERFORMANCE MANAGEMENT
235 CHAPTER SEVEN PERFORMANCE MANAGEMENT 7.1 INTRODUCTION Integrated development planning enables the achievement of the planning stage of performance management. Performance management then fulfills the
TIPS SELECTING PERFORMANCE INDICATORS. About TIPS
2009, NUMBER 6 2ND EDITION DRAFT PERFORMANCE MANAGEMENT & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS About TIPS TIPS provides practical advice and suggestions to USAID managers on issues related
DATE: June 16, 2014 REPORT NO. PW2014-050. Chair and Members Committee of the Whole Operations and Administration
PUBLIC WORKS COMMISSION PUBLIC WORKS COMMISSION DATE: June 16, 2014 REPORT NO. PW2014-050 TO: FROM: Chair and Members Committee of the Whole Operations and Administration Geoff Rae, MBA, P.Eng. General
Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819)
Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819) 953-6088 (For the hearing and speech impaired only (TDD/TTY):
Alberta Health. Primary Health Care Evaluation Framework. Primary Health Care Branch. November 2013
Primary Health Care Evaluation Framewo Alberta Health Primary Health Care Evaluation Framework Primary Health Care Branch November 2013 Primary Health Care Evaluation Framework, Primary Health Care Branch,
Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy?... 2. 2 Scope of this Policy... 2. 3 Principles of data quality...
Data Quality Policy Appendix A Updated August 2011 Contents 1. Why do we need a Data Quality Policy?... 2 2 Scope of this Policy... 2 3 Principles of data quality... 3 4 Applying the policy... 4 5. Roles
Final Report. 2013-709 Audit of Vendor Performance and Corrective Measures. September 18, 2014. Office of Audit and Evaluation
2013-709 Audit of Vendor Performance and Corrective Measures September 18, 2014 Office of Audit and Evaluation TABLE OF CONTENTS MAIN POINTS... i INTRODUCTION... 1 FOCUS OF THE AUDIT... 7 STATEMENT OF
MEDICAL OFFICER OF HEALTH AND CHIEF EXECUTIVE OFFICER PERFORMANCE APPRAISAL PROCEDURE
Middlesex-London Health Unit Policies & Procedures Policy Number: HR 5-050 Section: HR Policy 5-050 Sponsor: Governance Committee Page 1 of 29 Subject: Medical Officer of Health and CEO Performance Appraisal
Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies
Home > Centre of Excellence for Evaluation Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies 6.0 Performance Measurement Strategy Framework 6.1 Overview of the
