Organisational Performance Measurement and Reporting Guide



Similar documents
Guide to the Performance Management Framework

KEY PERFORMANCE INFORMATION CONCEPTS

Framework for Managing Programme Performance Information

Logan City Council. Strategic Planning and Performance Management Framework

Performance Measurement

Sector Development Ageing, Disability and Home Care Department of Family and Community Services (02)

Gateway review guidebook. for project owners and review teams

GUIDANCE MATERIAL GUIDANCE ON THE USE OF POSITIVE PERFORMANCE INDICATORS TO IMPROVE WORKPLACE HEALTH AND SAFETY

Guideline. Records Management Strategy. Public Record Office Victoria PROS 10/10 Strategic Management. Version Number: 1.0. Issue Date: 19/07/2010

U.S. Department of the Treasury. Treasury IT Performance Measures Guide

The Performance Management Overview PERFORMANCE MANAGEMENT 1.1 SUPPORT PORTFOLIO

Management of Business Support Service Contracts

DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper

ITIL 2011 Summary of Updates

Queensland Government Human Services Quality Framework. Quality Pathway Kit for Service Providers

INTEGRATED PLANNING AND REPORTING

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

Drinking Water Quality Management Plan Review and Audit Guideline

Benefits realisation. Gate

1.1 Objective Competencies and resources required Scope Related documents Defining the objectives and scope of projects 7

Some Text Here. Policy Overview. Regulation Impact Statement for Early Childhood Education and Care Quality Reforms. July 2009

Guide to the National Safety and Quality Health Service Standards for health service organisation boards

The Australian Government Performance Measurement and Reporting Framework

QUALITY ASSESSMENT & IMPROVEMENT. Workforce ACUTE HOSPITAL SERVICES. Supporting services to deliver quality healthcare JUNE 2013

AER reference: 52454; D14/54321 ACCC_09/14_865

Internal Audit Division

THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY MANAGEMENT OF PERFORMANCE INFORMATION POLICY AND PROCEDURES DOCUMENT

Performance Measurement

Selecting a Content Management System

Better Practice Guide. Better Practice in Annual Performance Reporting

Comcare, the Safety, Rehabilitation and Compensation Commission, and the Seafarers Safety, Rehabilitation and Compensation Authority

december 08 tpp 08-5 Guidelines for Capital Business Cases OFFICE OF FINANCIAL MANAGEMENT Policy & Guidelines Paper

HPF Tool. Template for the Performance and Accountability Statement Second round reviews. Amended December 2012

The Auditor-General Audit Report No Assurance and Control Assessment Audit. Payroll Management. Australian National Audit Office

Capital Works Management Framework

IT Security Management

NAPCAN s strategy is to bring about the changes necessary in individual and community behaviour to stop child abuse and neglect before it starts by:

WARNING: You cannot rely on a printed version of this document to be current. Always check the DECCW intranet to ensure you have the latest version.

AQTF Audit Handbook. This publication remains current and applicable to the VET sector.

CORPORATE PERFORMANCE MANAGEMENT GUIDELINE

Assessment of compliance with the Code of Practice for Official Statistics

The New Zealand Human Services Quality Framework - ISO9002:2008 to 2012

Queensland Ombudsman. Complaints Management Project

FINANCIAL SERVICES TRAINING PACKAGE FNB99

IN CONFIDENCE. Regulatory Impact Analysis Requirements: New Guidance

Solvency II Data audit report guidance. March 2012

Social impact assessment. Guideline to preparing a social impact management plan

august09 tpp Internal Audit and Risk Management Policy for the NSW Public Sector OFFICE OF FINANCIAL MANAGEMENT Policy & Guidelines Paper

Qualification Outline

Avondale College Limited Enterprise Risk Management Framework

NSW SENIOR EXECUTIVE SERVICE

DESIGNING AN EMISSIONS STANDARD FOR AUSTRALIA

E.33 SOI ( ) Statement of Intent. Crown Law For the Year Ended 30 June 2010

Managing ICT contracts in central government. An update

TOTAL ASSET MANAGEMENT. Life Cycle Costing Guideline

Queensland Treasury. Queensland Government. Program Evaluation Guidelines

Sustainability Objectives and Indicators

NORTH AYRSHIRE COUNCIL CORPORATE ASSET MANAGEMENT STRATEGY JANUARY 2013

Guide for the Development of Results-based Management and Accountability Frameworks

Compliance Review Report Internal Audit and Risk Management Policy for the New South Wales Public Sector

Standards, quality processes and accountability

Improving Accountability: Developing an Integrated Performance System

Policy Documentation Development Information

The criteria and guidelines Best Belgian Sustainability Report Edition 2014

Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy? Scope of this Policy Principles of data quality...

Project Evaluation Guidelines

Project Assessment Framework Establish service capability

Guide to Developing a Quality Improvement Plan

FINANCIAL MANAGEMENT MATURITY MODEL

Best Practice in Design of Public-Private Partnerships (PPPs) for Social Infrastructure, particularly in Health Care and Education

Performance audit report. Ministry of Education: Monitoring and supporting school boards of trustees

C o r p o r at e p l a n. adding value to public sector performance and accountability

Discussion Paper. Strengthening Local Government. Strengthening councils and communities

Central bank corporate governance, financial management, and transparency

JOB DESCRIPTION. Contract Management and Business Intelligence

Reporting and Analytics Framework February 2014

Frequently Asked Questions

Human Resource Change Management Plan

DATA AUDIT: Scope and Content

Strategic Plan. New Zealand Fire Service Commission to

FINANCIAL ADVISERS REGULATION: VOLUNTARY AUTHORISATION

Strategic service planning for the ACT Government

Performance Frameworks and Board Reporting II

Good Practice Guidelines for Management Development & Succession in the Public Service

Victorian Government Risk Management Framework. March 2015

Purpose of this Guidance

Human Services Quality Framework. User Guide

The National Health Plan for Young Australians An action plan to protect and promote the health of children and young people

Better Practice Guide

TEC Capital Asset Management Standard January 2011

Part B1: Business case developing the business case

Assessing benchmarks of good practice in school career guidance

<Business Case Name> <Responsible Entity> <Date>

Internal Audit Standards

Capitalisation of Software

Statistics on Drug and Alcohol Treatment for Adults and Young People (produced by Public Health England)

BEST PRACTICE CONSULTATION

How To Help Your Educational Psychology Service Self Evaluate

Asset Management. Framework and Guidelines. Part 2 - Asset Management Guidelines for Western Australian Local Governments p1.

Enterprise Risk Management Framework Strengthening our commitment to risk management

Transcription:

Organisational Performance Measurement and Reporting Guide

Australian Capital Territory, Canberra, April 2013 This work is copyright. Apart from any use permitted under the Copyright Act 1968, no part may be reproduced by any process without written permission from the Territory Records Office, Shared Services, Commerce and Works Directorate, ACT Government GPO Box 158, Canberra City, ACT, 2601. Produced by: Contact: Policy and Cabinet Division, ACT Chief Minister and Treasury Directorate Director Economic, Regional and Planning Policy and Cabinet Division Chief Minister and Treasury Directorate ACT Government GPO Box 158 Canberra City ACT 2601 phone: 13 22 81 email: policydivision@act.gov.au This publication has been prepared by officers of the Chief Minister and Treasury Directorate, ACT Government. It is believed that the information is correct and reliable, but neither the authors nor the Directorate give warranty in relation hereto and no liability is accepted by the authors or the Directorate, or any other person who assisted in the preparation of the publication, for errors and omissions, loss or damage suffered as a result of any person acting in reliance thereon. Last updated: 14 March 2013 i

Contents List of tables and figures... iii Introduction... 1 1. Plan... 3 2. Select performance indicators... 5 3. Conduct the program... 8 4. Collect data and monitor performance... 8 5. Analyse and report... 9 6. Evaluate and modify... 10 Resources... 11 Attachment A: Strategic planning template... 12 Attachment B: Characteristics of good performance indicators... 13 Attachment C: Template for selecting performance indicators... 14 Attachment D: The ACT s current performance measurement reporting framework... 15 ii

List of tables and figures Table 1: Steps in developing program logic... 4 Table 2: What is being measured?... 5 Table 3: Examples of common measures at each stage of the results logic model... 6 Figure 1: Step-by-step guide to performance measurement... 2 Figure 2: Relationship between resource allocation and outcomes in program logic... 3 Figure 3: Example program logic diagram and corresponding performance indicators for Transport for Canberra... 7 Figure 4: Characteristics of effective performance reporting... 9 Figure 5: Template for future directions table... 12 Figure 6: Template for selecting performance indicators... 14 Figure 7: Mapping of government performance planning and reporting... 15 iii

Introduction The ACT Government has a robust and integrated performance management framework, Strengthening Performance and Accountability: A Framework for the ACT Government. 1 Performance management frameworks integrate organisational strategic management, performance measurement, monitoring, evaluations and reporting. 2 Performance measurement improves Directorates ability to track their progress towards achieving outcomes for Canberra. Purpose This step-by-step performance measurement guide (the Guide) will assist Directorates in selecting and designing appropriate performance indicators to strengthen current program and strategy management. This will introduce further consistency across Government in planning, as well as in measuring and evaluating performance. The Guide will outline a method for identifying outcomes and relevant outputs, developing appropriate performance indicators and strengthening performance reporting leading to more effective and efficient programs and strategies. This Guide complements ACT Treasury s Guide to the Performance Measurement Framework. 3 Treasury s advice assists Directorates in developing performance measures specific to the Budget process and statutory reporting requirements. This Guide forms part of a broader performance management process and it is integrated with planning, reporting, monitoring and evaluation. The use of both guides will ensure Directorates are measuring and reporting a complete picture of performance including effective reporting against accountability requirements of the Government and the community. Structure Performance measurement is structured around the six steps shown in Figure 1. 1. Plan. 2. Select performance indicators. 3. Deliver program. 4. Monitor performance. 5. Report on performance. 6. Evaluate. 1 ACT Chief Minister s Department 2011, Strengthening Performance and Accountability: A Framework for the ACT Government, ACT Government, Canberra. 2 Department of Premier and Cabinet 2012, A Guide to the Queensland Government Performance Management Framework, Queensland Government. 3 ACT Treasury Directorate 2012, Guide to the Performance Measurement Framework, ACT Government, Canberra. 1

Figure 1: Step-by-step guide to performance measurement If performance measurement indicates that the program or strategy is not effective, evaluate the program or strategy and consider modifying it. Planning is central to good program design and effective measurement. Program logic can be used to assist planning. This includes: 1. defining or selecting outcomes 2. defining impacts 3. identifying outputs 4. identifying resources. Reporting must: explain the difference between planned performance and actual performance provide a picture of overall performance provide an unbiased and complete picture identify assumptions, gaps and variances present information clearly and concisely provide explanations of assumptions, gaps and variances. Indicators must: measure performance refer to a result rather than being descriptive be well-defined, relevant and informative be within the control or influence of Directorates be available, timely and cost-effective be comparable with a benchmark or previous results over time. Data is gathered to monitor performance. Data considerations include: gathering relevant data by set timeframes ensuring the data is accurate, comprehensive and comparable. Conduct program in line with program logic. 2

1. Plan Good planning does not guarantee good performance, but it can assist in developing more robust performance measurement systems and finding a clearer course of action. A useful tool in planning is program logic, which involves aligning top level Government outcomes, Government priorities, Directorate services and costs. Figure 2 outlines this relationship. Figure 2: Relationship between resource allocation and outcomes in program logic Identifying program logic Program logic is the first step in robust performance measurement because indicators for each level can be developed so that a clear line of sight emerges. Program logic assists in clarifying the context, logic and purpose of the program by taking a structured and deliberative approach to planning and program design. This can be done by following the steps in Table 1. 4 Program logic, as outlined below, refers to the process of planning rather than the output. Program logic diagrams summarise the information developed through the program logic process. Program logic diagrams for strategies will summarise significant work required to map impacts and inputs to outcomes. An example of a program logic diagram template is at Attachment A. 4 State Services Commission and The Treasury 2008, Performance Measurement: Advice and Examples of How to Develop Effective Frameworks, New Zealand Government, http://www.ssc.govt.nz/ performance-measurement. 3

Table 1: Steps in developing program logic 1. Select and clearly define outcomes Outcomes are what the Government is working to achieve. Outcomes should be clearly defined as early as possible in the planning process. Outcomes, both intermediate and long-term, do not specify what is being provided, but rather the impacts expected after outputs are delivered. Outcomes may be identified by looking at the Canberra Plan and supporting high level plans. 2. Define impacts Impacts are the effect of the Directorate s outputs or what must happen to achieve the outcome. One key to performance measurement is to clearly define the impacts in a way that represents the results expected from your outputs. Define your impacts as specifically as possible so that they can be measured against. This can be done by describing what you actually want to achieve. 3. Identify outputs Outputs are how you achieve the impacts and outcomes. Outputs must be a tangible statement such as services delivered or goods produced by, or on behalf of, the Directorate to the community, allowing agencies to measure the cost or time to provide goods or services. 4. Identify resources Resources are the set of inputs to produce outputs, such as funding and staff. Programs must be linked back to resources so decision-makers can assess value for money. 4

2. Select performance indicators This section provides more information on selecting appropriate and measureable performance indicators. The performance of programs or strategies should be measured at each level of the logic model. Characteristics of good performance indicators are provided at Attachment B. Table 2 defines what is being measured at each level, and their equivalent indicator in the ACT Government context. Table 2: What is being measured? Outcome measures Impact measures Output measures Track our achievement of strategic goals. Track progress towards outcomes, assess the differences occurring in the short/medium term, confirm the right mix of outputs and assess cost-effectiveness. Impact measures indicate the success of the program and provide feedback on performance by linking outputs to outcome levels. Typically assess: quality, quantity, targeting, timeliness, location, cost and coverage. Comparable information also allows your Directorate to assess how production and efficiency have changed, and to test whether impacts or outcomes changed as predicted. Strategic indicators measure the Government s performance against outcomes and impacts. Accountability indicators measure a Directorate s performance in delivering its outputs identified in the Budget papers. Now that we know what is being measured, we need to know how to measure it. Table 3 outlines common indicators at each level of program logic, including impact measures across both the short (immediate) and medium term. 5

Table 3: Examples of common measures at each stage of the results logic model 5 What is measured Output Impact: immediate Impact: medium term Outcome Efficiency of process Economy Quantity Coverage Access Quality Appropriateness Effectiveness Completion rate Knowledge retained Reduction in queue Receipt of benefits Unintended effects Effectiveness Behaviour change Risk reduction Lifestyle change Survival Unintended effects Total improvement Program effectiveness More equity How it is measured Cost of delivering output (input per output unit) Number of people served Number of people receiving training/service/output Number of targeted group not using service/output % of output meeting specification % of people who would recommend/re-use output/service % of people with post-school qualifications compared with national average % of people with computers at home % of people changing their behaviour after use of service/output % of people better off after use of service/output Fewer drunken drivers/bad incidents % in jobs/new career/crime free Improved health/wealth/happiness Fewer deaths/accidents/people in prison Decrease in welfare dependency/fewer kids in care Putting theory into practice, Figure 3 outlines a program logic diagram with corresponding performance indicators. Please note this is an example only. A template for planning individual outcomes associated with strategies and selecting appropriate performance indicators is provided at Attachment C. 5 Adapted from State Services Commission and The Treasury 2008, Performance Measurement: Advice and Examples of How to Develop Effective Frameworks, New Zealand Government, http://www.ssc.govt.nz/ performance-measurement. 6

Figure 3: Example program logic diagram and corresponding performance indicators for Transport for Canberra 6 Program logic diagram Performance indicators Outcomes To create a transport system that is integrated, active, sustainable, reliable, safe, accessible, efficient and cost-effective Percentage of people within a 15 minute walk to a centre or rapid transit corridor Impacts Decrease travel time on public transport Decrease wait times for public transport Increase reliability of public transport Meet the coverage needs of the public transport service network Increase the patronage of public transport Increase participation in active transport Average wait time Average travel time On-time running Percentage of ACT households within a 500m walk of a bus stop Percentage of trips by walking, cycling and public transport Outputs Increase public transport priorities, vehicles and adopt a rapid services target speed of 40km/h Increase the frequency of bus services Increase the accessibility of public transport Number of additional bus lanes and rapid services Frequency of bus services Percentage compliance with Disability Standards for Accessible Public Transport 2002 Inputs Funding Staffing Infrastructure Assets Planning Procurement Audited financials Full time equivalents Strategic assets management plan 6 This example is an adaptation based on outcomes identified in Transport for Canberra. The indicators, impacts, outputs and inputs have been assigned for the purposes of this Guide only. 7

3. Conduct the program The project is delivered to effect change and should be implemented in line with the program implementation plan. Service delivery is the core function of the ACT Government. For more information on service delivery please see the ACT Government Strategic Service Planning Framework. 7 4. Collect data and monitor performance Program planning, design and selection of indicators are key components of a performance measurement system. However, these steps are essentially meaningless if data is not collected against each of the indicators. Measuring performance requires the timely and relevant collation and analysis of data. Data must be gathered by set timeframes and must be accurate, comprehensive and comparable. When analysing data to get a meaningful picture of performance, comparisons must be considered. This can be done either year by year or, where relevant, by using benchmarking or simple comparisons of groups that did and did not receive outputs. Note, however, that there are risks associated with benchmarking and comparisons. Such risks may relate to changes and lack of control of external factors, which may also influence performance. Where does data come from? There are many sources of program relevant data. Data may already be collected and released by a centralised body, such as the Australian Bureau of Statistics or the Productivity Commission. When using this data, care must be taken to ensure that it is frequently updated and relevant to the ACT s scale. Some data may be gathered internally within Government and some indicators and programs may require new data to be gathered. This data must be checked for integrity and appropriateness prior to being analysed and reported. Monitoring performance Data is collected over time so that performance can be monitored to answer the following questions. Has performance changed over time? If so, by how much? In which direction? This is the basis for performance monitoring. Regular and complete monitoring of performance can effectively manage risk and strengthen accountability. 7 ACT Chief Minister and Cabinet Directorate 2012, Achieving Results for the Community: An ACT Government Strategic Service Planning Framework, ACT Government, Canberra. 8

5. Analyse and report The utility of performance information is limited if it is not communicated effectively and integrated back into the planning cycle. It is important to link performance measurement into the wider management processes of your Directorate. The ACT s current reporting framework is outlined in Attachment D. Figure 4: Characteristics of effective performance reporting Clarify the relationship between planned and actual performance Provide overall picture of performance (unbiased, complete, balanced) Performance reporting Present information clearly and concisely Identify assumptions, gaps and variances Figure 4 outlines the characteristics of effective performance reporting. It must: clarify the relationship between planned performance and actual performance; provide a picture of overall performance (unbiased, balance and complete) of Government, including how the Directorate s service links to Government objectives; identify assumptions, gaps and variances; and present performance information clearly and concisely, in a user-friendly and timely manner, and across a broad range of mediums. 9

6. Evaluate and modify Once the data for the performance indicators are collected, analysed and communicated, consider evaluating both the performance indicators selected and the program or strategy being measured. Evaluating the program or strategy can inform leaders decisions about planning, capability and resource allocation. Evaluating and modifying performance measures Performance indicators of all levels in the program logic model must be re-evaluated over time against the criteria in Attachment B to ensure the performance indicators are still relevant to measuring the program s progress towards achieving the overall outcome. Evaluating and modifying programs or strategies Performance measurement can assist in determining whether modification of programs, program implementation, strategies or Directorate objectives is necessary. Performance measurement showing positive changes to outcomes or impacts may indicate that the program or strategy is performing well and there is no need to make any modifications to the program. Conversely, performance measurement may illustrate that there are no positive changes to impacts or outcomes, indicating the program, strategy or its implementation is not performing as expected. Measuring impacts tracks the direct effect and performance of your program or strategy. 8 If performance measurement indicates there is no change to the impact, consider changing your strategy or program. Directorates must identify the causes of shortfalls, internal obstructions or externalities that are adversely influencing performance. Consider whether: the program or strategy is still necessary, current and suitable given any changes to the internal or external environments; changes are required to improve the effectiveness or clarity of the program or strategy; there is poor targeting or insufficient outputs; there are implementation needs such as additional communication or staff training; the implementation of the program or strategy is complying with plans; or an action plan can address shortfalls, internal obstructions or externalities. 8 If the Directorate does not have a high degree of control or there are many external factors affecting the outcome, these limits on control and external influences should be acknowledged. 10

Resources ACT Chief Minister and Cabinet Directorate 2012, Achieving Results for the Community: An ACT Government Strategic Service Planning Framework, ACT Government, Canberra. ACT Chief Minister s Department 2011, Strengthening Performance and Accountability: A Framework for the ACT Government, ACT Government, Canberra. ACT Treasury Directorate 2012, Guide to the Performance Measurement Framework, ACT Government, Canberra. Centre for International Economics (CIE) 2011, Strategic Service Planning for the ACT Government: Components, Principles and Processes, prepared for ACT Chief Minister and Cabinet Directorate, Canberra. Department of Premier and Cabinet 2012, A Guide to the Queensland Government Performance Management Framework, Queensland Government, http://www.premiers.qld.gov.au/publications/categories/guides/perf-manageframework.aspx National Audit Office 2001, Choosing the Right FABRIC: A Framework for Performance Information, United Kingdom, http://www.nao.org.uk/publications.aspx State Services Commission and The Treasury 2008, Performance Measurement: Advice and Examples of How to Develop Effective Frameworks, New Zealand Government, http://www.ssc. govt.nz/performance-measurement. 11

Attachment A: Strategic planning template Figure 5: Template for future directions table Future Direction Where are we headed? GOALSwhere we would like to be Strategies how are we going to get there? REPORTING are we on track? Annual Reporting on progress Outcomes indicators 12

Attachment B: Characteristics of good performance indicators 9 Relevant Performance measures must be relevant and aligned to what the organisation is trying to achieve. It must also be confirmed that the measure is the most appropriate way of assessing the performance of the output/impact/outcome and present an indication of how the Directorate is performing. Significant and informative Performance measures must measure significant information on the performance of a program and its contribution to the change. Well-defined The definition of the data should be easy to understand and unambiguous. This will ensure that the data is collected consistently and the measure can be understood and used. Available and cost-effective Data availability often constrains measurement. It is important when choosing indicators to ensure that there are cost-effective methods of collecting relevant data. Attributable Performance measures must measure something that the Directorate is able to influence. The measure should also provide an indication of how much of any change can be attributed to the Directorate. Reliable Performance measures must be relatively accurate and responsive to change. For example, a measure based on a very small sample may show large fluctuations. Similarly, a measure that is not responsive to change will not demonstrate the impact of programs. Timely Data used to measure the performance must be produced frequently enough to track progress and be up to date. If the data is not released frequently or the program is long-term and will take time to create an impact, give sufficient time for the performance to be measured. Avoids perverse incentives The measure should not encourage perverse incentives, for example, measuring the speed of answering letters instead of measuring the quality of responses. This can be avoided by measuring outputs, impacts and outcomes, and ensuring that measures are relevant and well-defined. Comparable The measure should be capable of being compared with either past period or similar programs elsewhere. This will track the progress and performance of the programs. Verifiable Performance measures should have clear documentation behind it so that they can be validated. 9 Adapted from National Audit Office 2001, Choosing the Right FABRIC: A Framework for Performance Information, United Kingdom. 13

Attachment C: Template for selecting performance indicators Figure 6: Template for selecting performance indicators Program logic Performance indicators Outcomes Impacts Outputs Inputs 14

Attachment D: The ACT s current performance measurement reporting framework Figure 7: Mapping of government performance planning and reporting Plans Planned performance Deliver Actual performance Reports Canberra Plan Progress goals Changes in wellbeing Measuring Our Progress Statement of intent Priority outcomes and actions Results Statement of Achievement Corporate plans Strategic indicators Results Annual Report Budget papers Accountability indicators Efficiency of effectiveness of services Annual Report Under the current performance measurement system, agency performance is reported in agency annual reports and corresponds with corporate plans. Accountability indicators are identified in the Budget papers and reported in annual reports. Progress towards the achievement of the ACT Government s priorities is identified and reported in the annual Statement of Achievement. 15