Striving for Excellence



Similar documents
Guide for the Development of Results-based Management and Accountability Frameworks

MEMO TO: FROM: RE: Background

Status Report of the Auditor General of Canada to the House of Commons

Revenue Administration: Performance Measurement in Tax Administration

Clive W Pack Managing Principal Louis A Allen Associates (Aust) Pty Ltd. March 1990.

Understanding and articulating risk appetite

STANDARDS FOR SOCIAL WORK RECORDING

Financial Services FINANCIAL SERVICES UTILITIES 57 FINANCIAL SERVICES AND UTILITIES BUSINESS PLAN. CR_2215 Attachment 1

Measuring the success of a managed volatility investment strategy

A FRAMEWORK FOR THE APPLICATION OF PRECAUTION IN SCIENCE-BASED DECISION MAKING ABOUT RISK

Community Futures Management Consultant in a Box

Performance planning:

Canadian Intergovernmental Conference Secretariat

The Importance of Data Quality for Intelligent Data Analytics:

Integrated Risk Management:

GAO. Assessing the Reliability of Computer-Processed Data. Applied Research and Methods. United States Government Accountability Office GAO G

WHO GLOBAL COMPETENCY MODEL

Department of Finance & Management Strategic Plan V.3.3

Vital Risk Insights kpmg.com

Internal Audit Practice Guide

Guide to the Performance Management Framework

Amajor benefit of Monte-Carlo schedule analysis is to

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

WHITE PAPER APRIL Leading an Implementation Campaign to Address the Convergence of Healthcare Reform Initiatives

P3M3 Portfolio Management Self-Assessment

Internal Audit. Audit of HRIS: A Human Resources Management Enabler

Industry Services Quality Management System

Migration Planning guidance information documents change ManageMent Best Practices October 7, 2011

GENERAL GUIDELINES FOR DEVELOPING A BUSINESS PLAN

Creating and Monitoring Customer Satisfaction

Final Audit Report. Audit of the Human Resources Management Information System. December Canada

OCC 98-3 OCC BULLETIN

The case for continuous penetration testing

What are some effective standards-based classroom assessment practices?

the Doctor of Audiology Degree (AuD)

Human Resources: Recruitment/Selection

PWGSC YOUR SERVICE OUR SERVICES, STANDARDS AND RESULTS

DATA BROUGHT TO YOU BY

PERFORMANCE MANAGEMENT TRAINING

1099 Compliance: Avoid worker misclassification and save your company from financial disaster. Highlights include:

Junk Research Pandemic in B2B Marketing: Skepticism Warranted When Evaluating Market Research Methods and Partners By Bret Starr

John Keel, CPA State Auditor. An Audit Report on The Dam Safety Program at the Commission on Environmental Quality. May 2008 Report No.

Key performance indicators

Ethical Ponderings on Anti-Oppressive Social Work Practice Annette Johns MSW, RSW

IMPLEMENTATION NOTE. Validating Risk Rating Systems at IRB Institutions

How To Complete An Assessment Questionnaire In Alberta

Audit of Project Management Governance. Audit Report

Chapter 8: Performance Appraisal. Performance appraisal addresses the following objectives:

The Netherlands response to the public consultation on the revision of the European Commission s Impact Assessment guidelines

Proven Best Practices for a Successful Credit Portfolio Conversion

Creating an Effective Mystery Shopping Program Best Practices

Executive Summary of Mastering Business Growth & Change Made Easy

Assessing the Adequacy and Effectiveness of a Fund s Compliance Policies and Procedures. December 2005

An Oracle White Paper November Financial Crime and Compliance Management: Convergence of Compliance Risk and Financial Crime

Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Audit of Internal Controls Over Financial Reporting.

Environmental Assessments and Major Projects Policy Considerations

WHITE PAPER. 7 Keys to. successful. Organizational Change Management. Why Your CRM Program Needs Change Management and Tips for Getting Started

U.S. Department of the Treasury. Treasury IT Performance Measures Guide

Final Report Audit of Vendor Performance and Corrective Measures. September 18, Office of Audit and Evaluation

Audit of Contract Management Practices in the Common Administrative Services Directorate (CASD)

Auditor General of Canada to the House of Commons

How To Build Trust In The Statistics Service

A Risk-Based Audit Strategy November 2006 Internal Audit Department

THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK

Quality Assurance Initiatives in Literacy and Essential Skills: A Pan-Canadian Perspective

Report of the Assistant Director Strategy & Performance to the meeting of Corporate Governance & Audit Committee to be held on 20 March 2009.

THE INFORMATION TECHNOLOGY PROJECT CHARTER

FLOOD DAMAGES AND TOOLS FOR THEIR MITIGATION Lenka Camrova, Jirina Jilkova

Customer Service. 1 Good Practice Guide

Program and Project Management Practices in the Federal Civilian Agencies

Onboarding Program. Supervisor s Guide

Structure of the Administration (political and administrative system)

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL BACKGROUND

Investment manager research

Municipal Executive Dashboard Performance Blueprint

The Secrets to Success in Business Transformations

for Sample Company November 2012

Speaking Notes for. Liseanne Forand. President, Shared Services Canada. At GTEC

Chapter 3 Office of Human Resources Absenteeism Management

Creating a National Electronic Health Records (EHR) System

PRECARIOUS EMPLOYMENT IN CANADA: TAKING STOCK, TAKING ACTION

Chapter 12 NATURE AND PRESENTATION OF AUDIT FINDINGS AND RECOMMENDATIONS. Section Contents Page Purpose and Background 12-1

Measuring and Evaluating Results

SEVEN WAYS TO AVOID ERP IMPLEMENTATION FAILURE SPECIAL REPORT SERIES ERP IN 2014 AND BEYOND

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

Transcription:

School of Public Administration Performance Management Systems in Canada: Striving for Excellence By: Andrew Bucci February 24 th, 2014

A Bucci 1 Canada s Public Service continually strives towards finding better ways to deliver services and programs to Canadians everywhere. In recognition of this vision, I believe the future direction of the Canadian Public Service is one which needs to focus on increasing accountability, and achieving higher quality governance through greater efficiency and effectiveness in delivering results. It is argued here that one means of realizing these goals, is to implement new, or strengthen existing, performance management systems in the public sector. This assertion will be established by first defining what performance management systems are, and will then progress to examine how they can theoretically lead to these desired outcomes. From here, the main barriers to successful implementation will be identified, and finally some considerations will be provided on how to potentially address these problems moving forward. To begin however, it is necessary to establish the scope of our discussion. Performance management systems (PMS) can be broadly or narrowly designed depending on what function they are intended to serve. A system might be implemented to monitor the performance of individual employees, single or multiple departments, a specific or interrelated set of policies, a particular program, and more. 1 As a result of this complexity, the scope of this discussion will be limited to performance management systems in the context of evaluating a specific program within a single government department. It is important however, to recognize that many of the potential benefits and challenges surveyed here are not solely unique to the context of program evaluation. As such, the observations and recommendations put forth in this analysis may provide insights related to the use of these systems more broadly, or in other 1 Simmons, Julie M. Desperate Measures: Why Performance Management Doesn t Measure Up, in Approaching Public Administration: Core Debates and Emerging Issues, eds. Roberto P. Leone, and Frank L.K. Ohemeng (Toronto: Emond Montgomery Publications, 2011), 153.

A Bucci 2 set contexts. Now that some brief considerations to the scope of this discussion have been provided, we can examine what exactly comprises a (PMS). What is a performance management system? There is typically a progression through five stages when implementing an effective PMS. It should be noted however, that there is a certain degree of mobility between these different steps; they are not necessarily confined to one sequential order, and certain aspects may become more pronounced at different phases of the overall process. Further, it is important to remember that the system described in this section is how it is understood to function in theory; a discussion on practical application will follow later in the analysis. The first stage is to come up with clear objectives, goals, benchmarks, or desired outcomes for what you want to accomplish. 2 A hypothetical example of a clear program goal would be: to increase the number of aboriginal students graduating from high school by 15% within an annual budget of $100 million over five years. Having a clear objective in mind for what exactly it is you are trying to achieve, should ideally allow for that clarity to progress through to the actual program design stage; simply put, form follows function. Progressing to the second phase, analysts can then work to identify suitable performance measures, which are deemed to be necessary conditions for successfully achieving the program goal. 3 In the context of the previously used hypothetical program goal, an analyst might identify the following as important performance measures; number of schools in a given area, number of standardized tests taken and the accompanying average scores, class size, and number of afterschool programs available. 2 Peter Aucoin, The Aucoin Reader (Dalhousie University, 2006), 123. 3 Ibid.

A Bucci 3 The third component of an effective system is the need for a stable flow of information between all relevant parties. 4 The goals and measures must be communicated by managers to their respective employees, and citizens must have access to information on how public resources are being utilized. These first three steps are typically established early on in the program design process, and should be monitored for the duration of a given project. The fourth and fifth components however, require at least some time to have elapsed before they can come into play. The fourth step is where performance management visibly takes place. Typically, a manager will take stock of the actual measured performance, and compare it to the desired objectives. 5 From here the final phase of the system is implemented, using all the information from the previous stages and taking action based upon the results. 6 This may result in a reallocation of resources to a perceived weak area, a reevaluation of existing goals or measures, or it may simply result in continuing with the status quo. This leads to the next portion of the discussion; can a performance management, lead to better accountability and governance? Theoretical Implications of an Effective System: To better understand how performance management systems can affect accountability and governance, it will be beneficial to first explore the basic aspects of these two concepts. Accountability can be understood as the ability to hold individuals responsible for the quality, efficiency, effectiveness, and resulting outcomes of any given aspect of the work they are producing for someone else. 7 Subsequently, this idea of better governance can be tied to the 4 Ehsan, The Evolution of Performance Management Systems, 139. 5 Ibid. 6 Ibid. 7 Ibid, 148.

A Bucci 4 notion of efficiently and effectively utilizing public resources to maximize results in service delivery to the public. 8 Given these two broad concepts, a connection can be drawn between the previously outlined PMS and its relationship to better accountability and governance. In the context of accountability, having clear goals and performance measures that are communicated and accessible to the public, allows for the proper transparency needed to hold government accountable for the implementation and outcome of a given program. 9 If the public sees their tax dollars going to a failing or poorly designed program, they will hold government accountable for that performance. This increased transparency in theory should progressively lead to better quality governance over time, as both internal and external pressures increase the demand for stronger program performance. This idea of quality governance is predicated on how well and how efficient the government is addressing the needs and interests of the general citizenry. 10 In the context of this discussion, the central focus is how a specific program might meet these needs. This desire for better governance is thus exemplified in the actual performance management component of the five-stage system. At this point, a program is evaluated to see what is not working well, and then action is taken to try and improve the process and produce better results. If a performance management system is sustained for the duration of the program s lifespan, it should theoretically result in progressively more economical and effective outcomes each time changes are made. 11 Now that some brief considerations have been made on how performance management systems should theoretically lead to better accountability and governance, the examination can now turn to see what issues arise when it comes to practical implementation. 8 Simmons, Desperate Measures: Why Performance Management Doesn t Measure Up, 153. 9 Ehsan, The Evolution of Performance Management Systems, 140. 10 Simmons, Desperate Measures: Why Performance Management Doesn t Measure Up, 152-153. 11 Ehsan, The Evolution of Performance Management Systems, 148.

A Bucci 5 Barriers to Effective Implementation: Issues Regarding Clarity of Goals As is often the case, there can be a significant disconnect between the theory and application of a performance management system. The difficulties here typically arise during the first three stages previously described. The first problem is that the initially defined goals of a program tend to lack clarity or conciseness. 12 This issue can arise as a result of a few challenges. One difficulty is that complex problems do not generally have simple or clear solutions, and consequently, program goals may be excessively broad. 13 The matter can be further complicated by the concept of ministerial responsibility, which states that a minister is accountable to parliament for the success or failures of what is being done in their department. 14 This may motivate a minister to intentionally use vague legislative language to set extremely broad or ambiguous goals, as a means of making it more difficult to establish whether a program is in fact a failure. 15 However, even if a clear target has been established, an even more difficult issue arises when trying to identify sufficient performance measurements. Issues Regarding Performance Measures: It can be exceedingly complicated to identify adequate performance measures which can be causally linked to a desired outcome. 16 Similarly, it is difficult to compartmentalize the individual effect of any given measure; for example, if you miss a target goal by 10%, it can be hard to identify what measured category can be attributed to that failure. Furthermore, not 12 Simmons, Desperate Measures: Why Performance Management Doesn t Measure Up, 153, 13 Ibid. 14. B. Timothy Heinmiller, Ministerial Responsibility: The Cornerstone of Administrative Accountability in Canadian Government, in Approaching Public Administration: Core Debates and Emerging Issues, eds. Roberto P. Leone, and Frank L.K. Ohemeng (Toronto: Emond Montgomery Publications, 2011), 124. 15 Kathryn Newcomer, and Sharon Caudle, Public Performance Management Systems: Embedding Practices for Improved Success, in Public Performance & Management Review, vol. 35, No. 1 (New York: M.E. Sharpe, Inc., 2011), 111. 16 Ibid, 124.

A Bucci 6 everything which may be essential to a program s success or failure can be easily quantified; for example, social cohesion or school pride. Even if a means of qualitative analysis is used to try and measure these factors, the impact on the overall program s outcomes is still difficult to pin down. These issues are exacerbated by the lack of professionally trained evaluators employed in government departments. 17 This lack of dedicated evaluators is further echoed by an audit conducted on the activities of the ten full-time evaluators employed within Environment Canada, which found that as much as 40% of their time was being devoted to tasks outside their intended job description. 18 There is one final concern which arises by attempting to create an exhaustive list of the perceived measures of performance, the temptation enters for managers and employees to only focus on what they know they are being evaluated on; this may result in other important aspects being overlooked. 19 As a result, these factors can pose significant barriers to what these systems are trying to achieve. In 2009 for example, a report from the Office of the Auditor General of Canada found that in a random sample taken from multiple departments, 17 of 23 program evaluations stated that the performance information was either too incomplete or simply too unreliable to use for the purposes of making management decisions. 20 Issues Regarding Credible Reporting: The last barrier to effective implementation to be surveyed in this analysis, relates to the communication aspect of the process. This occurs in the context of what the literature calls a 17 Office of the Auditor General of Canada, Evaluating the Effectiveness of Programs, 22. 18 Office of the Auditor General of Canada, Report of the Auditor General of Canada to the House of Commons: Chapter 1 Status Report on Evaluating the Effectiveness of Programs, (Ottawa: Minister of Public Words and Government Services Canada, Spring 2013).24. 19 Simmons, Desperate Measures: Why Performance Management Doesn t Measure Up, 154. 20 Office of the Auditor General of Canada, Evaluating the Effectiveness of Programs, 12.

A Bucci 7 principal-agent relationship. 21 In this scenario, the credibility of the performance reporting is at risk of being compromised, because an employee (agent) might not want to report bad news to his supervisor (principal); this may be done intentionally or subconsciously, as an agent tries to minimize or mask their own perceived mistakes. The principal-agent dilemma occurs as a result of the asymmetrical knowledge transfer between an employee and their manager. By virtue of their position, a manager will likely not have the same level of technical knowledge that the specialized employee has. 22 As a result, the principal is put at a disadvantage for assessing the accuracy or quality of the information they are receiving from an agent. This is an issue which is particularly relevant to systems established internally to evaluate departmental programs; however it still exists in the instance of a central agency (principal) evaluating the activities of a specific department (agent). Furthermore, the issues highlighted in the first three stages of a performance management system, undermine the overall reliability and usefulness of the information being passed up to management for the purposes of decision making. As a result, this impairs the ability for the system to lead to better quality governance, because major decisions are now potentially being based on flawed or skewed information. Accountability still remains, as these reports are still available to the public, but what individuals are being held to account for may not be an accurate measure of their actual performance. Recommendations Moving Forward In response to the previously surveyed issues, three recommendations are put forth here to help potentially mitigate these barriers to effective implementation. The first is aimed at addressing the lack of clarity in the main program goals. As was discussed, ambiguity in what it 21 Aucoin, The Aucoin Reader, 39. 22 Ibid.

A Bucci 8 is you are trying to achieve can make it difficult to focus a project team behind a clear target. Therefore, it is recommended here that the central agencies should work to establish some type of overarching benchmark for what constitutes a clear program goal. This could be done by instructing that program goals should provide certain key considerations, such as the target timeframes, desired percentage changes, and dollar amounts. Alternatively, it might be sufficient just to identify ideal examples of clear objectives stated in past successful programs in varying contexts, as a framework to strive for. The main issue with this recommendation however, is that it can only realistically be pitched as a suggestion to a minister. They still retain the ability to choose to accept or refuse the advice given, until such a time that a standard would be legislated from above. Despite this limitation, it would still be beneficial to try and establish such a benchmark, to at the very least be available for general reference. The second recommendation proposed here, is to target the hiring of specialized evaluators, as a means of developing and strengthening the in-house evaluation infrastructure; this would also have the added bonus of reducing the dependency on hiring outside contractors. These specialists could help in advising managers and employees on how best to design and implement a performance management system, and could assist in the ongoing collection of information. A specific departmental staff dedicated to the PMS, can thus help to ensure that the information gathered is complete enough for the purposes of decision-making. In addition, these evaluators over time might be in a better position to develop progressively more accurate performance measures, as they gain a deeper technical and operational understanding of how the department itself functions. The third and final proposed recommendation is to establish and promote a dedicated means of anonymous reporting to supplement existing communication channels. This is

A Bucci 9 specifically aimed at counteracting the principal-agent dilemma discussed previously. By establishing an anonymous channel of communication, employees may be more willing to identify negative problems or concerns surrounding a program, which might otherwise be downplayed or masked in the formal reporting channel. The information from both channels, may jointly provide insights into potential areas of weakness, or identify past decisions which may warrant a second look. Such an anonymous channel could potentially be maintained and monitored by one or more of the previously recommended specialized evaluators. The use of dual reporting channels could over time help to mitigate the potential flaws or biases which can be present if just the traditional reporting was used. Concluding Considerations: Having taken into consideration the potential complications and barriers, it is argued here that performance management systems still hold the potential for delivering their purported benefits. As such, this is an area the Canadian Public Service can look to develop moving forward into the future. This is a trial and error process, and by continually experimenting with these systems there may be enormous benefits to be gained in the long run. As was mentioned in the initial scope of analysis, this discussion focused on the implications of performance management systems within the context of program evaluation; however, many of the observed challenges hold true in the different government contexts as well. By recognizing these limitations and learning from them, I believe that these systems will still yield promising results for the future development of new and existing programs, policies, and more.

A Bucci 10 Works Cited: Aucoin, Peter. The Aucoin Reader, Dalhousie University, 2006. Pages 39, 74-130. Ehsan, Mohammed. The Evolution of Performance Management Systems, in Approaching Public Administration: Core Debates and Emerging Issues, eds. Roberto P. Leone, and Frank L.K. Ohemeng, 137 151.Toronto: Emond Montgomery Publications, 2011. Heinmiller, B. Timothy. Ministerial Responsibility: The Cornerstone of Administrative Accountability in Canadian Government, in Approaching Public Administration: Core Debates and Emerging Issues, edited by Roberto P. Leone, and Frank L.K. Ohemeng, 123-133. Toronto: Emond Montgomery Publications, 2011. Leone, Roberto P. Charting the Course of Canadian Public Administration: The Influence of Contemporary Theories and Approaches, in Approaching Public Administration: Core Debates and Emerging Issues, edited by Roberto P. Leone, and Frank L.K. Ohemeng, 13-25. Toronto: Emond Montgomery Publications, 2011. Newcomer, Kathryn, and Caudle, Sharon. Public Performance Management Systems: Embedding Practices for Improved Success, in Public Performance & Management Review, vol. 35, No. 1. New York: M.E. Sharpe, Inc., 2011. Office of the Auditor General of Canada, Report of the Auditor General of Canada to the House of Commons: Chapter 1 Status Report on Evaluating the Effectiveness of Programs, Ottawa: Minister of Public Words and Government Services Canada, Spring 2013. Office of the Auditor General of Canada, Report of the Auditor General of Canada to the House of Commons: Evaluating the Effectiveness of Programs. Ottawa: Minister of Public Words and Government Services Canada, 2009. Simmons, Julie M. Desperate Measures: Why Performance Management Doesn t Measure Up, in Approaching Public Administration: Core Debates and Emerging Issues, edited by Roberto P. Leone, and Frank L.K. Ohemeng, 152-160. Toronto: Emond Montgomery Publications, 2011.