a PERFORMANCE DATA QUALITY STRATEGY 2010-11
LEICESTERSHIRE COUNTY COUNCIL PERFORMANCE DATA QUALITY STRATEGY 2010-11 Status: Final Approved by Corporate Performance & Improvement Board, 23 March 2010 Date 9 April 2010 Prepared by: Richard Wilding/Nicola Truslove Contents Page EXECUTIVE SUMMARY 3 SECTION 1 - POLICY AND PRINCIPLES 1. Introduction 4 2. Why Data Quality Matters 4 3. Data Quality Policy 2010-11 5 4. Data Quality Objectives 5 5. Data Quality Principles 6 SECTION 2 - POLICY INTO PRACTICE 6. Governance Arrangements for Data Quality - Strategic Responsibility 11 7. Specific Roles and Responsibilities 12 8. Arrangements for Monitoring and Review 13 9. Systems and Procedures 14 Master List Of Corporate Performance Indicators Performance Indicator Data Quality Template Year End Collection Data Quality Pro forma Collection Validation and Presentation Process The TEN Performance System 10. Risks and Risk Management 15 Managing Strategic Risks Managing Operational Risks 11. Partnership Arrangements 17 12. Data Supplied by Third Parties 17 APPENDICES Appendix A - Data Collection and Reporting Process 18 Appendix B - Specific Roles and Responsibilities to Achieve Data Quality 19 Appendix C - Sources of Further Information 21 Appendix D - Contacts for Further Information 21 Appendix E - Performance Indicator Data Quality Template 22 2
EXECUTIVE SUMMARY The purpose of this strategy is to outline the Council s approach to ensuring data quality for Performance Indicators. High quality performance information is essential to support effective performance management. It is also vital to ensure that customers and stakeholders have confidence in our published performance information. Data quality is the responsibility of every member of staff entering, extracting or analysing data from any of the Council s information systems. The following specific roles and responsibilities are identified in this document (Appendix B): The Corporate Performance & Improvement Team is responsible for developing corporate process and procedures for data quality and agreeing performance reporting timetables. The Responsible Manager/Director is responsible for reviewing and signing off the overall data and commentary for the department. Departmental Performance Leads are responsible for coordinating departmental performance data quality activity. The Responsible Officers are responsible for ensuring that data is collected at agreed times and is accurate and consistent. The Performance Indicator Validator provides independent assurance that data is accurate and fit for purpose, the calculation has been performed correctly and the definition has been appropriately applied. This role cannot be undertaken by the same person who collects data or calculates the indicator. The PI Calculator is responsible for interpreting definitions, defining data, calculating the result and ensuring data is accurate. The Data Collectors are responsible for collecting the raw data which underpins the indicator. A corporate Performance Indicator Data Quality Template (Appendix E)should be completed for each corporate performance indicator to ensure clear accountability for data collection, calculation and validation. This will show how the indicators were calculated and reference the working papers and data upon which calculations are based. The TEN Performance System is the primary repository and source for corporate performance information. This system records responsibilities in relation to each performance indicator. There are a number of conditions that might lead to an indicator being considered at a higher risk of inaccuracy than others. Every indicator should therefore be considered against these factors to identify those indicators at greater risk of inaccuracy which require action. 3
SECTION 1 Policy and Principles 1. INTRODUCTION 1.1 The purpose of this strategy is to outline the Council s approach to ensuring data quality for Performance Indicators (PIs) across the Council and where agreed, its partnerships. 1.2 The document is split into two sections, the first sets out the strategic policy and principles that Leicestershire County Council has adopted for Data Quality in relation to performance information; the second section, Policy into Practice, sets out how those aspirations are to be achieved. 1.3 The Council has developed an Information Management Strategy (a parent strategy) and a Data Strategy (intranet link). A Data Quality Strategy has also been agreed and adopted by Leicestershire Together. This document sets out how the principles within those strategies will be applied in practice to the production of corporate performance information. 1.4 The current National Performance Framework comprises the National Indicator Set and a Local Area Agreement (LAA). 1.5 Full Data Quality details can be found on the County Council s Corporate Information System (CIS- Intranet). http://intranet.leics.gov.uk/dataquality 2. WHY DATA QUALITY MATTERS. 2.1 Poor data can: undermine accountability and damage trust; weaken frontline service delivery; lead to financial loss and poor value for money; leave the vulnerable at risk; undermine partnership working; confuse rather than clarify the relationship between local public bodies and central government; and 2.2 Consistent, accurate, timely and comprehensive performance information is vital to support effective decision-making and management of resources to ensure that the improvements, outcomes and impacts we and our partners aim to achieve within the locality are delivered. 2.3 Performance information is being used increasingly by external bodies to assess Council performance, often as an alternative to inspection. This places a greater emphasis on data quality. Indicators which are proven under inspection to be incorrectly expressed or calculated can be discounted completely which can seriously affect the score arising from the assessment. 2.4 The LAA reward grant relies upon the Council s Internal Audit team certifying that the relevant National Indicators have been collected and calculated correctly. This certification requires robust data quality procedures and documentation. 4
2.5 Good quality performance information is vital to ensure that customers and stakeholders can have confidence that our published performance provides a reliable basis from which to judge us. 2.6 The National Performance Framework gives greater flexibility in the use of performance data locally and fewer nationally prescribed indicators. With that comes a greater responsibility on Local Authorities and their partners, to ensure that the information used to manage and report performance can be relied upon and is based on a robust data quality framework. External regulators expect such a framework to be in place. 3. LEICESTERSHIRE COUNTY COUNCIL DATA QUALITY POLICY 2010-11 3.1 Leicestershire County Council is committed to the production of high quality and timely data across the whole organisation to support and inform the effective management of the Council s performance and partnership working. 3.2 We will establish and maintain a clear management framework for delivering this commitment, through:- Championship of data quality at a senior level, and effective oversight by Council Members; Clear accountability amongst staff equipped with the relevant skills and knowledge to perform their roles; Clearly documented procedures which are regularly reviewed and subject to challenge; Mechanisms which ensure that the organisation validates and has confidence in its data before it is published; Effective interpretation and reporting of performance data and acting on the results. Effective engagement with our partners to improve data quality; Managing risks and ensuring continuity of provision of quality data. 4 OBJECTIVES 4.1 Progress against Data Quality Objectives for 2009-10 A. To fully implement revised procedures to support the publication of data for the National Indicator Set and LAA during 2008-2010. (Status: Achieved) B. To introduce the Ten Performance System for reporting at the end of 2008-09 and enable the automatic transfer of data from the performance systems of at least three district councils and a phased extension to other district councils during 2009-2010. (Status: Achieved. Automatic transfer of data from the performance systems of all district councils enabled.) C. To review and refine system and procedures in the light of feedback. (Status: Achieved) D. To implement joint data quality procedures with partners under the adopted partnership Data Quality Strategy. (Status: Achieved for LAA2 indicators) 5
E. To explore the potential for automated transfer of data from the performance systems of the partners in the light of the learning from district pilots. (Status: Currently at planning stage) 4.2 Data Quality objectives 2010-11 A. To implement joint data quality procedures with partners under the adopted partnership Data Quality Strategy. B. To explore the potential for automated transfer of data from the performance systems of partners other than District Councils. C. To collect a Performance Indicator Data Quality template for each national indicator. Indicator 1: percentage of national indicator data quality templates received. Target: 100% Indicator 2: percentage of LAA reward indicator data quality templates received. Target: 100% D. To ensure that all corporate performance data is produced on time, without the need for revision after publication. Indicator 3: Number of national indicators requiring amendment following submission of outturn data. Target: 0 E. To ensure that no national indicators are reserved as a result of external audit. Indicator 4: Number of national indicators reserved as a result of external audit. Target: 0 5. DATA QUALITY PRINCIPLES There are a number of principles that underpin good data quality to which the Council is committed in the implementation of the data quality policy: Awareness relevant staff recognise the need for good data quality and how they can contribute; Definitions relevant staff know which indicators are produced from the information they input and how they are defined; Systems - are fit for purpose and staff have the expertise to get the best out of them. Validation - there are validation procedures in place as close to the point of input as possible; Data supplied by third parties- aiming to ensure information produced amongst our partners is as reliable as our own and that risks where reliability is in doubt are effectively managed Output - performance indicator data is extracted regularly and efficiently and communicated quickly; Presentation - performance indicators are presented so as to give an easily understood, accurate and transparent picture of the Council s performance to the public, external inspectorates, partners, Council Members and staff. Risk, continuity and knowledge management strategic and operational risks in relation to performance data are managed. Arrangements for business continuity should be implemented to ensure that specialist knowledge in relation to performance indicators and data quality is not lost. 6
Monitoring and Review- it is important to continuously review the arrangements for securing data quality. The paragraphs that follow expand upon the principles set out above. 5.1 Awareness 5.1.1 Data quality is the responsibility of every member of staff entering, extracting or analysing data from any of the Council s information systems. Every relevant officer should be aware of his or her responsibilities with regard to data quality. 5.1.2 The commitment to data quality will be communicated clearly and appropriate training given so that responsibilities are clearly understood. 5.1.3 Officers involved in target setting for service and departmental plans, and managing delivery are reliant on accurate data to understand, plan, and monitor performance. It is essential that those responsible for producing this data are aware of the uses to which it will be applied. 5.1.4 Responsibilities for data quality should be reviewed within a relevant individual s appraisal and suitable targets set where necessary. 5.2 Definitions 5.2.1 The methods of data collection and calculation of performance indicators must be clearly defined and applied correctly every time the indicator is calculated to ensure our results can be accurately compared to other councils and that changes from year to year truly reflect our performance and not statistical error. 5.2.2 Indicators which are used for external assessments, such as the National Indicator Set have prescribed definitions. It is particularly important that these are correctly applied because these indicators are used to assess our performance as an organisation. 5.3 Systems for production of performance information 5.3.1 Figures produced by software and other systems are only as good as the data inputted into them. There must be adequate controls over the input of data on which performance information is based. It is important that officers have clear guidelines and procedures for using systems and are adequately trained to ensure that information is being entered consistently and correctly. 5.3.2 Responsibility for maintaining robust clear procedures and controls for the information systems which provide performance indicator data lies within departments or individual organisations responsible for producing the data. 5.3.3 It is important to have a central record of the roles, responsibilities and processes for each indicator in order to provide a clear reference source for how the principles set out in this strategy will be applied in practice to each indicator and the systems which support it. 7
5.4 Validation 5.4.1 Data requirements should wherever possible be designed along the principle of getting it right first time in order to avoid waste in the form of time and money spent on cleansing data, interfacing between different information systems, and matching and consolidating data from multiple databases. 5.4.2 Nevertheless, in complex systems, even where there are strong controls over input, errors can occur. Where necessary, a validation procedure should exist close to the point of data input. The frequency of validation checks will need to be aligned with the frequency of data reporting and commensurate with the importance of the indicator and the degree of risk. (See later section on risk management). 5.5 Data supplied by third parties 5.5.1 Particular attention needs to be paid to data provided by external sources. A number of PIs are calculated using information provided by other bodies. The Council's intention must be to work alongside, and support its partners to ensure they are able to supply data which meets the data quality principles outlined in this document. 5.5.2 Arrangements for data quality with partners should be agreed formally. The Council works within an established system, with an Information Sharing Protocol at its centre, supported by Information Exchange Agreements, individually agreed and documented with its partners. Further advice and assistance should be sought from the Corporate Performance or Information Management teams as necessary. 5.5.3 When entering into contracts with service providers it is essential that, wherever relevant, there is a requirement to provide timely and accurate performance information, which is fit for purpose and that an independent data quality assessment may be carried out by. The Council must be clear with the contractor about their responsibilities for data quality and how we will be checking the information they provide. 5.5.5 Responsibility for initial data validation from partners will lie within Departments, but the Corporate Performance & Improvement Team can offer advice and guidance about validation procedures. The Corporate Performance & Improvement Team will undertake selected spot and reality checks and query any data supplied. 5.6 Output 5.6.1 Performance data needs to be produced and communicated to a timetable that allows for timely approval of both data and management commentary for publication. 5.6.4 It is important that performance information is subject to scrutiny and challenge before being passed up the line for management action. In practice, this should consist of a validation check on output reports followed by a formal departmental review of performance data to agree, approve and signoff both accuracy and commentary explanation and where relevant, future targets. 8
5.6.5 This formal review and signoff would usually be through a DMT prior to performance being reported to CMT, Council Members or Partnership Bodies. In certain instances such as data provided outside the normal reporting timetable or which needs to be changed after submission, a named senior officer may formally approve and signoff performance information. 5.7 Presentation 5.7.1 Performance management depends on information which allows significant issues to be seen and discussed readily. The format in which performance information is presented for review has a significant impact on this. 5.7.2 The TEN Performance System has been set up to allow partners to share performance data and develop a common understanding of current and future performance in relation to the Sustainable Community Strategy and the Local Area Agreement. 5.7.3 For a large number of National PIs, performance will only be recognised publicly if it can be substantiated by external bodies through Audit. This requires an audit trial of working papers to be presented to show clearly how the indicators were calculated and the data on which these calculations are based were collected. 5.8 Risk, Continuity and Knowledge Management 5.8.1 There are specific risks to the production of accurate and reliable data without which performance monitoring is ineffective. This section considers the management of risks to the quality of data itself 5.8.2 Strategic risks in relation to the reliable provision of performance data and significant changes to performance data requirements should be considered as part of the Council s corporate risk management arrangements. Strategic risks are those which would involve a significant number of performance measures, e.g. new PIs and also significant reorganisation involving substantial change in responsibilities for PIs. 5.8.3 In addition, specific operational risks in relation to individual indicators need to be identified, considered and addressed at departmental level. 5.8.4 It is also essential that data and records in relation to performance information are considered as part of the Councils approach to business continuity, particularly the protection and back up of electronic data. This is addressed through the Council s arrangements for Business continuity and disaster recovery planning. 5.9 Monitoring and Review 5.9.1 Regular monitoring and reporting in relation to data quality is needed, so that those charged with management and governance can be assured of the integrity of data. Regular formal reporting on the accuracy of data supporting key performance measures should be included in performance reports alongside the performance information itself. 9
5.9.2 The appropriateness of the processes in place to ensure data quality also need to be reviewed to ensure they remain fit for purpose and reflect best practice within a continuously changing environment. 5.9.3 Regular independent review and challenge of arrangements for data quality is important in order to give the organisation and its stakeholder assurance about the data and published results on which it is based. This take place through regular internal and external audit of data quality policy and procedure. 10
SECTION 2 Policy into Practice 6. GOVERNANCE ARRANGEMENTS FOR DATA QUALITY - STRATEGIC RESPONSIBILITY 6.1 Members 6.1.1 Members of the Council use performance indicators to monitor and scrutinise performance and reach decisions about priorities and the allocation of resources. It is important that Members have confidence in the information they are given and have an adequate understanding of the importance of data quality. It is not necessary for them to have a detailed knowledge of data quality principles to perform that role. 6.1.2 The Lead Member for Performance Management (which incorporates a lead role on data quality) should have a greater level of understanding of data quality commensurate with that role. 6.2 Officers 6.2.1 Overall strategic responsibility for performance data quality within the Corporate Management Team lies with the Director of Resources, Brian Roberts. 6.2.2 The Corporate Performance Management Framework and operating arrangements including data quality is overseen and developed by the Corporate Performance and Improvement Board chaired by the Assistant Chief Executive, which in turn reports to and makes recommendations to the Corporate Management Team (CMT). 6.2.3 Data quality arrangements in relation to corporate performance management are developed and coordinated by the Corporate Performance & Improvement Team, in consultation with Departmental Performance Lead Officers acting as a technical group. The key contacts within the Corporate Performance & Improvement Team are set out in Appendix D. 6.2.4 Overall responsibility for data quality within departments and partner organisations rests ultimately with their directors or equivalent, with operational responsibility devolved to named officers. 6.2.5 Each department and partner organisation has a Performance Lead who co-ordinates the collection of performance information and data quality issues, and liaises with the Corporate Performance & Improvement Team on performance and data quality matters. Current Performance Leads are set out in the table below. Department Chief Executive s Performance Lead James A Fox / Nicola Truslove james.fox@leics.gov.uk nicola.truslove@leics.gov.uk 0116 305 8302 11
Department Children & Young People s Service Environment and Transport Adults and Communities Corporate Resources Performance Lead Ahmed Esat ahmed.esat@leics.gov.uk 0116 3058188 Alex Scott alex.scott@leics.gov.uk 0116 3055704 Matt Williams matt.williams@leics.gov.uk 0116 3057427 Julian Haywood julian.haywood@leics.gov.uk 0116 3056747 7. DATA QUALITY ROLES AND RESPONSIBILITIES 7.1 The different roles and responsibilities for ensuring data quality as they relate to Performance information are listed below: Corporate Performance & Improvement Team Responsible Manager/Director Departmental Performance Lead (LAA Equivalent- Block Leads) Responsible Officers (LAA= Outcome Lead/Indicator Owner). PI Calculator Validator Data Collector The Corporate Data Strategy has defined strategic roles for data management. The specific roles in relation to performance data sit beneath these as shown overleaf: Corporate Data Strategy Role Data Owner Data Custodian Data Steward Performance Data Quality Role Responsible Director/ Manager/Officer Corporate Performance & Improvement Team Departmental Performance Leads Responsibilities Strategic role in defining and approving standards for the data they own. Set targets for data quality and responsible for achieving them Authorising access Responsible for quality and standards of service data for their area Liaise with data owners on setting standards and targets for data quality Responsible for operating data quality monitoring procedures for core and service data Lead and maintain definition of data standards for core data sets Delivering corporate policy on data standards 12
Corporate Data Strategy Role Data User Performance Data Quality Role PI Calculator Validator Data collector Responsibilities Calculating indicator figures Responsible for quality on input to performance system Collecting data according to relevant guidelines 7.2 The detailed responsibilities for each role in relation to performance management are set out in Appendix B. 7.3 Each role plays a crucial part in ensuring the Authority can be confident about its performance indicator reporting. Depending on the management arrangements and nature of the measure, some roles may be undertaken by the same person. However, the role of validation and calculation must not be undertaken by the same individual. 7.4 It is the responsibility of departments to decide who will perform each of these roles for each indicator. Those decisions will be recorded within the Performance Indicator Data Quality Template (see next section Systems and Procedures for Data Quality) which needs to be completed for each National Indicator. 7.5 Responsibility for data quality will be centrally recorded on the TEN Performance System. The duties of the named individuals and training needs in relation to these roles will be reviewed through the Council s performance development review process. 8. ARRANGEMENTS FOR MONITORING AND REVIEW OF DATA QUALITY STRATEGY AND PROCEDURES 8.1 Operational Review 8.1.1 Commentary on the key steps taken to assure data quality for the relevant reporting period will be included within corporate performance monitoring reports for the benefit of those reviewing performance. 8.1.2 Review of data quality procedures and practices after each reporting period will be coordinated by the Corporate Performance & Improvement Team through the Performance Leads Group and direct feedback from performance leads in partner organisations. 8.2 Strategic Review 8.2.1 The data quality strategy and main processes will be reviewed at least annually to ensure that it remains fit for purpose and reflects the changing needs of the Council.. 8.2.2 Each year the Council s approach to data quality is subject to independent audit from PWC as part of the Use of Resources Assessment. Feedback from that audit will be used to review and revise the Council s data quality strategy and arrangements. These findings and proposals for improvement will be formally reported to Corporate Performance Improvement Board and CMT. 13
9. SYSTEMS AND PROCEDURES 9.1 Master List of Corporate Performance Indicators 9.1.2 It is important to have an up to date record of all performance information which is to be collected and reported. 9.1.2 The TEN Performance System is the primary repository and source for corporate performance information, and records responsibilities in relation to each performance indicator. 9.2 Performance Indicator Data Quality Template 9.2.1 It is vital that the Council and its partners apply consistent data standards to performance information. It is essential to clearly record the precise definitions of each measure and the systems by which information is collected and recorded to ensure that it can be repeated the same way each time. 9.2.2 Clarifying and recording this basic information and procedures and data sources for each indicator within a consistent template is the foundation of good quality data, and without it, other procedures relating to performance management quickly become unreliable. 9.2.4 The Council has therefore adopted a Performance Indicator Data Quality Template which should be completed for the National Indicators for which the County Council is the designated reporting organisation and the Local Area Agreement reward indicators alongside submission of data for year end reporting. 9.2.5 Once completed, the template will be stored and accessible through the TEN Performance System. 9.2.6 Templates require timely updating if information or requirements have changed in addition to the annual update. 9.3 Year End Collection Data Quality Pro forma 9.3.2 During external audits, there should be at least one other officer who is able to provide advice and information on the PI in the absence of the lead officer. This is an important control to ensure that audit work proceeds smoothly. 9.3.3 When information is presented for external audit, the named Validator and Responsible Officer (see roles and responsibilities section) must both review working papers to confirm that the definition has been followed, the calculations are correct and the indicator is supported by a full audit trail. 9.3.4 A full audit trail should be supported by: system notes; explanation of any variance from the previous year or from target; documentation supporting any estimates, sampling, or any apportionments made; 14
supporting information (e.g. spreadsheet, database, screen dumps), or at least a full description of where the supporting information is kept. 9.4 Collection Validation and Presentation Process for Performance Information 9.4.1 Collection of performance information in a consistent and timely way requires a clear process, timetable and up to date guidance so that everyone involved understands what is required and the necessary stages of validation review and signoff to ensure high data quality. 9.4.2 The Collection process is set out in diagrammatic form in Appendix A. Currently performance data is collected quarterly each financial year. 9.4.3 A detailed timetable for each reporting period is negotiated and agreed with Departments and the Committee Services Team by the Corporate Performance & Improvement Team. This timetable forms part of the briefing and process papers given at the start of each collection period. 9.5 The TEN Performance System 9.5.1 The TEN Performance System allows National Indicator data and data associated with the Sustainable Community Strategy to be collected and shared with relevant partners. The system collects predicted performance for the year end and forecast risk ratings relating to the likelihood of current year and three year targets being achieved. 9.5.2 The system comprises two elements: a data entry system for collecting and holding data and commentary, accessible by a restricted and controlled number of users; and a separate reporting system which allows that information to be viewed on the Leicestershire Together website. 9.5.3 The latest available data and commentary will be collected by departmental/organisational performance leads and entered onto the data entry system, Once data checks have been carried out and commentary and risk ratings have been reviewed, this information will be published online. 9.5.4 Automatic transfer of data has been enabled from the District Council performance systems to the TEN Performance System. 9.5.6 The system will also hold National Indicator Set definitions. 10. RISKS AND RISK MANAGEMENT 10.1 Managing strategic risks 10.1.1 The Council operates a formal risk management process and strategic risks in relation to data quality are considered through that system both corporately and departmentally. 15
10.1.2 Such risks may include major operational changes, system changes, or major external changes. 10.1.4 Guidance notes for assessing strategic risks in relation to data quality have been incorporated into the Corporate Risk Management framework. Strategic risks in relation to data quality will be considered alongside other risks by departments as part of their normal approach to risk assessment. 10.2 Managing Operational Risks 10.2.1 There are two main operational risks in relation to Data Quality in performance management: Poor performance arising from a lack of effective monitoring and early reaction to performance which is off track. The Council s systematic approach to quarterly review of key corporate performance indicators set out above is intended to manage this risk. Inaccurate data which undermines correct decision making and planning within the monitoring process, and can lead to adverse external judgements either through an incorrect comparison with other organisations or because the indicator and performance it represents is discounted because of poor data quality. 10.2.2 A further risk relates to knowledge management in the context of staffing changes. Data quality is very often dependent on detailed specialist knowledge and experience of members of staff in relation to particular performance indicators and the systems which underpin them. Change of staff and loss of this knowledge represents a particular risk. This is managed by ensuring that systems procedures and practices are clearly recorded on Performance Indicator Data Quality Templates in sufficient detail to allow knowledge and understanding to be transferred and those records are kept up to date. 10.2.2 There are a number of conditions that might lead to an indicator being considered at a higher risk of inaccuracy than others. Complexity of performance information/calculation Complexity of Guidance or issues with interpretation Knowledge, skills or experience of staff Resource issues - time available, financial cost Weaknesses or gaps in data collection process Weaknesses in data reporting process Problems identified in previous years External/3rd party source of data 10.2.3 Every indicator should therefore be considered against these factors to identify those indicators at greater risk of inaccuracy which require action. 10.2.4 This risk assessment should be undertaken as a high level review by Departmental Performance Leads in consultation with responsible officers, through a process, guidance and to a timetable provided by the Corporate Performance & Improvement Team. An assessment methodology and guidance has been developed to support the identification and assessment of operational risks. 16
10.2.5 Where experience of past reporting has shown inaccuracies or system weakness to exist it is for Responsible Officers to take steps to bring about improvement in processes to address those weaknesses, with support from Departmental Performance Leads as necessary. 10.2.6 This should include control mapping and testing of PI Systems to prevent and detect data manipulation and error. 10.2.7 Support and advice will be available from the Corporate Performance & Improvement Team, Departmental Performance Leads and Internal Audit as necessary. 11. PARTNERSHIP ARRANGEMENTS 11.1 Performance in relation to the LAA indicators is currently managed through a separate but similar framework using the TEN Performance System. 11.2 The operational guidance for these arrangements can be found on the Leicestershire Together website. http://www.leicestershiretogether.org/laa_guidance_manual_sept07.pdf 11.3 The County Council works together with partners to share data quality best practice to enable us and the partnership, to give quality assurance for all corporate and LAA indicators. 12. DATA SUPPLIED BY THIRD PARTIES. 12.1 The Information Management and Advisory Group (IMAG) has developed a partnership-wide Data Quality Strategy which was adopted prior to LAA2 in May 2008. This strategy and the systems and procedures within it are intended to support the practical implementation of that partnership wide strategy. As accountable body, the County Council will prescribe the data quality arrangements in relation to LAA reporting. 12.3 The Leicestershire Information Management Partnership was commissioned to develop an Information Sharing Protocol to facilitate sharing of information between the public, private and voluntary sectors. The ISP will set out the rules that all people working for or with the partnership organisations must follow when using and sharing information. The Leicestershire Information Management Partnership reports to the Information Management Advisory Group. 12.4 In addition to the Information Sharing Protocol, partner organisations enter into Information Exchange Agreements where there is a requirement to outline organisation specific requirements to facilitate information sharing. 17
APPENDIX A
APPENDIX B - SPECIFIC ROLES AND RESPONSIBILITIES TO ACHIEVE DATA QUALITY IN PERFORMANCE REPORTING The Corporate Performance & Improvement Team has the following responsibilities with regard to data quality: circulation of advice and guidance regarding the National Indicator Set, assisting in the interpretation of the guidance, as necessary; development of Corporate process and procedures for Quarterly, bi-annual and year end data collection and presentation; development and collation of a corporate Performance Indicator Data Quality Template for the National Indicators for which the County Council is the designated reporting organisation and the Local Area Agreement reward indicators. to ensure clear accountability is allocated for data collection, calculation and validation, management of performance to which the measure relates including target setting and explanation of performance; ensuring all Performance Indicator Data Quality Templates are signed off by the designated responsible officer. maintenance and development of the TEN Performance System and the Council s performance management systems; liaison with internal and external audit concerning reviews and audits of PIs and data quality process; notification of auditor concerns to departmental performance officers with request that they ensure that accountable officers take action; conducting regular consultation meetings with departmental performance officers to jointly review the effectiveness of current arrangements, and agreeing changes. ensuring that partner agencies are aware of their responsibilities regarding submission of LAA performance indicator data. undertaking periodic checks to ensure that the that information entered onto the Data Interchange Hub is the same as information entered onto the TEN Performance System. The Responsible Manager/Director is responsible for reviewing and signing off the overall data and commentary for the department for corporate reporting once all stages have been completed. This person will have overall high level accountability for departmental performance. They would also be responsible for signing off any subsequent amendments. Departmental Performance Leads (LAA Equivalent- Block Leads) have a responsibility for coordinating data quality activity within their departments as it relates to performance management. This comprises the following duties, which should be incorporated into officers job descriptions: ensuring accountability for each corporate performance measure is allocated and recorded using the Corporate Performance Indicator Data Quality Template passing on guidance received from Corporate Performance & Improvement Team to officers accountable for individual Indicators; informing relevant officers of Data Quality procedures including amendments managing information on relevant guidance for each PI and ensure that any guidance that has been provided is accessible for reference and that relevant officers are aware of any updates/changes; assisting Responsible Officers and Validators in the interpretation of the relevant guidance; collation of performance information for responsible managers reporting to their seniors; liaison with internal and external audit concerning reviews and audits of PIs;
ensuring that Responsible Officers make appropriate changes in response to auditor recommendations; training officers in the corporate approach to performance monitoring and reporting; working with the Central Performance Team to review and improve the arrangements for data quality; entry of year-end information into the Data Interchange Hub; ensuring that information entered onto the Data Interchange Hub is the same as information entered onto the TEN Performance System. The Responsible Officers (LAA= Outcome Lead/Indicator Owner) are responsible for making sure that data is collected at agreed time frames, is accurate and supported by appropriate processes to ensure consistency and facilitate independent audit. Responsible Officers have the following duties with regard to data quality, which should be incorporated into officers job descriptions: A. ensuring accuracy and timeliness understanding definitions and guidance for the PI(s) for which they are responsible; ensuring that there are adequate information systems in place to collect the required information for the PIs in the required timescale; intervening with corrective action where data has not been collected adequately appraising the information collected by PI calculators for realism, raising questions and resolving queries as necessary; ensuring that data for a performance indicator is entered into a local performance data collection system that is overseen by Performance Leads within agreed reporting timeframes; signing off the data, confirming accuracy, for data quality when passing through for departmental and corporate reporting ensuring that the Information Sharing Protocol is being used when dealing with external bodies and that providers are aware of how the information will be used; The Performance Indicator Validator role might be undertaken by the Responsible Officer or the Performance Lead. However, the Validator role cannot be undertaken by the same person who either collects data or calculates the indicator. The role of the Validator, which should be incorporated into officer s job descriptions, involves providing independent assurance that all the data collected is accurate and fit for purpose, the calculation has been performed correctly and the definition has been appropriately applied. This involves: understanding definitions and guidance for the PI(s) for which they are responsible for checking; checking actual and cumulative figures entered against the indicator are correct by appraising the return for realism against historic data anticipated outturn, target & comparator data raising queries with the Responsible Officer/Calculator/Departmental Performance Officer where concerns over data quality arise and securing assurance as to accuracy including recalculation if appropriate. carrying out annual spot checks on the data and methods of collection used for all LAA reward indicators. A generic checklist for this process should include the following: Ensure definitions align with the agreed LAA2 target detail or NI set 20
Review data collection systems to ensure that these align with the agreed definition and description Review accessibility to ensure in the event of key staff being unavailable, the data can be reproduced. Ensure that methodology and extraction of information is documented in the form of a procedure where complex. Supporting information or records must be held for all targets particularly where annual figures are produced ( these should be dated, extracted method stated, periods covered etc) For final outturn information, assess the information produced to ensure it is : accurately extracted and arithmetically correct; supporting information is reasonable to support the final outturn figure; right time periods have been used for extraction, no duplication of data items; and any significant variations can be fully explained or supported. The PI Calculator(s) have the following duties with regard to data quality, which should be incorporated into officers job descriptions: interpreting the indicator definition and identifying assumptions required, ensuring responsible officer approves the approach; define the data requirements to allow collection of data from the data collectors; understanding the approach taken by data collectors and the checks that they perform; calculating the information, in the timescales required, in compliance with the definitions and guidance; ensuring data is accurate; setting down calculations for each mid and end of year reporting period in a working paper, which is retained in a formal file structure to allow retrospective validation/audit of the calculation at a future date; explain movements in the results that relate to calculation and not performance changes; raising concerns with the data collectors and responsible officers should they arise; reporting accurate information to the responsible officer and or the Departmental Performance Lead, as required; The Data Collector s main role is to collect and accurately record the raw basic data which underpins the performance indicator. This should be incorporated into officers job descriptions and involves: understanding definitions and guidance for the PI(s) for which they are responsible for providing data; collecting and preparing data for inputting in according with the agreed method as set out in the performance indicator template,; ensuring data is collected and reported accurately in the timescales required, in compliance with the definitions and guidance set out in the performance indicator template. APPENDIX C SOURCES OF FURTHER INFORMATION 1. Details of the operational processes, templates and guidance described in this strategy http://intranet.leics.gov.uk/dataquality 2. Leicestershire County Council Information Management Strategy 2009-11, June 2009 21
3. Leicestershire County Council Data Strategy, August 2009 (intranet link) 4. The Leicestershire Together Data Quality Strategy 5. A framework to support improvement in data quality in the public sector, Audit Commission, November 2007 6. Nothing but the Truth? A Discussion Paper, Audit Commission, November 2009 APPENDIX D CONTACTS FOR FURTHER INFORMATION Richard Wilding, Performance Improvement Manager 0116 3055 7308 richard.wilding@leics.gov.uk James Fox, Performance Improvement Manager 0116 305 8077 james.fox@leics.gov.uk Nicola Truslove, Performance Improvement Manager 0116 305 8302 nicola.truslove@leics.gov.uk 22
APPENDIX E - PERFORMANCE INDICATOR DATA QUALITY TEMPLATE Leicestershire County Council/Leicestershire Together Performance Indicator Data Quality Template 2009-10 Please enter information for one Performance Indicator only. Please save as: <<PI DQTemplate / indicator reference number / 2009-10.doc>> N.B. Completed information will be kept for reference purposes and will be publicly available Part 1 Indicator Reference information PI Reference number (s) (if there is more than one reference number please include them all) Is this an LAA2 reward indicator Performance Indicator Title (Please replicate the exact wording used in any official guidance) Unit of Measure (e.g. % or number) Please list title and issuing body of any official guidance used to calculate this indicator including internet source. Yes / No (delete one) Number of Decimal Places Date of Introduction Polarity? Is High or Low better? PI description and definition (For locally developed indicators, please define precisely what the indicator measures.) 23
What is the calculation formula? NB it is particularly important for the calculation method of locally developed indicators to be clearly set out. Methodology. How is the data actually collected locally? (It is important that this is in sufficient detail to allow another person to repeat the process and achieve the same result) What is the measurement period? (e.g. academic or financial year) Part 2 Year End Data Audit Trail Validation & Signoff Section 1 Calculation - PI Calculator Responsibility It is the responsibility of the calculator to provide the initial figure for the end of year outturn and supporting data and documentation to allow that calculation to be independently checked/ audited. Check the PI definition Refer to the Data Quality Template/CLG guidance for the Correct definitions and calculation methods for this Performance Indicator Set down in the relevant spaces below: the outturn for year end against year end target the calculations by which the final year end figure was derived Identify and provide locations for the actual data sources drawn on for the calculations 24
Show calculations used to produce the indicator: Work through the definition using the data obtained for the calculation setting out clearly the process that has been followed and the data that has been used. Clearly identify significant points (such as the numerator and denominator) in the calculation and the final figure Out-turn Data (Please also include the out-turn for last year and target for this year) What is the location of source records/working files for audit purposes & future reference List any computer or manual files that you have used to access source data for your Calculations Identify from where copies can be obtained, with file path where relevant. You must retain any reports or printouts as source evidence for audit purposes 2008-09 out-turn ( if applicable available) 2009-10 Out-turn 2010-11 Target What population figure have you used? (If applicable) Please name the source and reference date for the data Calculators comments- Please note anything about the calculation that the Validator Responsible Officer or Auditor should be aware of. E.g. statistical or other factors which may have resulted in variations from what was expected. N/A Name of PI Calculator: Organisation/Department Date: Contact Telephone No & email: Once you have completed all sections Email this document to the Validator for review. 25
Section 2 Validation Responsibility Validator Instructions It is the responsibility of the Validator to satisfy yourself that the year end figure is accurate and calculated correctly before it is reviewed by the responsible officer for management purposes. Annual spot check A spot check should be undertaken by the Validator on all LAA reward indicators to check the actual data and methods of collection being used. Accuracy check Is the end of year calculation arithmetically correct and correctly expressed according to the definitions set out in the reference section above? Reality Check Compare the calculated figure with last year s figure outturn if available. Compare the calculated figure with the target that has been set for this year. If there are significant unexpected variations are there known reasons for them which have been identified? Confer with the responsible officer /calculator if appropriate. Does the calculation need to be referred back to Calculator to be checked for accuracy? Only once you are satisfied the outturn is reliable should you sign off the Calculations for management review Validator s Comments: Name of PI Validator: Organisation/Departme nt Date: Contact Telephone No & email: Once your section is complete email this document to the Responsible Officer for Management review and signoff 26
Section 3 Management Review- Responsible Officer The Responsible Officer is charged with overall responsibility for producing an accurate year end figure and commenting on performance and management action and target setting where necessary. Verify all sections have been completed. Review performance and include commentary where performance has missed or exceeded target Addressing Reasons for under performance or over performance Proposed action to bring performance back on track Responsible Officer s Comments: Name of Responsible Officer Organisation/Departme nt Date: Contact Telephone No & email: Once Section 3 is complete- Email this form to the Organisational/ Departmental Performance Lead for collation of year end returns 27
Part 3 Optional - Feedback Comments Please use this section to give any feedback about the value/usefulness of this indicator and/or any suggested improvements to it or any of the collection reporting processes which may help inform a future review. Optional - Other notes and information Use this section to include any further information or feedback which may be helpful / relevant. 28