Hertsmere Borough Council. Data Quality Strategy. December 2009 1



Similar documents
7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers

PERFORMANCE DATA QUALITY POLICY

DATA QUALITY STRATEGY

DATA QUALITY STRATEGY

DATA QUALITY POLICY April 2010 Updated January

Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy? Scope of this Policy Principles of data quality...

Data Quality Strategy 2006/2008

Royal Borough of Kensington and Chelsea. Data Quality Framework. ACE: A Framework for better quality data and performance information

PERFORMANCE DATA QUALITY STRATEGY

Report of the Assistant Director Strategy & Performance to the meeting of Corporate Governance & Audit Committee to be held on 20 March 2009.

Version No: 2 Date: 27 July Data Quality Policy. Assistant Chief Executive. Planning & Performance. Data Quality Policy

CHECKLIST OF COMPLIANCE WITH THE CIPFA CODE OF PRACTICE FOR INTERNAL AUDIT

Corporate Data Quality Policy

Improving information to support decision making: standards for better quality data

Code of Corporate Governance

Data Quality Report. February Data Quality Report. Croydon London Borough Council. Audit 2007/08

Data Quality Standard

Data Quality. Carlisle City Council Audit January 2009

HARLOW COUNCIL PERFORMANCE MANAGEMENT FRAMEWORK

Draft Data Quality Policy

Data Quality Action Plan

the role of the head of internal audit in public service organisations 2010

Shropshire Community Health Service NHS Trust Policies, Procedures, Guidelines and Protocols

A Guide to Clinical Coding Audit Best Practice

Information Governance Strategy

Charnwood Borough Council. Corporate Performance Management Framework

Post-accreditation monitoring report: The Chartered Institute of Personnel and Development. June 2007 QCA/07/3407

APPENDIX C. Internal Audit Report South Holland District Council Project Management

CCG: IG06: Records Management Policy and Strategy

Financial Planning Assessment Vale of Glamorgan County Borough Council. Audit year: Issued: January 2015 Document reference: 620A2014

HERTSMERE BOROUGH COUNCIL

To be used in conjunction with the Invitation to Tender for Consultancy template.

Appendix 15 CORPORATE GOVERNANCE CODE AND CORPORATE GOVERNANCE REPORT

Achieve. Performance objectives

Australian National Audit Office. Report on Results of a Performance Audit of Contract Management Arrangements within the ANAO

Complaints Policy. Complaints Policy. Page 1

Abu Dhabi EHSMS Regulatory Framework (AD EHSMS RF)

Housing Association Regulatory Assessment

RISK MANAGEMENT POLICY

Outsourcing. Definitions. Outsourcing Strategy. Potential Advantages of an Outsourced Service. Procurement Process

Office of the Police and Crime Commissioner for Avon and Somerset and Avon and Somerset Constabulary

Certification criteria for. Internal QMS Auditor Training Course

Review of Financial Planning and Monitoring. City of York Council Audit 2008/09 Date

Policy: D9 Data Quality Policy

Performance Management Framework

Job Description. To lead and effectively manage the Empty Homes team which is responsible for:

THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY MANAGEMENT OF PERFORMANCE INFORMATION POLICY AND PROCEDURES DOCUMENT

Internal Audit Annual Report 2011/12

Appendix 4 - Statutory Officers Protocol

QUAๆASSURANCE IN FINANCIAL AUDITING

Contact address: Global Food Safety Initiative Foundation c/o The Consumer Goods Forum 22/24 rue du Gouverneur Général Eboué Issy-les-Moulineaux

Compliance. Group Standard

Internal Audit Strategic and Annual Plans 2015/16

Risk Management & Business Continuity Manual

The Regulatory Framework for Social Housing in England Governance and Financial Viability standard requirement: Governance Annual Assessment

Relationship Manager (Banking) Assessment Plan

COMPLIANCE FRAMEWORK AND REPORTING GUIDELINES

Clearing and Settlement Procedures. New Zealand Clearing Limited. Clearing and Settlement Procedures

AUDITOR GUIDELINES. Responsibilities Supporting Inputs. Receive AAA, Sign and return to IMS with audit report. Document Review required?

Government Communication Professional Competency Framework

QUALITY MANAGEMENT POLICY & PROCEDURES

1.1 Terms of Reference Y P N Comments/Areas for Improvement

F I N A N C I A L R E G U L A T I O N S

Data Communications Company (DCC) price control guidance: process and procedures

HSCIC Audit of Data Sharing Activities:

Data Quality - A Review of the Audit Committee

Performance Detailed Report. May Review of Performance Management. Norwich City Council. Audit 2007/08

Audit and Performance Committee Report

Managing ICT contracts in central government. An update

Employee Performance Management Policy

Corporate Risk Management Policy

Delivering Excellence in Insurance Claims Handling

CORPORATE PERFORMANCE MANAGEMENT GUIDELINE

GOVERNMENT INTERNAL AUDIT COMPETENCY FRAMEWORK

Audit Report for South Lakeland District Council. People and Places Directorate Neighbourhood Services. Audit of Grounds Maintenance

PROJECT MANAGEMENT FRAMEWORK

SMS0045 Construction Health and Safety Policy and Procedures

Transcription:

Hertsmere Borough Council Data Quality Strategy December 2009 1

INTRODUCTION Public services need reliable, accurate and timely information with which to manage services, inform users and account for performance. Service providers make many, often complex, decisions about their priorities and the use of resources. Service users and members of the public more widely, need accessible information to make informed decisions. Regulators and government departments need information to satisfy their responsibilities for making judgements about performance and governance. Much time and money is spent on the activities and systems involved in collecting and analysing the data which underlies performance information, yet there sometimes remains a prevailing lack of confidence in much of this data. As increasing reliance is placed on this information in performance management and assessment regimes, the need for reliable remains critical. Good quality data is the essential ingredient for reliable performance and financial information to support decision-making. The data used to report on performance must be fit for purpose and represent an organisation s activity in an accurate and timely manner. At the same time there must be a balance between the use and importance of the information, and the cost of collecting the required data to the necessary level of accuracy. The new Comprehensive Area Assessment (CAA) from 2009 makes reliable performance information even more important. Council s will be expected to use information to reshape services and to account to the public for performance. Producing data which is fit for purpose should not be an end in itself, but an integral part of the organisation s operational, performance management and governance arrangements. It is recognised that organisations that put data quality at the heart of their performance management systems are most likely to be actively managing data in their day-to-day business, and turning that data into reliable information. 1 Statement on Data Quality Hertsmere Borough Council is committed to the very highest levels of data accuracy. This includes data used internally for management decisionmaking, and data reported externally on the performance of the Council and its services. In order to meet its commitments to data quality, the Council will ensure that: o Data quality is owned and understood across the organisation o The appropriate level of resources are invested in data quality in order to meet the Council s data quality commitments 1 Grant Thornton Data Quality Audit Report November 2008 December 2009 2

o Robust quality control procedures are in place o Independent external audits of data are reported internally and externally, and that improvement actions are acted upon in order to continuously improve the Council s approach. In order to do this Hertsmere Borough Council will: o Ensure that performance information is of high quality, consistent, timely and comprehensive and held securely and confidentially o Put in place arrangements at senior level to secure the quality of data that we use to manage our services and demonstrate our performance o Make clear what is expected from officers and contractors in terms of the standards of data quality o Put in place systems, policies and procedures to enable the highest possible data quality, particularly where information is shared with partners o Ensure that we put in place the right resources, and in particular have the right people with the right skills, to ensure we have timely and accurate performance information o Ensure that we have the right controls to ensure that the Council meets what is expected of us. DIMENSIONS OF DATA QUALITY There are six dimensions of good quality data 2. 1. Accuracy Data should be accurate, presenting a fair picture of performance and enable informed decision-making at all appropriate levels. 2. Validity Data should represent clearly and appropriately the intended result. Where proxy data is used, bodies must consider how well this data measures the intended result. 3. Reliability Data should reflect stable and consistent data collection processes and analysis methods across collection points and over time, whether using manual or computer based systems or a combination. Service heads should be confident that progress toward performance targets reflects real changes rather than variations in data collection methods. 4. Timeliness Data must be available for the intended use within a reasonable time period. Data must be available frequently enough to influence the appropriate level of management decisions. 2 Robson Rhodes Data Quality Audit 2005-06, November 2006 December 2009 3

5. Relevance The data reported should comprise the specific items of interest only. Sometimes definitions for data need to be modified to reflect changing circumstances in services and practices, to ensure that only relevant data of value to users is collected, analysed and used. 6. Completeness All the relevant data should be recorded. Monitoring missing or invalid fields in a database can provide an indication of data quality and can also point to problems in the recording of certain data items. GUIDING PRINCIPALS In addition to this, there are a number of principles that underpin good data quality. It is important that these are considered sequentially because if any of these principles are not adhered to, inaccuracies are likely to occur. These principles are as follows: Awareness Everyone recognises the need for good data quality and how they can contribute. Data quality is the responsibility of every member of staff entering, extracting or analysing data from any of the council s information systems. Every relevant officer should be aware of his or her responsibilities with regard to data quality. The commitment to data quality will be communicated clearly throughout the council and stated in all corporate documents. Some information officers will have overall responsibility for data quality on a system but this does not exempt others from the responsibility to ensure that data is accurate and up-to-date. Responsibility for data quality should be reflected in job descriptions and appraisal process. Departments are encouraged to ensure that suitable appraisal targets and paragraphs in job descriptions are included, bearing in mind the level of involvement staff have in the Performance Indicator (PI) reporting processes. There is collective responsibility for data quality, but it is necessary to be clear about what actions and responsibilities are allocated to specific individuals and teams in order to implement this strategy. Definitions Everyone knows which PIs are produced from the information they input and how they are defined. December 2009 4

All officers should know how their day-to-day job contributes to the calculation of PIs and how lapses can either lead to errors or delay reporting, both of which limit our ability to manage performance effectively. This means that everyone should have an understanding of any PIs affected by their area of work. The National Indicator set under the Comprehensive Area Assessment regime have nationally set definitions. It is important that every detail of the definition is applied. This ensures that data is recorded consistently, allowing for comparison over time and national benchmarking. Where the council is setting local PIs it needs to ensure that we have established a clear definition and that there are systems available to collect and report the data in an agreed format. In some cases there are a number of similar indicators (some national and some local) measuring the same thing in slightly different ways. It is important to ensure that separate figures are calculated and reported systematically for each definition. Every PI should have a named officer who is responsible for collecting and reporting the information. This ensures that there is consistency in the application of definitions and use of systems for providing the data. Input There are controls in place with regard to the input of data. System-produced figures are only as good as the data input into that system in the first place. The aim should be 100% accuracy 100% of the time. It is important that officers have clear guidelines and procedures for using systems and are trained to ensure that information is being entered consistently and correctly. A key requirement is that data should be entered on an ongoing basis, not saved up and entered as a block at the end of a period. This reduces the error rate and the need for a complicated verification processes. Controls should be in place to avoid double counting. These controls should be designed according to the nature of the system, in particular where more than one person inputs data. One control should be an absolutely clear division of responsibility setting out who is responsible for what data entry. Verification There are verification procedures in place as close to the point of input as possible. Data requirements should be designed along the principle of getting it right first time so as to avoid wasting time and money. This is where verification is important. December 2009 5

The simplest verification system might be a review of recent data against expectations, or a reconciliation of systems-produced data with manual input records. Depending on the complexity of the system, it might be necessary to undertake more thorough verification tasks, such as: o data cleansing, e.g. removing duplicate records or to fill in missing information o sample checks to eliminate reoccurrence of a specific error o test run of report output to check the integrity of the query being used to extract data o spot checks e.g. on external contractor or partner information Particular attention needs to be paid to data provided by external sources. A number of PIs are calculated using information provided by contractors or partners and the council must work with contractors and partners to ensure that data is accurate. When entering into contracts with service providers or agreements with partners it is essential that, wherever relevant, there is a requirement to provide timely and accurate performance information and that we are clear with the contractor or partners about their responsibilities for data quality. It might not be possible to alter existing contracts or agreements so that contractors or partners are fully committed to providing an agreed quantity and quality of performance data. In this case, the data must be treated as high-risk and system checks and measures must be established to ensure confidence about the accuracy of this data. Where concerns exist about the integrity of data from external parties, we should work with these agencies to provide assurance and rectify and problems. Ultimate responsibility for data verification lies with Service Heads, but Internal Audit and the Corporate Performance team can offer advice and guidance about verification procedures. Responsibility Data Quality is the responsibility of every employee who enters, extracts or analyses data from and of the Council s information systems and records. Every employee should be aware of his or her responsibilities for the quality of data. Directors and Service Heads have the responsibility for ensuring that accurate and complete records are maintained and that performance, appraisal and disciplinary processes are in place to maintain and enhance data and information quality for their service. Each performance indicator has an assigned officer who takes responsibility of the systems to support this PI as well as reporting this information to the required standards. December 2009 6

o Officers with responsibility should document the procedures that need to be undertaken to produce the information to the required standard on the PI Audit Trail Form. o Officers with responsibility should work closely with Information Services Unit when procuring new systems to ensure that performance data can be provided by and extracted from the systems and ensure a robust control environment. o These procedures should be reviewed and updated on an annual basis. Officers should ensure that they have a deputy able to produce PI information in their absence. Systems Systems should be fit for purpose and staff have the expertise to get the best out of them. There are a number of conditions that might lead to an information system being considered high risk. Risk assessments will be updated annually by the Corporate Performance team in consultation with each unit. Each PI will be scored out of 3 against the following criteria: o ease of collection o succession planning o accuracy of information The responsibility for system improvements lies with units, but support will be available will be available from the Corporate Performance and Internal Audit teams. Where high risk systems are identified, the following steps will need to be taken: o analysis of the control environment o identification of gaps o design of additional controls and procedures to address gaps o preparation of an action plan o monitoring the implementation of the action plan There are a number of tools that might be used to analyse the control environment. These include: o Map of key controls showing the progress of information from the input to output stages, and can be used to document the people, processes and tools that exist to ensure that expectations are met at every stage. o A verification checklist would take the form of a series of yes/no answers and could be used by anyone to determine whether information flows accurately through the system. These tools exist to identify weaknesses in the controls or confirm that the control environment is robust. Once this has been undertaken systematically, gaps in the control environment will be evident and new systems and procedures can be designed, addressing any part of the PI production December 2009 7

process where controls are weak. This might involve a new verification process, new input controls or improved training and communication. Output PIs are extracted regularly and efficiency and communicated quickly. Best use of performance data can be made if it is produced and communicated on a timely basis that allows for management ratification. PIs must be submitted onto Covalent (the council s performance system) within 2 weeks of the end of the quarter to which they relate, so that they can be subjected to Data Review and then go to the Executive Performance Management Panel within 4 weeks of the end of the quarter. PI reports are then presented to the Overview and Performance Committee and the Executive. However, while the PI reports are going through the democratic process, work should be underway to clarify and if necessary rectify any identified issues. Presentation PIs are reported and presented with full supporting evidence, in a clear understandable manner. Reporting accurate information regularly leads to good decision-making and improved performance. Data quality is therefore important if our stakeholders are to have confidence in the information that we present and the decisions that we make. In the past, the Audit Commission has made qualifications, reservations and amendments to a number of our PIs. Although the changes in the inspection regime means that PI s are no longer subject to qualification and reservation in the same manner, this remains an important issue as it creates an impression that performance information cannot be relied on. Confidence in the council s data quality is important as it impacts of the external audit assessments against the Use of Resources assessment. During audits there should be at least one other officer per unit that is able to provide advice and information on PIs in the absence of the unit s lead PI officer. This will provide business continuity both in terms of performance management and audit. When information is presented for audit, Heads of Service must review working papers to confirm that the definition has been followed, the calculations are correct and the indicator is supported by a full audit trail. A PI Audit Trail form, accompanied by full supporting evidence, must be compiled for all PIs. This needs to include: o a calculation, cross-referenced both backwards (to the self-assessment form) and forwards (to supporting information); o system notes o explanation of any variance from the previous year December 2009 8

o documentation supporting any estimates, sampling, or any apportionments made o a full description of where the supporting information supporting information (e.g. spreadsheet, database, screen dumps) is kept. Guidance has been issued to departments on procedures necessary to ensure that high quality audit submissions are made. This guidance was further consolidated in a series of workshops on the subject, which will be repeated as necessary to ensure that the council continually improves data quality standards. Some of these procedures are limited to indicators subject to external audit: National PIs and indicators established by the Audit Commission to be used in the service assessment frameworks for Housing, Environment and Culture. However there are locally set definitions for some Service Plan targets and it is necessary to ensure consistency of reporting for these indicators as well. Additionally, there are other performance indicators (such as local management indicators) that are not subject to audit but are used in performance assessments and it is essential that these are also reported fairly. As part of the annual governance assurance processes Heads of Service should complete the Checklist for Local and National Performance Indicators (Appendix A) on an annual basis. It is vital that information is of the highest quality and integrity to enable effective decision making. DATA QUALITY ACTION PLAN The council has a data quality action plan in place to ensure that the council continually improves its data quality standards. Progress against the data quality action plan will be reviewed on a regular basis as part of the quarterly Data Review performance meetings and will be fully refreshed an annual basis. The data quality action plan (Appendix B) takes into account feedback from external assessment (Data Quality Audit November 2008 and Use of Resources Assessment 2009). December 2009 9

Appendix A CHECKLIST FOR LOCAL / NATIONAL PERFORMANCE INDICATORS if yes [1] INDICATOR DEFINITION [a] Is there a clear approved definition for the performance indicator entered onto covalent and does it cross-reference to the original source is the definition dated? [b] Have all aspects of the definition been complied with when calculating the indicator? [2] TARGET- SETTING [a] Is there an explanation as to how and why the target has been set? (should be added as note in covalent) [b] Is the target realistic? If are you aware of what result is expected? [c] Are the quartile figures / threshold figures set on covalent? [3] STATISTICS [a] Are copies of relevant statistics attached to the indicator in Covalent and are they up-to-date (e.g. population statistics, office for national stats. estimates etc.)? [b] Has the source of the statistics and the date been provide? (should be added as note in covalent) [4] AUDIT TRAIL [a] Is there a signed off Audit Trail for Each Performance Indicator? [b] Is the whole process shown from start to finish and is it clear where the process starts and finishes? [c] Have detailed workings out been provided to clearly and transparently show how relevant calculations were performed? [d] Has the calculation been performed as per the indicator definition December 2009 10

[e] Are all documents / reports etc. clearly referenced to their source? [5] FOLLOW-UP ACTIONS [a] Have any previous issues (from Internal Audit / External Audit / central QA) been actioned and has this been evidenced / recorded? [6] APPROVAL/ SIGN- OFF [a] Has the Audit Trail been signed and dated by the Manager and the deputy owner? Signed Head of Service Date December 2009 11

Action Plan Appendix B This action plan includes recommendations made by the External Auditors (Data Quality Audit November 2008) intended to assist the Council in achieving sufficient improvements to achieve a good performance within the new Use of Resources framework KLOE 2.2. It also reflects feedback from the External Auditors following the first assessment under the Use of Resources framework. Data Quality Audit November 2008 Recommendations Ref Recommendation Priority: Management Response / Progress to Date Lead Officer/ Responsibility Timescale KLOE 1.2: The body has clear data quality objectives 1 The council is reviewing its local indicators and should consider how new indicators can be developed and designed to ensure high levels of data quality High Agreed Series of Workshops conducted in November / December 2008 for each department with external Performance Advisors (Y.Change) to develop robust local indicators. All indicators have a detailed definition and audit trail in place Head of Corporate Support / Partnership Performance Officer Completed New set of Indicators presented to Overview and Performance and Executive in April 2009. Introduce checklist (Appendix A of revised Data Quality Strategy December 2009) for local and national performance indicators as part of annual assurance processes. 2009/10 Management Assurance Statements KLOE 2.2: Compliance with data quality policy 2 The Council should ensure that the new performance Medium Agreed Corporate Completed management and data quality strategies are effectively communicated, and are complied with by officers in Performance Management Strategy agreed by Executive April 2009 and series of workshops Support Manager / December 2009 12

Ref Recommendation Priority: Management Response / Progress to Date Lead Officer/ Responsibility Timescale practice conducted internally to communicate the Performance requirements of the performance strategy and the Officer data quality strategy. This has been reiterated with training on use of Covalent (the council s performance management system). KLOE 3.1: Performance systems 3 The council should consider the scope for developing Low Agreed Head of To be agreed links and interfaces between Covalent and other key corporate systems To be explored through IS Strategy and in particular procurement process for new finance Information Services and revenues and benefits currently underway KLOE 3.3: Security and continuity 4 The council should fully implement its plans to develop business continuity plans Medium Agreed Performance Management / Covalent - Not identified as critical area under the current BIA Head of Corporate Support processes. Heads of Business Continuity Plans ongoing Service KLOE 3.4: Data sharing 5 A formal data sharing protocol, for use with ali third parties, should be prepared, agreed with key stakeholders and implemented Medium Agreed Data sharing protocol to be drafted and agreed via Corporate Governance Group February Head of Corporate February 2010 2010. Support Formalisation of the current Protocol for January 2010 Information Sharing with Herts. Constabulary to be completed in January 2010 December 2009 13

Ref Recommendation Priority: Management Response / Progress to Date Lead Officer/ Responsibility Timescale KLOE 4.2: Data Quality training 6 The council should ensure that a programme of training and development is delivered to staff around data quality, and this could be achieved in conjunction with training on performance management more generally High Agreed Performance Management Strategy agreed by Executive April 2009 and series of workshops conducted internally to communicate the requirements of the performance strategy and the Corporate Support Manager / Performance Officer Completed data quality strategy. This has been reiterated with training on use of Covalent (the council s performance management system). KLOE 5.1: Reported performance information is actively used in the decision making process 7 Complete the exercise to review the local performance indicator set to ensure that these are relevant and appropriate, and further develop a culture of performance management within the council High Agreed Series of Workshops conducted in November / December 2008 for each department with external Performance Advisors (Y.Change) to develop robust local indicators. All indicators have a detailed definition and audit trail in place Head of Corporate Support / Partnership Performance Officer Completed New set of Indicators presented to Overview and Performance and Executive in April 2009. Performance standard item of all Senior and Middle Managers Meetings and Chief Officer Board. Chief Executive includes Performance as part of his and the Leaders quarterly staff briefing sessions. New Portfolio Holder with responsibility for Performance appointed in January 2009. December 2009 14

Ref Recommendation Priority: Management Response / Progress to Date Lead Officer/ Responsibility Timescale Review of Performance Arrangements and Improvements presented to Overview and Performance Committee in October 2009. KLOE 5.2: Controls over reported data 8 Ensure that all externally reported information (i.e. not just Nis) is subject to corporate scrutiny and approval, especially where this information impacts on external assessments or income. Medium Agreed Information is subject to corporate scrutiny as part of current procedures reporting through Chief Officer Board, relevant scrutiny committee and Chief Officers Completed the Executive. December 2009 15

Use of Resources 2009 findings and conclusion No. Recommendation Priority Management response Implementation details 5 Agreed Protocols for Data Sharing The Council should ensure that there are appropriate protocols in place for sharing and exchanging data with partners. This requirement should form part of the IS Strategy. 6 Spot-checking Performance Indicators The Council should consider undertaking spot-checks on performance indicators throughout the year to ensure that they are being calculated correctly and in accordance with the agreed methodology for the indicator. High Agreed Data sharing protocol to be drafted and agreed via Corporate Governance Group February 2010 in conjunction with Partnership Governance Framework. High Agreed Internal Audit to undertake a series of spot checks in addition to the amendment to the Data Quality Strategy to include performance indicator checklist (in addition to audit trails) to be signed off by Heads of Service as part of annual governance assurance processes. December 2009 16