Royal Borough of Kensington and Chelsea. Data Quality Framework. ACE: A Framework for better quality data and performance information



Similar documents
7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers

Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy? Scope of this Policy Principles of data quality...

DATA QUALITY POLICY April 2010 Updated January

Hertsmere Borough Council. Data Quality Strategy. December

DATA QUALITY STRATEGY

Improving information to support decision making: standards for better quality data

Version No: 2 Date: 27 July Data Quality Policy. Assistant Chief Executive. Planning & Performance. Data Quality Policy

Data Quality Strategy 2006/2008

Report of the Assistant Director Strategy & Performance to the meeting of Corporate Governance & Audit Committee to be held on 20 March 2009.

PERFORMANCE DATA QUALITY POLICY

PERFORMANCE DATA QUALITY STRATEGY

Corporate Data Quality Policy

Draft Data Quality Policy

Information Management Strategy. July 2012

Worcester City Council Data Quality Policy

DATA QUALITY STRATEGY

Best Value toolkit: Information management

Appendix 1e. DIRECTORATE OF AUDIT, RISK AND ASSURANCE Internal Audit Service to the GLA. Performance Management Framework

JOB DESCRIPTION. Contract Management and Business Intelligence

All CCG staff. This policy is due for review on the latest date shown above. After this date, policy and process documents may become invalid.

Huntingdonshire District Council Data Quality Audit Report 2006/07. March 2008

Data Quality Standard

Annual Governance Statement 2013/14

Comprehensive Spending Review 2007 covering the period

Date 10/03/09. Date 19/03/09. Date 16/03/09. Date Name of TU rep. n/a Date. Date 17/03/09. Date 13/03/09. Date 13/03/09. Date 17/03/09.

Appendix 1: Performance Management Guidance

Compliance. Group Standard

Business Continuity Management Framework

Enterprise Risk Management Policy

Corporate Health and Safety Policy

Request for feedback on the revised Code of Governance for NHS Foundation Trusts

Best Value toolkit: Performance management

FMCF certification checklist (incorporating the detailed procedures) certification period. Updated May 2015

Policy Document Control Page

National Occupational Standards. Compliance

Data Quality Action Plan

Handbook for municipal finance officers Performance management Section J

Lancashire County Council Information Governance Framework

Information Governance Strategy


Business Plan 2012/13

Managing ICT contracts in central government. An update

Code of Corporate Governance

Data Quality Arrangements in the Screening Division Public Health Wales NHS Trust. Issued: August 2012 Document reference: 348A2012

Richmond-upon-Thames Performance Management Framework

Information Governance and Management Standards for the Health Identifiers Operator in Ireland

Contract and Vendor Management Guide

OAKPARK SECURITY SYSTEMS LIMITED. Health & Safety Policy. Requests or suggestions for amendment to this procedure

Derbyshire County Council Performance and Improvement Framework. January 2012

HARLOW COUNCIL PERFORMANCE MANAGEMENT FRAMEWORK

1.1 Terms of Reference Y P N Comments/Areas for Improvement

MANAGING DIGITAL CONTINUITY

Corporate Risk Management Policy

Version Number Date Issued Review Date V1 25/01/ /01/ /01/2014. NHS North of Tyne Information Governance Manager Consultation

ROLE PROFILE. Performance Consultant (Fixed Term) Assistant Director for Human Resources

A Framework of Quality Assurance for Responsible Officers and Revalidation

West Dunbartonshire Council. Follow-up data protection audit report

Preparation of a Rail Safety Management System Guideline

Part A OVERVIEW Introduction Applicability Legal Provision...2. Part B SOUND DATA MANAGEMENT AND MIS PRACTICES...

Data Quality Audit Commission report

Information Governance Policy

COMPLIANCE CHARTER 1

Project Management. From web projects through to major change projects

Guidance Note: Corporate Governance - Board of Directors. March Ce document est aussi disponible en français.

Framework for Managing Programme Performance Information

RISK MANAGEMENT POLICY AND STRATEGY. Document Status: Draft. Approved by. Appendix 1. Originator: A Struthers. Updated: A Struthers

Shropshire Community Health Service NHS Trust Policies, Procedures, Guidelines and Protocols

Reputation, Brand & Communications

Information Governance Strategy & Policy

Information governance strategy

Project Management. From small self contained projects through to major change projects. Brought to you by Project Agency

Confident in our Future, Risk Management Policy Statement and Strategy

Compliance Management Framework. Managing Compliance at the University

INFORMATION GOVERNANCE POLICY

Job Description and Person Specification. Post Number: HCI.C24 JE Ref: JE028

Performance objectives

Network Rail Infrastructure Projects Joint Relationship Management Plan

Volunteer Managers National Occupational Standards

KEY PERFORMANCE INFORMATION CONCEPTS

Recorded Crime in Scotland. Scottish Government

Mapping the Technical Dependencies of Information Assets

Exposure draft of a report from the UK Statistics Authority: Quality Assurance and Audit Arrangements for Administrative Data

Review of the Management of Sickness Absence Conwy County Borough Council

Public Records (Scotland) Act Healthcare Improvement Scotland and Scottish Health Council Assessment Report

Information Governance Strategy

Audit Committee 25 th February Data Quality Review Report

Introduction to the online training materials

Information Governance Management Framework

Business Plan

Charnwood Borough Council. Corporate Performance Management Framework

Information Paper for the Legislative Council Panel on Financial Affairs. Protection of Consumer Credit Data

Extractive Industries Transparency Initiative. Validation guide

JOB PROFILE. Deputy Finance Manager

Supplier Assurance Framework Good Practice Guide

(1) To approve the proposals set out in paragraphs to ensure greater transparency of partnership board activity; and

White Paper: FSA Data Audit

QUALITY POLICY. Our mission is to provide information, advice and support to those affected by someone else s substance misuse.

Performance Detailed Report. May Review of Performance Management. Norwich City Council. Audit 2007/08

HERTSMERE BOROUGH COUNCIL

Data Quality Review

Transcription:

Royal Borough of Kensington and Chelsea Data Quality Framework ACE: A Framework for better quality data and performance information March 2010

CONTENTS FOREWORD 2 A CORPORATE FRAMEWORK FOR DATA QUALITY 3 SCOPE OF THE DATA QUALITY FRAMEWORK 4 TOOLS FOR MANAGING DATA QUALITY 4 OTHER RELEVANT POICY AND GUIDANCE 4 DEVELOPMENT 5 THE ELEVEN PRINCIPLES FOR DATA QUALITY 5 AWARENESS 5 COLLECTING AND RECORDING 8 EVALUATING 11 ANNEX: ROLES AND RESPONSIBILITIES 15 PAGE 1

FOREWORD The Royal Borough makes effective and wide ranging use of performance information to help deliver Really Good Services and realise our vision of a Better City Life. Yet we are ambitious to do more. We want to take greater ownership of our improvement agenda and achieve even higher standards of performance. In doing so we will need to place a greater premium on high quality performance information to: understand how we are progressing against priorities and in comparison to others; make the right decisions and at the right time; inform our strategies and ensure we focus resources where they are most needed; empower local people and account for our performance. As Government seeks to reduce the expense and intrusiveness of assessment activity, they too are giving greater weight to performance information. Under the new local performance framework we are judged to a large extent on how we, with our partners, perform against the National Indicator Set and Delivering for Our Community, the delivery plan for the Borough s Community Strategy. Already stringent requirements for demonstrating the accuracy of our data will get tougher still. And unless we meet these requirements we cannot expect to continue to achieve first class assessment ratings. Yet performance information is only as good as the data it is founded upon: if the numerous elements that make up the information are not accurate, comprehensive, reliable and timely then we risk producing information that is not only of little value but also potentially misleading. I am therefore pleased to introduce the Council s Data Quality Framework 2010 which reinforces the Council s commitment to securing high quality data and builds on arrangements already judged by our external auditors to be performing well. It details the expectations the Council has for those of us who work with data and performance information, which is the vast majority. And it sets out a small number of principles and practical guidance that, if followed, will allow us to be fully confident in the quality of our performance information. I hope you find the Framework helpful. Councillor Tim Ahern Cabinet Member for Service Improvement PAGE 2

A CORPORATE FRAMEWORK FOR DATA QUALITY The Council is committed to using good quality performance information as an integral element of sound governance and effective performance management. This Framework helps us realise that commitment by ensuring we adopt a consistent and best practice approach to the way we collect, collate, record and manage data. In particular, it: formalises what the Council expects of those who work with data and performance information; establishes eleven principles clustered under three themes to guide our data quality work; and, contributes to a culture in which performance information and data are seen as valuable assets. WHAT MAKES GOOD QUALITY DATA? The Audit Commission defines the six characteristics of good quality data as follows. Accuracy Data should be sufficiently accurate for its intended purpose, representing clearly and in enough detail the interaction provided at the point of activity. Validity Data should be recorded and used in compliance with relevant requirements, including the correct application of any rules or definitions. Reliability Data should reflect stable and consistent data collection processes across collection points and over time, whether using manual or computer based systems, or a combination. Timeliness Data should be captured as quickly as possible after the event or activity and must be available for the intended use within a reasonable time period. Relevance Data captured should be relevant to the purposes for which they are used. Completeness Data requirements should be clearly specified based on the information needs of the body and data collection processes matched to these requirements. Source: Audit Commission, Improving Information to Support Decision Making ASSURANCES FOR DATA QUALITY INCLUDE: The correct numerator and denominator has been used and the calculation is correct and consistent with agreed methodologies; Figures agree to systems reports or compilation documents; The correct data has been included or excluded and it is accurate, complete and up-to-date; Data is collected, recorded and reported consistently for the correct time period, using the same methods across the period and across collection points. Source: Audit Commission, Data Quality Spot Checks PAGE 3

SCOPE OF THE FRAMEWORK The Framework applies to data and performance information relating to the following performance indicators: the measures within the National Indicator Set (NIS) which the Council is responsible for producing, reporting or contributing to; local indicators reported corporately through our annual summary of performance or included in Delivering for our Community; and, any other indicators we use as Vital Signs or formally report to Government, regulators, residents or other stakeholders. The broader array of information we use to manage services is not explicitly covered by this Framework. However, many of the principles and practices set out within the Framework should equally be applied. TOOLS FOR MANAGING DATA QUALITY The Framework is accompanied by a set of tools and further guidance for strengthening data quality arrangements. These are: National indicator audit form; Local indicator audit form; Data quality risk assessment tool; Data quality self audit form; Guide to defining local performance indicators; and Data quality action plan (kept under review by the Service Improvement Team). OTHER RELEVANT POLICY AND GUIDANCE Separate policies and procedures exist to address the following wider risks important to a rounded approach to data quality: Information Security: Data Protection: Information Security Policy.docx Data Protection Policy.docx Information Sharing: Information Systems and Technology: Information Sharing Policy.docx Information Systems and Technology Strategy.docx Freedom of information: http://kcintranet/rbkcareatplt/default.aspx?catid=72d43ddb-0152-4ba2-ace7-810765215ab7 Business continuity: http://kcintranet/rbkcareatplt/default.aspx?catid=df49a9a7-0448-4d92-a47b- 4c0d117dec21 PAGE 4

DEVELOPMENT The Framework draws on best practice from other authorities and the Audit Commission s Standards for better data quality. It incorporates the views of staff producing, reporting on and making use of performance indicators. The Framework was approved for the first time in April 2008 by the Cabinet Member for Service Improvement and endorsed by the Council s Management Board. Refreshed annually by the Service Improvement Team, the current version of the Framework was signed off by the Executive Royal Borough Improvement Programme Group (ERBIPG) in March 2010. If you wish to comment on the Framework or any part thereof, the Service Improvement Team would be glad to hear from you. THE ELEVEN PRINCIPLES Producing performance indicators, and managing the underlying data, is a complicated business. Increasingly we are being challenged to produce information more quickly and to greater standards of accuracy, both for our own purposes and for Government. This Framework establishes eleven principles to guide our approach to data quality, structured around three themes. AWARENESS Everyone is aware of why data quality is important and understands what is expected of them COLLECTING AND RECORDING EVALUATING Our processes and systems for collecting and recording data are fit for purpose and operate according to the principle of right first time We evaluate our data and the systems and processes behind it. We also check the Data Quality Framework is being adhered to AWARENESS Principle A1: Responsibility for data quality is clearly assigned and everyone understands their role The vast majority of us review, work with or contribute to performance information at one level or another. This means we all share responsibility for accurate and reliable data. But it is important to be clear about what we expect of the people directly involved in producing performance information. These roles and responsibilities are explained throughout this Framework and summarised at the end of this document. PAGE 5

A number of roles exist specifically to focus on data quality: the Director of Strategy and Service Improvement is the officer data quality champion and leads on the Council s approach to data quality; each Business Group has at least one nominated PI Contact who champions data quality, provides support and ensures the Framework is being implemented; where systems are used to produce multiple indicators, each system should have a system manager who is responsible for its effective working; each performance indicator has a performance manager who is directly responsible for managing performance in the area in question and also for taking an overview of data quality arrangements and risks; and, a data manager is also assigned to each indicator, who is directly responsible for: collecting and compiling the data; producing, reporting and analysing the performance information; and, maintaining high standards of data quality. The Service Improvement Team maintains a schedule of national and local indicators and their respective performance and data managers. Where an individual collects or manages data, their job description should be used to clarify and reinforce their responsibilities. THE ROLE OF A PI CONTACT PI Contacts play a vital role in promoting and securing data quality. They also act as a point of contact for advice and guidance. For example, the PI Contact for Transport, Environment and Leisure Services set up a database of all departmental indicators. For each indicator the database records who is responsible, the detailed definition and any relevant targets. It is also used to track data quality risks and the Contact meets regularly with responsible officers to help ensure they are resolved. Principle A2: Staff at all levels recognise why data quality is important and it is seen as part of the day job The Service Improvement Team is responsible for communicating the Council s commitment to data quality. Managers should ensure this commitment is reinforced by: using the Council s Performance Development Framework (REAL) and personal development plans to assess and develop data quality skills; setting performance targets for data quality work; and, making active and timely use of performance information to manage services. Principle A3: We make partners aware of the value we place on data quality and set high standards Often the data which makes up our performance information is provided by one or more of the Council s partners. We expect organisations delivering services on our behalf to provide us with information so we can monitor their performance. And we increasingly work in partnership to deliver cross-cutting outcomes. This is reflected PAGE 6

in the National Indicator Set, which requires us to report to Government on indicators that span traditional service provider boundaries. To promote a common approach to data quality, partners on the Kensington and Chelsea Partnership (KCP) have agreed to adopt a common set of high level principles describing the characteristics of a robust approach to data quality. These are the same principles which form the basis of this Framework. If the information provider is not a member of the KCP, the possibility of getting them to sign up to the protocol should be explored. Where personal information is being shared either with partners or internally, the Information Governance Team must be informed to ensure the information is sufficiently protected. The Council s Information Sharing Policy sets out a risk based decision making process for assessing proposals to share personal information. The Team will judge whether a formal information sharing agreement is needed and will assist managers to put in place the necessary arrangements for meeting relevant legislative requirements. Where a partner or contractor is responsible for providing data which feeds into one or more of the indicators within the scope of this Framework, the relevant manager should ensure additional and more specific requirements are put in place. This is likely to involve specifying within the partnership agreement, service level agreement, contract or other formal document: the performance data and/or performance information the partner must provide; the definitions that must be complied with and any relevant guidance that must be followed; the format and timescales within which the data/information must be provided and to whom; the quality standards required; the need for the data/information provider to highlight any risks associated with the quality of the data and whether a statement of quality assurance will need to accompany data returns; the controls and verification processes the data/information provider will put in place to ensure the quality of data; the process the Council will use to confirm the quality of the data and robustness of associated systems; and, the implications for the data provider if the data is not provided in a timely manner and to agreed standards. Managers are expected to review existing agreements to check they comply with the Data Quality Framework. Where they do not, they will need to take a view on whether it is necessary to immediately strengthen data quality arrangements or whether this can wait until an opportune moment, for example the next scheduled contract/agreement review. The urgency will depend on the extent to which existing arrangements are not in line with the Framework and on the importance of the data. PAGE 7

DATA QUALITY IN A PARTNERSHIP CONTEXT Achieving data quality in situations where the data is provided by a number of partners can be particularly challenging. The Supporting People Team has put in place a robust process to promote high levels of data quality. Data reporting requirements are set out in the Performance Management Policy and West London Performance Management Framework. Quarterly data is checked by the Business and Performance Manager and variances are challenged, prior to reporting to the Department for Communities and Local Government. Providers are required to carry out a data quality self-assessment and highlight any risks as part of annual contract monitoring procedures. The process for checking annual data is set out in a simple flow chart. The West London Supporting People Information Sharing Protocol includes a provision to benchmark to improve the completeness, accuracy and reliability of data. COLLECTING AND RECORDING Principle C1: Indicator definitions and associated guidance are readily available and well understood This Framework is necessarily high level since the data for each indicator needs to be collected, collated and calculated according to a specific definition and set of guidance. If the definition is not followed precisely, we risk producing performance information that is inconsistent and fails to measure what it is intended to. In the case of National Indicators, the indicator may be qualified by auditors, putting assessment ratings at risk. Performance and data managers can mitigate these risks by: familiarising themselves with the definition and guidance and keeping abreast of any amendments; making the definition and guidance easily accessible to anyone else involved in collecting and entering data and taking the time to explain how their data feeds in to the overall performance indicator; and, translating national guidance in a way that is relevant to local circumstances. LOCAL INDICATORS National indicators alone cannot present us with a rounded picture of our performance. That is why we have a complementary set of local indicators which is refreshed annually. But these indicators will be of little value unless they flow from our priorities and are defined according to recognised standards. The Service Improvement Team has therefore developed guidance which can be referred to whenever local indicators are established or reviewed. PAGE 8

Principle C2: Systems and processes are designed according to our data needs and have proportionate quality controls The systems and processes we have in place for collecting and recording data and then turning that data in to performance information need to be fit for purpose. When setting up or reviewing systems, service and system managers should therefore ensure the system: allows data to be captured and calculated in accordance with the definition, so that data is relevant, complete and timely; is user friendly and takes into account the views of current users; has proportionate controls to reduce the risk of data being misstated, so we can be confident it is accurate and valid; allows for a stable and consistent data collection process, so that data is reliable; is able to produce an adequate audit trail; and, is complemented by appropriate technical and user support. Where new systems are being procured, the Business Protection Manager should be consulted to ensure synergy with the Corporate Information Systems and Technology Strategy. A PROPORTIONATE APPROACH A theme which cuts across all of the eleven principles outlined in this framework is the need to adopt a proportionate approach to data quality. In order to do so we must identify for each of our key performance indicators: the likelihood of the underlying data being misstated; and, the potential impact of any errors. We should also consider the accuracy levels required for the indicator in question. For example, there is little value in establishing elaborate and resource intensive controls if the indicator is straightforward with little scope for error. DATA QUALITY RISK ASSESSMENT But if an indicator is complicated, draws on large volumes of data and linked to performance reward grant then, conversely, there would most likely be a need for a strong control environment and regular and detailed checks, both of the data and the system itself. By reviewing the level of risk, we can also determine whether it needs to be recorded on the appropriate risk register. For each indicator, performance and data managers are therefore required to prepare, and keep under review, a data quality risk assessment to inform their approach. We should aim to produce data which is right first time. This will reduce the need for costly processes such as data cleansing, reconciliations across different systems and manually recalculating performance returns. We can do so by ensuring there is a robust control environment for each system. System managers should build in controls such as: PAGE 9

arrangements for logging information at the point of receipt in a timely manner; automated checks and safeguards, for example consistency and completeness checks, to prevent erroneous data entry; and, keeping manual intervention to a minimum including by, wherever possible, integrating systems, using common data sets and generating returns automatically. STRENGTHENING OUR SYSTEMS An essential element in our approach to data quality is that we review and seek to improve our performance data systems. In 2007 a new system, IBS, went live across Housing Needs, Housing Benefits and Council Tax services. The developers met with the teams involved before the system was implemented to understand what they required. Subsequently, staff were trained how to use the system and at the same time reminded of why accurate and comprehensive recording was important. Some of the ways the system has helped to improve data quality are: allowing a common set of people and property data to be used for a range of indicators, helping to reduce manual input and eliminate double entries; enabling the number of systems holding performance data to be reduced and rationalised, mitigating against duplicate and erroneous data; using built in field validation and password access; and, making it easier to produce reports which can be sent to managers to check data accuracy. Principle C3: Procedure notes, guidelines and training are used to ensure staff have the skills and knowledge to correctly collect and record data Guidelines and procedure notes are another form of control important to securing data which is right first time, particularly where the process for producing a performance indicator is complicated or there is a high risk of the data being misstated. Procedure notes should document the whole process from collecting, recording and entering the data through to calculating the indicator, alongside any national and local requirements. It is also advisable to clarify who is involved at each stage and what their role is since this can reduce the risk of data being entered more than once or not at all. Sometimes, even though there may be procedure notes, quick reference guides, or system manuals, it may be necessary to hold training sessions. Data quality training works best when it addresses a specific indicator or system and it is therefore the responsibility of managers to identify and implement any training. However, time should also be set aside to explain why data quality is important and to draw attention to the Council s Data Quality Framework. The Service Improvement Team is looking at introducing corporate training to further embed the principles of the Data Quality Framework across the organisation. Corporate Learning and Development is being consulted and the possibility of linking in Data Quality with existing workshops on Data Protection, Data Sharing and Data Security is being explored. PAGE 10

IMPROVING RECORDING IN ADULT SOCIAL CARE It is vital frontline staff who collect and record data: are aware of how important data quality is; understand the specific rules and requirements flowing from the indicators they contribute data towards; and, have the skills and guidance needed to meet these requirements and collect data in a way that ensures it is of a high quality. Staff in the Business Information Team were aware that reported performance was being affected by under-recording. To address the issue, guidance notes incorporating national requirements were reviewed and updated and frontline staff invited to a number of performance events. At the events, staff were reminded of how important accurate and comprehensive recording was, what could be counted towards indicators and the process for recording the data. Subsequently, regular reports were circulated to highlight the impact better recording was having on the relevant indicators. Principle C4: Data is held securely and used and shared in compliance with all legal requirements The legal matters associated with data storage, sharing and use are not directly within the scope of this Framework. Nevertheless, those who work with data should take the time to understand the Council s obligations under the Data Protection and Freedom of Information Act. The Council s Information Sharing Policy will guide you. System security is important not only in meeting our legal obligations but also in ensuring data quality by restricting unauthorised amendments. Access to Council systems is controlled through unique user account and passwords. However, where systems fall outside of these access controls for example where spreadsheets are saved on a server data managers should ensure appropriate access restrictions are set up. Data and system managers are responsible for putting in place business continuity arrangements. As a minimum this will involve nominating a person who can deputise in their absence. IT systems should be reflected within ISD s applications database so they are within the remit of the IT Disaster Recovery Plan. EVALUATING Principle E1: Performance data are subject to proportionate verification to check accuracy, validity, relevance and completeness Even where there are strong controls surrounding the recording and collecting of data, errors can creep in. This is particularly the case for high risk indicators, for example where data is provided by a third party and is known to be of poor quality. System and data managers must therefore put in place processes to verify data complies with the six characteristics of good quality data (as set out on page 3). PAGE 11

Examples of processes for verifying data are: comparing the actual indicator outturn to the expected outturn; consistency checks across time periods and between different systems; carrying out data cleansing and checking for missing, incomplete or invalid records; sampling, which involves checking a representative sample of records in detail against manual records and for relevance and re-performing the calculation for the sample to check it is line with the overall outturn; and, checking performance indicators have been calculated correctly and using the relevant data, as specified by the guidance. Wherever possible, checks should happen close to the point of data entry. They should be aligned with reporting frequency and documented in procedure notes. If there are limitations or doubts regarding the quality of performance information, these should be highlighted in the relevant report. Councillors and managers have an important role to play in promoting data quality by querying unexplained variances in performance. Principle E2: Arrangements for producing performance data are evaluated proactively and any deficiencies reported and remedied Periodic review of the arrangements for producing a given performance indicator is necessary to maintain a robust control environment. Where the data quality risk is judged to be high, system and data managers are responsible for self-auditing both the process in place for producing the indicator from data capture through to calculating and reporting the outturn and the accuracy of the performance information itself at least annually. In some circumstances, responsible officers are likely to consider a review of arrangements associated with medium risk indicators worthwhile also. The relevant management team should be made aware of any indicators which are high risk and will advise whether the indicator needs to be added to the team s risk register. This is most likely to be the case where the self-audit shows there are weaknesses in arrangements for collecting the data and producing the indicator. PI Contacts should maintain a register of risk assessment and self audit outcomes, as well as any extant data quality concerns. Risks that become risks should be escalated to the Business Group management team and the Service Improvement Team. Actions to improve data quality should be identified and reflected in service plans. Guidance for conducting a data quality risk assessment and a self-audit accompanies this Framework. Principle E3: Performance indicator outturns are supported by clear evidence to demonstrate their accuracy and are signed off at a senior level PAGE 12

It is not enough for performance indicator outturns to be accurate they need to be demonstrably accurate and reported within the required timescales. For each performance indicator there must be a clear set of supporting evidence, ie. an audit trail. The audit trail should be such that someone with no knowledge of the indicator could understand the process for arriving at the figure and see that it is robust. It should therefore detail: the full definition for the indicator and the guidance used; the system in place to collect and record the data and calculate the figure; the checks in place to ensure the data is of high quality; where the data has been sourced from; any baseline information used in the calculation or to set targets, along with supporting evidence for this information; the source data, or a sample of source data if this is not possible, and details of any queries run to extract the data; how the final figure has been calculated; and, relevant targets and historic performance outturns, with an explanation of any variances. The audit trail should be maintained throughout the year and reviewed prior to formal reporting of the indicator, which in most cases is likely to be at year end. By completing an audit form, a self-audit where the risk to data quality is medium or high, and collating all the relevant evidence, performance and data managers can ensure they have a good audit trail. Once all the paper work has been brought together it needs to be signed off by the performance and data manager and the Business Group PI Contact. The Executive Director should sign off a schedule of all performance indicator returns prior to external reporting. The Service Improvement Team is responsible for checking a sample set of audit trails collected by PI Contacts. Principle E4: There are arrangements for evaluating whether the Data Quality Framework is being adhered to We need to be confident our approach to data quality is shaped by the principles set out in this Framework. Each Business Group is responsible for ensuring the Data Quality Framework is adhered to within its service areas. PI Contacts can fulfil this role by: reviewing and signing off audit forms and self-audits; supporting colleagues to implement the eleven data quality principles; maintaining an overview of data quality risks within their area; and, monitoring compliance with the Framework. Internal Audit plays a role in ensuring the Framework is adhered to as follows. Assessing arrangements for securing data quality, where relevant, in audits carried out under the Annual Audit Plan. The methodology used by Internal Audit PAGE 13

and the principles set out in this Framework are closely aligned and audits pay particular attention to the indicators that fall within the scope of this Framework. Periodically reviewing the Council s fundamental systems and data quality arrangements in high risk areas such as the Local Area Agreement. Carrying out a mid-year review of high risk indicators, including checking the arrangements for preparing the indicators are in accordance with the Data Quality Framework. Reporting to the Audit Committee any serious risks to the Council arising from ineffective data quality arrangements. Managers should inform the Business Group PI Contact of any potential issues identified during Internal Audit reviews so these can be added to the Business Group s register of data quality risks. The Service Improvement Team has overall responsibility for monitoring whether the Data Quality Framework is being adhered to. In doing so, the Team liaises closely with PI Contacts. It is also responsible for escalating any serious data quality concerns to the Executive Royal Borough Improvement Programme Group and reporting on progress against the Data Quality Action Plan at least annually. PAGE 14

ANNEX: ROLES AND RESPONSIBILITIES The matrix below summarises the roles and responsibilities of the different people and groups involved in ensuring the Council s performance data and information are of high quality. The Executive Royal Borough Improvement Programme Group Cabinet Member for Service Improvement Director of Strategy and Service Improvement Service Improvement Team Internal Audit Information Governance Team Service managers Roles and Responsibilities Overseeing the Council s arrangements for data quality. Reviewing progress against the data quality action plan. Reviewing significant data quality related risks. Acting as the Lead Member for data quality. Acting as the corporate data quality champion and lead officer. Responsible for the Council s policy on data quality. Putting in place the Council s Data Quality Framework, monitoring compliance and providing appropriate support. Maintaining an oversight of national requirements and guidance relating to data quality and liaising with PI Contacts. Co-ordinating the annual data quality self-assessment as part of the wider Use of Resources self-assessment. Carrying out a mid-year audit of a selection of high risk performance indicators. Mainstreaming data quality into audit work. Reviewing the Council s fundamental systems and conducting audits of areas where data quality risks are high. Leading on corporate data protection arrangements and providing support to ensure these are adhered to. Working with the Service Improvement Team to ensure data protection and information sharing work is integrated with wider data quality work. Ensuring data quality arrangements within their service area are robust and in accordance with the Data Quality Framework. Making sure serious data quality risks are highlighted at the relevant management team, cascaded into the Risk Management Framework where appropriate and ultimately rectified. Signing off performance returns prior to external reporting and challenging any variances. Ensuring data quality is reflected in performance review and development processes. PAGE 15

PI Contacts Performance managers System managers Data managers Other staff who collect and collate performance data Championing data quality and disseminating and encouraging best practice. Keeping abreast of changes to guidance and informing performance and data managers. Working with the Service Improvement Team to implement the Data Quality Framework and action plan and promote a corporate approach to data quality. Supporting colleagues to improve data quality. Monitoring adherence to the Data Quality Framework. Signing off audit forms and self-audits. Maintaining a register of data quality risks within the Business Group and escalating these to management teams and the Service Improvement Team as appropriate. Using performance information to manage performance. Taking an overview of data quality risks and supporting the data manager to adopt an approach to data quality which is in line with the Data Quality Framework. Being clear about who is responsible for different elements of the data collecting, collating and validating process and ensuring these individuals have the necessary skills. Reflecting data quality in performance review and personal development processes. Signing off audit forms and self-audits. Ensuring the system is fit for purpose and appropriate controls, validation procedures and security measures are in place so that data is right first time. Ensuring the system can produce an adequate audit trail. Putting in place arrangements to secure business continuity. Assisting in the self-audit process. Identifying and addressing any training/support needs. Putting in place robust arrangements to collect data and produce performance indicators, supported by controls to ensure data is right first time. Accurately calculating indicators and ensuring they are reported on time. Analysing and interpreting performance information. Ensuring guidance and procedure notes are produced, are readily available and are kept under review. Identifying and helping to address any training needs. Arranging for the periodic review of the systems used to produce performance data, including carrying out a self-audit where necessary, and taking action to remedy deficiencies. Completing audit forms and maintaining audit trails so information is audit compliant. Following policies and guidance and ensuring activity is captured, collated and recorded in a way that provides for good quality data. Assisting their managers to identify training needs. PAGE 16