Improving quality through regular reviews:



Similar documents
LEADERSHIP GROUP LG (2015) Paper August 2015 ORGANISATIONAL DEVELOPMENT PROGRAMME REVIEW OF THE PERFORMANCE MANAGEMENT MODEL.

Introduction to Quality Assessment

Funding success! How funders support charities to evaluate

Understanding Agile Project Management

Gateway review guidebook. for project owners and review teams

Guidelines for the implementation of quality assurance frameworks for international and supranational organizations compiling statistics

RDF Planner Pilot Evaluation June 2013

RATIONALISING DATA COLLECTION: AUTOMATED DATA COLLECTION FROM ENTERPRISES

HARLOW COUNCIL PERFORMANCE MANAGEMENT FRAMEWORK

Recorded Crime in Scotland. Scottish Government

INVESTORS IN PEOPLE ASSESSMENT REPORT PERPETUITY RESEARCH & CONSULTANCY INTERNATIONAL

Appendix 1: Performance Management Guidance

Institute of Leadership & Management. Creating a coaching culture

Data Quality Assurance: Quality Gates Framework for Statistical Risk Management

Human Resources and Training update

University of Bath. Welsh Baccalaureate Qualification Internal Evaluation. Themed Report: STAFF TRAINING AND SUPPORT

The overall aim for this project is To improve the way that the University currently manages its research publications data

Principles and Guidelines on Confidentiality Aspects of Data Integration Undertaken for Statistical or Related Research Purposes

IMLEMENTATION OF TOTAL QUALITY MANAGEMENT MODEL IN CROATIAN BUREAU OF STATISTICS

Agile for Project and Programme Managers

Case Study Addressing the Challenges of Project and Programme Management Manchester City Council

Summary of feedback on Big data and data protection and ICO response

Performance Detailed Report. May Review of Performance Management. Norwich City Council. Audit 2007/08

The following criteria have been used to assess each of the options to ensure consistency and clarity:

GSBPM. Generic Statistical Business Process Model. (Version 5.0, December 2013)

DELIVERING OUR STRATEGY

GETTING EXTRA LEVERAGE FROM RESEARCH RESOURCES SOME RECENT EXPERIENCE FROM THE UK

13. Performance Management

Co-operative Development Support Services Service Specification

Statistical Data Quality in the UNECE

Performance Management and Service Improvement Framework

A Framework of Quality Assurance for Responsible Officers and Revalidation

Strengthening the capabilities of the Department of Statistics in Jordan MISSION REPORT

How To Learn To Be A Successful Accountant

Commercial Energy Management 11 Questions to ask your Energy Broker

AGENDA ITEM 5 AYRSHIRE SHARED SERVICE JOINT COMMITTEE 1 MAY 2015 AYRSHIRE ROADS ALLIANCE CUSTOMER SERVICE STRATEGY

Skills Audit for Researchers

How To Design A Project

EVALUATION OF THE CORPORATE PERFORMANCE MANAGEMENT FRAMEWORK

Derbyshire County Council Performance and Improvement Framework. January 2012

The NHS Knowledge and Skills Framework (NHS KSF) and the Development Review Process

developing an action plan Implementing change: Implementing change: developing an action plan

SUMMARY OF MONITOR S WELL-LED FRAMEWORK FOR GOVERNANCE REVIEWS: GUIDANCE FOR NHS FT S PUBLICATION Report by Trust Secretary

Planning user documentation - a guide for software project managers

Statistics Quality Management Handbook

Evaluation Case Study. Leadership development in special schools

Culture of Care Barometer: project plan

City of Cardiff Council Improving Scrutiny Project: Project Brief, January 2015

Strengthening the Performance Framework:

Research report. Understanding small businesses experience of the tax system

European Statistical System Code of Practice Peer Reviews: (Version 1.3)

Environmental management and reporting involves identifying,

How to Measure and Report Social Impact

Opportunities for All. Supporting all young people to participate in post-16 learning, training or work

City and County of Swansea. Human Resources & Workforce Strategy Ambition is Critical 1

E-COMMERCE - TOWARD AN INTERNATIONAL DEFINITION AND INTERNATIONALLY COMPARABLE STATISTICAL INDICATORS

5 costly mistakes you should avoid when developing new products

Quality Management Review

Key Steps to Implementing Performance Management

Outcome performance measures Quarterly report

PM Governance. Executive Team ADCA ADCA

A 360 degree approach to evaluate a broker s impact on partnerships

NATIONAL INSTITUTE FOR HEALTH AND CLINICAL EXCELLENCE SPECIAL HEALTH AUTHORITY

REVEALED: THE BLUEPRINT FOR BUSINESS SUCCESS Find out if you are achieving your full growth potential

How To Improve Training In Australia

Aberdeen City Council. Performance Management Process. External Audit Report o: 2008/19

Cambridge International Certificate in Educational Leadership 6247 Cambridge International Diploma in Educational Leadership 6248

CORNWALL COUNTY COUNCIL RESOURCE AND PERFORMANCE POLICY DEVELOPMENT AND SCRUTINY COMMITTEE

Making Government business more accessible to SMEs

Applies from 1 April 2007 Revised April Core Competence Framework Guidance booklet

Post-accreditation monitoring report: The Chartered Institute of Personnel and Development. June 2007 QCA/07/3407

National Disability Insurance Scheme

PROGRAMME MANAGEMENT AND BUSINESS PLANNING

White Paper On Pilot Method Of ERP Implementation

13. Project Management and the Australian Bureau of Statistics: Doing What Works

The Healthcare Leadership Model Appraisal Hub. 360 Assessment User Guide

Risk Management Policy and Process Guide

Audit of Project Management Governance. Audit Report

INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD

ONLINE, DISTANCE AND BLENDED LEARNING

New product development within ABC Electronics

Performance Management Framework. December 2013

Schroders Charities Our Services

Assessment Policy. 1 Introduction. 2 Background

Guidelines to Promote and Support the Credit Rating Process in Colleges in Scotland

Treating. A simple guide to our Standards of Conduct and how we work for you. fairly

SKILLED STAFF THE SECRET TO MODERNISING STATISTICS

PROJECT MANAGEMENT FRAMEWORK

ACCREDITATION. APM Corporate CASE STUDY

Item 10 Appendix 1d Final Internal Audit Report Performance Management Greater London Authority April 2010

The University of Finance Department - A Review

project management community? Are you getting the best from your Project Management Learning Consultancy At a glance

APPENDIX A (CFO/263/09) Merseyside Fire & Rescue Service ICT Outsourcing Procurement Support. Final Report

The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics

Cloud (educational apps) software services and the Data Protection Act

PROGRAMME SPECIFICATION POSTGRADUATE PROGRAMMES. Programme name MSc Project Management, Finance and Risk

How To Improve Safety In The Nhs

IAM Level 2. NVQ Certificate in Business and Administration. Qualification handbook edition

The centre of government: an update

BEST PRACTICE CONSULTATION

Transcription:

Implementing Regular Quality Reviews at the Office for National Statistics Ria Sanderson, Catherine Bremner Quality Centre 1, Office for National Statistics, UK Abstract There is a requirement under the UK Code of Practice for Official Statistics to ensure official statistics are produced to a level of quality that meets users' needs and that producers of official statistics seek to achieve continuous improvement in statistical processes by undertaking regular reviews. The ONS Quality Centre have developed and rolled out a new process for assessing the quality of a statistical output. This new process is called a Regular Quality Review and replaces the use of a self-assessment tool to carry out quality reviews. The decision to develop a new process was made based on feedback from managers of statistical outputs at ONS during a user engagement exercise. The Regular Quality Review has been designed to reduce the burden on statistical output managers whilst offering tailored recommendations to improve the quality of the statistical output. In this paper, the requirements for quality reviews of statistical outputs are described, the Regular Quality Review process and its implementation at ONS are presented and explained, and guidance is provided on how this approach could be tailored to other National Statistics Institutes and government departments producing official statistics. 1. Introduction The requirement to regularly review the quality of official statistics is set out both in the UK Code of Practice for Official Statistics (UK Statistics Authority, 2009) and the European Statistics Code of Practice (Eurostat, 2011). In 2013, ONS rolled out the use of the Quality, Methods and Harmonisation Tool (QMHT) to carry out quality reviews of all statistical outputs on an annual basis. QMHT is a self-assessment tool that asks questions about the output across all stages of the Generic Statistical Business Process Model (UNECE, 2013), which describes the processes involved in producing official statistics. 1 Contact: ons.quality.centre@ons.gov.uk 29 SMB 74

Quality Centre originally built QMHT with the aim of meeting the requirement under the UK Code of Practice for regular review of statistical outputs. It was also designed to collect information on respondent burden. Methodological expertise was employed to develop a set of recommendations for each stage of the Generic Statistical Business Process Model (GSBPM) that were built into QMHT. This meant that QMHT generated a series of standard recommendations once the fields were completed. QMHT was tested with other government departments before it was also made available to the Government Statistical Service (GSS). QMHT remains available for government departments to use as part of a set of quality resources 2 on the GSS website. This includes examples of how QMHT has been adapted to better suit the needs of individual government departments. At ONS, the responsibility for completing QMHT and responding to the recommendations sat with the managers of the individual statistical outputs. Once QMHT had been in use for a year, Quality Centre held focus groups with statistical output managers to establish whether it was meeting their needs. During these focus groups, statistical output managers highlighted that QMHT took a very long time to complete and generated a large number of recommendations that were not always relevant to the statistical output. The recommendations generated generally fell short of providing sound methodological advice that could be implemented, meaning that the opportunity to improve a statistical output was limited. There were additional difficulties faced by outputs that are not based on surveys as a number of questions were not applicable, which prevented the right recommendations being made. The overwhelming feeling from the focus groups was that managers of statistical outputs would like the opportunity to discuss their output with a methodologist. Responding to the feedback received, Quality Centre worked to develop a new approach to reviewing statistical outputs at ONS, which has been termed the Regular Quality Review. This approach was piloted in the summer of 2014 and was launched across ONS in November 2014. The Regular Quality Review (RQR) has been built around the principle of giving an opportunity for discussion with a methodological expert. 2. The Regular Quality Review (RQR) An RQR consists of a facilitated meeting, lasting between 1.5 and 2 hours, between the manager of a statistical output, a senior methodologist and a representative from Quality Centre. The meeting makes use of existing documentation, which forms the basis for a discussion of methods and quality, led by the senior methodologist. This review of methods covers all relevant stages of Generic Statistical Business Process Model and all five of the European Statistical System (ESS) dimensions of output 2 https://gss.civilservice.gov.uk/statistics/methodology-quality/quality-2/quality-resources/ 30 SMB 74

quality 3 (Eurostat, 2014). The collation of the documentation for the meeting, the correspondence and the scheduling of the meetings are all handled by Quality Centre. Each RQR is different; they are tailored by the senior methodologist to the individual statistical output. They ask questions of the output manager during the meeting and develop a set of recommendations for that statistical output. This means that the recommendations that are made by the senior methodologist are bespoke; this ensures that they are relevant to the statistical output, which addresses one of the main concerns raised regarding the use of QMHT. The recommendations then feed into business plans or Survey Action Plans, which are used by some surveys to monitor any risks or issues. The recommendations are assigned a priority status, ranging from low to medium to high. The outcome of the review meeting is a report containing recommendations made by the senior methodologist and a Red, Amber, Green status that reflects the findings of the meeting. The recommendations are then owned by the statistical output area. Quality Centre monitors recommendations and provides support to ensure that recommendations are addressed. One of the most important pieces of documentation used to carry out the RQR is the Quality and Methodology Information (QMI) report 4 written for each statistical output produced by ONS. The QMIs are designed to tell users of statistics what they need to know about using the data, including strengths, limitations, methodology and an assessment against each of the five ESS quality dimensions (Eurostat 2014). However their usefulness for carrying out reviews of the quality of statistics was recognised at an early stage and these documents form the basis of the RQR meeting. Another useful source of documentation for the meeting is the Value Engineering 5 (Sharp, 2014) exercise that takes place annually at ONS, which highlights where there are concerns with existing methods, processes or systems. Value Engineering is a self-assessment carried by the statistical output manager of the risk associated with different dimensions of a statistical output, including the quality considerations, but also taking into account staff capability and user feedback. It allows a view of the relative strengths and weaknesses across all statistical outputs. The senior methodologist reviews the QMI and Value Engineering alongside any other relevant supporting documentation prior to the meeting. As the meeting is based on existing documentation, the burden placed on the statistical output manager is minimised. They are only required to attend the meeting and share their knowledge. Once the review process was designed, it was piloted on three statistical outputs during the summer of 2014. This was a valuable test of whether the review process would address the concerns raised by statistical output managers and also gave a useful overview of what could realistically be covered during the meeting. The lessons 3 Relevance; accuracy and reliability; timeliness and punctuality; comparability and coherence; accessibility and clarity 4 http://www.ons.gov.uk/ons/guide-method/method-quality/quality/quality-information/index.html 5 Value Engineering assesses the level of risk associated with ten different dimensions of statistical outputs including systems, processes and methods. 31 SMB 74

from these pilots were used to further improve the process and Regular Quality Reviews were formally launched at ONS in November 2015. By May 2015, thirteen reviews have been completed. There was a recognised need for a slow start to ensure that the process was working as anticipated and to give some extra time and flexibility to smooth out any problems. Establishing buy-in from the senior methodologists who carry out the reviews at an early stage was crucial in ensuring that they could commit to carrying out these reviews. Their feedback has highlighted that there is value in this method of reviewing statistical outputs. 3. Building in process quality ONS has an existing quality initiative designed to ensure that effective quality assurance procedures are in place. This is termed the quality assurance (QA) or Divisional Director (DD) walkthrough. The initiative requires the senior manager (the DD) responsible for a statistical output to walk through the quality assurance procedures that are in place for that output. The DD then assigns a Red-Amber-Green status to the quality assurance activities, with improvement actions put in place for any cases where quality assurance was found not to be sufficient. The DD walk through was designed to take place once every three years or when key staff for a particular statistical output moved. Feedback from statistical output areas indicated that it was difficult to understand the quality initiatives in place and the requirements that output areas needed to meet. With this in mind, Quality Centre incorporated the DD walkthrough into the RQR. This means that there are two components to the RQR; the methods review carried out by the senior methodologist and the DD walkthrough, which is a review of process quality. The documentation used to carry out the DD walkthrough consists of desk instructions and information explaining the quality assurance checks that are in place for the statistical output. Although including the senior manager in the meeting can make the scheduling difficult, carrying out the DD walkthrough in this way adds value to the process. Combining the activities into a single meeting means that the burden placed on statistical output areas is minimised whilst providing the senior manager with the assurance that both methods and process quality have been reviewed. The inclusion of the DD walkthrough has formed an important part of how RQRs are described and communicated at ONS. Figure 1 shows a diagram developed by Quality Centre to show how the Regular Quality Review considers both methods quality and process quality and how it builds on existing information including the QMI, Value Engineering (Sharp, 2014) and quality assurance checks. 32 SMB 74

Figure 1. An overview of the Regular Quality Review 4. Progress to date The programme of RQRs was launched at ONS in November 2015. There are in excess of 160 statistical outputs produced by ONS; Quality Centre have established a three year rolling programme of reviews, such that approximately 50 reviews are carried out each year (some statistical outputs are very similar and as such have been grouped together). Thirteen RQRs have been conducted to date 6. Statistical output areas are extremely positive about this new process, and have welcomed the bespoke recommendations made by the senior methodologist. The quote used most often regarding this process is sensible. The process works for both survey and non-survey based outputs. Quality Centre encourages all statistical outputs, including experimental statistics, to attend an RQR meeting. 5. Challenges In rolling out Regular Quality Reviews, Quality Centre has faced three main challenges: scheduling the reviews establishing buy-in and communication working out a mechanism for addressing recommendations 6 May 2015 33 SMB 74

5.1. Scheduling The scheduling of RQRs has presented one of the biggest challenges to the roll-out of the initiative. Quality Centre has established a three year rolling programme of RQRs, to ensure that each of the approximately 160 statistical outputs produced by ONS is reviewed once in a three year period. This means that roughly one review a week is carried out; in practice, due to staff availability and ensuring time is set aside to review the process, this can vary. Quality Centre has established the time table for the first year of review activity. This scheduling took a number of (often competing) factors into consideration: the level of risk of the output as shown through Value Engineering, the availability of staff, the spread of reviews across sites and across areas within ONS, publication schedules for statistical outputs and whether an output also feeds into another statistical output. 5.2. Establishing buy-in and communication Establishing buy-in from the senior methodologists who carry out the reviews at an early stage was crucial. Quality Centre engaged with them early and continue to seek feedback on how the process can be simplified and improved. Given the negative feeling around the use of QMHT the communication of the initiative to statistical output areas in ONS was very important to ensure that the new RQR process was accepted. Communications were targeted at both senior managers and statistical output managers and the benefits of the new approach were clearly explained. Statistical output managers were asked for their feedback on the proposed timetable and seminars were given directly to statistical output managers to explain the changes. In addition, a number of presentations were made at divisional meetings, which provided the opportunity to present to smaller groups and also to answer questions raised by statistical output teams. 5.3. Mechanism for addressing recommendations The one unknown throughout the development of the RQR process was what type of recommendations would be made for each statistical output. Given this unknown, it was difficult to plan in advance how the recommendations would be addressed. One of the key features of the RQR is that it will provide bespoke recommendations. Another important consideration that a three yearly review cycle affords is that there is time between reviews for recommendations to be addressed; this was something that was missing from the annual cycle of QMHT. Quality Centre has a role to monitor recommendations following RQRs. The recommendations themselves are owned by the relevant statistical output area, however Quality Centre will return periodically to check progress. Where recommendations require methodological support and advice, Quality Centre will help to facilitate this and will help to get the required work onto the relevant business plans. In addition, Quality Centre s role is to ensure that recommendations are prioritised 34 SMB 74

appropriately and that the description of recommendations as low, medium or high priority is applied consistently. There is also a role in identifying recommendations that cut across different statistical outputs; these cross-cutting recommendations may need to be addressed from a corporate perspective, which Quality Centre are well-placed to advise upon. 6. The link between Regular Quality Reviews (RQRs) and other quality initiatives Quality Centre is responsible for managing a series of complementary quality initiatives that support quality assurance, quality reporting and quality improvement. The Regular Quality Review forms an important part of this portfolio, as they provide a mechanism to conduct a light-touch methodological review for each statistical output. They complement the more in-depth National Statistical Quality Reviews 7, which offer a much more detailed insight into the methodology in place for a statistical output. National Statistics Quality Reviews (NSQRs) involve external experts and are carried out for up to two statistical outputs per year. The programme of NSQRs was relaunched in 2012 and two reviews have been published to date on the Labour Force Survey (ONS, 2014) and National Accounts (Barker & Ridgeway, 2014); two more NSQRs are underway at the moment. Linking the existing quality assurance walkthroughs into the Regular Quality Review process has helped to further streamline the quality framework in place at ONS. Making an explicit link between the Quality and Methodology Information report and the RQR also further supports a set of linked quality initiatives. This streamlining of the existing quality initiatives forms an important foundation for renewing the ONS Quality Management Strategy, which is underway at the moment. 7. Relevance for National Statistics Institutes and other government departments Over the last six months Quality Centre has presented RQRs at a number of cascades and seminars across the ONS and GSS. These presentations have shown that the Regular Quality Review is a flexible model that is easily portable to other National Statistics Institutes and government departments responsible for producing official statistics. It offers a relatively light-touch approach to reviewing a statistical output, so is not appropriate for in-depth methodological review but can be used to provide a health-check. It is expected that where senior methodologists are not available, a senior statistician could carry out the review or support could be sought from the Methodology Advisory Service 8. This could be someone within the same department or from a different department. Fundamentally, this is a meeting, but some guidance has been developed on how such a meeting may be carried out, based on the experiences at ONS. 7 http://www.ons.gov.uk/ons/guide-method/method-quality/quality/quality-reviews/index.html 8 https://gss.civilservice.gov.uk/courses-and-events/statistical-training-unit/methodologyadvisory-service/ 35 SMB 74

7.1. Guidance on how to carry out a Regular Quality Review 1. Identify participants and arrange for the meeting to take place The meeting should involve the manager responsible for the statistical output and a senior methodologist or statistician that will lead the review. Although it can be useful to have someone to facilitate the meeting, this would not be a requirement. Where departments are small, the senior methodologist or statistician could come from another department or a different part of the organisation. 2. Collate background documentation The information that forms the basis of the meeting should be compiled and circulated to all those involved in the meeting ahead of time. This should include any methodological articles or quality reviews. 3. Senior methodologist studies background information and identifies questions The senior methodologist or statistician carrying out the review needs time to study the background material and to develop a set of questions in readiness for the review. Up to one day of time is set aside for this stage to be completed. 4. The senior methodologist or statistician leads the review The meeting takes place and the senior methodologist or statistician works through the questions that have been prepared. Sometimes responses may need to be provided after the meeting. The success of the review depends on all parties being open and transparent about the practices and methods that are in place. The meeting should be an open forum; it should not be considered as an interview or a test. There are no right or wrong answers; the aim is simply to get the best value out of the recommendations that are made. 5. Recommendations made within a report and RAG status assigned The senior methodologist or statistician then writes their recommendations in a report, which is sent back to the output manager. The output manager has a right to reply to ensure that the information recorded is accurate. The recommendations can be prioritised. This has proved to be a useful approach where many outputs are reviewed, both to monitor cross-cutting recommendations but also to ensure consistency when multiple reviewers are involved. A RAG status is applied to the statistical output which reflects the discussions and the recommendations made. There then needs to be some facility to implement, and to follow up on the implementation of, the recommendations that are made. It has also been beneficial to carry out regular lessons learned sessions to make sure that the process is working as it is intended. As RQRs become more embedded, it is expected that this will be required on a less frequent basis. 36 SMB 74

8. Conclusions The new Regular Quality Review (RQR) process has been developed at ONS in response to the requirements of the UK Code of Practice for Official Statistics. Principle 4 of the Code requires that official statistics are produced to a level of quality that meets users' needs and that producers of official statistics seek to achieve continuous improvement in statistical processes. The Quality, Methods and Harmonisation Tool (QMHT) previously used for conducting reviews at ONS was not meeting the needs of statistical output managers. The RQR process was developed in response to feedback received from statistical output managers during a series of focus groups. The suggestion of a face-to-face meeting with a methodological expert was made at one of the focus groups and the RQR has been developed based on this principle. The RQR process was rolled out at ONS in November 2014. It is working well and has been accepted by statistical output managers as being more straightforward and less burdensome than QMHT. The RQR is a facilitated meeting where a senior methodologist asks the statistical output manager questions about the methods in place for the output at each stage of the Generic Statistical Business Process Model (UNECE, 2013) and the quality of the output against the European Statistical System dimensions of quality (Eurostat 2014). The meeting is based on existing documentation, which is collated by ONS Quality Centre for the senior methodologist to consider prior to the meeting. There is no requirement for any preparation on behalf of the statistical output manager. The outcome of the meeting is a set of bespoke recommendations for the statistical output, which are monitored by Quality Centre. A consideration of process quality has been incorporated into the review by including the existing quality assurance (or Divisional Director (DD)) walkthrough, carried out by the senior manager responsible for the statistical output, into the meeting. This further minimises the burden placed on statistical output managers whilst providing assurance that both methods quality and process quality have been reviewed. The roll-out represents the start of a three year programme of carrying out Regular Quality Reviews. Quality Centre are working to continue communication of the benefits of RQRs and to further refine and improve the process through feedback from statistical output areas and the methodologists involved. The RQR offers a light-touch methodological review and complements the existing quality initiatives in place at ONS. It is a flexible model that can be used by official statistics producers; guidance has been developed for carrying out this approach in other statistical institutes and in other government departments. 37 SMB 74

9. Acknowledgements The authors would like to acknowledge the important and substantial contribution of Victoria Chenery to the design, piloting and development of Regular Quality Reviews. References Barker K., Ridgeway A., 2014, National Accounts and Balance of Payments, National Statistics Quality Review Series 2 (2), available from: http://www.ons.gov.uk/ons/guide-method/method-quality/quality/qualityreviews/list-of-current-national-statistics-quality-reviews/nsqr-series--2--reportno--2--review-of-national-accounts-and-balance-of-payments.pdf Eurostat, 2011, European Statistics Code of Practice, available from: http://ec.europa.eu/eurostat/web/quality/european-statistics-code-of-practice Eurostat, 2014, ESS Handbook for Quality Reports, available from: http://ec.europa.eu/eurostat/web/products-manuals-and-guidelines/-/ks-gq-15-003 ONS, 2014, Labour Force Survey, National Statistics Quality Review Series 2 (1), available from: http://www.ons.gov.uk/ons/guide-method/methodquality/quality/quality-reviews/list-of-current-national-statistics-qualityreviews/nsqr-series--2--report-no--1/report---review-of-the-labour-forcesurvey.pdf Sharp, G., 2014, The application of Value Engineering tools to risk assess the outputs of an NSI, Paper presented at Q2014 conference in Vienna, available from: http://www.q2014.at/fileadmin/user_upload/vienna_q_conference_- _VE_paper_v.1.1.docx UK Statistics Authority, 2009, Code of Practice for Official Statistics, available from: http://www.statisticsauthority.gov.uk/assessment/code-of-practice/code-ofpractice-for-official-statistics.pdf UNECE, 2013, Generic Statistical Business Process Model, version 5, available from: http://www1.unece.org/stat/platform/display/gsbpm/gsbpm+v5.0 38 SMB 74