UK Standards for Microbiology Investigations



Similar documents
Content Sheet 7-1: Overview of Quality Control for Quantitative Tests

LAB 37 Edition 3 June 2013

CHAPTER 13. Quality Control/Quality Assurance

ASSURING THE QUALITY OF TEST RESULTS

JOINT COMMISSION INTERNATIONAL ACCREDITATION STANDARDS FOR. 2nd Edition

Gap Analysis of ISO 15189:2012 and ISO 15189:2007 in the field of Medical Testing

Content Sheet 10-1: Overview of External Quality Assessment (EQA)

Validation and Calibration. Definitions and Terminology

AN AUDIT QUESTIONNAIRE THAT EXAMINES SPECIFICALLY THE MANAGEMENT OF TECHNICAL ACTIVITIES CLAUSES IN ISO 15189

Quality control in dental practice Peter Kivovics DMD, BDS, MDSc, PhD, Chief Dental Officer for Hungary

QUALITY MANAGEMENT IN VETERINARY TESTING LABORATORIES

Validation of measurement procedures

EMA Clinical Laboratory Guidance - Key points and Clarifications PAUL STEWART

Mouse Insulin ELISA. For the quantitative determination of insulin in mouse serum and plasma

Quality Control of the Future: Risk Management and Individual Quality Control Plans (IQCPs)

Control Charts and Trend Analysis for ISO Speakers: New York State Food Laboratory s Quality Assurance Team

For the Design, Installation, Commissioning & Maintenance of Fixed Gaseous Fire Suppression Systems

Code of practice for clinical biochemists/chemical pathologists. and clinical biochemistry services. March 2011

How To Comply With The Loss Prevention Certification Board

General Information. Our Background Automated Phone Services Holiday Coverage Licenses & Accreditations

Checklist. Standard for Medical Laboratory

MEDICAL DEVICE GUIDANCE

REQUIREMENTS FOR AN ACCREDITED VETERINARY MEDICAL DIAGNOSTIC LABORATORY

How To Inspect A Blood Bank

Lifecycle Management of Analytical Procedures; What is it all about? Jane Weitzel Independent Consultant

Revision of the Directive 98/79/EC on In Vitro Diagnostic Medical Devices. Response from Cancer Research UK to the Commission August 2010

Health Service Circular

510(k) SUBSTANTIAL EQUIVALENCE DETERMINATION DECISION SUMMARY

Chapter 9 - ANALYTICAL QUALITY ASSURANCE

SPECIFICATIONS AND CONTROL TESTS ON THE FINISHED PRODUCT

Guidelines for the Acceptance of Manufacturer's Quality Assurance Systems for Welding Consumables

Content Sheet 12-1: Overview of Personnel Management

Sutter Health Support Services Shared Laboratory Position Description. Incumbent: Laboratory Date: October 23, 2006 Written By: Michele Leonard

Quality Requirements of a Point of Care Testing Service

DECISION LIMITS FOR THE CONFIRMATORY QUANTIFICATION OF THRESHOLD SUBSTANCES

Content Sheet 5-1: Overview of Sample Management

Procedure: Quality Assurance Policy Version 5 Quality Assessment Policy Version 3

Site visit inspection report on compliance with HTA minimum standards. London School of Hygiene & Tropical Medicine. HTA licensing number 12066

State Meat and Poultry Inspection (MPI) Program Laboratory Quality Management System Checklist

Assay Qualification Template for Host Cell Protein ELISA

Basic Lessons in Laboratory Quality Control

Information Governance Strategy & Policy

Institute of Biomedical Science. Point of Care Testing (Near-Patient Testing) Guidance on the Involvement of the Clinical Laboratory

Assay Migration Studies for In Vitro Diagnostic Devices Guidance for Industry and FDA Staff

Guidance for Industry

CHAPTER 7 QUALITY ASSESSMENT

Good Clinical Laboratory Practice (GCLP) An international quality system for laboratories which undertake the analysis of samples from clinical trials

Qualified Persons in the Pharmaceutical Industry Code of Practice 2009, updated August 2015

USING CLSI GUIDELINES TO PERFORM METHOD EVALUATION STUDIES IN YOUR LABORATORY

EA IAF/ILAC Guidance. on the Application of ISO/IEC 17020:1998

Example of a food company quality

USE OF REFERENCE MATERIALS IN THE LABORATORY

A Practical Guide to Internal Quality Control (IQC) for Quantitative Tests in Medical Laboratories

South Coast Air Quality Management District

Reflection paper for laboratories that perform the analysis or evaluation of clinical trial samples

APPENDIX N. Data Validation Using Data Descriptors

Network Certification Body

Standards for Laboratory Accreditation

Standardization, Calibration and Quality Control

Incident reporting policy National Chlamydia Screening Programme

Human Free Testosterone(F-TESTO) ELISA Kit

OECD SERIES ON PRINCIPLES OF GOOD LABORATORY PRACTICE AND COMPLIANCE MONITORING NUMBER 10 GLP CONSENSUS DOCUMENT

In vitro diagnostic reagent, calibrator and control material stability

Terms concerned with internal quality control procedures

How to Verify Performance Specifications

Improving quality, protecting patients

NABL NATIONAL ACCREDITATION

NIST HANDBOOK 150 CHECKLIST

American Association for Laboratory Accreditation

Quality Management System Policy Manual

VALIDATION OF ANALYTICAL PROCEDURES: TEXT AND METHODOLOGY Q2(R1)

Quality Assurance (QA) & Quality Control (QC)

Procedure for Equipment Calibration and Maintenance

Content Sheet 3-1: Equipment Management Overview

HSCIC Audit of Data Sharing Activities:

Good Research Practice

Drinking Water Quality Management Plan Review and Audit Guideline

Good Practice Guidelines for Appraisal

Quality in noninfectious

ALACC Frequently Asked Questions (FAQs)

UK Standards for Microbiology Investigations

OECD Series on Principles of GLP and Compliance Monitoring Number 8 (Revised)

Step-by-Step Analytical Methods Validation and Protocol in the Quality System Compliance Industry

DOCUMENTED PROCEDURE MANUAL

5.5 QUALITY ASSURANCE AND QUALITY CONTROL

NABL accreditation is a formal recognition of the technical competence of a medical testing laboratory

ROLES, RESPONSIBILITIES AND DELEGATION OF DUTIES IN CLINICAL TRIALS OF MEDICINAL PRODUCTS

QUALITY CONTROL AND QUALITY ASSURANCE PROCEDURES USED DURING THE INDOOR AIR QUALITY MONITORING PROGRAMME IN WESTON VILLAGE, RUNCORN

Standard 5. Patient Identification and Procedure Matching. Safety and Quality Improvement Guide

Medical Technologies Evaluation Programme Methods guide

Assessment of compliance with the Code of Practice for Official Statistics

OMCL Network of the Council of Europe QUALITY MANAGEMENT DOCUMENT

UK Standards for Microbiology Investigations

STANDARD OPERATING GUIDELINES: EVIDENTIAL BREATH ALCOHOL INSTRUMENT CALIBRATION

An Approach to Records Management Audit

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

Guidance on the In Vitro Diagnostic Medical Devices Directive 98/79/EC

DEVELOPING WORLD-CLASS PERFORMANCE IN HEALTHCARE SCIENCE

Newcastle University Information Security Procedures Version 3

New Guidelines on Good Distribution Practice of Medicinal Products for Human Use (2013/C 68/01)

Transcription:

UK Standards for Microbiology Investigations Quality assurance in the diagnostic virology and serology laboratory Issued by the Standards Unit, Microbiology Services, PHE Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 1 of 29 Crown copyright 2015

Acknowledgments UK Standards for Microbiology Investigations (SMIs) are developed under the auspices of Public Health England (PHE) working in partnership with the National Health Service (NHS), Public Health Wales and with the professional organisations whose logos are displayed below and listed on the website https://www.gov.uk/ukstandards-for-microbiology-investigations-smi-quality-and-consistency-in-clinicallaboratories. SMIs are developed, reviewed and revised by various working groups which are overseen by a steering committee (see https://www.gov.uk/government/groups/standards-for-microbiology-investigationssteering-committee). The contributions of many individuals in clinical, specialist and reference laboratories who have provided information and comments during the development of this document are acknowledged. We are grateful to the medical editors for editing the medical content. For further information please contact us at: Standards Unit Microbiology Services Public Health England 61 Colindale Avenue London NW9 5EQ E-mail: standards@phe.gov.uk Website: https://www.gov.uk/uk-standards-for-microbiology-investigations-smi-qualityand-consistency-in-clinical-laboratories PHE publications gateway number: 2015010 UK Standards for Microbiology Investigations are produced in association with: Logos correct at time of publishing. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 2 of 29

Contents ACKNOWLEDGMENTS... 2 AMENDMENT TABLE... 4 UK SMI: SCOPE AND PURPOSE... 5 SCOPE OF DOCUMENT... 7 INTRODUCTION... 7 1 EXTERNAL QUALITY ASSURANCE... 10 2 INTERNAL QUALITY ASSURANCE... 10 3 INTERNAL QUALITY CONTROL... 12 4 EQUIPMENT MONITORING... 15 5 AUDIT... 16 APPENDIX 1: DOCUMENTATION USED IN IQA... 18 APPENDIX 2: DOCUMENTATION USED IN IQC... 21 APPENDIX 3: EQUIPMENT CONTROL SHEETS... 24 APPENDIX 4: DOCUMENTATION AND DATA HANDLING ASSOCIATED WITH AUDIT OF SPECIMEN TURNAROUND TIME... 25 APPENDIX 5: STATISTICS USED WITH IQC... 27 REFERENCES... 29 Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 3 of 29

Amendment table Each SMI method has an individual record of amendments. The current amendments are listed on this page. The amendment history is available from standards@phe.gov.uk. New or revised documents should be controlled within the laboratory in accordance with the local quality management system. Amendment no/date. 9/07.12.15 Issue no. discarded. 6.2 Insert issue no. 7 Section(s) involved Whole document. Page 2. Amendment Hyperlinks updated to gov.uk. Information on uncertainty of measurement added to strengthen the document. Minor textural changes throughout. Scientific content remains unchanged. Updated logos added. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 4 of 29

UK SMI # : scope and purpose Users of SMIs Primarily, SMIs are intended as a general resource for practising professionals operating in the field of laboratory medicine and infection specialties in the UK. SMIs also provide clinicians with information about the available test repertoire and the standard of laboratory services they should expect for the investigation of infection in their patients, as well as providing information that aids the electronic ordering of appropriate tests. The documents also provide commissioners of healthcare services with the appropriateness and standard of microbiology investigations they should be seeking as part of the clinical and public health care package for their population. Background to SMIs SMIs comprise a collection of recommended algorithms and procedures covering all stages of the investigative process in microbiology from the pre-analytical (clinical syndrome) stage to the analytical (laboratory testing) and post analytical (result interpretation and reporting) stages. Syndromic algorithms are supported by more detailed documents containing advice on the investigation of specific diseases and infections. Guidance notes cover the clinical background, differential diagnosis, and appropriate investigation of particular clinical conditions. Quality guidance notes describe laboratory processes which underpin quality, for example assay validation. Standardisation of the diagnostic process through the application of SMIs helps to assure the equivalence of investigation strategies in different laboratories across the UK and is essential for public health surveillance, research and development activities. Equal partnership working SMIs are developed in equal partnership with PHE, NHS, Royal College of Pathologists and professional societies. The list of participating societies may be found at https://www.gov.uk/uk-standards-for-microbiology-investigations-smi-qualityand-consistency-in-clinical-laboratories. Inclusion of a logo in an SMI indicates participation of the society in equal partnership and support for the objectives and process of preparing SMIs. Nominees of professional societies are members of the Steering Committee and working groups which develop SMIs. The views of nominees cannot be rigorously representative of the members of their nominating organisations nor the corporate views of their organisations. Nominees act as a conduit for two way reporting and dialogue. Representative views are sought through the consultation process. SMIs are developed, reviewed and updated through a wide consultation process. Quality assurance NICE has accredited the process used by the SMI working groups to produce SMIs. The accreditation is applicable to all guidance produced since October 2009. The process for the development of SMIs is certified to ISO 9001:2008. SMIs represent a good standard of practice to which all clinical and public health microbiology # Microbiology is used as a generic term to include the two GMC-recognised specialties of Medical Microbiology (which includes Bacteriology, Mycology and Parasitology) and Medical Virology. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 5 of 29

laboratories in the UK are expected to work. SMIs are NICE accredited and represent neither minimum standards of practice nor the highest level of complex laboratory investigation possible. In using SMIs, laboratories should take account of local requirements and undertake additional investigations where appropriate. SMIs help laboratories to meet accreditation requirements by promoting high quality practices which are auditable. SMIs also provide a reference point for method development. The performance of SMIs depends on competent staff and appropriate quality reagents and equipment. Laboratories should ensure that all commercial and in-house tests have been validated and shown to be fit for purpose. Laboratories should participate in external quality assessment schemes and undertake relevant internal quality control procedures. Patient and public involvement The SMI working groups are committed to patient and public involvement in the development of SMIs. By involving the public, health professionals, scientists and voluntary organisations the resulting SMI will be robust and meet the needs of the user. An opportunity is given to members of the public to contribute to consultations through our open access website. Information governance and equality PHE is a Caldicott compliant organisation. It seeks to take every possible precaution to prevent unauthorised disclosure of patient details and to ensure that patient-related records are kept under secure conditions. The development of SMIs is subject to PHE Equality objectives https://www.gov.uk/government/organisations/public-healthengland/about/equality-and-diversity. The SMI working groups are committed to achieving the equality objectives by effective consultation with members of the public, partners, stakeholders and specialist interest groups. Legal statement While every care has been taken in the preparation of SMIs, PHE and any supporting organisation, shall, to the greatest extent possible under any applicable law, exclude liability for all losses, costs, claims, damages or expenses arising out of or connected with the use of an SMI or any information contained therein. If alterations are made to an SMI, it must be made clear where and by whom such changes have been made. The evidence base and microbial taxonomy for the SMI is as complete as possible at the time of issue. Any omissions and new material will be considered at the next review. These standards can only be superseded by revisions of the standard, legislative action, or by NICE accredited guidance. SMIs are Crown copyright which should be acknowledged where appropriate. Suggested citation for this document Public Health England. (2015). Quality assurance in the diagnostic virology and serology laboratory. UK Standards for Microbiology Investigations. Q 2 Issue 7. https://www.gov.uk/uk-standards-for-microbiology-investigations-smi-quality-andconsistency-in-clinical-laboratories Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 6 of 29

Scope of document This SMI describes aspects of quality assurance in the diagnostic virology and serology laboratory. It covers laboratory aspects of quality assurance, specifically the analytical phase. However, laboratories are urged to pay equal attention to ensuring that pre and post examination phases are reviewed and controlled to minimise the risk of errors occurring. Quality must start with the selection of the appropriate analytical method and equipment. Practice should, where possible be supported by World Health organization (WHO) / National Institute for Biological Standards and Control (NIBSC) reference methods and materials. However, where there are no recommended WHO/NIBSC reference methods/materials/standards for a process, laboratories must carry out their own validation. Standardisation of methods, and thence results is key to the use of pathology data between laboratories and for wider purposes such as surveillance. Quality assurance is the term used to describe procedures used to monitor the performance of all aspects of work in the laboratory. These include external and internal quality assessment, internal quality assurance, monitoring of equipment and auditing of processes. Quality assurance aims to ensure that tests requested on clinical specimens are processed, analysed and interpreted in agreement with defined professional standards and in accordance with the patients needs for diagnosis, treatment and management. Note: It should be noted that the scope of this document is limited to use in the diagnostic virology and serology laboratory. See the Pathology quality assurance review report for information on how to harmonise and embed quality assurance in the different disciplines in Pathology. This SMI should be used in conjunction with other SMIs. Introduction ISO 9000:2005 defines quality as the degree to which a set of inherent characteristics fulfils requirements. In pathology a quality product or service can be defined as the right result on the right specimen from the right patient that is accurate, timely and properly interpreted. The objective of a diagnostic laboratory should be to produce cost-effective, accurate, reproducible and timely results, which are comparable with the results obtained in a similar laboratory elsewhere, and which are promptly, effectively and appropriately communicated to the users of the service. In this way the quality of the product or service can be guaranteed. Laboratories achieve quality service through quality assurance (QA) which comprises all the different measures taken to ensure the reliability of investigations. Relevant measurable variables include: the quality of training and education of staff, the quality of reagents, apparatus and specimens and the suitability of the techniques in use. Therefore, it is clear that quality assurance relates to the entire process of diagnosis which starts and ends with the patient. If an error has occurred at the pre or post examination phase which might have resulted in the wrong patient being identified, the wrong specimen being taken, the specimen s storage conditions during transit to the laboratory being incorrect, a data entry error occurring at specimen reception, an Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 7 of 29

incorrect interpretation of the results or the result having been sent to the wrong address, then the accuracy of the test itself becomes irrelevant. Quality assurance in the examination phase is the collective term for several distinct procedures used to monitor the performance of all aspects of work in the laboratory. A quality assurance scheme should be used to identify procedural and technical problems, check the adequacy of current techniques and calculate the frequency of errors. A comprehensive quality assurance programme should be an integral part of the procedures of a diagnostic microbiology laboratory and is necessary for compliance with accreditation standards assessed by the United Kingdom Accreditation Service (UKAS). Quality assurance procedures (figure 1) include: quality assessment - External quality assessment (EQA) 1 - Internal quality assessment (IQA) 2 quality controls - Internal quality control (IQC) 3 equipment monitoring audit The use of EQA, IQC and equipment monitoring procedures should be mandatory, and participation in IQA and audit schemes is regarded as best practice according to ISO standards 4. UKAS encourages laboratories to define their own acceptance criteria for EQA, IQA and IQC. Participation in EQA schemes is voluntary, but is a requirement for UKAS accreditation 5. However, where no EQA is available, section 5.6.3.2 of ISO 15189:2012 recommends testing appropriate alternative materials such as: samples previously examined; samples exchanged with other laboratories; certified reference materials etc 6. Figure 1. Overview of quality assurance Quality assurance in diagnostic virology and serology Quality assessment Quality control Equipment monitoring Audit Internal External Internal External NEQAS EQAS ad hoc IQAS Kit controls Internal QC Routine maintenance and cleaning In-house monitoring Servicing Calibration procedures Individual laboratory (In-house) UKAS The use of these procedures is designed to increase confidence in the handling and testing of a specimen, in the validity of the assay results and in the final report. However, a laboratory based quality assurance scheme may only monitor the Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 8 of 29

procedures over which the laboratory has control. Joint audits should be conducted to monitor all processes associated with the diagnosis of infections. Quality assurance should be an integrated system, in which results obtained from one of its constituent parts are confirmed by another, eg: the coefficient of variation of an assay determined in the QC scheme can be used to determine if there is a significant difference in the results obtained in the IQA scheme reciprocally, IQA is a useful procedure when the quality of control material is in doubt reduced assay sensitivity, indicated by violations of the Westgard QC warning rules (4 1SD or 10 x rules) may be investigated through equipment monitoring eg reduced incubator temperature would indicate equipment failure (whereas correct temperatures may suggest reagent deterioration) Assay kits and equipment should undergo thorough evaluation before being introduced for routine use in the laboratory. The use of EQA, IQA or QC procedures will not improve the innate performance of assays or equipment. Quality assurance can only be undertaken effectively if data obtained in the scheme, comments on that data and remedial actions taken are recorded. This should be the designated responsibility of a senior and experienced member of staff. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 9 of 29

1 External quality assurance External quality assessment (EQA) Schemes facilitate comparison of laboratories and detection systems followed by comprehensive discussion of results and discrepancies. Clinical specimens and artificially spiked samples are distributed from external sources or reference laboratories to assess a broad range of techniques and assays performed in microbiology. EQA distributions are normally limited in number and clearly identifiable, which could allow laboratories to process them in a special manner, however ISO 15189:2012, section 5.6.3.3 specifies that Inter-laboratory comparison samples shall be examined by personnel who routinely examine patient samples using the same procedures as those used for patient samples 6. UKAS assessment ensures that this standard is met during inspection audits. Laboratories should not communicate with other participating laboratories or refer their EQA samples for confirmatory examination elsewhere, until after the closing date for submission of results. Analysis and comparison of results will be performed externally. The findings should be widely disseminated within the laboratory in order to encourage staff and/or to allow full investigation of any problems identified. Laboratory managers should monitor the results of EQA and ensure the implementation of corrective actions when required. EQA schemes are available for the majority of assays and organisms routinely tested. Table 1 gives examples of some EQA providers. There are many other EQA providers which can be identified via the UKAS website, http://www.ukas.com/services/accreditation-services/clinical-pathologyaccreditation/cpa-accredited-eqa-providers/. EQA providers are seen as suppliers by UKAS and so must meet suitable methods for acceptance. For more information on participation in EQA testing, refer to TPS 47: UKAS policy on participation in proficiency testing. Table 1: Examples of external quality assessment schemes United Kingdom National External Quality Assessment Service (UK NEQAS) Quality Control for Molecular Diagnostics (QCMD) Labquality, Helsinki, Finland www.ukneqasmicro.org.uk www.qcmd.org www.labquality.fi/in_english/ Randox International Quality Assessment Scheme (RIQAS) Wales External Quality Assessment Scheme (WEQAS) www.riqas.com www.weqas.co.uk/ 2 Internal quality assurance Internal quality assessment (IQA) Schemes are used to monitor all activities involved in the passage of specimens through the laboratory, from first receipt to issuing of the Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 10 of 29

final report. In an IQA scheme, approximately 0.5-1.0% of specimens received in the laboratory are anonymised and resubmitted for testing (Figure 2). Specimen selection should be random in an IQA scheme for general serology but should reflect the proportion of the total workload submitted for each test. IQA schemes for monitoring culture, antigen detection, genome detection or electron microscopy may be more difficult to construct. Loss of viability or degradation during storage, inadequate specimen volume, or a low frequency of positive specimens may all complicate the analysis. Such IQA schemes are often supplemented by the use of spiked specimens. Figure 2: Internal quality assessment scheme John Smith Blood sample Laboratory reception Routine testing under name Divide sample into 2 aliquots * Create new request form with name and sender replaced by IQA number Number sample and request form Number sample and request form Allocate test code Allocate test code Perform test Perform test Report results Report results Compare tests performed and results Analyse discrepancies and take action * Paired samples previously tested and stored under optimal conditions can be resubmitted for testing to ensure that samples are not easily paired with the named aliquot during testing and to monitor: test variation operator variation Discrepancies between the results obtained for the original named sample and the anonymised sample should be recorded and reviewed by a senior member of staff. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 11 of 29

This staff member may comment on the discrepancy and may retest the two samples in parallel if appropriate. Again, there should be a wide circulation of the results and staff should be encouraged to discuss any discrepancies found. Results from the IQA scheme should be recorded and reviewed so that recurring problems or trends can be identified. See Appendix 1 for examples. 3 Internal quality control Internal quality control (IQC) samples should be included in all assays performed in the laboratory and may be used to validate test procedures, test kits and equipment. Most serology kits now include IQCs and in such cases, users should adhere to the manufacturer s instructions. However, the use of independent IQC material is also encouraged in ISO 15189:2012 to provide an independent assessment of system performance. IQC samples can be created from international, national or local standard sera or pooled sera, which is well characterised in previous assays, and produces values within clinically significant ranges. The IQC procedure is performed as follows: select suitable control material test the control material in 20 separate assay runs (see Appendix 2) determine the mean (target value) and standard deviation (SD) of the control material establish a Shewart control chart with the mean and +1SD, +2SD and +3SD delineated include the control material in each subsequent assay run and plot the result obtained on the control chart determine the validity of each assay run by applying the Westgard rules (Table 2) 7,8 Monitoring the testing process and QC results Interpretation of QC results involves both statistical and graphical methods. They are as follows: Westgard rules 8,9 These are a set of 6 statistical rules which can be used individually, or in combination, to detect both random and systematic errors. Such errors create uncertainty of measurement, which must be taken into account when testing procedures and / or testing results are compared with each other or against specifications. A random error is any deviation away from an expected result. Any positive or negative deviation away from the calculated mean of QC results is regarded as random error. Causes include errors in pipetting and changes in incubation period. Random errors can be minimised by training, supervision and adherence to standard operating procedures. A systematic error is an error which occurs consistently, when a number of measurements are made under the same conditions, or varies predictably when Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 12 of 29

conditions change. Systematic errors may occur gradually, as demonstrated by a trend in control values or may be abrupt as demonstrated by a shift in control values. Example causes are variations in incubation temperature, blockage of plate washer, aging of the reagents, gradual deterioration of control materials, and change in the reagent batch or modifications in testing method. The Westgard rules are used to define specific performance limits for a particular assay and verify the reliability of test results. The most commonly used Westgard rules, A-C, are warning rules whose violation should trigger a review of the test procedures, reagent performance and equipment calibration. Westgard rules D-F are mandatory or alarm rules whose violation should result in the rejection of the results obtained from specimens in that assay run. Table 2: The Westgard rules 8,9 A 1 2SD This rule is used as a warning rule to trigger careful inspection of the control data. If one control measure exceeds the mean ±2SD, control values in the previous run should be considered to rule out a trend. B 2 2SD This rule detects systematic errors and is violated when two consecutive control values (on the same side of the mean) exceed the same mean +2SD or mean -2SD limit. C 4 1SD This rule detects systematic error. The rule is violated when four consecutive values exceed the same mean +1SD or mean -1SD limit. The run does not need to be rejected if this rule is violated but should trigger recalibration or equipment maintenance. D 1 3SD This control rule detects random error. Violation of this rule may also point to systematic error. The assay run is considered to be out of control when one control value exceeds the mean ±3SD. E R 4SD This is a range rule which detects random error only. This rule is applied only within the current run. The rule is violated when one control measurement in a group exceeds the mean +2SD and another exceeds the mean -2SD. F 10 x This rule detects systematic error and it is violated when 10 consecutive values fall on the same side of the mean. Its violation often indicates the deterioration of assay reagents. The 10x rule is usually applied across runs and often across materials. Note: Often the 4 1SD and 10 x must be used across runs in order to get the number of control measurements needed to apply the rules. There are some modifications to the 10 x rule to make it fit more easily with the 4 1SD. They are as follows (see below): G 8 x This rule is violated when 8 consecutive values fall on one side of the mean. H 12 x This rule is violated when 12 consecutive values fall on the same side of the mean. Some other control rules that fit better and are easier to apply in situations where 3 Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 13 of 29

different control materials are being analysed include: I 3 1SD This rule is violated when 3 consecutive control measurements exceed the same mean +1SD or mean -1SD control limit and fall on the same side of the mean. J 6 x This rule is violated when 6 consecutive control measurements fall on one side of the mean. K 9 x This rule is violated when 9 consecutive control measurements fall on one side of the mean. In the event of a violation of a Westgard rule, there are three potential actions that may be taken: accept the test run in its entirety - this usually applies when only a warning rule is violated. reject the whole test run - this applies only when a mandatory rule is violated. enlarge the grey zone and thus re-test range for that particular assay run - this option can be considered in the event of a violation of either a warning or mandatory rule. The mean or target value, the coefficient of variation and the SD of the proposed control are calculated from the results obtained after testing the control material on 20 separate occasions (see Appendix 2). This process may be accelerated by testing 4 aliquots of the sample on each of 5 occasions, however the mean and SD should be recalculated after 20 assay runs. The values obtained are then used to set acceptable limits for the results obtained subsequently with the assay control. The concentration of analyte in the control material should be within the clinically significant range. For example, in an ELISA the control s OD value should lie within the linear part of the dose response curve. Control material that is strongly positive and therefore saturates the assay should not be used. Control charts Shewhart or Levey-Jennings plots should be drawn, with the target value (mean) and the limit values of +1SD, +2SD and +3SD delineated, for each control used. Subsequent values obtained with the assay controls are plotted and the Westgard rules applied to determine the validity of each assay run. (See Appendix 2 for examples of documentation used with internal quality controls). The coefficient of variation (CV) is a measure of variability and is expressed as a percentage. Variation may be caused by the assay, the equipment used to perform the assay and/or the assay operator. Uncertainty of measurement Uncertainty of measurement is a quantitative indication of the quality of a result, which can be defined as a parameter associated with the result of a measurement that characterises the dispersion of the values that could reasonably be attributed to the measurand. A measurand is a quantity intended to be measured. It is essential for the correct interpretation of a result and becomes important when test results are close to Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 14 of 29

a specified limit. Both ISO 17025:2005 and ISO 15189:2012 outline specific requirements for laboratories to evaluate and report uncertainty of measurement. Quantitation of uncertainty also allows an assessment of reliability when comparing results from different laboratories, within a laboratory, or with standard reference values. This information can prevent unnecessary repetition of tests. Laboratories are encouraged to measure uncertainty in their routine work in order to assess the quality of their test results and to become aware when the results are close to the specified limit. This process also highlights the aspects of the test or calibration that produces the highest uncertainties and where improvements could be made. Some sources of uncertainty include: sample itself; its storage condition; the test method; reagents and media used; equipment used; and different staff operators carrying out the same process. These uncertainties can potentially be minimised by training, supervision and recalibration of equipment. 4 Equipment monitoring Laboratories should verify upon installation, and before use, that equipment is capable of achieving the necessary performance and that it complies with requirements relevant to any examinations concerned. This applies to equipment used in the laboratory, on loan, or in associated facilities when authorised by the laboratory. The performance of equipment (for example the maintenance of appropriate temperature in water baths, incubators and freezers) should be checked and recorded at regular, predetermined intervals (see Appendix 3). Spectrophotometers and balances should be checked against available standards and recalibrated if necessary. Adjustable and fixed volume pipettes should be calibrated when new and while in use. The introduction of automated liquid handling devices, capable of preparing or performing assays, may offer increased efficiency and reproducibility. Procedures and routines should be established for the control and maintenance of computer-controlled liquid handling devices. The scope and frequency of these procedures will depend on the automated system used. Recommended schedules of maintenance are usually an integral part of the manufacturer s instructions. QC measurements may also be a part of the assay protocols, especially with random access or closed systems. Such systems may have in-built monitoring procedures. Open systems may also have in-built procedures for monitoring mechanical and electronic parameters but procedures for validating assay results may have to be programmed by the operator. Operators should be familiar with, and carry out, these procedures. In-house QA procedures should be devised for open access machines running assays sourced from different manufacturers. Protocols can be created to measure the CV of sample dilution, sample addition and reagent addition by performing dummy runs with dye substituted for the clinical samples or reagents. Acceptable limits can be set after 20 runs in the same way as for QC samples (+3SD limits). A minimum of 30 aliquots which should be included in the QA procedure, which should mimic an assay currently performed on the equipment. A simple QA procedure can be created by copying a current assay protocol and modifying it to perform the tasks required (see Table 3 for an example of two processes amended for QA purposes). Uncertainty Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 15 of 29

should be reassessed at intervals and whenever a significant change in a procedure occurs, such as the introduction of new equipment or modification of the assay. Error messages and faults occurring with automated equipment should be recorded for each machine. The record should also document visits by maintenance engineers, routine servicing, remedial action and changes to software and hardware. Responsibility for monitoring equipment performance and carrying out cleaning and maintenance should be clearly defined in the SOP for the equipment. Faults should be reported to senior staff if they recur or if repairs are required. Table 3: QA procedures for use with automated liquid handling devices Test protocol QA protocol (dilution) QA protocol (reagent addition) Pick up sample Pick up dye Add dye (as reagent) Pre-dilute in a tube Pre-dilute in a tube Read Dilute in a plate Incubate Dilute in a plate Read Wash Add conjugate Incubate Wash Add substrate Add stop solution Read 5 Audit According to ISO 9000:2005, audit is defined as a systematic, independent and documented process for obtaining evidence and evaluating it objectively to determine the extent to which audit criteria are fulfilled. It can also be defined as the process used to evaluate, amend and improve procedures in a systematic way in order to enhance quality. It is often used to highlight difficulties in those procedure(s) or to identify bottlenecks. The selection of tests for specific clinical syndromes and specimen turnaround times are examples of procedures that can be subject to audit in the clinical laboratory. An audit can be categorised as internal or external. Internal audits are organised and carried out by laboratory staff and management. External audits are carried out by external inspection bodies such as UKAS, and also through participation in external quality assessment schemes. Note: Audits of the quality management system should be conducted by personnel trained to assess the performance of managerial and technical processes. Audit should be: planned Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 16 of 29

scheduled structured - following similar patterns independent - if possible reported and recorded on time followed-up and acted upon The documentation required will depend on the procedure being audited. An example of documentation associated with an audit of specimen turnaround time is shown in Appendix 4. There are three main audit types which can be performed as part of the internal audit process: 5.1 Horizontal audit A horizontal audit assesses one element of the quality system, for example staff training or equipment calibration. This type of audit ensures that individual elements of the quality system are in place and functioning properly. However, it does not assess how the whole system functions. 5.2 Examination audit Examination audit is also known as witness or observation audit. In an examination audit, a member of staff undertaking a task is assessed by the auditor. This assesses whether an SOP is being followed and whether work is competent and safe. It provides an opportunity to ascertain whether the staff member is satisfied with their training; has the correct level of supervision; understands all aspects of the procedure they are audited against; and is aware of the impact that their work has. 5.3 Vertical audit A vertical audit assesses all the activities associated with processing one item or sample, to ensure that all parts of the system are functioning. This audit may be prospective or retrospective. As well as tracking the sample itself, a vertical audit should include aspects such as the training record of personnel involved in testing the sample, records of equipment and reagents used to perform assays, IQA and IQC results relevant for the time the test was performed, etc. This type of audit assesses a whole process but is normally limited to a small number of samples passing through the laboratory. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 17 of 29

Appendix 1: Documentation used in IQA 1 Example record sheet 1.1 Internal quality assessment - virus serology Original IQA 26 Original IQA 27 Original IQA 28 Laboratory No. 10223 10345 10635 10664 10728 10800 Date 12/6/94 12/6/94 18/6/94 18/6/94 20/6/94 20/6/94 CFT Assay Influenza A 64 16 Influenza B Chlamydia species <8 <8 <8 <8 Coxiella burnetii <8 <8 Adenovirus 32 64 Mycoplasma pneumoniae 256 256 Measles 8 128 Hepatitis serology HAV IgM HBsAg Anti-HBs 96 miu/ml 106 miu/ml Examples of three discrepancies are shown in bold: in IQA 26, there is a four-fold difference in Flu A titre between the named and anonymised samples in IQA 27, results of the anti-hbs assay might have altered patient management (booster dose of vaccine being or not being advised) in IQA 28, a significant measles antibody titre was detected in only one of the duplicate samples Discrepancy reports should be produced for all three samples. Laboratories are encouraged to evaluate the impact of any discrepancy and to follow up on how best to avoid such issues or situations arising in the future. Actions might involve retesting of samples, retraining of staff, assessing pre and post analysis steps, and re-evaluating reagents/equipment used in the tests. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 18 of 29

2 Example discrepancy reports 2.1 Non-significant discrepancy INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..18../..06../..94 IQA Number:.26.. Original Named sample Anonymised IQA sample Number:..10223 Test: Resp. CFT.. Number: 10354. Test:.Resp. CFT.. Report: Influenza A 64 Report: Influenza A 16 Comment: On repeat both samples had CF titres of 32 Date:..02../..07../..94 Initials: JJG.. 2.2 Diagnostically significant discrepancy INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..24../..06../..94 IQA Number:.28.. Original Named sample Anonymised IQA sample Number:..10728 Test: Rash group.. Number: 10800. Test:.Rash group.. Report: Measles CF 8. Not significant.. Report: Measles CF 128 May indicate recent measles infection Comment: On repeat both samples had CF titres of 8 Date:..04../..07../..94 Initials: JJG.. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 19 of 29

2.3 Discrepancy leading to different advice INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..24../..06./..94 IQA Number:.27.. Original Named sample Anonymised IQA sample Number:..10635 Test: anti-hbs.. Number: 10664. Test:.anti-HBs.. Report: 96 miu/ml Give booster dose of vaccine Report: 108mIU/mL. Indicates immunity to HBV Comment: Both results within CV for this assay Date:..04../..07../..94 Initials: JJG.. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 20 of 29

Appendix 2: Documentation used in IQC 1 Determining the target value and SD of a QC sample and setting acceptable limits for its use (see Appendix 5) Assay run Antibody concentration (Arbitrary units/ml) 1 68 2 68 3 64 4 72 5 61 6 65 7 66 8 65 9 65 10 70 11 72 12 73 13 63 14 67 15 69 16 66 17 63 18 67 19 70 20 65 QC data: Number tested = 20 Mean (target value) = 66.95 AU/mL SD = 3.33 AU/mL Acceptable range (+3SD) = 56.9-76.9 AU/mL A Shewhart plot should now be constructed with the mean, +1SD, +2SD and +3SD values delineated. The result obtained with the QC sample should be plotted after each assay run and must lie within 56.9 AU/mL and 76.9 AU/mL for the run to be valid. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 21 of 29

2 Examples of Shewart control charts 12.02 HAV IgM ELISA: QC 1994 LMR assay Random errors 4 Q C / C O r a t i o 10.46 8.90 7.34 5.78 4.22 2.66 3 2 1 0-1 -2 S D 1.10-3 Assay run The chart above shows examples of random errors or operator errors. The 1 3SD rule has been violated on two occasions. The results obtained in both these assay runs are invalid. Note: The results of the QC sample are plotted as a ratio of the OD of the QC sample: OD of the assay cut-off value. This compensates for the small differences in assay performance seen day-to-day. Assays incubated at room temperature are particularly susceptible to small, but acceptable, changes in performance. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 22 of 29

The QC charts below are an example of systematic errors. A change in assay performance was detected with both controls and was associated with a new batch of reagents. Violations of the 10 x rule were detected with both controls and the 4 1SD with the intermediate control. The QC procedures indicate an increase in the sensitivity of the assay. If this is acceptable then the QC limits should be recalculated using 20 control values obtained with this batch of reagents. This process can be speeded up by testing 4 aliquots of each control in 5 assay runs. A recalculation should be made after the results of 20 runs are available, in order to check the accuracy of the values used. anti-hbs QC: intermediate 163.6 1994 Systematic error 7 A n t i b o d y 152.2 140.8 129.4 118.0 106.6 95.2 6 5 4 3 2 1 S D c o n c. 83.8 72.4 61.0 49.6 0-1 -2-3 38.2-4 Assay run anti-hbs QC: low 18.48 1994 Systematic error 4 A n t i b o d y 16.17 13.86 11.55 9.24 3 2 1 0 S D c o n c. 6.93 4.62 2.31-1 -2-3 0.00-4 Assay run New batch introduced Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 23 of 29

Appendix 3: Equipment control sheets Temperature control sheet Inventory number: CAMB001234 Incubator type (CO 2 or general purpose): CO 2 Water bath: N/A Refrigerator: N/A Set temperature: 37 C +2 C (5% CO2) Acceptable range: 35 C - 39 C Uncertainty of Measurement: 0.2 ( as an example) Date Temp C Action/ initials 4.9.15 37 JJG 5.9.15 36 JJG 6.9.15 37 JJG 7.9.15 36 JJG 8.9.15 36 replaced CO 2 cylinder JJG 11.9.15 37 JJG 12.9.15 35 JJG 13.9.15 37 JJG 14.9.15 37 JJG Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 24 of 29

Appendix 4: Documentation and data handling associated with audit of specimen turnaround time 1 Vertical audit of specimen processing time select approximately 20 consecutive specimens requiring the same test or combination of tests record the time taken to perform each task as the specimen passes through the laboratory complete this form for every sample enrolled in the audit Figure 2: Vertical audit recording form Sample Lab No:.. Test code:. Stage Date Time Sample taken Arrival at lab Coding for test Registration Work sheet generated 1 st test started Last result entered on computer Result authorisation Report sent out 24hr clock Time elapsed Days - Hours Differential time Days - Hours In the example given above, the fields are designed to reflect workflow. This may vary from laboratory to laboratory. calculate the mean and median time for each task and identify the bottlenecks if necessary, make changes to procedures or the frequency of testing repeat the audit several months after the changes have been instituted Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 25 of 29

Figure 3: Areas of responsibility that may be examined by vertical audit Area of responsibility Audit area Outcome Ward, clinic, practice etc. Patient handling Patient correctly identified Medical and nursing staff, porters, etc Specimen collection Sample appropriately obtained, contained, labelled, transported and stored Laboratory Specimen analysis Results calculated correctly and expressed without ambiguity Laboratory Report Destination correct, interpretation given when required Medical and nursing staff, staff, etc Report Delivery Results entered in notes and acted on if required Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 26 of 29

Appendix 5: Statistics used with IQC These are examples of the most common methods of statistical analysis (below) but it should be noted that other methods are also available. 1 Mean The mean is defined as the arithmetic average of a set of data points. It is expressed as: x i n Mean = x where i = each data point n = the number of data points in the set The mean identifies the target value of a set of QC data points. 2 Standard deviation The standard deviation (SD) is a measure of the distribution of data points above and below the mean. It is used to set acceptable limits for values obtained with IQC samples. SD = 2 ( x ) n 1 ( x ) Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 27 of 29 Where ( x 2 ) ( x ) 2 = the sum of all data points squared n 2 = the sum of the squares of each value of x n = the total number of data points in the set Quality control data exhibit a normal distribution, therefore: 68.3% of values are within +1SD from the mean 95.9% of values are within +2SD from the mean 99.7% of values are within +3SD from the mean 3 Coefficient of variation The coefficient of variation (CV) is a measure of the variability of an assay and is expressed as a percentage. CV = (SD/ mean)(100) The CV is useful for determining whether values, obtained with duplicate samples, which lie either side of an arbitrary cut-off value are within experimental error. For example, in anti-hbs antibody determination a value of 105mIU/mL would indicate satisfactory immunity whereas a value of 98mIU/ml would signal the need for a

booster dose of vaccine. If the assay has a CV of 10%, both values would be acceptable. 4 Uncertainty of measurement Uncertainty of measurement (UM) is estimated using the statistical formula below: UM = ± s.k where s = population standard deviation (also known as combined standard uncertainty) k = coverage factor Multiplication by a coverage factor gives the confidence interval for the distribution of values which could be derived from the quantities measured. This is known as the expanded uncertainty of measurement. For 30 samples, using a confidence level of 95% is obtained using k =2. For <30 samples, a confidence level of 95% is obtained when k = the two tailed value of the student s t - test for those measurements and for the level of confidence required. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 28 of 29

References 1. Snell JJS. External Quality Assessment. In: Snell JJS, Brown DFJ, Roberts C, editors. Quality Assurance Principles and Practice in the Microbiology Laboratory. London: Public Health Laboratory Service; 1999. p. 77-89. 2. Gray JJ, Wreghitt TG, McKee TA, McIntyre P, Roth CE, Smith DJ, et al. Internal quality assurance in a clinical virology laboratory. I. Internal quality assessment. J Clin Pathol 1995;48:168-73. 3. Gray JJ, Wreghitt TG, McKee TA, McIntyre P, Roth CE, Smith DJ, et al. Internal quality assurance in a clinical virology laboratory. II. Internal quality control. J Clin Pathol 1995;48:198-202. 4. Sharp IR. Quality audit and quality system review in the laboratory. In: Snell JJS, Brown DFJ, Roberts C, editors. Quality Assurance Principles and Practice in the Microbiology Laboratory. London: Public Health Laboratory Service; 1999. p. 105-17. 5. Barnes I. Pathology Quality Assurance Review. Pathology Quality Assurance Review, Skipton House, 80 London Road, London SE1 6LH. 2014. p. 1-40 6. European committee on Standardization. Medical laboratories - Requirements for quality and competence (ISO 15189:2012). British Standards Institution 2012. p. 1-50. 7. Westgard JO, Barry PL, Hunt MR, Groth T. A multi-rule Shewhart chart for quality control in clinical chemistry. Clin Chem 1981;27:493-501. 8. Westgard JO, Groth T, Aronsson T, Falk H, de Verdier CH. Performance characteristics of rules for internal quality control: probabilities for false rejection and error detection. Clin Chem 1977;23:1857-67. 9. Westgard,JO. Westgard Rules. Tools, Technologies and Training for Healthcare Laboratories - Madison, Wisconsin 53717. Accessed on 28/01/15. https://www.westgard.com/mltirule.htm. Quality Guidance Q 2 Issue no: 7 Issue date: 07.12.15 Page: 29 of 29