QUALITY CONTROL AND QUALITY ASSURANCE IN CLINICAL RESEARCH



Similar documents
CLINICAL DATA MONITORING PLAN (CDMoP) PROTOCOL # [0000] [TITLE]

Barnett International and CHI's Inaugural Clinical Trial Oversight Summit June 4-7, 2012 Omni Parker House Boston, MA

Data Management Unit Research Institute for Health Sciences, Chiang Mai University

No Page 1 of 5. Issue Date 4/21/2014

Synergizing global best practices in the CRO industry

Section 1 Project Management, Project Communication/Process Design, Mgmt, Documentation, Definition & Scope /CRO-Sponsor Partnership

How Industry Can Partner with FDA in Defining a Risk- Based Monitoring Program

The Importance of Good Clinical Data Management and Statistical Programming Practices to Reproducible Research

Introduction. The Evolution of the Data Management Role: The Clinical Data Liaison

Centralized vs Onsite Monitoring:

Needs, Providing Solutions

Clinical Data Management (Process and practical guide) Nguyen Thi My Huong, MD. PhD WHO/RHR/SIS

DHHS/NIH/OD/OIR/OHSRP 1/2/2015

Clinical Data Management (Process and practical guide) Dr Nguyen Thi My Huong WHO/RHR/RCP/SIS

Health Care Job Information Sheet #20. Clinical Research

INSERT COMPANY LOGO HERE BEST PRACTICES RESEARCH

The Monitoring Visit. Denise Owensby, CCRP Sr. Clinical Research Coordinator Clinical & Translational Science Center University of California, Davis

Data Management & Case Report Form Development in Clinical Trials. Introduction to the Principles and Practice of Clinical Research.

Subject: No. Page PROTOCOL AND CASE REPORT FORM DEVELOPMENT AND REVIEW Standard Operating Procedure

Quality Assurance: Best Practices in Clinical SAS Programming. Parag Shiralkar

Data Management and Good Clinical Practice Patrick Murphy, Research Informatics, Family Health International

CNE Disclosures. To change this title, go to Notes Master

Objectives. Monitoring & Auditing of Clinical Trials. Overview. Pre-study Qualification Visit. Industry-sponsored Trials

KCR Data Management: Designed for Full Data Transparency

Clinical Training Management

ICH CRA Certification Guide March 2009

Clinical Data Management is involved in all aspects of processing the clinical data, working with a range of computer applications / database systems

The use of computer systems

Good Documentation Practices

DAIDS Appendix 2 No.: DWD-POL-DM-01.00A2. Data Management Requirements for Central Data Management Facilities

What is Clinical Data Management

Testing Automated Manufacturing Processes

Study Start-Up SS STANDARD OPERATING PROCEDURE FOR Site Initiation Visit (SIV)

Laurie Shaker-Irwin, Ph.D., M.S. Co-Leader, Regulatory Knowledge and Research Ethics UCLA Clinical and Translational Science Institute

Essential Documents for the Conduct of a Clinical Trial. Debra Dykhuis Associate Director RSO

Effective Data Management Plans for Research Studies Patrick Murphy, Research Informatics, Family Health International

The Study Site Master File and Essential Documents

ACDM GUIDELINES TO FACILITATE PRODUCTION OF A DATA HANDLING PROTOCOL

This is a controlled document. The master document is posted on the JRCO website and any print-off of this document will be classed as uncontrolled.

TEMPLATE DATA MANAGEMENT PLAN

Managing & Validating Research Data

LIBRARY GUIDE: Clinical Medical Device

Version history Version number Version date Effective date 01 dd-mon-yyyy dd-mon-yyyy 02 dd-mon-yyyy dd-mon-yyyy 03 (current) dd-mon-yyyy dd-mon-yyyy

STANDARD OPERATING PROCEDURE FOR RESEARCH. Management of Essential Documents and Trial Folders

CLINICAL DATA MANAGEMENT

Guidance for Industry COMPUTERIZED SYSTEMS USED IN CLINICAL TRIALS

Monitoring & Auditing of Clinical Trials. Sponsored by Center for Cancer Research National Cancer Institute

DAIDS Bethesda, MD USA. POLICY Requirements for Clinical Quality Management Plans

European Forum for Good Clinical Practice Audit Working Party

Risk based monitoring using integrated clinical development platform

Data Manager s Role in Data Quality

Basics of Clinical Data Management

Electronic Medical Records and Source Data for Research: What s the Difference?

The Concept of Quality in Clinical Research. Dorota Śwituła Senior Clinical Quality Assurance Advisor

GOOD CLINICAL PRACTICE: CONSOLIDATED GUIDELINE

DAIDS Bethesda, MD USA POLICY. Requirements for Clinical Quality Management Plans

Infoset builds software and services to advantage business operations and improve patient s life

Guidance for Industry Computerized Systems Used in Clinical Investigations

Clinical database/ecrf validation: effective processes and procedures

A Web-Based Data Management System for Small And Medium Clinical Trials

Programme Guide PGDCDM

DATA MANAGEMENT IN CLINICAL TRIALS: GUIDELINES FOR RESEARCHERS

Managing Data in Clinical Research. Developed by Center for Cancer Research, National Cancer Institute, NIH Endorsed by the CTN SIG Leadership Group

Data-management and Biostatistics

U.S. Food and Drug Administration

Challenges and Opportunities in Clinical Trial Data Processing

UNIVERSITY OF LEICESTER, UNIVERSITY OF LOUGHBOROUGH & UNIVERSITY HOSPITALS OF LEICESTER NHS TRUST JOINT RESEARCH & DEVELOPMENT SUPPORT OFFICE

Sharon H. Johnson, BS, MS 123 Main Street Capital City, VA Phone:

The Clinical Research Center

ROLE OF THE RESEARCH COORDINATOR

INTRODUCTION. This book offers a systematic, ten-step approach, from the decision to validate to

Guidance for Industry

Clinical Data Management Overview

Adopting Site Quality Management to Optimize Risk-Based Monitoring

August

Remote Monitoring of Clinical Trials and EMRs

Data Management: Good Team Work is de sleutel tot succes!

Workshop on Quality Risk Management Making Trials Fit for Purpose

A white paper. Data Management Strategies

The Regulatory Binder/Trial Master File: Essential Records for the Conduct of a Clinical Trial

Implementation of SDTM in a pharma company with complete outsourcing strategy. Annamaria Muraro Helsinn Healthcare Lugano, Switzerland

Clinical Trial Oversight: Ensuring GCP Compliance, Patient Safety and Data Integrity

STANDARD OPERATING PROCEDURE NO. CM

The role, duties and responsibilities of clinical trials personnel Monitoring: rules and recommendations

Data Management in Clinical Trials

Good Clinical Practice 101: An Introduction

From paper to electronic data

Software Test Plan (STP) Template

What is necessary to provide good clinical data for a clinical trial?

STANDARD OPERATING PROCEDURES FOR GOOD CLINICAL PRACTICE

MHRA GMP Data Integrity Definitions and Guidance for Industry March 2015

Standard Operating Procedures

US EPA REGION III QUALITY MANAGEMENT PLAN REVIEW CHECKLIST

Electronic Data Capture - MACRO. Anja Fischer, Thomas Müller

This unit will provide the structure needed to achieve successful supervision and oversight of the study. We will explain the importance of an

Computerized System Audits In A GCP Pharmaceutical Laboratory Environment

Speed to Market Abbott Nutrition Experience in Streamlining the Pediatric Clinical Research Process

1.0 Scope. 2.0 Abbreviations

1.0 Scope. 2.0 Abbreviations. 3.0 Responsibilities

Transcription:

QUALITY CONTROL AND QUALITY ASSURANCE IN CLINICAL RESEARCH Martin Valania, Executive Director, Corporate QA and Compliance Introduction Pharmaceutical companies recognize the benefits of carefully managing the quality of data from their drug development and clinical trials. When these companies evaluate CROs as possible outsourcing partners, quality plays a key factor in their decision. The ideal candidate will be able to ensure that all clinical data are reviewed, vetted for outlying data points, and carefully documented for query identification and resolution throughout the study duration. Maintaining accuracy and quality throughout a clinical study is a continual, dynamic process. Although study requirements are carefully set forth initially in detailed documents, such as an approved clinical protocol and accompanying statistical analysis plan, expectations and requirements can change during a study. This ongoing process requires that the mechanisms for quality assurance be revised and communicated to all investigators and support staff. Defining the Terminology Quality the total set of characteristics of a product or service that affect its ability to satisfy a customer s stated or implied needs. Quality system the organizational structure, responsibilities, procedures, processes, and resources for implementing quality management. Quality Assurance (QA) the systematic and independent examination of all trial-related activities and documents. These audits determine whether the evaluated activities were appropriately conducted and the data were generated, recorded, analyzed, and accurately reported according to protocol, Standard Operating Procedures (SOPs), and Good Clinical Practices (GCPs). Quality Control (QC) periodic operational checks within each functional department to verify that clinical data are generated, collected, handled, analyzed, and reported according to protocol, SOPs, and GCPs.

The Quality Challenge The ongoing challenge in managing the quality of clinical data is to continually monitor data collection protocols and data management practices at every level of the study. This includes: Ensuring that data generated during the study reflect what is specified in the protocol (case report form [CRF] vs. protocol) Comparing data in the CRF and data collected in source documents (CRF vs. source documents) for accuracy Ensuring that the data analyzed are the data recorded in the CRF (database vs. CRF) Quality surveillance continues after the trial has ended and plays an important role in ensuring that: Data presented in tables, listings, and graphs (TLGs) correctly match data in the database (TLGs vs. database) Data reported in the CSR are the data analyzed (CSR vs. TLGs) All aspects of the data management processes are compliant with SOPs and GCPs The Quality Plan The quality plan describes how the overall quality process guidelines will be applied to the clinical trial. It definitively states what is meant by the various quality-related tasks in the study. A quality plan documents specific quality practices, resources, and activities relevant to a specific project. This includes both operational QC and QA activities. Operational QC It is critical that trials managers develop a QC plan for each key operational stage of the study, defining standards against which QC will be conducted, including:

Sampling plan to be used (if applicable) Metrics to be documented Appropriate methods to report and distribute result During the study design phase, QC oversees independent review of the approved proposed protocol. The QC plan also compares the study s CRF to the objectives set forth in the protocol to ensure that the CRF is designed to collect all necessary data. As part of this QC plan, CRF completion guidelines are also reviewed. For overall site management, a complete QC plan addresses: Investigator selection and qualifications Study conduct (monitoring) Source document verification During the clinical trial, the QC plan verifies accuracy of data entry as well as database testing parameters from automated program logic checks to manual review of database entries versus source documentation and CRFs. During statistical analysis, QC monitors relevant TLGs, as well as their incorporation into clinical study reports. QA activities Many aspects of QA and data surveillance typically are organized according to an overarching strategy that is formally developed in a QA audit plan. The QA audit plan sets forth guidelines and expectations for the study and states the purpose and scope of the audit schedule. The QA audit plan also spells out what internal processes of the study will be audited, from initial study design, site and data management, statistical analysis, and assembly of the final clinical study report. It specifies the organizational structure of audit team members and auditee for each study stage, as well as the standards against which the audit

will be conducted in terms of monitoring the original intent of the protocol, CRF completion guidelines, SOPs, and ICH/GCP. Audits must also consider standards related to regulatory directives. For example, US audits must consider FDA s 21 CFR 11 requirements for electronic records and signatures and audit trails for data changes. The equivalent regulation in Europe is the recently adopted EU clinical trial directive, EU 2001/20/EC. A thorough QA audit plan also clearly sets expectations for documents to be provided by the auditee, as well as the location, date, and time of audits and their expected duration. Preparation for QA audits should include reviews of the approved protocol and amendments, SOPs (both general and study-specific), and any specialized training associated with the study. Annotated CRFs and the statistical analysis plan (SAP) should also be reviewed before QA audits. Internal process audits are another important QA responsibility. Internal audits review all the drug development processes employed across several studies to determine if there are systemic problems within a process. This includes a review of employee training, compliance with SOPs and regulatory requirements, and documented evidence that QC was appropriately conducted on the output of each internal process as well as the final deliverable to a client. Site management metrics Internal process audits at the site management level ensure that the defined process for investigator selection was followed by reviewing pre-study visits, as well as facilities, equipment, and staff qualifications. Other audits include examining FDA public lists of firms or persons debarred or restricted under the Federal Food, Drug, and Cosmetic Act (21 USC 335-A & -B) and ensuring that site staff training was conducted. Several metrics commonly evaluated by internal process audits after the study has begun are:

Percentage of monitoring visits completed on time Percentage of assessable subjects (no protocol violations) Percentage of Serious Adverse Events reported within 24 hours to IRB and sponsor Percentage of acceptable informed consent forms Number of queries/number of CRF pages reviewed Number of missing entries/number of CRF pages reviewed Data management metrics Internal process audits examining all aspects of data management are required to ensure computer systems (both hardware and software) are adequately validated, from initial installation to procedures that document how changes to a computer system are made and controlled by audit trails. This data management process audit begins with examining user requirements and initial hardware installation and qualification. After hardware validation, software installation and qualification are audited, along with requisite user acceptance testing. QC with either manual or electronic data capture (EDC) requires data review via source documentation. In addition to monitoring and verification of audit trails, details of any problem logs, maintenance logs, and all user training are reviewed and compared with required SOPs regarding data management. Other QA data management activities include reviews of relevant security controls, system back-up, disaster recovery, configuration change control, and problem reporting and corrective action. Examples of data management metrics for QA are: Percentage of database errors Percentage of queries manually generated Time from last patient out to database lock Percentage of difference between SAE database and study database Percentage of SAS programs adequately validated

Data entry QA is another critical area of data management internal process audits. These audits examine and ensure data entry is verified, and evaluate accuracy of independent, or double-entry, as well as reviewing programmed data logic checks and procedures for the manual review of data. These audits also ensure that all data queries are resolved and verify overall database QC. Statistical analysis metrics Statistical analysis QA ensures SAS programs are validated, including all TLGs. When drafting the clinical study report, QA verifies that both the CSR and statistical analysis plan (SAP) development SOPs are followed. It ensures that both internal and expert reviews are completed and documented, and finally that all statistical analysis protocols are approved by appropriate authority. Examples of statistical analysis metrics include: Percentage of TLGs with numerical or formatting errors SAS programs adequately validated Examples of statistical analysis reporting metrics include: Percentage of error of in-text data Number of CSR versions before final Study site audits QA conducts site audits throughout the course of any clinical trial, documenting and evaluating problems reported by study monitors. QA s criteria for site selection include: High patient enrollment High staff turnover Abnormal number of AEs (high and low) High or low subject enrollment rates that are unexpected given the research site s location and demographics

Site audits ensure compliance with the approved protocol and GCPs, as well as adequate documentation of case histories (source documents) such as medical records, progress notes, and hospital charts. Audits examine whether all clinical tests were performed at the time specified in the clinical study protocol, including specimen collection, storage, and shipping package (if applicable), and the timeliness of review of clinical test results. Another important aspect of QA site audit evaluation of source documentation is the timeliness of documentation (i.e., How long after visit was data entered into the CRF?). Audits examine the accuracy of source documentation to support CRF data (i.e., sampling of CRF vs. source docs - square root of n + 1 subjects). All on-site documentation must account for: All dropouts and withdrawals SAEs (including the timeliness of SAE reporting to the IRB and sponsor) Deaths All safety and efficacy data Finally, sites are audited for an adequate drug-accountability process. Corrective and preventive action process The role of this QA function is to ensure that complaints, discrepancies, and non-compliances are visible, prioritized, and tracked and that the root cause is determined and resolved. This process establishes guidelines for assigning priority status and a management review date for issues of non-conformity during the study to assess responsibility and accountability, followed by a well-articulated proposal for corrective action that is tracked to completion.

Continual process improvement QA also has a critical introspective role in continually monitoring and evaluating its own activities to improve QA processes for future studies. This continual process of improvement tracks and reports on metrics for key activities and processes of drug development, keeping in mind the adage that what gets measured gets managed. Other inputs to process improvement include a formal debriefing after project close and customer satisfaction surveys that may lead to changes in SOPs and training. Summary Benefits of managing the quality of clinical data are numerous: Ensures management of compliance with the protocol, SOPs, and GCPs Enables systemic problems to be resolved before the end of the study Helps reduce data queries (industry average= $120/query), as well as identify ways to reduce cycle times for various processes Ensures data continuity throughout the course of the study: that the data collected are the data required by the protocol Ensures the accuracy and consistency of data, from source document entry into the CRF to final datasets reported in the final CSR Plays a critical role in dealing with instances of non-conformity while carrying out clinical trials Continually evaluates all QA functions in an effort to continually improve QA methodologies and metrics

Sources Townshend IJ, Bissel AF. Sampling for clinical report auditing. Statistician. 1987;36:531-539. Rondel RK, Varley SA. Clinical Data Management. New York: John Wiley & Sons; 1993.