ISSUES IN CLINICAL TRIALS MANAGEMENT. Introduction



Similar documents
Good Documentation Practices

The Monitoring Visit. Denise Owensby, CCRP Sr. Clinical Research Coordinator Clinical & Translational Science Center University of California, Davis

DAIDS Bethesda, MD USA. POLICY Requirements for Clinical Quality Management Plans

Subject: No. Page PROTOCOL AND CASE REPORT FORM DEVELOPMENT AND REVIEW Standard Operating Procedure

To Certify or Not to Certify

To Certify or Not to Certify Sandra Halvorson, BA, CCRP

The Regulatory Binder/Trial Master File: Essential Records for the Conduct of a Clinical Trial

Data Management Unit Research Institute for Health Sciences, Chiang Mai University

No Page 1 of 5. Issue Date 4/21/2014

Managing & Validating Research Data

The role, duties and responsibilities of clinical trials personnel Monitoring: rules and recommendations

INTERIM SITE MONITORING PROCEDURE

QUALITY CONTROL AND QUALITY ASSURANCE IN CLINICAL RESEARCH

CNE Disclosures. To change this title, go to Notes Master

DHHS/NIH/OD/OIR/OHSRP 1/2/2015

Electronic Medical Records and Source Data for Research: What s the Difference?

ROLES, RESPONSIBILITIES AND DELEGATION OF DUTIES IN CLINICAL TRIALS OF MEDICINAL PRODUCTS

GOOD CLINICAL PRACTICE: CONSOLIDATED GUIDELINE

Laurie Shaker-Irwin, Ph.D., M.S. Co-Leader, Regulatory Knowledge and Research Ethics UCLA Clinical and Translational Science Institute

Data Management & Case Report Form Development in Clinical Trials. Introduction to the Principles and Practice of Clinical Research.

Health Care Job Information Sheet #20. Clinical Research

Study Start-Up SS STANDARD OPERATING PROCEDURE FOR Site Initiation Visit (SIV)

The Study Site Master File and Essential Documents

European Forum for Good Clinical Practice Audit Working Party

How to Run Clinical Trials in Private Practice

ISO 9001:2008 STANDARD OPERATING PROCEDURES MANUAL

STANDARD OPERATING PROCEDURES FOR GOOD CLINICAL PRACTICE

How To Write A Binder Tab

R&D Administration Manager. Research and Development. Research and Development

Data Management and Good Clinical Practice Patrick Murphy, Research Informatics, Family Health International

Investigator responsibilities for research conducted under the authority of the UTHSCSA Institutional Review Board (IRB)

DAIDS Bethesda, MD USA POLICY. Requirements for Clinical Quality Management Plans

Objectives. The Paper Tells the Story

Barnett International and CHI's Inaugural Clinical Trial Oversight Summit June 4-7, 2012 Omni Parker House Boston, MA

This policy applies to all clinical research conducted at Beaumont Health System.

Quality Monitoring Checklist

Monitoring & Auditing of Clinical Trials. Sponsored by Center for Cancer Research National Cancer Institute

Essential Documents for Clinical Trial Research. Centre for Health Evaluation and Outcome Sciences Manager, Clinical Research Development Office

Principal Investigator Responsibilities for Education and Social/Behavioral Researchers

Comprehensive Study Documents List (Biomedical Studies)

Quality Management in Clinical Trials

Fall 2015 Research Training Program Agenda SESSION 1, TUESDAY, OCTOBER 6, 2015

Objectives. Monitoring & Auditing of Clinical Trials. Overview. Pre-study Qualification Visit. Industry-sponsored Trials

Standard Operating Procedure on Training Requirements for staff participating in CTIMPs Sponsored by UCL

ICH CRA Certification Guide March 2009

Duke Ethics & Compliance Office Update 2014

Needs, Providing Solutions

Spring 2016 Research Training Program Agenda SESSION 1, TUESDAY, APRIL 5, 2016

New Investigator Collaborations and Interactions: Regulatory

A Principal Investigator s Guide to Responsibilities, Qualifications, Records and Documentation of Human Research University of Kentucky

Good Clinical Practice 101: An Introduction

Hollie Goddard Sr. IRB Coordinator McKesson Specialty Health

Essential Documents for the Conduct of a Clinical Trial. Debra Dykhuis Associate Director RSO

Clinical Investigator Training Course

PREP Course #27: Medical Device Clinical Trial Management

Friday, May 2. Clinical Trials: Is Your Staff Competent? Session 10:15 11:45 am Ballroom C

Pre-Questions. Mastering Clinical Research July 29, 2015

Infoset builds software and services to advantage business operations and improve patient s life

ROLE OF THE RESEARCH COORDINATOR

Remote Monitoring of Clinical Trials and EMRs

Content Sheet 18-1: Organizational Requirements for a Quality Management System

Sharon H. Johnson, BS, MS 123 Main Street Capital City, VA Phone:

2015 Core Clinical Research Training (CCRT)

CLINICAL DATA MONITORING PLAN (CDMoP) PROTOCOL # [0000] [TITLE]

Clinical Trial Oversight: Ensuring GCP Compliance, Patient Safety and Data Integrity

CONTROLLED DOCUMENT- DO NOT COPY STANDARD OPERATING PROCEDURE. STH Investigator

12.0 Investigator Responsibilities

Role of IRB/IEC in GCP. Benjamin Kuo, MD, Dr.PH, CIP.

Guidance for Industry Investigator Responsibilities Protecting the Rights, Safety, and Welfare of Study Subjects

CLINICAL RESEARCH PROTOCOL CHECKLIST

ROLE OF THE RESEARCH COORDINATOR Study Start-up Best Practices

BUILDING QUALITY INTO CLINICAL TRIALS AN FDA PERSPECTIVE

CERTIFICATE IN CLINICAL TRIAL MANAGEMENT

Investigational Drugs: Investigational Drugs and Biologics

History and Principles of Good Clinical Practice

U.S. Food and Drug Administration

What is necessary to provide good clinical data for a clinical trial?

Orientation Manual for Clinical Research Coordinators

Building the Foundation for Clinical Research Nursing

Version history Version number Version date Effective date 01 dd-mon-yyyy dd-mon-yyyy 02 dd-mon-yyyy dd-mon-yyyy 03 (current) dd-mon-yyyy dd-mon-yyyy

IMP management at site. Dmitry Semenyuta

A Clinical Research Coordinator for an International Study THE GYMNAST

TRIAL MASTER FILE- SPONSORED

Health Canada s GCP Compliance Program. GCP Information Sessions November 2010

Speed to Market Abbott Nutrition Experience in Streamlining the Pediatric Clinical Research Process

STANDARD OPERATING PROCEDURE FOR RESEARCH. Management of Essential Documents and Trial Folders

Catherine Jahrsdorfer, RN, BSN Director of Clinical Services USF Health Office of Clinical Research

Signature Requirements for the etmf

Introduction The Role of Pharmacy Within a NHS Trust Pharmacy Staff Pharmacy Facilities Pharmacy and Resources 6

Clinical Trials: The Crux of Cancer Innovation

The Beginner Research Assistant/Coordinator (CRC) Track Basic Level

Roadmap for study startup

MRC. Clinical Trials. Series MRC GUIDELINES FOR GOOD CLINICAL PRACTICE IN CLINICAL TRIALS. Medical Research Council

Conduct of clinical Trials Communication of

Clinical Trial Coordinators A Vital Sign of a Good Clinical Trial

Introduction to Clinical Research

Energy Management. System Short Guides. A Supplement to the EPA. Guidebook for Drinking Water and Wastewater Utilities (2008)

Vertex Investigator-Initiated Studies Program Overview

Comparing GCP Requirements for Medical Device Clinical Trials in the US and Japan

LIBRARY GUIDE: Clinical Medical Device

Transcription:

ISSUES IN CLINICAL TRIALS MANAGEMENT Quality time: The art of QA program development for research sites Carolynn J. Thomas, RN, BSN, MSPH; T. Beth Dean, RN, CCRC; and Donna R. Fowler, RN, BSN, CCRC For more than a decade, both public and private sponsors have worked diligently to improve quality while maximizing speed and minimizing mounting costs for clinical trial processes. International Conference on Harmonisation good clinical practice (GCP) guidance documents require sponsors to have standard operating procedures and quality assurance (QA) systems in place; however, guidance for sites on methods for developing QA systems is lacking. Therefore, many research sites do not mandate these systems as part of their site mission to fulfill GCP commitments. Despite operational challenges, sites that adopt quality systems find that the time investment is minimal in comparison to the resulting rewards. By providing a simple approach to (1) identify process and quality indicators, (2) plan a method to train, and (3) self-evaluate through a cyclic approach for ongoing improvement, sites easily can benchmark from within and transform site performance. Now more than ever, sites have an ethical responsibility to conduct clinical trials with well-trained staff, documented procedural systems, and most importantly, making the time to implement realistic QA processes. Key Words: clinical trials, process improvement, quality assurance, site management, standard operating procedures And if you don t know where you are going, any road will get you there. Introduction George Harrison 1 Clinical researchers seek to identify safe and effective drugs and devices to prevent and treat human diseases. Safety and efficacy measurements are the critical parameters used by researchers to assess a product s therapeutic value in the conduct of clinical investigations. At the investigative site, daily clinical trial activities and processes must adhere to good clinical practice (GCP) to ensure patient safety and the accuracy of reported data. However, clinical research sites are challenged by several realities: competitive enrollment among sites, high cost of site operations, shortages of qualified staff, increasing trial complexities, and lower study budgets. Moreover, physician investigators are trying to reach monthly non-research clinical relative value Author Disclosure - According to ACCME and Research Practitioner policy, authors must disclose all associations with proprietary entities that may have a direct relationship to the subject matter of their presentation. Carolynn J. Thomas, RN, BSN, MSPH; T. Beth Dean, RN, CCRC; and Donna R. Fowler, RN, BSN, CCRC, have stated they have no such associations. unit (RVU) requirements and often are challenged to be focused dually on the business of site management and the ethical commitments listed in FDA Form 1572. It generally falls to the site manager to ensure that the site s performance reflects the quality goals elucidated in most site mission statements. Three essential elements for adequate quality assurance (QA) of site performance are: (a) management s commitment to quality, (b) dynamic site standard operating procedures (SOPs), and (c) an ongoing site QA program. W. Edwards Deming, of The W. Edwards Deming Institute, provides organizations with guidelines for creating a more efficient workplace, higher profits, and overall satisfaction of the stakeholders. 1,2 Along with his mentor Walter Shewart, Deming is credited with developing the plan-do-check-act (PDCA) cycle as a simple template for quality improvement of processes and organizations. (See Figure 1.) PLAN Recognize an opportunity, and plan the change DO Test the change CHECK Review the test, analyze the results, and identify learnings ACT Take action based on what was learned in the check step. If the change was successful, incorporate the learnings from the test into wider changes. If not, go through the cycle again with a different plan 3 Additionally, Deming stresses the importance of management s overall commitment to quality and generally condemns inspection alone as inefficient, while stressing training and cooperation as effective tools to gaining process improvement. 2003 Center for Clinical Research Practice Research Practitioner / VOLUME 4 NUMBER 6 219

Quality time: The art of QA program development for research sites / Thomas, Dean, and Fowler Figure 1: PDCA Cycle PLAN CHECK DO ACT achieved with GCP through the development of measurable, descriptive SOPs. All site staff should be included in the GCP training and SOP development and training process. When beginning, the pivotal key is generating a list of quality indicators that fulfill GCPs through SOP compliance. Quality indicators should be organized according to specific headings such as: informed consent eligibility patient visits/follow-up data management study drug/device endpoints adverse events regulatory documents In this article, we are going to use the PDCA cycle as a model to illustrate the process of adopting and using continuous quality improvement at the clinical research site to enhance ethical, quality benchmarking activities. PLAN Develop/plan a QA program at your site Having a QA program at the site is an essential first step, the realized opportunity for strides in process improvement. Management must adopt quality as the absolute measure for assuring ethical, safe, and efficient conduct of clinical studies. Authorities agree that without support of a quality system from the top level downward, it ultimately will remain only a high stakes endeavor. In his book, Quality Without Tears, Philip Crosby puts together a checklist that sites can use to profile the need for quality systems and SOPs within the organization. 4 This checklist has been modified to conform to a research site in Table 1. Sites must actively pursue best practices, especially given the charge to perform ethical, safe, and effective research. Crosby suggests that although management may not realize the root of quality problems, there is a quality vaccine that can be used to build up antibodies to potential pitfalls. His vaccine includes the following elements: (1) determination, (2) education, and (3) implementation. 4 In addition to management assuming responsibility for quality, compliance also must be This list should establish the basis of primary research SOPs developed by the site. All parties in the research site should agree to the measurable indicators for specific elements in each category. For example, the following quality indicators are the minimal criteria to be assured and measured regarding informed consent: Patient signed consent prior to study procedures Correct version of stamped consent was used All original signatures and dates required for the consent are proper All pages of the consent have been initialed by the subject Documentation of the consent process includes: - date and time of consent and signature - indication of any and all witnesses to the process - indication of names of parties who gave informed consent - any subject questions and additional information exchanges Site training on the quality indicators expected for informed consent is supported by the SOP documenting the site standard for this process. Management would endorse this expectation and incorporate it in job descriptions for coordinators and designated research personnel. 220 Research Practitioner / VOLUME 4 NUMBER 6

Thomas, Dean, and Fowler / Quality time: The art of QA program development for research sites DO Implementing a QA program Once a site has defined, established, and trained staff on all quality indicators for informed consent, it then can implement a process of quality control (QC) and QA to ensure the SOP for Informed Consent (and consistent adherence to GCPs) has occurred. QC is a realtime activity 5 through which all individuals at the site perform daily checks during and immediately after the informed consent process to ensure that the quality indicators were achieved correctly. This QC checklist should promote adherence to the SOP and GCP guidelines. However, getting an evaluation of overall site performance relative to quality indicators requires a process that includes a statistical measure of quality achievement. The site should develop an audit tool that reflects specific and measurable quality indicators. An audit tool will allow the site to be sure it is consistently performing at expected levels, while also providing a means to document any deviations in full compliance with quality indicators. Data from the QA audit forms can be compiled using software such as Excel and Access. It is a challenge for sites to dedicate time for training and QA processes, and management support is the key to establishing a strong QA system. Questions that will be elucidated from a site s QA SOP include: Do you have to perform a QA audit on all subjects and all studies? What is the minimum number to be audited? How often should the audits take place? Who will perform the audits? Examples of QA SOP evolution Example A. Donna Fowler, BSN, CCRC directs a private practice research site. At our site, we started with an SOP that required auditing 20% of all enrolled subjects per study at least monthly. Our initial tool audited 46 items, some of which were repeated across several visits. We also had a checklist with 32 items for the review of regulatory binders. The initial plan included a training session with inter-rater reliability testing to ensure everyone was on the same page in terms of data collection. Since we designed this as a peer review SOP, the entire staff participated and were Table 1: Indications for a quality system at the clinical research site That s us Some is We re not all the way true like that 1 Our data and regulatory document management normally contain waivers, deviations, queries, missing items and other indications of not conforming to requirements. 2. Our site spends a lot of time in the fix it mode, correcting deviations. 3. Our coordinators do not know specifically what management wants from them concerning quality. 4. Management does not know what the price of nonconformance really is. 5. Management believes that quality is a problem caused by something other than management action. 5 points 3 points 1 point Point count condition: 21-25 Critical Needs intensive care immediately 16-20 Guarded Needs life support system hookup 11-15 Resting Needs medication and attention 6-10 Healing Needs regular check-up 5 Whole Needs ongoing counseling Adapted from: Crosby PC. Quality Without Tears. New York: McGraw-Hill; 1984: 4. Research Practitioner / VOLUME 4 NUMBER 6 221

Quality time: The art of QA program development for research sites / Thomas, Dean, and Fowler randomly assigned cases to review. We kept the assignments blinded to alleviate bias and create a self-improvement endpoint. The QA SOP was welldesigned and worked well initially, when enthusiasm was high. As we got busier and experienced staff turnover, priorities inevitably shifted to orientation, the launch of new studies, and staff turnover, moving the SOP to the back burner. Having an operational QA process was a high priority, so we had to take action to get it back on track. We did a consensus review of sponsor monitoring reports for our site and all prior QA peer audit reports and compared that to our current SOP auditing tool. This resulted in reducing the number of items on our audit tool to 15 critical indicators. We also determined that we could do our peer audits less frequently and increase our training of newly hired staff. In 6 months, we will do another review of our auditing reports and the SOP, which may result in more frequency of our audits. If so, we will look at further innovative ways to incorporate QA into our daily juggling routine. We also found it beneficial to build in priority time for coordinators to complete the QA chart reviews. To ensure the audits are completed in a timely manner, each coordinator attends a 1-hour staff meeting that allows uninterrupted time for completing chart audits. This ensures protected time for the QA process, and coordinators do not feel they must sacrifice in daily work quality. Auditing charts are used as a learning tool for the less-experienced coordinator. Through the hands-on QA process they become more aware of the errors that can occur. The message remains as a visual reminder. When scoring another coordinator s work they also are reviewing critical items for appropriate trial conduct. Example A illustrates a good example of how a site committed to excellence by establishing a QA SOP, training staff on its execution, and then evaluating results toward change. An important point is that sites should not aim so high that they cannot achieve the quality milestones set forth in SOPs. Use the GCPs as a guide, adopt measurable quality indicators, and be realistic about frequency and samples. Moreover, group consensus and training is the major means to a QA SOP success story. Example B. Another illustration of site QA systems comes from the academic perspective, specifically, sites that are part of large, multicenter grants such as those funded by the National Institutes of Health (NIH). Beth Dean, RN, CCRC reflects the QA experience of managing an AIDS Clinical Trial Unit (ACTU). For more than a decade, these groups have gradually raised the bar on quantifying QA methods at the site. This was a legislated part of the NIH application and approval as a clinical trials unit. Working within a multi-site network provided an opportunity to acknowledge that the by-product of a good QA program was multifold. First, since it was mandated by the funding agency, lack of a fully operational system was not an option. Second, it instituted an excellent staff understanding of specific performance evaluation criteria within the operation, thus providing a means for sites to define local SOPs. Finally, QA data were compiled centrally and dispersed to the sites in ongoing reports. The review of QA data provided a means for sites to benchmark against each other and to have early reversal of negative trends, thereby reducing chronic QC activities. Ultimately such a mandated QA process led to more product ownership, job satisfaction, and retention of experienced staff. The unit viewed this as a team effort, and interestingly, we never had staff miss the quarterly QA sessions we conducted as a group. The concept of protected time for these QA management systems was critical, as was anonymous reporting of quality improvement trends. When staff saw that QA systems could be time efficient and reduce frustrations in the daily routine, it was easy to achieve consensus. The International Conference on Harmonisation (ICH) Guidelines for clinical trial conduct define a requirement of SOPs and QA processes for sponsors. 6 Although the ICH guideline does not specify that research sites have SOPs or QA systems in place, the principles of ICH GCP 2.13 state that systems with procedures that assure the quality of every aspect of the trial should be implemented. This clearly reflects the agreement of experts that clinical trials should not be implemented without defined SOPs and quality systems. Sponsors and contract research organizations continue to consider these systems as an indicator of a benchmarking site when selecting and investing in 222 Research Practitioner / VOLUME 4 NUMBER 6

Thomas, Dean, and Fowler / Quality time: The art of QA program development for research sites Sample QA tool for clinical GCP SOP quality indicators Clinical SOP Quality Indicators Yes No Informed Consent 1. ICF Documentation Check Sheet completed* 2. Patient signed consent prior to study procedures 3. Correct ICF version used Data Management 4. Source and CRF neat and legible 5. CRFs completed and up-to-date Eligibility/Follow-up 6. Patient qualified for study enrollment 7. HIPAA notice, and if required, appropriate release forms signed 8. All visit-specific tests/procedures performed per protocol 9. Missed tests/procedures documented adequately 10. Patients only on allowed concomitant medications 11. Visits occurred within visit windows per protocol Study Drug Endpoints Adverse Events 12. Correct study drug/dosage dispensed and documented per protocol 13. All endpoints are captured and reported per protocol 14. SAEs are reported within 24-48 hours of knowledge of SAE to sponsor and to IRB according to specific requirements 15. Subject specific SAE log maintained in site source** Study: Patient Number: Enrollment Date: Current Visit Date: QA Audit Date: Comments: Description of Deficiencies: Action Items: QA Auditor Signature: *ICF Checklist contains all elements required for consent and consent process and kept with subject s signed ICF **SAE log documents all SAEs and reporting process for all SAE including disposition Research Practitioner / VOLUME 4 NUMBER 6 223

Quality time: The art of QA program development for research sites / Thomas, Dean, and Fowler sites to perform clinical studies. Having SOPs and QA processes are not bureaucratic exercises, but a necessary means to achieving excellence. Moreover, having a QA system should not be an optional part of any research plan; it should be an ethical responsibility. When implementing a peer-review QA system at the site, a form of governance should be outlined in the SOP to identify roles and assure that someone will spear-head the effort. The site manager, a committee manager, or a regulatory manager may make assignments to peer teams, but someone must be in charge of pushing the program forward. In any organizational system, the pitfalls to poor governance surface quickly from lack of focus, lack of decisiveness, lack of learning, and lack of credibility. 7 Therefore, a finalized QA SOP clearly would identify all corresponding roles with an accountability flow that provides support through governance. CHECK What to do with results of QA audits It is important to develop simple, time-efficient audit tools, such as forms collecting objective data in yes/no check box fields. Spreadsheets or database software are essential to report and analyze these data and identify trends. Reports should display current and past results for feedback comparisons. Use tables and graphs to display trends. All QA reports should be shared with the group and management personnel. In addition to tracking results from internal QA auditing, feedback from external monitors during routine site visits can provide additional information to be added to periodic QA reports. Since the principal investigator (PI) ultimately is responsible for site operations, all site PIs should be heavily involved in reviewing QA reports. Monthly staff meeting agendas should incorporate QA topics to report on and discuss things done consistently right and any deviations, both positive or negative. QA-related action items should focus on: opportunities for improvement, improvement scores for prior negative deviations at the site, and benefits of a particular training. This QA reporting method provides site personnel with important measurable assessments related to performance, needed training, or valuable SOP edits. ACT Quality improvements and consistency It is important to simultaneously evaluate the QA system s effectiveness while utilizing it for process improvement and SOP evolution. Do a quarterly checkup on the QA system SOP as a whole. Management s commitment to the process, staff enthusiasm and dedication to QA tasking, and timeliness of reporting should be measured and discussed. The important thing is to have a system that remains operational. As illustrated in Example A, a system that looks good on paper may not work efficiently at the site. SOPs describe a site s procedures; a major pitfall with having SOPs is that if the site does not not follow its SOPs, the site is setting itself up for regulatory non-compliance. Some sites avoid having documented SOPs to avoid scrutiny by sponsors and regulatory agencies. However, the benchmarking site demonstrates accountability to subjects and to fellow sites by stating its quality indicators and constructing a roadmap to getting the job done efficiently, effectively, and ethically. SOPs can be revised to be more operationally efficient. Sites can also elect to start a QA system on a smaller scale, adding additional quality indicators to existing SOPs at annual SOP editing workshops. The essential element to continuous quality improvement is to have a method of consistently seeking means to achieving adherence to SOPs and, hopefully, excellence. Conclusion Quality is the weapon you hone to achieve speed. Quality cuts out repetition and slices off delays. 8 If this quote is true, then it is essential that sites develop QA system SOPs and adhere to them consistently, using results of audits to improve processes and identify educational opportunities for staff development. Even though site staff are time-challenged, this element in site operations is critical for more efficiency in the long run and is the only ethical means by which sites can operate. Study subjects should be enrolled in trials feeling secure that their safety is being assured at each visit and that data collected will be a quality means to the end safe and effective new therapies. Carolynn J. Thomas, RN, BSN, MSPH Clinical Research Consultant Birmingham, Alabama 224 Research Practitioner / VOLUME 4 NUMBER 6

Thomas, Dean, and Fowler / Quality time: The art of QA program development for research sites T. Beth Dean, RN, CCRC Research Nurse Coordinator Site Consultant University of Alabama at Birmingham Birmingham, Alabama Donna R. Fowler, RN, BSN, CCRC Director, Office of Clinical Research CardioVascular Associates, PC Birmingham, Alabama References 1 Harrison G. Any Road. Brainwashed. Capitol Records, Inc.; 2002. 2 Deming WE. Out of the Crisis. Cambridge MA: Massachusetts Institute of Technology, Center for Advanced Engineering Study; 1986. 3 W. Edwards Deming. Available at: www.pathmaker.com/ resources/leaders/deming.asp. Accessed Sept. 5, 2003. 4 Crosby PC. Quality Without Tears. New York, NY: McGraw-Hill; 1984. 5 Low M. Quality assurance at the clinical research site. Res Nurse. 1999;5:12-15. 6 ICH Harmonized Tripartite Guideline. Guidelines for Good Clinical Practice-E6. May 1, 1996. Available at www.ich.org/mediaserver.jser?@_id=482&@_mode=glb Accessed Oct. 24, 2003. 7 Waife RS. Who s in charge here. Appl Clin Trials. April 2003:34-35. 8 Thompson WL. Clinical trials must be faster, more streamlined. The Monitor. 1999:13:25-28. Research Practitioner / VOLUME 4 NUMBER 6 225