A white paper. Data Management Strategies



Similar documents
Introduction. The Evolution of the Data Management Role: The Clinical Data Liaison

Clinical Data Management (Process and practical guide) Dr Nguyen Thi My Huong WHO/RHR/RCP/SIS

QUALITY CONTROL AND QUALITY ASSURANCE IN CLINICAL RESEARCH

Data Management: Good Team Work is de sleutel tot succes!

eclinical Services Predictable Pricing Full Service EDC Phase I-IV Sophisticated Edit Checks Drug Supply Chain Forms Library Data Collection Services

CLINICAL DATA MANAGEMENT

Clinical Data Management is involved in all aspects of processing the clinical data, working with a range of computer applications / database systems

An Introduction to Electronic Data Capture Software. Learn the basics to find the right system for your needs

Selecting Enterprise Software

MAKE THE SWITCH TO ELECTRONIC DATA CAPTURE

What is Clinical Data Management

How To Buy A Crm Solution

Clinical Data Management Overview

The Complete Guide to DEVELOPING CUSTOM SOFTWARE FOR ANY BUSINESS CHALLENGE

Clinical Data Management (Process and practical guide) Nguyen Thi My Huong, MD. PhD WHO/RHR/SIS

through advances in risk-based

Standardizing Your Enterprise s Software Packaging Process

The Evolution of Data Management Job Models. in the Execution of Clinical Trials.

Agile Power Tools. Author: Damon Poole, Chief Technology Officer

9 Features Your Next EMR Needs to Have. DocuTAP White Paper

How To Write An Slcm Project Plan

Implementing ERP in Small and Mid-Size Companies

How To Test For Elulla

Deploying End-to-End Small Call Center Software Solution

Off-the-Shelf Software: A Broader Picture By Bryan Chojnowski, Reglera Director of Quality

Full-Service EDC as an Alternative to Outsourcing

Time and AttendanceVendor Tutorial - 9 Must Have Features For a Successful Transition

Clinical database/ecrf validation: effective processes and procedures

WHY ISN T EXCEL GOOD ENOUGH INTRODUCTION THE COMPARISON: EXCEL VS. PRIMAVERA S CONTRACT MANAGER EXECUTIVE SUMMARY MICROSOFT OFFICE EXCEL OPTION

Perform-Tools. Powering your performance

Portfolio Management 101:

Needs, Providing Solutions

CDISC Journal. Using CDISC ODM to Migrate Data. By Alan Yeomans. Abstract. Setting the Scene

Standard Operating Procedures

Epic Implementation Guide for a Multi-Hospital Install

INSERT COMPANY LOGO HERE BEST PRACTICES RESEARCH

Accounts Payable Invoice Processing. White Paper

Things You Need in AP Automation

Geoff Taylor Director, Clinical Quality Assurance, Eisai Product Creation Systems

The Importance of Good Clinical Data Management and Statistical Programming Practices to Reproducible Research

White Paper. The Ten Features Your Web Application Monitoring Software Must Have. Executive Summary

CRM. Booklet. How to Choose a CRM System

FIREWALL CLEANUP WHITE PAPER

STANDARD OPERATING PROCEDURE NO. CM

BUILD A FEARLESS SALES FORCE. CPQ Guide. CPQ Readiness check list Key questions to ask CPQ vendors CPQ implementation guide.

The Roles of Functional and Cross-functional Planning in Managing the Enterprise Elliot Chocron Matthew Steigerwald Introduction

AGILE RANDOMIZATION AND TRIAL SUPPLY MANAGEMENT SOLUTIONS: A RECIPE FOR SPEED, SIMPLICITY AND SERVICE

SEVEN WAYS THAT BUSINESS PROCESS MANAGEMENT CAN IMPROVE YOUR ERP IMPLEMENTATION SPECIAL REPORT SERIES ERP IN 2014 AND BEYOND

CRM Made Easy for Small to Mid-Sized Businesses

datalabs edc REvolutionizing clinical data management

Developing a Load Testing Strategy

The Monitoring Visit. Denise Owensby, CCRP Sr. Clinical Research Coordinator Clinical & Translational Science Center University of California, Davis

SMART PREPARATION FOR DATA CENTER MIGRATION

Knowledge Base Data Warehouse Methodology

Building an Integrated Clinical Trial Data Management System With SAS Using OLE Automation and ODBC Technology

A Comparison of Two Commonly Used CRO Resourcing Models for SAS/ Statistical Programmers R. Mouly Satyavarapu, PharmaNet/ i3, Ann Arbor, MI

Measuring Success Service Desk Evaluation Guide for the Midsized Business: How to Choose the Right Service Desk Solution and Improve Your ROI

Organization Profile. IT Services

Draft Requirements Management Plan

15 Principles of Project Management Success

White Paper. Fundamentals of Performance Testing

Welcome to the training on the TransCelerate approach to Risk-Based Monitoring. This course will take you through five modules of information to

Proven Best Practices for a Successful Credit Portfolio Conversion

Barnett International and CHI's Inaugural Clinical Trial Oversight Summit June 4-7, 2012 Omni Parker House Boston, MA

Normalized EditChecks Automated Tracking (N.E.A.T.) A SAS solution to improve clinical data cleaning

Automation can dramatically increase product quality, leading to lower field service, product support and

Changing Roles Of Data Management, Clinical Research, Biostatistics and Project Management When Implementing Internet-Based Clinical Trials

Data Conversion Best Practices

This unit will provide the structure needed to achieve successful supervision and oversight of the study. We will explain the importance of an

How to Choose a CRM System

SEVEN WAYS TO AVOID ERP IMPLEMENTATION FAILURE SPECIAL REPORT SERIES ERP IN 2014 AND BEYOND

Choosing an LMS FOR EMPLOYEE TRAINING

Managing & Validating Research Data

When companies purchase an integrated learning

Electronic Data Capture - MACRO. Anja Fischer, Thomas Müller

Outsourcing BI Maintenance Services Version 3.0 January With SourceCode Inc.

TABLE OF CONTENTS. Introduction EDI 101. Selecting the Right Solution. The Buying Process EDISOURCE BUYER S GUIDE

LMS Maximizing the Return on Your LMS Investment

How to Build an Enterprise App in 5 Days 1

In the same spirit, our QuickBooks 2008 Software Installation Guide has been completely revised as well.

The ROI of Data Governance: Seven Ways Your Data Governance Program Can Help You Save Money

KCR Data Management: Designed for Full Data Transparency

August

How to Choose a CRM System.

BridgeWays Management Pack for VMware ESX

Planning a Successful Visual Basic 6.0 to.net Migration: 8 Proven Tips

The Benefits of Deployment Automation

Ensuring Data Quality in an EDC Study: When Traditional QC No Longer Applies

How To Manage A Focused Outreach Lead Generation Initiative

Transcription:

A white paper Data Management Strategies

Data Management Strategies Part 1: Before the Study Starts By Jonathan Andrus, M.S., CQA, CCDM Electronic Data Capture (EDC) systems should be more than just a means to an end. Quality EDC systems can enable the entire clinical trials information management process. But if the system enables, it is the data managers who drive. They are burdened with designing processes to make the transition from paper to EDC efficient while maintaining the integrity of the data. And while such efficiency sounds simple in theory, it requires extensive preparation prior to the start of an electronic clinical study. Preparation is the most integral and significant function of a data manager, and electronic studies actually require more upfront preparation than paper studies. Because I believe preparation is so vital to the success of an electronic study, I have outlined several basic, but vital best practices for the start-up phase of a clinical study. When such best practices are administered, the result is a more efficient study with far fewer errors and complications during the critical latter phases. Your diligence in implementing thorough preparation including edit check specifications and standardized ecrf design will improve the efficiency of your study, and help you avoid some of the most common pitfalls even an experienced data manager can encounter. Best Practice #1: Set realistic expectations. It all starts with a plan doubly so for an electronic study. World-leading personal-time-management expert, Alan Lakein, defined planning as bringing the future into the present so that you can do something about it now. Whether approaching EDC for the first time or changing to a new EDC vendor, the most crucial element of the plan actually takes place before the plan. Setting achievable goals is the most fundamental component of the study s start-up phase. To frame realistic expectations, standard operating procedures (SOPs) must be updated including identifying metrics and performance targets, as well as performing a gap analysis between current SOPs and requirements for the new system. Ideally, the performance targets that are set for EDC projects will be based on the sponsor s foundational reasons for switching to EDC, and should represent the first level objectives for EDC projects. The next set of objectives can be developed during rollout of the EDC solution and should include feedback from all stakeholders. Data management must also identify any additional metrics that may not be applicable for paper-based studies, but will be used for EDC projects. Examples of EDC metrics may include average time for discrepancy resolution by site, average number and severity of help desk calls, and percent of EDC system downtime. Data management may also establish goals for EDC projects based on calculated ROI. However, most organizations will find it necessary to modify their processes to accommodate EDC during the start-up phase. It should be expected that the start-up phase will be iterative, and therefore be impacted by many variables. The development of a clear set of realistic expectations will be influenced by: Complexity of the projects implemented Variation between the projects Requirements for user training Type of EDC system implemented Number of staff affected by the EDC transition Preparation required by each site 2

Setting realistic expectations also requires performing a gap analysis between current SOPs and requirements for EDC. It is critical to determine how implementation of the EDC system will necessitate changes in the sponsor s current set of SOPs and other controlled documentation. Identifying these gaps is a technical and clinical operations effort that must be shared among all stakeholders. At a minimum, requirements should be written for each new process that has the potential to intersect study data. These requirements must later be tested and will form the basis of validation efforts. Functional and business requirements must be developed the former to test the overall functionality of the solution, and the latter to test how the solution meets the needs of the sponsor. Examples of procedures and processes for which requirements, testing, and validation should be performed include: data entry, data verification, discrepancy management, data lock, user roles, and user security. Setting lofty goals or best-in-class objectives is fine, too. But recognize them as such and don t overlook the fundamentals. A good system can make your trials run more smoothly, but inadequate processes, poor communication, or inappropriate goals will surely make everyone s lives miserable. Best Practice #2: Create a training program to address change even a perfect plan can fail if your staff is not up to speed. If it takes a village to raise a child, then it takes a village of well-trained and educated clinical staff to effectively conduct an electronic study. But it is the responsibility of the data manager to create effective training materials. User training on both the system and the study setup within the system is essential. The data manager in charge needs to thoroughly understand the workings of the EDC system in order to assign functional requirements to the appropriate personnel. Tasks done in a paper environment, such as verification, locking, data review and safety review, investigator signatures must be mapped to the EDC environment and documented. This type of training needs to be completed before performing the SOP gap analysis. Emphasis on user training can vary, but at a minimum. Each user with the ability to enter/modify study data, issue queries and electronically sign within the EDC system should have documented training on the basic system functionality, such as logging on, opening an e-crf, entering data and responding to a query. Regardless of experience, each user should have documented training prior to being granted access to the live system. User training can be provided through: Generic trainer, self-read of materials, test of competency using sample cases in training environment Providing a web-based instruction or demo Training CRAs to provide system training during site visits (i.e., training the trainer) Creating training cases that are generic or customized to the study-specific workflow Experience has proven that it s best for the data manager to conduct a review of the data entered and actions taken by study coordinators shortly after first use of the system to minimize potential errors. This step is often overlooked, even though it supplies immediate, customized feedback to each user about how they use the system and how to better support data collection. It should be used more frequently. As the data manager, you should evaluate your staff to decide which combination of training methods will be the most effective. For instance, 3

users with read-only access may need only minimal instruction, while a study coordinator may need in-depth training of most system functions. Users who perform the majority of study tasks should also be trained on applying functionality of the EDC system to the study-specific workflow. For example, if a study allows for the capture of unscheduled visits but must apply insert page functions, the user entering this data must understand that a page or a visit block must be inserted or activated in a certain location. It is critical that the data manager decides who will be responsible for training staff and maintaining training records throughout the duration of the study. Additionally, the data manager needs to consider how often training will be required. Continuing education for clinical organization staff is imperative to ensure that your staff is well educated not just in the EDC software, but also in the processes and communicative procedures that are expected. Best Practice #3: Plan studies to avoid last-minute system modifications to ensure adequate attention to the collection of safety data. The resource that always seems to be in short supply is time. It is imperative that you manage it well for yourself as well as your staff. This can be overlooked during the study start-up phase when the data manager needs to juggle staff training and everything that encompasses study planning. Plan the time to clearly and specifically lay out your forms, and the logic behind them. With paper studies, this type of logic could wait, as the only people affected by it were your internal staff. With EDC, the sites are directly impacted by the thoroughness and completeness of your study logic. By maximizing preparation, the data manager can significantly decrease the number of errors, revisions, and queries that will affect the end phase of the trial, and sets the study on a more efficient path. A study that is front-loaded with edit checks will run more efficiently. It will guide data entry better and provide data managers the unique opportunity to resolve data issues by interacting directly with clinical site coordinators. With the ability to see data as it is entered, the data manager can resolve the issue with site staff by phone. This type of direct contact with the site staff also promotes an active approach to completing an e-crf and its edit checks together. Moreover, during study development, data managers can truly collaborate with database developers when programming edit checks. This collaboration is essential to the edit checks functioning correctly in the EDC system. Data managers should: Edit check specifications should be developed concurrently with e-crf specifications Finalize the study protocol prior to finalizing the study s edit checks One of the most common mistakes is moving forward with a study without first finalizing electronic case report forms (e-crf). To ensure suitable attention to the collection of safety and efficacy data, data managers should design a standardized and efficient e-crf before any data are collected. Doing so will facilitate data exchange, enable the merging of data between studies, provide increased efficiency in processing and analysis of clinical data, and enhance monitoring activity and investigator staff efficiency. If paper CRF s are available, modifications will be needed to implement these CRF s in an EDC system. The data manager should review the forms and decide how they should be represented electronically. The use of drop-down lists, yes/no buttons and conditional fields can eliminate the need for many checks, but these elements should be addressed early on. 4

Since the EDC system output will go to the programming group, it is also important to involve them in this process. Field names, lengths, types and codelists should all be decided during the CRF creation phase. It is the role of the data manager to facilitate a balance between designing forms that accommodate efficient data-entry, and providing useable output for the statisticians. By developing a standard e-crf, the data manager provides capabilities to a study that are not traditionally available when paper-based CRFs are used. Unfortunately, industry-wide standards for ecrfs are still only developing. Any of the leading EDC systems should be flexible enough to capture the questions as they would appear on a paper CRF. But it is crucial that the data manager lead the charge for the design and functionality of ecrf standards. The following real-life scenario illustrates some of the best practices I ve discussed. Company A had standard paper CRF s, and insisted that the electronic CRF exactly mirror the standard paper CRFs. This became problematic because many of the features of the EDC system that add to the ease of data entry were not utilized. Many annotated CRFs are designed to pool multiple studies together and have extraneous information that needs to be added to the backend of the EDC system that never get used. For example, a codelist for Reason for not completing the study contains ten items, and the CRF for this particular study only needs to display five. This makes sense in a paper-based study, since the codelist can be added after the data is entered. But for an EDC study where the codelist should represent the data on the CRF this doesn t make sense. Companies that are accustomed to paper-based CRFs often see many advantages with data collected electronically. Since there is no need to save paper and show multiple types of data on one paper page, the status of each form is reliable and complete. Many edit checks can be eliminated by using EDC features. For example, the check to make sure there is no comment if medical history is normal can be eliminated by only allowing entry in the comment field if abnormal is indicated. Pages will no longer need to be tracked or logged in by hand. The EDC system can provide a count of pages entered, verified, locked, etc. Data will be cleaner the site will be aware of how the data manager expects to see the data. This is done by using edit checks as a data entry guide. The site will know immediately as data is entered if it is the correct format or type and whether it is outside a given reference range, requiring an additional explanation. In conclusion, I would never suggest that the success of a study should be based on luck, but I cannot help but think of the words of golf legend, Arnold Palmer, who said, the harder I work, the luckier I get. I have learned that experience in paper-based studies does not automatically translate into expertise with electronic studies. It is especially important for data managers even those experienced in paper studies to work harder when deploying a new system. By taking extra precautions prior to and during the study start-up phase, data managers will significantly increase the efficiency of the study and decrease the risk of potential errors. Data managers approaching EDC for the first time frequently operate under the notion that their workload will be diminished in an electronic study as compared to a paper study. These best practices are offered here to illustrate that, contrary to common belief, workload is not always diminished in fact preparation is even more extensive for electronic studies. And, the first few times one uses any new system will very likely entail more work than the old tried and true. The big payoff of more efficiently run trials comes after these processes are refined. I hope that you can use these practices to find yourself successful and lucky in all your upcoming studies. 5

Overheard: Top Five Warning Signs You don t need to do any UAT, because the vendor tests the system. The sites only need a quick 15-minute orientation on the system and they ll be fine. Moving forward with the development of study logic/edits without first finalizing the electronic case report forms shouldn t be a problem. We can just convert our paper forms to ecrfs. We don t have the time to optimize them or figure out how the sites will react at this point. Any decent site will likely have Internet access. I don t think we have to qualify them before the study starts. 901 East Eighth Avenue Suite 201 King of Prussia, PA 19406 P: 866.737.4332 www.phoenixdatasystems.net 6 2008 Phoenix Data Systems, Inc. All rights reserved