The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency

Similar documents
2014 Metrics on Human Research Protection Program Performance

Kimberley W. Cole, PhD, CPA Assistant Dean, School of Education University of California Davis

2014 HIMSS Workforce Survey AUGUST 2014

Outsourcing to an AAHRPP-Accredited IRB vs. Creating a Local, Regulatory-Compliant IRB. August 22, 2012

Statistics in Medicine Research Lecture Series CSMC Fall 2014

Clinical Research Management Webinar Series: Results of Huron s Clinical Trials Management System Survey

Informed Governance. A Framework for Adaptive Information Policy. Mark Elson, PhD Barbara Filkins Intrepid Ascent

PAYROLL DEPARTMENT BENCHMARKS AND ANALYSIS Essential research and metrics on payroll operations, resources and performance

GAO VA HEALTH CARE. Reliability of Reported Outpatient Medical Appointment Wait Times and Scheduling Oversight Need Improvement

Reducing Labor and Supply Chain Costs Proven Through Proven Metrics and Operational Techniques. Steven Berger, FACHE, FHFMA, CPA

This policy applies to research administrators, investigators, research staff, Human Investigation Committee (HIC) members and HIC staff.

GAO VA HEALTH CARE. Reliability of Reported Outpatient Medical Appointment Wait Times and Scheduling Oversight Need Improvement

Premium Pay Are you paying too much for your labor? Sandy Yanko HCA Far West Division Vice President Operations

business statistics using Excel OXFORD UNIVERSITY PRESS Glyn Davis & Branko Pecar

AAHRPP DOCUMENT # 84 UNIVERSITY OF ALABAMA HUMAN RESEARCH PROTECTIONS PROGRAM FORM: NOTIFICATION TO IRB OF EMERGENCY USE OF A TEST ARTICLE

Changes in Career and Practice of Pharmacy after Obtaining a Degree through an External Doctor of Pharmacy Program

Human Resources

SURVEY OF STAFF. Prepared by Russell L. Smith, PhD. Director, School of Public Administration University of Nebraska at Omaha

REVENUE CYCLE MANAGEMENT WHAT YOU MUST DO NOW TO SUCCEED!

University of Maryland, Baltimore Effort Reporting Policy Statements

Date: Wednesday, June 24, 2009

Succession Planning Practices & Outcomes in U.S. Hospital Systems: Final Report

David L. Ross, MBA, CRA

System Overview. Tel:

WHITE PAPER APRIL Leading an Implementation Campaign to Address the Convergence of Healthcare Reform Initiatives

A Systematic Approach to Business Process Improvement

The Treasury 3.0 Framework: Deploying a Model of Best Practices Treasury Strategies, Inc. All rights reserved.

Nurse Practitioners and Physician Assistants in the United States: Current Patterns of Distribution and Recent Trends. Preliminary Tables and Figures

Descriptive Statistics

Guidian Healthcare Consulting

Implementing End-to-End Process Controls To Assure HIPAA 5010 Compliance

How To Calculate College Enrollment In The United States

Anatomy of an Enterprise Software Delivery Project

Industry Watch. State of the Industry: Document Management Service Providers

New Program Proposal Doctorate of Physical Therapy (DPT) Medical University of South Carolina

Principal Investigator Responsibilities for Education and Social/Behavioral Researchers

UC DAVIS OFFICE OF RESEARCH AAHRPP Preparation UC Davis Human Research Part IV Criteria for Review. Cindy Gates IRB Administration

GRANT & RESEARCH MANAGEMENT REPORT BOOK

Analysis of the medication management system in seven hospitals

The Ohio State University. Office of Research. Five-Year Business Plan for Research Administrative Units

Reducing the Cost to Collect Using cloud-based workflow tools to reduce collection costs by up to 50%

Oracle CPQ Cloud Product Overview. Sergio Martini CX/CRM Master Principal Sales Consultant CX Organization June 11, 2014

Clinical Trials and SPA Fall, 2014

REVENUE CYCLE PRINCIPLES SERIES

Monitoring Patient Cycle Time: Utilizing EMR Data To Assess Patient Flow and Provider Efficiency. East Arkansas Family Health Center (All 5 Sites)

Summary of Select Results from the 2014 Medical School Information Technology Survey Sponsored by the Group on Information Resources (GIR)

Rationale: Increasing the supply of RNs in Texas is of paramount importance. Although both enrollment and completion rates have been

Influence, Information Overload, and Information Technology in HealthCare.

Display Revenue cycle graphic

Supplier Engagement Guidance

Survey of more than 1,500 Auditors Concludes that Audit Professionals are Not Maximizing Use of Available Audit Technology

GE Healthcare. Transforming radiology with actionable intelligence. *Trademark of General Electric Company

Affordable Care Act (ACA) Dashboard

PROVIDER ATTITUDES TOWARD VALUE-BASED PAYMENT MODELS

Lower business risk and increase IT productivity with trusted, managed and secured file transfer

Optimizing Your Billing System to Produce Clean Claims

Human Resources

Successful Projects Begin with Well-Defined Requirements

Maximizing Learning in Online Training Courses: Meta-Analytic Evidence. Traci Sitzmann

Dr. KNITASHA V. WASHINGTON, DHA, MHA, FACHE phone

Research Methods & Experimental Design

USING THE ETS MAJOR FIELD TEST IN BUSINESS TO COMPARE ONLINE AND CLASSROOM STUDENT LEARNING

Healthcare Performance Management Strategies for Highly Efficient Practices

What s in This Report? 4 Section 1. Quantitative Data... 4 Section 2. Qualitative Data... 4 Section 3. Appendices... 4

Title Registration Form Campbell Collaboration Social Welfare Coordinating Group

Program: Human Resources Program Based Budget Page 41

Healthcare Information Technology (HIT)

PAYROLL DEPARTMENT BENCHMARKS AND ANALYSIS

Rethink the Night: An Evidence-Based Discussion on Teleradiology Partnerships

Mental Health Nurses and their Employers See Enhanced Role for Nursing in Milwaukee County s Mental Health System

Virtualization Still a Priority for SMBs Hesitant About the Public Cloud

IL-HITREC P.O. Box 755 Sycamore, IL Phone Fax

2015 Credit Union Call Center Conference Survey Results

Portfolio Company Performance Analysis and Reporting Automation

HR Function Optimization

Managing Your Outsourcing Relationship..helping companies optimize their HR / benefits / payroll service partnerships

IMPORTANT: AANA Electronic Survey Application

Travel and Expense (T&E) System: Automated Expense Claim Reimbursement and Electronic Funds Transfer (EFT)

SIZING-UP IT OPERATIONS MANAGEMENT: THREE KEY POINTS OF COMPARISON

The Clinical Trials Office: Understanding Organizational Structure, Functions through a Real-World Lens

University of Michigan Health System Program and Operations Analysis. Utilization of Nurse Practitioners in Neurosurgery.

Simple Predictive Analytics Curtis Seare

HOW AN INTEGRATED RECEIVABLES SOLUTION

Embla Enterprise. Sleep Business Software

ITIL v3 Service Manager Bridge

5 Best Practices for SAP Master Data Governance

What Financial Administrators Must Know About Clinical Research at MSU

Cost Containment Strategies for Hospitals and Health Systems

Office Efficiency. Barbara Tauscher, MHA, FACMPE Director of Operations The Oregon Clinic GI South Portland, Oregon

Managing Research Compliance Risks

Secrets of World-Class Accounts Payable Departments

WAITING FOR TREATMENT: A SURVEY

Adopting Site Quality Management to Optimize Risk-Based Monitoring

OnCore Clinical Research Management System Standard Operating Procedures

6 Contract Management Best Practices - OPERATIONS EDITION -

Career Opportunities in Healthcare Analytics presented by Kaiser Permanente Northwest Region. Today s Speakers. Friday, May 13 at 1:00 pm.

International Association of Rehabilitation Professionals Vocational Expert Fee Survey Report, 2012

ACT Research Explains New ACT Test Writing Scores and Their Relationship to Other Test Scores

Transcription:

The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency Karl Oder and Stephanie Pittman Rush University ABSTRACT Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should expect a positive return on investment from the purchase of an IRB system. An IRB system should decrease the processing time for IRB applications and affect the number of IRB staff members necessary to adequately handle IRB processes. This study examined these and other factors at AMCs to determine the value of IRB computer systems. INTRODUCTION There is a lack of studies specific to the results of implementing a computer system to support IRBs. These systems are very expensive and challenging to implement. The rewards from this technology should be a more efficient operation that puts less of an administrative burden on the investigators utilizing IRB services. The outcome of this project provides evidence to indicate that larger IRBs are aided by computerized systems. IRBs need to efficiently process protocol submissions. Delays in processing submissions are frustrating for both the investigators and the research sponsors. In fact, for several years there has been a movement towards using a centralized IRB rather than a local IRB (Weschsler, 2007). One of the reasons for this preference is that central IRBs are often quicker to process protocols than local IRBs (Weschler, 2007). This is particularly true with multi-center studies where a centralized IRB would 1

provide a single, more efficient review rather than multiple reviews by each local IRB associated with the project (Food and Drug Administration, 2011). A study conducted by Whitney et al. (2008) included a survey of principal investigators (PI) views towards IRBs. Both negative and positive responses towards the IRB system were offered by the PIs. The negative responses portrayed the IRB process as a research burden investigators perceived IRB processing of research protocols as being slow and inefficient. Andrews et al. (2012) described how a local IRB can be made more efficient without adding additional cost by workflow redesign. This workflow redesign was done without adding computer systems that undoubtedly would have increased the cost. Computerized systems have been implemented in many industries in order to increase efficiency and eliminate process bottlenecks that can lead to employee or customer dissatisfaction. A few papers have outlined the success of implementing systems in research institutions to help track budgets and increase the productivity of the research administrative offices. The Medical University of South Carolina (MUSC) effectively implemented a webbased budgeting system (Glenn & Sampson, 2011). The Mayo Clinic increased the quality of the services offered by its research administration offices by implementing a new pre-award and IRB system (Smith & Gronseth, 2011). A study from the Imperial College of London described the implementation of both a pre-award and post-award system (Rutherford & Langley, 2007). All of these papers offer insights into the implementation of computer research administrative systems at a single institution. They all cite benefits of the new systems but are only focused on a case study regarding a particular institution s implementation of computerized systems. The studies lack the presentation of data from multiple institutions that supports the cost of purchasing and implementing new systems. Our study presents productivity data from multiple institutions that were used to inform the decision to acquire and implement an IRB computer system. The 2012 Metrics on Human Research Protection Program Performance report issued by the Association for the Accreditation of Human Research Protection Programs (AAHRPP) indicated that 90% of AAHRPPaccredited institutions reported using a database to track IRB protocols, while only 40% of accredited institutions reported using an online system for the actual IRB review functions (AAHRPP, 2013). Additionally, the AAHRPP metrics showed that the use of computer systems for the distribution of materials has increased consistently since 2009 (p. 15). A caveat to these statistics, however, is that the data come from institutions that meet the high standards for excellence in their Human Research Protection Programs required for AAHRPP accreditation. Therefore, these

data may not be applicable to all IRB programs in academic medical centers. METHODS This study design involved a cross sectional survey. Data were collected via a 12-item survey tool distributed to senior Institutional Review Board (IRB) administrators at institutions belonging to the University Health Consortium (UHC). The UHC is an organization comprised of 120 academic medical centers and 299 of their affiliated hospitals. The UHC mission is to facilitate performance improvement for member institutions. Of the 120 UHCmember AMCs, contact information for a senior IRB administrator was publicly available for 92 institutions. A senior IRB administrator was defined as an individual within the IRB office with a job title equivalent to manager or above. An email was sent to the 92 identified IRB administrators with a letter of introduction and a link to the survey in Google Docs. Two weeks after the initial contact, a reminder email and link to the survey were sent to the same cohort of identified IRB administrators. The survey was kept open for a total of four weeks from the initial point of contact to maximize response rates after each of the two contacts. Responses to survey questions were returned anonymously; however, respondents had the option to include contact information at the conclusion of the survey if they wished to receive a copy of the aggregate data as an incentive for participation. All research activities were reviewed and approved by the Rush University Medical Center Institutional Review Board. RESULTS The data were analyzed for consistency and statistical significance using a standard statistical software package. Two survey responses were removed from the analyzed data set due to inconsistent responses. The two survey responses incorrectly stated or omitted the percentage of studies the organization sent to an external IRB for review. Table 1 provides the overall statistics for the survey, including the 41% response rate. Table 1 Response Rate for Survey Total Surveys Sent 92 Total Surveys Received 38 Response Rate 41.30435

Table 2 offers the descriptive statistics for the data collected via the survey. The vast majority of respondents have computer systems to help automate the IRB process. Table 2 Descriptive Statistics for Survey Responses Computer System for IRB Institutions with External IRB AAHRPP Accreditation Variable N(%) Yes 29 (76) No 9 (24) Yes 24 (63) No 14 (37) Yes 27 (71) No 11 (29) Variable Median (25 th Percentile, 75 th Percentile) Information Services FTE (with computer system) 1.5 (1, 3) Total Number of Active Studies at the Institution 2100 (1275, 4000) Number of IRBs 4 (2, 4.25) Percent of Studies Sent to External IRB 5 (2, 13.75) Variable Mean + SD IRB Staff (FTE) 11.64 +/- 7.07 Percent Expedited Studies per Institution 57 +/- 18 Days to Process Expedited Studies 22.5 +/- 13.5 Percent of Studies Sent to Full IRB Board Review 30 (24, 40) Days to Process Full Board Studies 40.2 +/- 20.1 Comparison of IRB FTEs vs. IRB Computer Systems Figure 1 plots the mean IRB FTEs for organizations with and without a computerized IRB system. A one-way ANOVA was used to compare IRB FTEs for organizations with (12.224 + 6.487, n=29) and without (9.778 + 8.871, n=9) a computer system. The analysis indicates that there was no difference in the mean number of FTEs in organizations with or without a computer system (F1,36 = 0.819, p=0.372). 2

Mean Number of IRB FTE Research Management Review, Volume 20, Number 2 (2015) 16 14 12 10 8 6 4 2 0 Not Electronic Electronic Figure 1. Number of IRB Full-Time Equivalents (FTE) on Organizations with and without a Computer System Comparison of IRB FTEs vs. Number of Studies A Spearman s correlation test was conducted on the data (see Figure 2). Test results indicated a statistically significant positive relationship between IRB FTEs and number of studies in this sample (rs =.831, p<.001). The effect size for this correlation is considered to be large and suggests that the correlation between IRB FTEs and number of studies should be considered by practitioners in developing their staffing model for IRB offices. Figure 2. Number of IRB FTEs vs. Number of Annual Studies

Mean Number of Studies Research Management Review, Volume 20, Number 2 (2015) Comparison of Number of Studies vs. IRB Computer Systems Figure 3 depicts the difference in mean number of active studies for IRBs with and without an electronic system. The results of a Mann-Whitney U test (Mann-Whitney U = 89, p = 0.154) indicated no difference in the median number of studies for those who reported not having a computer system [1800 (425, 3750), n=9] and for those who reported having a computer system [2500 (1600, 4300), n=29]. 4000 3500 3000 2500 2000 1500 1000 500 0 Not Electronic Electronic Figure 3. Mean Number of Studies for Offices with and without a Computer System Comparison of Number of IRBs vs. IRB Computer Systems Figure 4 depicts the difference in mean number of IRBs for institutions with and without an electronic system. The results of a Mann-Whitney U test (Mann-Whitney U = 102.5, p = 0.161) indicated no difference in the median number of IRBs for those who reported not having a computer system [3 (1, 4.5), n=9] and for those who reported having a computer system [4 (2, 4.5), n=29]. 4

Mean Number of IRBs Research Management Review, Volume 20, Number 2 (2015) 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 Not Electronic Electronic Figure 4. Mean Number of Studies for Offices with and without a Computer System Comparison of IRBs Grouped by Number of Studies vs. IRB Computer Systems The data were grouped into study categories defined by AAHRPP (see Table 3) (AAHRPP, 2013). A Mann-Whitney test was performed on the categorized data to determine the validity of the alternative hypothesis that an organization with a large number of studies is more likely to have a computer system. The Mann-Whitney U test (Mann-Whitney U = 77, p = 0.0275), determined that there is a statistically significant difference for the categorized data when comparing those IRBs with and without a computer system. Figure 5 graphically depicts the increase in the percentage of computerized IRBs as a function of the number of protocols processed per IRB. Table 3 Number of Study Categories Number of Studies N (%) % with Computer Systems 1 100 0 (0) 0 101 500 4 (10) 25 501 1,000 3 (8) 67 1,001 2,000 11 (29) 82 2,001 4,000 11 (29) 82 4,000+ 9 (24) 89

Percent IRBs with Computer Systems Research Management Review, Volume 20, Number 2 (2015) 100 90 80 70 60 50 40 30 20 10 0 1-100 101-500 501-1,000 1,001-2,000 2,001-4,000 4,000+ Number of Protocols Processed Annually by IRB Figure 5. IRBs Categorized by Protocol Volume vs. the Percent with a Computer System Comparison of Percent of IRBs with Computer Systems vs. IRBs with AAHRPP Accreditation Figure 6 compares the percentage of IRBs with computer systems with those that have AAHRPP accreditation. The chi square analysis for the association between having a computer system (yes, no) and AAHRPP accreditation (true, false) indicated no significant association between having a computer system and AAHRPP accreditation (χ 2 (1) = 1.377, p<.241). 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Electronic System AAHRPP Accredited Yes No Figure 6. Percent of IRBs with Computer Systems and Percent of IRBs with AAHRPP Accreditation

Comparison of Application Processing Time vs. IRB Computer Systems The data were also analyzed to see if the processing time for expedited or full board studies was impacted by the presence of a computer system. The difference in mean processing days for an expedited study (without a computer system, 23.8 + 14.8, n=8; and with a computer system, 22.2 + 13.4, n=29) was examined. The study data indicated that a computer system does not impact the mean days for an expedited study (independent t35 = 0.289, p = 0.775). The difference in mean processing days for a full board study (without a computer system, 43.56 + 24.9, n=9, and with a computer system, 39.2 + 18.8, n=29) was also examined. The study data similarly indicated that a computer system does not affect the mean days for a full board study (independent t36 = 0.566, p = 0.575). DISCUSSION The null hypothesis for the study is that there is no difference in the mean number of IRB FTEs for organizations with or without a computer system. The results of the study indicate that the presence of a computer system has no effect on the number of IRB FTE staff members. However, the statistical analysis of the effect of number of studies on IRB FTE count revealed that organizations with more studies have a higher FTE count (see Figure 3). This study also examined whether the number of studies influenced an organization s decision to have a computer system. Figure 4 seems to indicate that organizations with more studies are more likely to have a computer system. Unfortunately, the statistical analysis did not support that hypothesis. However, it seemed that the large variation in number of studies [median = 2100 (1275, 4000), n=38] and the small sample size caused an issue with this statistical analysis. In order to smooth the variations in the data, the data were put into categories defined by AAHRPP (AAHRPP, 2013). A statistical analysis of these categorized data revealed that the number of studies processed by an IRB was an indicator of whether or not an organization used an IRB computer system. Figure 5 graphically depicts this result. An IRB that processes more studies annually is more likely to have an IRB computer system. The study results were limited by several factors. Because the survey sample included only IRBs that belong to the University Health System Consortium, the association of the IRB with a UHC hospital may influence the ability of those IRBs to adopt computer systems for processing protocols. Additionally, due to the small sample size from a single membership group, the results may not be indicative of larger IRB office populations. Therefore, future studies of the impact of computer systems on IRB staffing and efficiency should include a larger, more 3

diverse population of IRB offices. Also, the study should collect more detailed information regarding functions required of the computer system used by a particular institution. For instance, is the computer system used only to collect the protocol information or does it also support the workflow for processing review and approval of the protocol? This distinction would indicate the level of automation for the IRB office. CONCLUSION This study concluded that the number of staff in an IRB office is a function of the number of protocols that it processes. Also, the study found that IRB offices with more protocols are more likely to have a computer system. Further study is needed to determine if the results of this study are applicable to the larger population of IRB offices. Also, a future study should explore the impact of various computer system functions (e.g., collection of protocol information, automated workflows, virtual meetings) on IRB office staffing levels or protocol processing times. LITERATURE CITED Andrews, J. E., Moore, J. B., Means, P., & Weinberg, R. (2012). An IRB transformation: Increasing quality and efficiency using existing resources. Journal of Research Administration, 43(2), 69 81. Association for the Accreditation of Human Research Protection Programs (AAHRPP). (2013). 2012 metrics on human research protection program performance. Retrieved from: http://www.aahrpp.org/tipsheetdownload.ashx?filename=2012 Metrics (2).pdf Food and Drug Administration. (2011). Human subjects research protections: Enhancing protections for research subjects and reducing burden, delay, and ambiguity for investigators. Federal Register, 76(143), 44512 44531. Glenn, J. L., & Sampson, R. R. (2011). Developing an institution-wide web-based research request and preliminary budget development system. Research Management Review, 18(2), 39 57. Rutherford, S., & Langley, D. (2007). Implementation of systems to support the management of research: Commentary from a UK university perspective. Journal of Research Administration, 38(1), 49 60. Smith, S. C., & Gronseth, D. L. (2011). Transforming research management systems at Mayo Clinic. Research Management Review, 18(2), 27 38. Weschsler, J. (2007). Central vs. local: Rethinking IRBs. Applied Clinical Trials Online. Retrieved from: http://www.appliedclinicaltrialsonline.com/appliedclinicaltrials/article/articledetail.jsp?id=401619 Whitney, S., Alcser, K., Schneider, C., McCullough, L., McGuire, A., & Volk, R. (2008). Principal investigator views of the IRB system. International Journal of Medical Science, 5(2), 68 72.

ABOUT THE AUTHORS Karl Oder is an instructor in the Health Systems Management Department and an information technology manager at Rush University. He has a master of science in engineering from the University of Illinois at Chicago and a master of science in research administration from Rush University. His main area of research interest is the use of information technology in academic medical centers. Stephanie Pittman currently works in IRB administration at Edward-Elmhurst Healthcare in Naperville, IL. Previously, she was the IRB Manager in the Office of Research Affairs at Rush University. She received her bachelor of science in biochemistry from the University of Illinois at Urbana-Champaign and master of science in research administration from Rush University.