Measuring the Performance of Telephone-Based Disease Surveillance Systems in Local Health Departments



Similar documents
WORKING P A P E R. Improving and Enhancing Telephone-based Disease Surveillance Systems in Local Health Departments

Optimizing Call Patterns for Landline and Cell Phone Surveys

In the mid-1960s, the need for greater patient access to primary care. Physician Assistants in Primary Care: Trends and Characteristics

Risk-communication capability for public health emergencies varies by community. Public Health, 677 Huntington Avenue, Boston MA, U.S.A.

Business Statistics. Successful completion of Introductory and/or Intermediate Algebra courses is recommended before taking Business Statistics.

Course Text. Required Computing Software. Course Description. Course Objectives. StraighterLine. Business Statistics

Informatics at Local Health Departments: Findings from the 2005 National Profile of Local Health Departments Study

USE OF CONSUMER PANEL SURVEY DATA FOR PUBLIC HEALTH COMMUNICATION PLANNING: AN EVALUATION OF SURVEY RESULTS. William E. Pollard

Barbara Ferrer, Ph.D., MPH, M.Ed Executive Director Boston Public Health Commission

HOUSEHOLD TELEPHONE SERVICE AND USAGE PATTERNS IN THE U.S. IN 2004

Growth of Home Health Services and Disparities in California,

2003 National Survey of College Graduates Nonresponse Bias Analysis 1

Organizing Your Approach to a Data Analysis

Predicting Successful Completion of the Nursing Program: An Analysis of Prerequisites and Demographic Variables

HEALTH CHARACTERISTICS OF PATIENTS IN NURSING AND PERSONAL CARE HOMES

THE EFFECT OF AGE AND TYPE OF ADVERTISING MEDIA EXPOSURE ON THE LIKELIHOOD OF RETURNING A CENSUS FORM IN THE 1998 CENSUS DRESS REHEARSAL

This article describes challenges, issues, and strategies

Strategies for Identifying Students at Risk for USMLE Step 1 Failure

The SURVEYFREQ Procedure in SAS 9.2: Avoiding FREQuent Mistakes When Analyzing Survey Data ABSTRACT INTRODUCTION SURVEY DESIGN 101 WHY STRATIFY?

Competency 1 Describe the role of epidemiology in public health

Administration of Emergency Medicine

Federal Emergency Management Agency

Virtual Mentor American Medical Association Journal of Ethics November 2006, Volume 8, Number 11:

Northumberland Knowledge

TESTIMONY. Analyzing Terrorism Risk HENRY WILLIS CT-252. November 2005

Major Public Health Threat

Fairfield Public Schools

Multiple logistic regression analysis of cigarette use among high school students

Survey of Employer Perspectives on the Employment of People with Disabilities

FY Strategic Plan

A Comparative Analysis of Income Statistics for the District of Columbia

US FIRE DEPARTMENT PROFILE 2013

Department of Behavioral Sciences and Health Education

CARE MANAGEMENT FOR LATE LIFE DEPRESSION IN URBAN CHINESE PRIMARY CARE CLINICS

Man y spe cia l ist s hel p employers

THE OREGON STATE REHABILITATION COUNCIL AND OFFICE OF VOCATIONAL REHABILITATION SERVICES 2008 CLIENT CONSUMER SATISFACTION SURVEY

NHPSS An Automated OTC Pharmaceutical Sales Surveillance System

How To Become A Clinical Epidemiologist

Does referral from an emergency department to an. alcohol treatment center reduce subsequent. emergency room visits in patients with alcohol

* California Department of Industrial Relations,

A SURVEY OF K-12 TEACHERS

Executive Summary. Specifically, the project gathered both primary and secondary data to meet four main research objectives:

2. Professor, Department of Risk Management and Insurance, National Chengchi. University, Taipei, Taiwan, R.O.C ;

Session Title: Assessment Centers and Situational Judgment Tests. Speaker Information: Contact Information: Brief Presentation Summary:

The Validity of Self-Reports Health Insurance Data

INTRODUCTION TO SURVEY DATA ANALYSIS THROUGH STATISTICAL PACKAGES

Trends in Australian children traveling to school : burning petrol or carbohydrates?

Finance and Economics Discussion Series Divisions of Research & Statistics and Monetary Affairs Federal Reserve Board, Washington, D.C.

GAO. MEDICAID Elevated Blood Lead Levels in Children

17% of cell phone owners do most of their online browsing on their phone, rather than a computer or other device

2010 Summary Report. Annual Test of the 24/7 Response System in Local Health Jurisdictions of Washington State. March /10/10

University of Michigan School of Public Health Final Report of the 2007 Michigan Public Health Workforce Assessment

Gender Effects in the Alaska Juvenile Justice System

Reducing Barriers, Enforcing Standards, and Providing Incentives to Immunize

How To Predict Success In An Executive Education Program

The Current Utilization and Impact of Curriculum Mapping within Physician Assistant (PA) Education: A National Survey Project

WAITING FOR TREATMENT: A SURVEY

Comparing Alternate Designs For A Multi-Domain Cluster Sample

Top Strategies for Increasing Immunization Coverage Rates

Table 1 Predictors of Discrepancy in Child Care Selection Criteria Variable B SE Wald p OR 95% CI

A C T R esearcli R e p o rt S eries Using ACT Assessment Scores to Set Benchmarks for College Readiness. IJeff Allen.

Students' Opinion about Universities: The Faculty of Economics and Political Science (Case Study)

AUGUST 26, 2013 Kathryn Zickuhr Aaron Smith

ANALYTIC AND REPORTING GUIDELINES

Electronic Health Record-based Interventions for Reducing Inappropriate Imaging in the Clinical Setting: A Systematic Review of the Evidence

Emergency Department Callbacks. By Ronald A. Hellstern, MD, Chief Medical Officer, Loopback Analytics

Accurately and Efficiently Measuring Individual Account Credit Risk On Existing Portfolios

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

The replication of empirical research is a critical

LAGUARDIA COMMUNITY COLLEGE CITY UNIVERSITY OF NEW YORK DEPARTMENT OF MATHEMATICS, ENGINEERING, AND COMPUTER SCIENCE

Team-Based Primary Care: Convergence of Improving Engagement, Safety, and Enhanced Joy in Practice

Oregon Public Health Workforce Training Needs Assessment. Key Informant Interviews Summary Report

WHAT WE NEED TO DO NOW TO PREVENT A PUBLIC HEALTH WORKFORCE CRISIS

Main Section. Overall Aim & Objectives

Survey Methods for a New Mail Survey of Office-Based Physicians 1

Open-Access Appointment Scheduling in Family Practice: Comparison of a Demand Prediction Grid With Actual Appointments

Priority for Childhood Lead Poisoning Environmental Assessment Referrals

Appendix G STATISTICAL METHODS INFECTIOUS METHODS STATISTICAL ROADMAP. Prepared in Support of: CDC/NCEH Cross Sectional Assessment Study.

Trends in Interdisciplinary Dissertation Research: An Analysis of the Survey of Earned Doctorates

Easily Identify Your Best Customers

1.1 WHAT IS A QUIT LINE?

X X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)

Marketable Features of the Adapted Physical Education Career in Higher Education

Department/Academic Unit: Public Health Sciences Degree Program: Biostatistics Collaborative Program

Guideline on Vulnerability and Patch Management

Study Design and Statistical Analysis

Employee Assistance Research Foundation Announces Winners of Its First Grant Awards Program

Family Income and the Impact of a Children s Health Insurance Program on Reported Need for Health Services and Unmet Health Need

COMMUNITY ASSESSMENT POPULATION SURVEY (CAPS)

Improvement of Data Quality Assurance in the EIA Weekly Gasoline Prices Survey

Leah Ericson Computing Services, Carnegie Mellon University 5000 Forbes Ave, Pittsburgh, PA 15213,

Blending e-learning with face-to-face teaching: the best of both worlds or just twice the work?

TRAINING SCHOOL IN EXPERIMENTAL DESIGN & STATISTICAL ANALYSIS OF BIOMEDICAL EXPERIMENTS

Trends in Life Expectancy and Causes of Death Following Spinal Cord Injury. Michael J. DeVivo, Dr.P.H.

THE THEORY OF PLANNED BEHAVIOR AND ITS ROLE IN TECHNOLOGY ACCEPTANCE OF ELECTRONIC MEDICAL RECORDS IMPLEMENTATION

Descriptive Methods Ch. 6 and 7

Access Provided by your local institution at 02/06/13 5:22PM GMT

The Impact of School Library Media Centers on Academic Achievement

National Commission for Academic Accreditation & Assessment. Standards for Quality Assurance and Accreditation of Higher Education Programs

Electronic Medical Record Customization and the Impact Upon Chart Completion Rates

Transcription:

Measuring the Performance of Telephone-Based Disease Surveillance Systems in Local Health Departments David J. Dausey, PhD, Anita Chandra, DrPH, Agnes G. Schaefer, PhD, Ben Bahney, MPIA, Amelia Haviland, PhD, Sarah Zakowski, BA, and Nicole Lurie, MD, MSPH Performance measurement and improvement in US health departments has received increased attention in recent years. Several factors, including the development of the National Public Health Performance Standards, 1 increased interest in the accreditation of health departments, 2 and the need to measure and report to Congress and the public on progress made in public health preparedness have contributed to this attention. 3 The goal of Congress and others is to augment the level of accountability in the public health system while supporting a process for quality improvement. 4 Disease surveillance is a high priority in public health practice, which often lacks adequate performance measurement and improvement strategies. 5 Assessing information about threats to public health, including those caused by infectious disease, and ensuring that adequate services are provided to meet these threats are core functions of health departments. 6 The telephone-based disease surveillance (TBDS) systems that local health departments have in place to receive case reports from the field are among the first lines of defense in identifying these threats. For more than 20 years, the Centers for Disease Control and Prevention (CDC) has provided health departments with guidelines to evaluate the performance of their TBDS systems. 7,8 In 2003, the CDC expanded its guidelines and developed performance standards to evaluate the ability of health departments to receive urgent case reports 24 hours a day, 7 days a week 9. These standards, although not binding performance obligations, emphasized the need for TBDS systems to consistently receive urgent case reports in a timely manner. The CDC encouraged health departments to regularly test their TBDS systems to assess their compliance with these standards because of concerns regarding the reliability of self-assessments not based on test results. The standards, however, were not, accompanied Objectives. We tested telephone-based disease surveillance systems in local health departments to identify system characteristics associated with consistent and timely responses to urgent case reports. Methods. We identified a stratified random sample of 74 health departments and conducted a series of unannounced tests of their telephone-based surveillance systems. We used regression analyses to identify system characteristics that predicted fast connection with an action officer (an appropriate public health professional). Results. Optimal performance in consistently connecting callers with an action officer in 30 minutes or less was achieved by 31% of participating health departments. Reaching a live person upon dialing, regardless of who that person was, was the strongest predictor of optimal performance both in being connected with an action officer and in consistency of connection times. Conclusions. Health departments can achieve optimal performance in consistently connecting a caller with an action officer in 30 minutes or less and may improve performance by using a telephone-based disease surveillance system in which the phone is answered by a live person at all times. (Am J Public Health. 2008;98:1706 1711. doi:10.2105/ajph.2007.114710) by guidance on how health departments should measure their performance, and it was unclear at the time whether the goals were achievable. To address this gap, Dausey et al. developed a method to assess whether local health departments could meet these standards and pilottested it in a convenience sample of 19 health departments. 10 The pilot tests found dramatic variations both in the response capabilities of TBDS systems and in their structure. 11 These findings suggested that there may be certain types of TBDS systems that perform better than others. In addition, these findings raised questions about whether TBDS systems could consistently achieve optimal performance as outlined by the CDC and whether quality improvement in these systems was possible. No research has described how health departments might improve their performance in receiving and responding to urgent case reports or which components of TBDS systems contribute to better performance. Literature exists on telephone response systems in other sectors that operate 24 hours a day, 7 days a week, ranging from emergency medicine to environmental hazard control. For example, literature exists on the effectiveness of the emergency response infrastructure in these areas, 12 15 as well as on the evaluation of emergency response in the field of emergency management. 16,17 Factors found to be associated with successful response in other sectors include structuring the system so that callers reach a live person, using a single telephone number instead of multiple numbers, building redundancies into the system in case of failure, requiring telephone operators to go through extensive training, and using formal protocols for call triage. We sought to identify the characteristics of TBDS systems associated with the ability of health departments to meet the CDC s standard requiring that all urgent case reports be connected to a trained public health professional in 30 minutes or less. We tested the TBDS systems of a random sample of 74 local health departments from across the United States. METHODS Sample We used data from the National Association of County and City Health Officials directory 1706 Research and Practice Peer Reviewed Dausey et al. American Journal of Public Health September 2008, Vol 98, No. 9

of health departments, 18 merged with data from the US Census Bureau, to construct a sampling frame of all local health departments. Our previous work suggested that very small health departments (i.e., serving less than 7200 people) are fundamentally different than their larger counterparts. 19,20 Therefore, we excluded 369 very small health departments (which together covered 0.05% or more of the total US population), giving us a target population of 2095 health departments. We created region-size strata by dividing this population into the 4 US census regions (Northeast, South, Midwest, West) and into population-size categories small (7200 149 250 people), medium (149 251 465 000), large (465 001 1 145 000), and extra large ( 1145 000) such that 25% of the US population was served by health departments in each size category. We selected 100 health departments for our sample by simple random sampling within each stratum. An equal number of health departments were selected across the 4 population-size categories (n = 25), with the number selected in each region varying proportionally to the population of the region. In the resulting sample, each selected health department represented health departments covering an equal fraction (1/100th) of the population of interest; in the largest population-size category, a selected health department could only represent itself; in the smallest category, a selected health department might represent as many as 60 other health departments. We replaced those health departments that we could not contact after 4 attempts or that declined to participate with another randomly selected health department from the same region-size stratum. Data Collection We adapted our test-call method from Dausey et al. 10 Prior to test calling, we obtained consent from health department directors and conducted a short interview to obtain information on their TBDS systems. We asked the consenting health department officers not to share the details of the project with their staff to ensure that test calls were not anticipated and that we could assess actual or realistic call connection time. To assess whether health departments could connect medical personnel to an action officer defined as a public health professional such as a public health physician, nurse, or epidemiologist a trained test caller contacted participating health departments, asserting that he or she was a doctor or nurse at a local health care facility calling with an urgent case report regarding an infectious disease. Sample caller scripts can be found elsewhere. 10 Callers were instructed to respond to inquiries about cases (prior to reaching an action officer) by saying that the case was confidential and that specific case information could only be provided to the action officer. If a department responded in more than 30 minutes to either all or none of its first 5 test calls, it received no more calls, because statistical calculations from previous research revealed a low probability that additional calls would yield different results (the probability that the sixth test call result would be different was approximately P=.003). 11 All other departments received 10 test calls. Calls were placed both during business hours (Monday to Friday, 8 AM to 5 PM, local time) and after hours (all other times) from May to October 2006. Measures For each test call, we recorded measures on the characteristics of the TBDS system that we identified through our literature review and previous work, as well as the TBDS system components currently required by CDC standards. These measures included recording whether the caller initially reached a live person or an automated system, whether the caller had to hang up and dial a second number before reaching an action officer, and whether the caller was transferred to the action officer with a warm transfer (i.e., immediately transferred to an action officer) or required a callback (i.e., left a number to be called back). We also identified whether the phone number called was the department s general number, a general communicable disease line, or a dedicated all hours health department line for urgent case reports. In instances in which departments had multiple numbers, we placed calls to all of them. Our measures of system performance were developed to capture the event of a connection to an action officer and speed of connection to an action officer for each call and to provide a benchmark system for evaluating consistency of health department connection times. These measures included whether the caller was connected to an action officer in 30, 60, or 240 minutes or less or not connected at all. We aggregated these call-level measures, such as the average time to call connection, to the health department level. From the call-level findings, we categorized health departments as excellent if all calls were connected in 30 minutes or less (the CDC standard), fair if 1 or more calls took more than 30 minutes but none took more than 240 minutes, and poor if 1 or more calls took more than 240 minutes or was not connected at all. At the conclusion of the test calls, we interviewed the health directors of 5 health departments that answered all calls in 30 minutes or less to obtain their perspectives on what may have contributed to the optimal performance of their TBDS systems. Data Analysis We placed 596 calls to 74 health departments. We began our analysis by comparing the percentage of calls connected with an action officer in 30 minutes or less at each health department to the percentage predicted by the department director before testing. We analyzed the determinants of call connection at both the health department and individual call level. We used sampling weights to calculate descriptive statistics and to report the data in a nationally representative manner. Our analyses at the health department level allowed us to determine which factors contributed to a fast connection with an action officer (call connection) and which factors were associated with the ability of a health department to consistently connect the test caller with an action officer from call to call (call consistency). The individual call level analysis allowed us to further examine which call-level factors predicted the amount of time to reach an action officer and the probability of ever reaching an action officer in addition to investigating whether these factors differed between the business day and after hours. To assess call connection, we modeled the mean time to connect to an action officer at the health department level and the September 2008, Vol 98, No. 9 American Journal of Public Health Dausey et al. Peer Reviewed Research and Practice 1707

associations between connecting with an action officer in 30 minutes or less or not connecting with an action officer at all, and TBDS system variables at the call level. We used linear ordinary least squares regression for the health department model and logistic regression with random effects specified at the health department level for the call-level models. For the call-level outcome models, we also analyzed our data separately by time of call initiation (business hours vs after hours). The call-level models used the full set of test calls in the sample; health departments whose performance was inconsistent in the first 5 calls had more calls, providing more variability in the outcome within health departments. To assess call consistency at the health department level, we used a multinomial logit regression. We tested determinants of excellent consistency (connection in 30 minutes or less for all test calls), fair consistency (1 or more calls taking 31 240 minutes), and poor consistency (1 or more calls in which the action officer was reached in more than 240 minutes or was never reached). RESULTS We contacted 124 health departments. Of those, 25 departments did not respond to repeated attempts to contact them, 4 had recently merged with another health department and were no longer responsible for handling urgent case reports, 3 agreed to participate but could not participate in the study time window, and 18 declined to participate. The resulting sample consisted of 74 health departments (response rate=62%). Nonresponse weights were developed to account for slight differentials in nonresponse rates across strata and were employed for descriptive statistics but not used in the regression models. Before conducting our test calls, we asked health department directors to predict the percentage of calls in which they thought our test caller would connect with an action officer in 30 minutes or less. Figure 1 shows the percentage of directors who accurately predicted, overestimated, or underestimated their actual call connection according to our test call data and the percentage of directors who stated that they did not have enough information to predict their performance. Forty-seven Figure 1 Number of local health departments under- or overestimating their call connection success (i.e., connecting calls within 30 minutes): United States, 2006. percent of department directors (n = 35) predicted that their performance would be substantially better than it actually was, 24% (n = 13) predicted their performance fairly accurately, and 13% (n = 14) underestimated their performance. Call Connection and Consistency at the Health Department Level Of the calls that were responded to at all, the average time that all health departments took to connect a test caller with an action officer was 63minutes (range=0 1003 minutes). Taking the median connection time for each health department and averaging it across all agencies, the mean of the health department median times was 8 minutes, reflecting the influence of outliers on the call-level mean. Nearly 40% of health departments (n=28) had 1 or more calls that ended without ever connecting with an action officer (Table 1). We analyzed factors hypothesized to have an influence on faster mean call connection time (Table 2). We found that for each 10% increase in the number of calls fielded by a live person (instead of a machine), there was an average reduction of approximately 37 minutes (P <.01) in response time. For example, health departments that fielded 20% more of their test calls with a live person had a shorter average connection time of approximately 74 minutes (P<.01). The timeliness of call connection with an action officer did not differ between systems that required callers to call more than 1 telephone number before reaching an action officer and those that were automated. After preliminary bivariate analyses, we included the 2 largest population-size categories in the model to compare them with the 2 smallest population-size categories, but this attribute was also not significantly associated with average connection time. We tested which system characteristics were associated with call consistency. Having a live person first answer the phone was a significant predictor of positive and consistent outcomes (Table 2) and significantly decreased the probability that a health department would be categorized as having poor consistency. Health departments that had 1 more test call out of 10 that connected first with a live person had a 43% reduction in the likelihood of being rated as having poor consistency (P <.01). Call Connection at the Call Level Table 3 presents information on factors contributing to connection time at the call level overall and separately by calls made during business hours and after hours. For all calls, having a person answer the phone increased the odds of having a call connected in 30 minutes or less and having a call connected at all (both results, P<.01). For calls placed during business hours (n=391), we again found that systems using a live person to answer the phone had a faster connection with an action officer (P<.01). 1708 Research and Practice Peer Reviewed Dausey et al. American Journal of Public Health September 2008, Vol 98, No. 9

TABLE 1 Predictor and Outcome Variables in Performance Evaluation of Telephone-Based Disease Surveillance Systems in Local Health Departments: United States, 2006 After hours (n = 205), having a call answered by a live person was a strong predictor of the call being connected in 30 minutes or less; calls that were connected to a live person had 6 times the odds of being connected with an action officer in 30 minutes or less compared with calls placed through other (nonlive) systems (P<.01). The use of an automated system after hours contributed to poor call connection; calls to an automated system had one tenth the odds of being responded to by an action officer at all compared with calls placed to other types of systems (P<.05). In these models, we included both calllevel independent variables and departmentlevel random effects, which allowed us to estimate the degree to which the outcomes varied within health departments compared with between health departments (as indicated by the rho values in Table 3, measuring intercluster correlation). We found little of the Performance Outcomes Predictors Percentage of calls connected to a live person, mean (SD) 0.75 (0.20) Percentage of calls responded to by an automated system, mean (SD) 0.13 (0.22) Percentage of calls requiring callers to call > 1 number, mean (SD) 0.18 (0.16) Call placed to general health department line, a % 19.0 Call placed to communicable disease line, a % 33.0 Call placed to 24/7 response line, a % 48.0 Outcomes Time to connect call, min, mean (SD) 63.11 (157.16) All calls connected in 30 min, % 30.1 All calls connected in 60 min, % 36.5 All calls connected in 240 min, % 48.3 Call connected in 30 min, a,b % 77.9 Call connected, a,c % 91.4 Consistency d Excellent, % 30.1 Fair, % 36.1 Poor, % 33.7 Note. All results were reported as means at the local health department level (n = 74) unless otherwise noted (call level, n=596). All department-level results were weighted with sampling weights to be representative of nationwide sampling pool. a Measured at the call level. b For all calls. However, when we analyzed only the first 5 calls to each local health department, 78.14% were connected in 30 minutes or less and 21.86% were not. c For all calls. However, when we analyzed only the first 5 calls to each local health department, 92.08% of calls were connected and 7.92% were not. d Excellent consistency = all calls were connected in 30 minutes or less. Fair consistency = 1 or more calls were connected in more than 30 minutes, but none were connected in more than 240 minutes. Poor consistency = 1 or more calls were connected in more than 240 minutes or was dropped. variability in timely call response during business hours to be between health departments; rather, much of the variability in this outcome was within departments. By contrast, for calls made after hours, nearly half the variation for both outcomes was between health departments, suggesting much more similar outcomes within departments. Thus, at all hours, a large component of call connection success was determined at the department level; the department determined a large component of call timeliness only after hours. Brief Interviews With Health Department Directors In brief follow-up interviews conducted at departments that had optimal performance (n = 5), the department directors described several factors that they felt may have contributed to their departments success. All directors indicated having high performance expectations for their staffs. Four of the 5 interviewees reported that they regularly tested their system, provided feedback to their staff members, and updated call lists (which included 1 or more backup person for occasions when the primary action officer could not be reached). One health department employed an automated call system activated by the person answering the phone. This system sequentially called down a list until a live person was reached. Four health department directors indicated that they were stimulated to improve their telephone response systems because of performance expectations and measures set forth by their states; 1 of these indicated that his budget was, in part, contingent on performance, along with a series of other measures. One director stated that the stimulus came from a survey of community physicians, which indicated dissatisfaction with the responsiveness of the health department. DISCUSSION We contacted the TBDS systems of health departments, pretending to be a doctor or nurse from a local health care facility, to study whether health departments were able to respond to urgent case reports by connecting the caller with an action officer in a timely fashion. Our goal was to identify factors associated with optimal performance. Overall, we found that nearly one third of participating health departments were able to consistently connect the caller with an action officer in 30 minutes or less. This finding confirms that consistent and timely responses are achievable by health departments; the large percentage of departments that were not yet able to meet this standard shows that substantial progress is needed to fully achieve this goal. In our quantitative analyses, 1 key factor whether callers reached a live person when they called, regardless of who that person was was a strong predictor of optimal performance, both for time to reach an action officer and consistency in doing so. Participating health departments used a variety of mechanisms to ensure that a live person would answer the phone, including hiring an answering service or forwarding calls to another local entity, such as the sheriff s dispatch or a local poison September 2008, Vol 98, No. 9 American Journal of Public Health Dausey et al. Peer Reviewed Research and Practice 1709

TABLE 2 Characteristics Associated With Call Connection Time and Call Consistency in Telephone-Based Disease Surveillance Systems in Local Health Departments: United States, 2006 Call Connection a Call Consistency b Average Difference Excellent vs Fair, in Connection RRR (95% CI) Fair vs Poor, Time, min (95% CI) RRR (95% CI) Calls responded to by a live person c 32.15** ( 50.3, 14.0) 0.94 (0.63, 1.38) 0.57** (0.39, 0.84) Calls responded to by an automated system c 13.94 ( 28.9, 1.0) 1.28 (0.96, 1.72) 1.12 (0.82, 1.53) Calls requiring callers to call > 1 number c 14.89 ( 37.4, 7.7) 1.22 (0.79, 1.90) 0.76 (0.49, 1.18) Population served 465000 d 50.21 ( 120, 19) 1.95 (0.58, 6.58) 0.92 (0.26, 3.20) Constant 373.4** (214, 533) 0.48 (0.01, 20.60) 74.51* (2.57, 2163.60) Note.CI=confidence interval; RRR = relative risks ratios.analyses were conducted at the local health department level; n = 74. a R 2 = 0.223. b Excellent consistency = all calls were connected in 30 minutes or less. Fair consistency = 1 or more calls were connected in more than 30 minutes, but none were connected in more than 240 minutes. Poor consistency = 1 or more calls were connected in more than 240 minutes or was dropped. Call consistency coefficients are reported as relative risk ratios for 10% changes in the predictors. For example, a 10% increase in calls responded to by a live person was associated with a 43% (1.00 0.57 = 0.43) reduction in the probability of having poor versus fair consistency. c Coefficients should be interpreted as a result of a 10% change in the independent variable. d Large and extra-large population categories. In this analysis we collapsed the small- and medium-sized population categories and the large and extra-large categories. *P <.05; **P <.01. performance measurement compared with standards, department leadership, and clear expectations for performance may have played a role in the ability of some health departments to achieve a high level of performance. Our results contribute to existing knowledge in several ways. First, they documented performance related to the ability to connect a caller with an action officer in a representative sample of health departments. They also identified a consistent and readily modifiable factor whether a live person answered the phone that was associated with optimal performance. Finally, our data indicated that perfect performance was achievable. This finding was complemented by qualitative interviews that indicated that improvement was possible. Indeed, all directors from optimally performing health departments with whom we spoke were able to point to a time when performance was not optimal and to identify a set of changes they made that led to improved performance. control center. The importance of reaching a live person was found to be particularly strong in protecting against poor performance. This indicates that it may be particularly helpful for departments with a high prevalence of slow connections or calls not returned to devote their resources to updating their TBDS systems to direct callers to live respondents. Although the CDC has focused attention on the presence of a single all hours line for urgent case reports, this feature of a health department s TBDS system was less critical to optimal performance than the ability to directly connect with a live person. However, having a live person answer the phone did not guarantee perfect performance, suggesting that other unmeasured attributes of system performance may also have played an important role. Qualitative interviews suggested that several other factors, including practice, routine Limitations To our knowledge, this study is the first to systematically investigate the types of phone systems employed by health departments and their effect on call response. However, these findings should be interpreted in light of the study s limitations. First, although the sample was drawn to represent health departments throughout the United States (excluding those that served less than 7200 people), the small sample size limited our statistical power to TABLE 3 Characteristics Associated With Individual Call Connection During Business Hours and After Hours in Telephone-Based Disease Surveillance Systems in Local Health Departments: United States, 2006 All Calls (n = 596) Calls During Business Hours a (n = 391) Calls After Hours (n = 205) Call Connected Call Call connected Call Call connected Call in 30 min, connected, in 30 min, connected, in 30 min, connected, OR (95% CI) OR (95% CI) OR (95% CI) OR (95% CI) OR (95% CI) OR (95% CI) Presence of live person b 4.90** (3.02, 7.95) 7.06** (3.54, 14.35) 5.15** (2.67, 9.93) 8.09** (2.64, 24.80) 6.57** (2.15, 20.06) 6.60** (1.69, 25.85) Presence of automated system c 0.62 (0.31, 1.24) 0.5 (0.20, 1.24) 0.94 (0.44, 2.02) 1.3 (0.31, 5.41) 0.23 (0.04, 1.26) 0.10** (0.01, 0.72) Caller used communicable disease line 1.2 (0.59, 2.42) 2.15 (0.74, 6.16) 2.08 (0.97, 4.47) 3.22 (0.74, 14.05) 1.18 (0.09, 15.12) 1.09 (0.06, 21.33) Caller used 24/7 line 1 (0.50, 2.00) 1.32 (0.48, 3.63) 1.07 (0.56, 2.06) 1.68 (0.46, 6.10) 2.46 (0.19, 31.21) 1.7 (0.10, 28.89) Intercluster correlation (ρ) 0.15 (0.06, 0.35) 0.32 (0.15, 0.56) 0.03 (0.00, 0.88) 0.36 (0.14, 0.66) 0.46 (0.23, 0.71) 0.45 (0.18, 0.75) Note.OR=odds ratio; CI = confidence interval. Analyses were conducted at the call level; 24/7 = 24 hours a day, 7 days a week. a Business hours were Monday through Friday, 8 AM to 5 PM local time; all other times were after hours. b Defined as caller having reached a live person without hanging up (which could also include having first reached an automated system). c Defined as caller having gone through an automated phone tree. **P <.01. 1710 Research and Practice Peer Reviewed Dausey et al. American Journal of Public Health September 2008, Vol 98, No. 9

identify other potential success factors. Second, although we based our selection of predictor variables on previous work, the literature, and case studies, there may be other relevant predictors that we did not measure. Third, we did not assess the nature and quality of the response by the action officer; we are not aware of any existing guidelines or standards for what an action officer should say in response to an urgent case report. Next, it is possible that callers in some health departments were anticipating the test calls and acted accordingly. We doubt this was the case, however, because our early developmental work on testing procedures suggested that most respondents were surprised by the test. Finally, this study did not examine the role of state health departments, which have a contractual obligation to meet CDC standards. Although local health departments receive federal funding that is passed through state health departments, they are not technically required to meet an all hours standard to receive funds. Conclusions Our findings suggest that many health departments have not yet begun to regularly test their TBDS systems despite CDC recommendations. The poor correlation between health department director expectations of performance and actual performance highlights the need for objective measurement. This study, however, did not clearly indicate by which process this testing should be done. Nonetheless, some options include having the CDC test state and local health departments, having state public health departments test local health departments in their jurisdiction, and having local departments conduct self-evaluations. Our findings also suggest that it may be prudent to revisit some aspects of the current CDC recommendations regarding TBDS systems. For example, we did not find any significant difference between health departments that had a separate dedicated all hours telephone line to receive urgent case reports and those that did not. It is also not clear what is an acceptable length of time for a caller to reach an action officer. The CDC set a standard of 30 minutes and has considered changing this standard to 15 minutes. Many health departments maintain that either standard is unrealistic; even if the bar were set at 60 minutes, a significant number of health departments did not meet this standard. The field of public health is moving toward performance measurement, accountability, and quality improvement. This study provides an example of objective performance measurement and suggests methods of quality improvement. Improving the performance of TBDS systems is clearly amenable to classical quality improvement approaches, which stress the use of multiple small-cycle tests of change and improvement, followed by regular assessments of performance to ensure that the improvements are maintained and goals are reached. Developing and improving measurement of other core local health department processes and functions will likely be necessary to achieve improvements. About the Authors David J. Dausey is with the RAND Corporation and the Heinz School of Public Policy and Management, Carnegie Mellon University, Pittsburgh, PA. Anita Chandra, Ben Bahney, and Sarah Zakowski are with the RAND Corporation, Arlington, VA. Agnes G. Schaefer, Amelia Haviland, and Nicole Lurie are with the RAND Corporation, Pittsburgh. Requests for reprints should be sent to David J. Dausey, RAND Corp, 4570 Fifth Ave, Pittsburgh, PA 15213 (e-mail: dausey@rand.org). This article was accepted June 11, 2007. Contributors D. J. Dausey originated the study, led the writing of the article, and supervised all aspects of the study implementation. A. Chandra synthesized analyses and led the writing of the methods and results. A. G. Schaefer assisted with the study and writing of the article. B. Bahney, A. Haviland, and S. Zakowski assisted with the study and completed the analyses. N. Lurie was the senior adviser for the project, conducted the brief interviews, and assisted in managing the study and writing the article. Acknowledgments This study was supported by the Office of the Assistant Secretary for Preparedness and Response in the US Department of Health and Human Services (contract HHSP233200500214U). We would like to thank all of the health departments that participated in the project. Note. The views expressed are solely those of the authors and do not necessarily reflect those of HHS. Human Participant Protection This study was approved by the RAND Corporation s human subjects protection committee. References 1. Ellison JH. National Public Health Performance Standards: are they a means of evaluating the local public health system? J Public Health Manag Pract. 2005;11:433 436. 2. Beitsch LM, Thielen L, Mays G, et al. The multistate learning collaborative, states as laboratories: informing the national public health accreditation dialogue. J Public Health Manag Pract. 2006;12:217 231. 3. Nelson C, Lurie N, Wasserman J. Assessing public health emergency preparedness: concepts, tools, and challenges. Annu Rev Public Health. 2007;28:1 18. 4. Tilson H, Berkowitz B. The public health enterprise: examining our twenty-first-century policy challenges. Health Aff. 2006;25:900 910. 5. Jajosky RA, Groseclose SL. Evaluation of reporting timeliness of public health surveillance systems for infectious diseases. BMC Public Health. 2004;26:4 29. 6. Institute of Medicine. The Future of Public Health. Washington, DC: National Academy Press; 1988. 7. Centers for Disease Control and Prevention. Guidelines for Evaluating Surveillance Systems. MMWR Morb Mortal Wkly Rep. 1988;37(suppl 5):1 18. 8. Centers for Disease Control and Prevention. Updated guidelines for evaluating public health surveillance systems: recommendations from the Guidelines Working Group. July 27, 2001. Available at: http:// www.cdc.gov/mmwr/preview/mmwrhtml/rr5013a1. htm. Accessed October 21, 2007. 9. Centers for Disease Control and Prevention. Improving surveillance infrastructure for terrorism detection: the 8-cities project resource materials. Available at: http://www.cdc.gov/epo/dphsi/8city.htm. Accessed March 11, 2007. 10. Dausey DJ, Lurie N, Diamond A, et al. Tests to Evaluate Public Health Disease Reporting Systems in Local Public Health Agencies. Santa Monica, Calif: RAND Corp; 2005. 11. Dausey DJ, Lurie N, Diamond A. Public health response to urgent case reports. Health Aff (Millwood). 2005;W5:412 419. 12. Blackwell T, Kaufman J. Response time effectiveness: comparison of response time and survival in an urban emergency medical services system. Acad Emerg Med. 2002;9:288 295. 13. Campbell J, Gridley T, Muelleman R. Measuring response intervals in a system with a 911 primary and an emergency medical services secondary public safety answering point. Ann Emerg Med. 1997;29:492 496. 14. Davis S. Virtual emergency operations. Risk Manage. 2002;49:46 52. 15. Hale J. A layered communications architecture for the support of crisis response. J Manage Inf Syst. 1997; 14:235 255. 16. Jain S, McLean C. Simulation for emergency response: a framework for modeling and simulation for emergency response. In: Chick SE, Sánchez PJ, eds. Proceedings of the 35th Conference on Winter Simulation: Driving Innovation, New Orleans, La, December 7 10, 2003 [CD-ROM]. Omnipress; 2003. 17. Mrvos R, Krenelok KP. Realtime testing of a regional poison information center s disaster plan. J Homeland Secur Emerg Manage. 2005;2:1 3. 18. Who s Who in Local Public Health. Washington, DC: National Association of County and City Health Officials; 2005. 19. Lurie N, Wasserman J, Nelson CD. Public health preparedness: evolution or revolution? Health Aff (Millwood). 2006;25:935 945. 20. Lurie N, Wasserman J, Stoto M, et al. Local variation in public health preparedness: lessons from California. Health Aff (Millwood). 2004;W4:341 353. September 2008, Vol 98, No. 9 American Journal of Public Health Dausey et al. Peer Reviewed Research and Practice 1711