Comparative Assessment of Software Programs for the Development of Computer-Assisted Personal Interview (CAPI) Applications.



Similar documents
Comparative Assessment of Software Programs for the Development of Computer-Assisted Personal Interview (CAPI) Applications

COMPUTER-ASSISTED PERSONAL INTERVIEWING

Writing Reports BJECTIVES ONTENTS. By the end of this section you should be able to :

IDC MarketScape: Worldwide Datacenter Infrastructure Management 2015 Vendor Assessment

elearning Course Catalog

Integrating an Open-source Learning Management System (Moodle) In an English Language Program: A Case Study

1998 Workplace Employee Relations Survey

Appendix A: Annotated Table of Activities & Tools

Performing a data mining tool evaluation

Quantification and Traceability of Requirements

Central African Republic Country Profile 2011

E-Series. Human Machine Interfaces. Evolution By Design MITSUBISHI ELECTRIC AUTOMATION, INC.

How Do Field Interviewers Measure Time?

1.1.1 Introduction to Cloud Computing

Request for Proposal San Juan County, Utah

National Disability Authority Resource Allocation Feasibility Study Final Report January 2013

Johannes Sametinger. C. Doppler Laboratory for Software Engineering Johannes Kepler University of Linz A-4040 Linz, Austria

Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344

North Carolina Learning Object Repository (NCLOR)

2014 V1.0. LiveText e-portfolios

Hands-on Sessions Each attendee will use a computer to walk-through training material.

Information. Security Series. Evaluate Your. Information. Security Program. Survey to find out if you are making progress

ACCREDITATION MODELS IN HIGHER EDUCATION IN FINLAND: EXPERIENCES AND PERSPECTIVES

Omni-channel solutions for posts

Setting a New Standard for 2D Design

E-government Data Interoperability Framework in Hong Kong

HP and Mimosa Systems A system for archiving, recovery, and storage optimization white paper

EMC Virtual Infrastructure for Microsoft SQL Server

Chapter 3. Methodology

ENTERPRISE ARCHITECTUE OFFICE

Configuring Hosting Controller with Exchange 2013 & 2016

Enterprise Architecture Process, Structure and Organization

Safety Regulation Group SAFETY MANAGEMENT SYSTEMS GUIDANCE TO ORGANISATIONS. April

Schenectady City School. Request for Proposal for Website Design and Content Management System RFP #

Sharing Open Educational Resources in Multilanguage Repositories - the Learning Resource Exchange and Scientix

Talent Management and Acquisition Training

Setting a New Standard for 2D Design

Advance unedited version. Decision -/CMP.3. Further guidance relating to the clean development mechanism

Assessing Your Information Technology Organization

Product. Velocity Gain Efficiencies and Improve Loan Quality with a Comprehensive, Open Architecture Loan Origination Solution

Stages of Instructional Design V. Professional Development

HP SUPPLIER ENVIRONMENTAL PERFORMANCE REVIEW QUESTIONNAIRE

Amadeus Selling Platform Connect

44-76 mix 2. Exam Code:MB Exam Name: Managing Microsoft Dynamics Implementations Exam

Process Miner (PM) Introduction. Search cross-lot data by product code, order numbers, and data parameters

Tableau Metadata Model

2. Issues using administrative data for statistical purposes

Teaching & Media: A Systematic Approach

Hertsmere Borough Council. Data Quality Strategy. December

Fourth generation techniques (4GT)

Graduate route 3 candidate guidance

UPK Content Development Rel 11.1

Current Situations and Issues of Occupational Classification Commonly. Used by Private and Public Sectors. Summary

NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0

TERMS OF REFERENCE FOR STRATEGIC PLANNING & MANAGEMENT EXPERT

DMM301 Benefits and Patterns of a Logical Data Warehouse with SAP BW on SAP HANA

How to Identify Real Needs WHAT A COMMUNITY NEEDS ASSESSMENT CAN DO FOR YOU:

HP Matrix Operating Environment Federated CMS Overview

2011 POPULATION AND HOUSING CENSUS OF TURKEY

LCMS and LMS. Taking Advantage of Tight Integration. August Raghavan Rengarajan Chief Software Architect Click2learn, Inc.

Ontario Ombudsman. Goals

A Principled Technologies white paper commissioned by Dell Inc.

How to Manage Business Logic

CUSTOMER SERVICE SATISFACTION WAVE 4

imc FAMOS 6.3 visualization signal analysis data processing test reporting Comprehensive data analysis and documentation imc productive testing

Guide to Low Cost Electronic Data Capture Systems for Clinical Trials Supported by the Institute for Translational Health Sciences, Seattle, WA

Cisco and VMware Virtualization Planning and Design Service

Data Validation and Data Management Solutions

Protecting Microsoft SQL Server with an Integrated Dell / CommVault Solution. Database Solutions Engineering

IS INTERNATIONAL STANDARD. Environmental management - Life cycle assessment - Principles and framework

The Aggregate Demand- Aggregate Supply (AD-AS) Model

JProfiler: Code Coverage Analysis Tool for OMP Project

FinTech Quadrant 2016

POLICY AND PROCEDURES FOR MEASUREMENT OF RESEARCH OUTPUT OF PUBLIC HIGHER EDUCATION INSTITUTIONS MINISTRY OF EDUCATION

THE INTER-UNIVERSITY COUNCIL FOR EAST AFRICA

Eastern Illinois University information technology services. strategic plan. January,

User Manual. for. OrangeHRM & OrangeHRM Live

Why NetDimensions Learning

USER S MANUAL FOR THE HEALTHCARE DEMO SOFTWARE:

Cisco Business Video Solutions Visual Communications Increase Collaboration Effectiveness and Reduce Cost

SIMPLE HRM SOLUTION FOR THE MOST COMPLEX HR PRACTICES

Guideline for Implementing the Universal Data Element Framework (UDEF)

Ten Questions to Ask PLM Solution Suppliers What You Need to Know to Make an Informed Decision. August A CIMdata White Paper

CHAPTER FIVE RESEARCH METHODS: THE LITERATURE REVIEW, CONDUCTING INTERVIEWS AND THE COLLECTION OF STATISTICAL INFORMATION

School of Education Department of Counseling and School Psychology SCHOOL PSYCHOLOGY GRADUATE PROGRAM. Internship Manual

Appendix A Protocol for E-Learning Heuristic Evaluation

Transcription:

Comparative Assessment of Software Programs for the Development of Computer-Assisted Personal Interview (CAPI) Applications Executive Summary 1

1. INTRODUCTION The World Bank, national statistics offices, and other institutions are keenly interested in moving from household surveys based on paper-and-pen interviewing (PAPI) to computer-assisted personal interviewing (CAPI). However, most organizations migrating from PAPI to CAPI are often not fully aware of their options in terms of the software package that best meets their needs. This dilemma underscores the paucity of consolidated information on CAPI software packages, of which there are five dimensions: (i) there are no standards for CAPI software packages; (ii) there are no CAPI user communities; (iii) there are no prior systematic public comparisons of CAPI software packages; and (iv) while there is an excess of publicly available information of differential quality and quantity, only a limited share is informed by hands-on experience. Consequently, the World Bank Living Standards Measurement Study (LSMS) team partnered with the IRIS Center at the University of Maryland to prepare a detailed report that aims to partly fill the informational gap in the short run, while serving as a foundation for the medium-term decisions concerning the transition from PAPI to CAPI as part of the surveys supported under the LSMS-Integrated Surveys on Agriculture (LSMS-ISA) initiative. 1 The challenge was not simply to identify CAPI software packages, but to find ones that are robust enough in performance and broad enough in functionality to support a complex multi-topic household survey. This document serves as an executive summary for the main comparative report, providing a synopsis of the approach to the evaluation and the main findings. The main report is accompanied by software-specific (i) compact checklists of features, (ii) detailed checklists of features, (iii) meta evaluation with a numerical scoring of the comparative assessment, and (iv) history of known deployments, in Appendices A, B, C, and D, respectively. In addition, screenshots from various stages of questionnaire development in each featured software package and a glossary of technical terms that are commonly used throughout the report are provided in Appendix E and F, respectively. 2. APPROACH TO EVALUATION To select the software packages to be assessed, a set of objective criteria were developed and applied consistently across the available CAPI software packages in the market. It was determined that the CAPI software packages should be capable of generating applications that meet minimal criteria in the areas of data capture, questionnaire navigation, skipping/branching, data quality control, data management, and case management. More specifically, it was required that the CAPI applications: i. allow a reasonable variety of question types, capture a large number of data points, capture data points at different levels of hierarchy, and collect and store large numbers of survey observations; 1 The LSMS-ISA project is an innovative household survey program that was started with a grant from the Bill and Melinda Gates Foundation and that is being led by the LSMS team. Under the LSMS-ISA initiative, the World Bank is supporting countries in Sub- Saharan Africa to establish systems of multi-topic, integrated panel household surveys with a strong focus on agriculture. Visit www.worldbank.org\lsms-isa for more information. 2

ii. include basic questionnaire navigation abilities, e.g. the ability to move backwards, to pause and resume at the last answered question, and to complete a questionnaire in a non-linear way; iii. include both basic and complex skip instruction capabilities; iv. provide data consistency checks within and across sections of a questionnaire; v. generate data sets, ideally with value/variable labels, in SAS/Stata/SPSS or at least in CSV format; and vi. include comparable sets of tools for enumerators, supervisors and survey managers to employ in the management of their work. Based on the above criteria, the following 8 packages were chosen for the assessment: Software Developer Version Status Blaise Statistics Netherlands Blaise 4.8 CASES Computer-Assisted Survey Methods Program at the University of California, Berkeley CASES 5.5 for CAPI CASES 6.0 for Case Management CSProX Serpro, S.A. CSProX 3.3 Entryware Techneos Entryware 6.4 MMIC Open Data Kit Pendragon Forms Surveybe RAND Labor and Population The University of Washington s Department of Computer Science and Engineering Pendragon Software Corporation Economic Development Initiatives MMIC v0.7.2b Aggregate v0.9.4 Build v0.8.5 Collect v1.1.5 Manage v1.0 Validate v1.2 Pendragon Form VI Surveybe (Pre-Release Beta Version) Public Domain (Open-Source) Public Domain (Open-Source) The software packages were evaluated with respect to functionality expectations from the ideal CAPI application, which must be: i. intuitive and easy to use during questionnaire development, field deployment, and data management; ii. flexible for the broadest possible circumstances, or best for a particular one; and iii. open to redesign and improvement, whether through regular updates or an in-built means for extending current commands. As a result, each package was evaluated i. through the three complementary lenses of survey designers, field staff, and survey managers; ii. with respect to the human, computing, and financial resources that it requires; and iii. by its potential for expanding and extending current functionalities. 3

In regards to the tools for survey designers, the evaluation considered the power and simplicity of the programming language, the ease of design within the development environment, and the diversity of questionnaire implementation options. For survey interviewers, the evaluation took into account the appropriateness and user-friendliness of the application interface, in addition to the power and convenience of the questionnaire navigation abilities. Lastly, for survey managers, the evaluation examined the tools for case management, data transfer, and data export in regards to their power as well as their simplicity of use. With respect to the human, computing and financial resources they require, CAPI packages were evaluated based on their breadth of support and documentation, their minimum hardware and software requirements for operation, and their pricing structure for software, training, and technical support. Finally, each package was assessed based on the extent to which the software provides users with the means of expanding and extending current functionalities. It should be noted that the assessment was first and foremost a stock-taking exercise that evaluated 8 software packages in terms of the functionalities that were readily available to end users by January 31, 2011. While the assessment team solicited as much information as possible from the software developers concerning their priorities for future development, and incorporated this knowledge into the main report, the functionalities that were claimed to be in the pipeline by the software developers did not factor into the analysis of how each package performed in each evaluation area. The readers should refer to the software developers for any functionality that may have been made available since the conclusion of the technical evaluation. 3. OVERVIEW OF MAIN FINDINGS The main findings of the assessment span a range of practical and technical features of the software packages. Here we have greatly simplified the assessment into several broad-stroke observations. i. No single software package is an unequivocal frontrunner in all twelve of the evaluation areas. ii. From a questionnaire design standpoint, the ideal software package would be a hybrid product that couples a user-friendly, menu-driven environment for core design tasks with a more efficient, albeit more demanding, command line-driven questionnaire development environment, which offers the same features as the menu-driven counterpart but also makes available more complex functionality. This package, in which menus serve as a pedagogical tool for command line coding, does not yet exist. The evaluated software packages that are primarily menu-driven have a clear intuitive appeal for the less technically proficient audience. However, currently, only the software packages that offer a command-line driven development environment allow for commissioning simultaneous development tasks across multiple programmers in a given survey that could later be integrated into one application. 4

iii. iv. The existing public domain, open-source software packages have sparse documentation and sometimes limited user communities. Conversely, most proprietary software packages have excellent support and documentation. Among the software packages that have been identified by the assessment with the capability to generate CAPI applications that can reliably accommodate the needs of LSMS-type household surveys, three surface as the top contenders: Blaise, MMIC, and Surveybe. Among these, Surveybe currently relies on a primarily menu-driven questionnaire development environment, and is the most user-friendly option. The package boasts an intuitive interface for field users, an intuitive environment for survey developers, and a powerful set of features most notably effortless non-linear questionnaire navigation, without the usual cost of complexity. However, Surveybe s immediate ease of use currently comes at a cost: adopting a sequential workflow for questionnaire development, such that either a single programmer develops an application in its entirety or a team of programmers make their contributions each in turn. This limitation may have nontrivial implications on human resource management within the survey implementing agency, particularly with lengthy and complex questionnaire instruments. Blaise and the open-source MMIC feature a powerful command line-driven questionnaire development environment, which is a mixed blessing. On the one hand, the packages offer both an impressive array of programmable features and the undeniable utility of reusable computer code. The latter functionality provides survey developers with the ability to recreate arbitrary questionnaire elements (e.g. questions, answers, modules) in future surveys by reusing associated code. On the other hand, Blaise and MMIC require survey developers to learn proprietary programming languages that are expansive in scope and often complex in their implementation. Learning and mastering these languages may be nontrivial tasks, and may require substantial staff time, particularly where software documentation is not clear. In this respect, although MMIC either matches or surpasses Blaise in 11 of the 12 realms of evaluation, Blaise may be preferred over MMIC in part due to its strong documentation, a relatively large and active user community, and a robust record of previous deployments. v. The relative cost-effectiveness of adopting a CAPI software package depends on the technical capacity of the survey team as well as the scale of the survey work. Fee-based software, which is often accompanied by technical support and training, may present a better value for organizations lacking the capacity necessary to successfully adopt less user-friendly public-domain software packages. Furthermore, pricing models vary between packages some charge per instance of software installation on field equipment, while others charge on data points collected or interview time. All of these factors should be taken into account in determining the most cost-effective package for a given survey effort. 5