The American Time Use Survey: Test of Coding Procedures and Lexicon



Similar documents
American Time Use Survey User s Guide

Estimating Lost Secondary Childcare in Wrongful Death Cases Using Data From the American Time Use Survey (ATUS) Mark W. Erwin, M.A., J.D.

Coimisiún na Scrúduithe Stáit State Examinations Commission LEAVING CERTIFICATE APPLIED 2005 CHILDCARE / COMMUNITY CARE

Screen Design : Navigation, Windows, Controls, Text,

Random Digit National Sample: Telephone Sampling and Within-household Selection

Usability Test Plan Docutek ERes v University of Texas Libraries

Non-Profit Solution for Microsoft Dynamics CRM

Swirl. Multiplayer Gaming Simplified. CS4512 Systems Analysis and Design. Assignment Marque Browne Manuel Honegger

Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102

Appendix A Protocol for E-Learning Heuristic Evaluation

Tutorial 3. Maintaining and Querying a Database

Microsoft Office 2010

Social Entrepreneurship MBA AOL Rubric 2014

Survey of Employer Perspectives on the Employment of People with Disabilities

Keywords: time use, nonresponse

Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire

CONSUMER EXPENDITURES 2014

2003 National Survey of College Graduates Nonresponse Bias Analysis 1

Student oral presentation marking sheet

Week 2 Practical Objects and Turtles

Eye Tracking on a Paper Survey: Implications for Design

Incorporating Information Architecture Activities into the Redesign of the U.S. Census Bureau s Web Site

Understanding how older Americans

Usability Heuristics for the Web. 1. Visibility of system status

SELECTED POPULATION PROFILE IN THE UNITED STATES American Community Survey 1-Year Estimates

Keywords: Introduction. Laura Downey 1 Vignette Corporation 3410 Far West Blvd., #300 Austin, TX ldowney@vignette.

Brain Injury: Stages of Recovery

AMERICAN TIME USE SURVEY 2013 RESULTS

ACCESS Importing and Exporting Data Files. Information Technology. MS Access 2007 Users Guide. IT Training & Development (818)

Writing Effective Questions

Timeless Time and Expense Version 3.0. Copyright MAG Softwrx, Inc.

Power Monitoring Expert 7.2

EHR Usability Test Report of Radysans EHR version 3.0

Web Design Competition College of Computing Science, Department of Information Systems. New Jersey Institute of Technology

Lesson 1 Quiz. 3. The Internet is which type of medium? a. Passive b. Broadcast c. One-to-one d. Electronic print

Direct Integration: Training Software Developers and Designers to Conduct Usability Evaluations

Web Site Planning Worksheets Prepared by

Revising the Standard Occupational Classification

Managing Your Qualitative Data: 5 Easy Steps. Qualitative Data Analysis Class Session

A STUDY IN THE PROCESS OF PLANNING, DESIGNING AND EXECUTING A SURVEY

IT Training. Microsoft Project. Vicky Samways, IT Training Information System Services Version 2.1

Shaadi.com- Usability Test

Designing and Evaluating a Web-Based Collaboration Application: A Case Study

Evaluation of an Electronic Charting System in the BCIT Nursing Simulation Lab

Guidance Notes for L&LS Ethics approval

Online Course Design Evaluation

Designing Questions on Criminal Justice Involvement for Health Surveys

D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013

Instructional Design. Educating Today s Students for Tomorrow. Susan Owens MS OTR

Answers to Review Questions

What is Multimedia? Derived from the word Multi and Media

Developing Base Ten Understanding: Working with Tens, The Difference Between Numbers, Doubling, Tripling, Splitting, Sharing & Scaling Up

Using Software to Collect Data Electronically for the Economic Census

Money Desktop FAQ. How do I get started with MoneyDesktop?

A QuestionPro Publication

Module 1: The Career Planning Process: An Overview Transcript

Chapter 12. Introduction. Introduction. User Documentation and Online Help

Chapter 11. HCI Development Methodology

Using NVivo for Qualitative Data Analysis

MYP Unit Question. How can I apply and convey my design skills in developing a product to meet the expectations of a client?

Computer Analysis of Survey Data -- File Organization for Multi-Level Data. Chris Wolf Ag Econ Computer Service Michigan State University 1/3/90

The Auditor's Responsibilities Relating to Other Information

Investigating the Impact of Experience and Solo/Pair Programming on Coding Efficiency: Results and Experiences from Coding Contests

STUDY SERIES (Survey Methodology # )

Sona Systems, Ltd. EXPERIMENT MANAGEMENT SYSTEM Master Documentation Set

Creating and Using Databases with Microsoft Access

Microsoft PowerPoint 2010 Computer Jeopardy Tutorial

A Guide to Curriculum Development: Purposes, Practices, Procedures

Analyzing survey text: a brief overview

Database Design Methodology

examines the ideas related to Class A, Class B, and Class C networks (in other words, classful IP networks).

Writing a Critical or Rhetorical Analysis

General Social Survey Overview of the Time Use of Canadians

3. In the Name field, enter a unique name for the task pad. Note that this name might be visible to cashiers.

Developing an Assessment Tool. February 2012

PRIMARY AND SECONDARY EDUCATION EDUCATIONAL STANDARDS - VOCATIONAL EDUCATION ESTABLISHMENT OF ACADEMIC AND TECHNICAL STANDARDS

Online Media Research. Peter Diem/Vienna

Directions: Read Chapter 3 Site Design in Lynch & Hoi-ton. Using the information you gather from your research, answer the questions below.

Administration. Welcome to the Eastwood Harris Pty Ltd MICROSOFT PROJECT 2010 AND PMBOK GUIDE FOURTH EDITION training course presented by

Access Tutorial 3 Maintaining and Querying a Database. Microsoft Office 2013 Enhanced

A Change Management Plan. Proposal

Commission Work/Life Program

How To Manage A Project

Tutorial 3 Maintaining and Querying a Database

FREQUENTLY ASKED QUESTIONS

Study Guide for the Deputy Recruit Written Exam

EMPLOYEE PERFORMANCE APPRAISAL FORM

Moving Data Between Access and Excel

LDC Template Task Collection 2.0

Content Sheet 12-1: Overview of Personnel Management

CFSD 21 ST CENTURY SKILL RUBRIC CRITICAL & CREATIVE THINKING

Data Quality Assessment. Approach

Why has the most obvious solution to reducing the DWI numbers been overlooked?

Bad designs. Chapter 1: What is interaction design? Why is this vending machine so bad? Good design. Good and bad design.

8 Tips for Maximizing Survey Response Potential

Access to the internet, broadband and mobile phones in family households No. 3

HOW TO PREPARE FOR THE SENIOR EXECUTIVE SERVICE

Simplify survey research with IBM SPSS Data Collection Data Entry

Judgment & Comprehension in Nursing Situations Subtest

Outlook 2010 Essentials Power Point Slides Corporate Training Materials

Transcription:

American Association for Public Research 2002: Strengthening Our Community - Section on Government Statistics The American Time Use Survey: Test of Coding Procedures and Lexicon Sid Schneider, Jennifer Crafts, and David Cantor Westat Tina Shelley, Diane Herz, and Lisa Schwartz Bureau of Labor Statistics 3125

Abstract The American Time Use Survey (ATUS) telephone interviews are scheduled to begin January 2003. Respondents will report all of their activities over a 24-hour period. Once these data are collected, coders will assign a three-tiered code to each activity, based upon a lexicon and coding rules developed by the Bureau of Labor Statistics (BLS). Our poster presents the results of a test of the ATUS coding procedures, including the lexicon, the documentation, and the Blaise software. For the test, nine coders received four hours of training. Then, for the next four days, they coded respondents' activities. Periodically while they worked, they filled out questionnaires and rating forms and participated in focus groups. Four of the coders were individually videotaped while coding in a usability laboratory and while participating in think-aloud interviews. Our poster presents the accuracy and productivity of the coders, the coders' improvement over time, and the sources of the coders' confusion and errors. The poster presents the coders' opinions of the lexicon, software, documentation, and coding rules, and the manner in which these opinions changed as the coders gained experience. Our poster also covers potential enhancements to the lexicon and the software that might increase coder accuracy, consistency, and efficiency. 3126

Background A time-use survey collects data on the amount of time people spend in various activities, such as paid work, childcare, volunteering, commuting, and socializing. About 50 countries have fielded, or will soon field, time use surveys. The Bureau of Labor Statistics is currently developing the American Time Use Survey (ATUS). One purpose of the ATUS is to value unpaid work and measure its contribution to the national economy (Stinson, 2000; Herz, Schwartz, & Shelley, 2001). Researchers will be able to use ATUS data to compare time-use patterns by such characteristics as industry, occupation, marital status, and age of children. Researchers will also be able to contrast time use by Americans with similar data from other countries. ATUS interviewers will contact respondents and ask them to recall their previous day s activities and estimate the duration of each activity. Coders will then code each activity within these 24-hour intervals into aggregate categories of time-use by assigning a six-digit code. The ATUS coding system or "lexicon" has a three-tier structure. The first tier consists of 19 major activity categories. Under each of these 19 activity categories is a second tier of subcategories. Under each second tier are the specific, third tier activities; many of these activities have examples listed with them to help coders make coding decisions. Westat conducted a test of the ATUS coding system, to evaluate the usability of the software that the coders will use, to study the manner in which coder training could be enhanced, to evaluate the ATUS coding lexicon and coding rules, and to suggest ways in which the coding process could be improved. Method Nine experienced coders with varying demographic characteristics were recruited for this test, which had the following components: All nine coders participated in a one-half day training session presented by BLS staff. The coders then coded activities over a period of four days. They worked with some activities that were collected in actual test time use surveys and some activities which were invented for this evaluation. The coders completed a questionnaire after each interview to indicate the activities that were most difficult to code and the steps they took to make their coding decisions. Four coders were individually videotaped while coding in Westat's usability laboratory. They also participated in "think-aloud" interviews, articulating their thoughts while they coded. The coders filled out questionnaires about their attitudes about the lexicon and the software at three points: after training, after 4 hours of coding, and after the four days of coding. The coders 3127

attitudes and opinions were also assessed in a debriefing discussion after two days of coding, and in two focus groups after the four days of coding. Results The main findings of this evaluation were as follows: Coding accuracy and speed varied considerably among the coders. On average, the nine coders were correct on all three tiers for 70 percent of the activities that they coded. Coders who were relatively fast tended also to be relatively accurate. Coders who were relatively slow tended also to be relatively inaccurate. Both accuracy and speed tended to increase as coders gained experience. Coders were most accurate when coding commonly-occurring activities, such as those in the Tier 1 categories of Personal care; Eating and drinking; Socializing, relaxing, and leisure; and Household activities. Coders were least accurate when coding activities within the Tier 1 category of Traveling and when coding infrequently-occurring activities (such as those in Tier 1 categories Arts and entertainment, Volunteer activities, and Child and adult care services [paid]). Coders' attitude data indicated that: As the coders gained experience, they thought that the coding task became progressively easier. Their self-rated speed, accuracy, and confidence improved over time. Coders had generally positive attitudes toward the overall usability of the ATUS coding software, both after training and after becoming more experienced. They thought that the software was easy to use, and allowed them to navigate easily among the tiers and to conduct searches. The coders found the categories of the lexicon to be logical and comprehensible. Conclusions and Recommendations Screen Design. Although coders were generally satisfied with the screen layout and content, they had several suggestions: Making the lexicon and coding rules available on the screen on command, with the capability of conducting searches of these materials. Allowing experienced coders to adjust and customize features of the screen set-up (e.g., position of the panes, the text colors, and the text fonts). 3128

Trigram Search Function. Although the coders reported that the search function was easy to use, the search too often yielded conspicuously off-target results or no results at all. In these circumstances, the software offered no guidance for alternative actions, which confused novice users. The following changes might improve the search feature: Populating the database with many more examples of activities. Examples could be added by using the results from ATUS interviews conducted with hundreds of respondents, or from another country s Time Use Survey. Adding a thesaurus that would automatically perform searches based on synonyms. Training coders when and how to take the context of an activity into account when conducting trigram searches. Adding a feature that conducts the search automatically and alerts the coder when highconfidence hits are available. Documentation. A dictionary accessible on screen would help coders look up word meanings as needed. Coders would also benefit from having the coding rules presented in the form of flow diagrams (e.g., series of branching questions). Such diagrams would reduce the need for coders to remember the coding rules. Navigation. Coders found the software easily navigable, but had some suggestions: Providing a feature to allow coders to examine the examples associated with Tier 3 activities at any time; permitting automatic entry of the 6-digit code when the coders locate a satisfactory example. Making examples accessible and visible at the Tier 2 level as well as at the Tier 3 level. In situations where coders decide to change the Tier 1 code they assigned, blanking out any codes they've already entered for Tiers 2 and 3. Training. The group practice exercise in the training session was very well received, and coders felt that more of that type of practice was needed. By enhancing and lengthening the training program, coder speed, accuracy, and confidence may reach higher levels. Coder comments suggested the following: Coders need feedback during training -- from an instructor or the software itself -- so that they can learn when they have been accurate and inaccurate, and gradually sharpen their coding skills. In this test, feedback was deliberately omitted from the training; feedback will be important, however, when coders train for the ATUS. Coders need to be trained in the use of standard strategies for handling common difficult-tocode activities. 3129

Coders need refresher training, e.g., by reviewing a DVD with video of the training session. Lexicon. Software coding wizards" might help guide coders as they work with difficult-to-code activities. Wizards could help with coding problematic areas such as activities related to travel, vehicles, and communication. The wizards would provide a series of branching questions leading to the accurate codes. Reducing the Error Rate. The coding software was not designed with features specifically intended to prevent the coders from making errors. Eight common sources of coder errors were identified: 1) Coders sometimes attended to extraneous information in the activity descriptions that they should have ignored. To prevent these errors, interviewers should be trained to record only key information, regardless of wording that the respondents use to describe their activities. Coder training should include methods for identifying the relevant information in ambiguously worded activities. 2) In instances where interviewers had recorded concurrent activities as one activity, coders sometimes had difficulty identifying the primary activity. To prevent this error, coders should be trained in when and how to use the context of an activity to assign the 6-digit code. 3) In instances where respondents had described sequential activities as a single activity, coders sometimes attended to the wrong activity. To prevent this error, interviewers should be trained not to code sequential activities as single activities. Coders need to be trained in the methods for identifying the salient activity. 4) Coders had difficulty understanding and applying the rules for coding activities related to traveling. To prevent this error, coders should have a diagram or set of branching questions that lead to the accurate code. Training should focus more on traveling activities. Also, a software wizard could be developed to assist coders with travel-related activities. 5) Coders had difficulty when coding very trivial activities. To prevent these errors, coders could be trained not to code activities of less than 5 minutes duration. Alternatively, rules for trivial activities could be developed. 6) Coders were not familiar with some of the words and phrases interviewers used to record activities. To prevent these errors, the coders could be provided dictionaries -- both on-screen and hard copy. Also, interviewers should be trained to avoid slang. 3130

7) In instances where the verb used to describe an activity was inconsistent with the purpose of that activity (e.g., "talked to my children" was usually coded as child care, not socialization), coders sometimes made incorrect coding choices. To avoid these errors, coder training should focus on identifying sequences of activities that have a specific purpose, and to code such sets according to their purpose. 8) Coders sometimes found several codes that seemed to apply equally well to an activity. To avoid these errors, training should cover these situations. The database should contain examples so that the trigram search yields unequivocal results. The results also suggested that coder accuracy could be improved with these steps: Reduce the number of categories in all three tiers. Merge little-used categories into other categories, unless doing so would compromise the goals of the ATUS. Select only the most accurate coders. Recruit and train more coders than needed, then select the most accurate and speedy coders. Code each interview more than once, using different coders, and then compare the results to identify errors for correction. Have experts resolve any discrepancies. This technique is often used by social scientists when studying subjective judgments. References Herz, D.E., Schwartz, L. K. & Shelley, K. J. (2001). Coding Activities in the American Time Use Survey. U.S. Department of Labor, Bureau of Labor Statistics. Stinson, L. (2000). Fourth Digit Reporting: A Data Collection Problem; Not a Coding Scheme Problem. Internal memorandum, the Bureau of Labor Statistics July 12, 2000. 3131