Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire



Similar documents
Designing & Conducting Survey Research

SURVEY DESIGN: GETTING THE RESULTS YOU NEED

Survey Design. Assessment Spotlight Hosted by the Student Affairs Assessment Council University of North Carolina at Chapel Hill

Writing Better Survey Questions

Self-administered Questionnaire Survey

Using Surveys for Data Collection in Continuous Improvement

TIPSHEET IMPROVING RESPONSE SCALES

Types of Research. Strength of Research. Data Collection 9/13/2011

Ithaca College Survey Research Center Survey Research Checklist and Best Practices

Survey Research: Designing an Instrument. Ann Skinner, MSW Johns Hopkins University


Society of Actuaries Middle Market Life Insurance Segmentation Program (Phase 1: Young Families)

Capstone Suggestions for Survey Development for Research Purposes

Get it together: six ways to effectively integrate phone, web, and surveys

Quantitative Research - Basics

Questionnaire Design. Questionnaires in Clinical Research

Running head: RESEARCH PROPOSAL GUIDELINES: APA STYLE. APA Style: An Example Outline of a Research Proposal. Your Name

Hosting Motivation and Satisfaction Study:

Surveys: Why, When and How. Gavin Henning Pamela Bagley

Survey Analysis Guidelines Sample Survey Analysis Plan. Survey Analysis. What will I find in this section of the toolkit?

Site Matters: The Value of Local Newspaper Web Sites. Site Matters: The Value of Local Newspaper Web sites

2. Choose the location and format for focus groups or interviews

Questionnaires. Unit 7

Chapter 4: Methodology of the Conducted Research. and state the research purpose, hypothesis and main research objectives. 4.1.

Population Research Seminar Series

CATALYST REPORT DebateHub and Collective Intelligence Dashboard Testing University of Naples Federico II

Surveys. Methods for Evaluating Advising Program/U.C. Berkeley 1

Math and Science Bridge Program. Session 1 WHAT IS STATISTICS? 2/22/13. Research Paperwork. Agenda. Professional Development Website

Chapter 6 Multiple Choice Questions (The answers are provided after the last question.)

Internet Surveys. Examples

PAYMENT PROTECTION INSURANCE RESEARCH

Essentials of Marketing Research

Data Collection for Program Evaluation

Measurement. How are variables measured?

Further Decline in Credibility Ratings for Most News Organizations

U.S. Investors & The Fiduciary Standard. A National Opinion Survey September 15, 2010

Usability Evaluation with Users CMPT 281

Online Media Research. Peter Diem/Vienna

The Financial Services Trust Index: A Pilot Study. Christine T Ennew. Financial Services Research Forum. University of Nottingham

CFPB Student Loan (TO13) Focus Group and Interview Screener

Customer Awareness Advertising Campaign Research

Online Survey of Employees Without Workplace Retirement Plans

What American Teens & Adults Know About Economics

How To Find Out How Fast A Car Is Going

Surveys can be an effective means to collect data

How consumers want Charities to communicate with them

How To Read A Company Annual Report

Human Subjects Research at OSU

MAINE K-12 & SCHOOL CHOICE SURVEY What Do Voters Say About K-12 Education?

Market Research: Friend or Foe to Christian Charities?

Levels of measurement in psychological research:

DATA COLLECTION AND ANALYSIS

Writing Good Survey Questions Tips & Advice February 21, 2013

High School Counselors Influence

Conducting Aging Well Resident Surveys

Availability, Effectiveness and Utilization of Computer Technology among High School Mathematic Teachers in the Instructional Process

How do we know what we know?

Survey Research. Classifying surveys on the basis of their scope and their focus gives four categories:

Course Catalog.

Jon A. Krosnick and LinChiat Chang, Ohio State University. April, Introduction

Planning Your Analysis. Copyright 2012 The University of Utah - All Rights Reserved

Customer Satisfaction Index 2014

Data Collection for Program Evaluation

Research & Consultation Guidelines

IVR: Interactive Voice Response

ONLINE SURVEY DESIGN & IMPLEMENTATION

The Importance of User-Centered Design

Paper-and-Pencil Surveys Implementation Guidelines

Community Life Survey Summary of web experiment findings November 2013

Adults and Cell Phone Distractions

Primary Data Collection Methods: Survey Design

Working with data: Data analyses April 8, 2014

Survey of Residents. Conducted for the City of Cerritos. July 2002

Internet Access and Use: Does Cell Phone Interviewing make a difference?

PME Inc. Final Report. Prospect Management. Legal Services Society Tariff Lawyer Satisfaction Survey

PERCEPTION OF SENIOR CITIZEN RESPONDENTS AS TO REVERSE MORTGAGE SCHEME

1. History of NASS Organizational Climate Surveys

Resource 6 Workplace travel survey guide

Miami University: Human Subjects Research General Research Application Guidance

ESOMAR 15 RESPONSES TO ESOMAR 28 QUESTIONS

Statistics and Probability (Data Analysis)

Northumberland Knowledge

Attitudes, Concerns and Opinions Relating to the Provision of Emergency Medical Services

Lessons Learned From an NHIS Web- Followback Feasibility Study

Best Practices: Understanding and Reducing Bias in Your Surveys

Questionnaire design and analysing the data using SPSS page 1

Netigate User Guide. Setup Introduction Questions Text box Text area Radio buttons Radio buttons Weighted...

Americans and text messaging

In general, the following list provides a rough guide of the survey process:

Council of Ambulance Authorities

17% of cell phone owners do most of their online browsing on their phone, rather than a computer or other device

To IRB, or Not to IRB? That Is the Question!

Using Response Reliability to Guide Questionnaire Design

Employee Surveys as a Management Tool. Dr. Mark Ellickson Opinion Research Specialists, LLC

SOCIETY OF ACTUARIES THE AMERICAN ACADEMY OF ACTUARIES RETIREMENT PLAN PREFERENCES SURVEY REPORT OF FINDINGS. January 2004

IBM SPSS Direct Marketing 22

Chapter 3. Methodology

SURVEY RESEARCH RESEARCH METHODOLOGY CLASS. Lecturer : RIRI SATRIA Date : November 10, 2009

Handhelds for Music Education

Prescription Painkiller/Heroin Addiction and Treatment: Public and Patient Perceptions

Transcription:

Survey research Chaiwoo Lee Postdoctoral Associate MIT AgeLab chaiwoo@mit.edu agelab.mit.edu Contents Key characteristics of survey research Designing your questionnaire Writing the questions Putting them together Measurement issues Evaluating the questionnaire Things to consider when collecting survey data Ensuring quality Paper vs. online IRB approval Resources

Key characteristics Collecting data by asking a series of questions to be answered in a prescribed format Useful method for describing the characteristics of a large population using a small sample (Babbie, 2007; Schuman and Presser, 1996) Strengths Easier to collect data from a larger sample compared to interviews, focus groups or observations Allows one to discover patterns and make generalizable statements through use of statistical techniques (Gable, 1994) Reliability and consistency: all respondents are presented with the same format (Babbie, 2007) Key characteristics Three types of information typically collected in surveys Type Description Example Demographic Behavioral Attitudinal Descriptive information about respondents Information about respondents behaviors Information about respondents opinions and thoughts Income, age, gender, employment status, household size, location Mode of transportation, time needed to complete a program, frequency, usage Preference for community program, thoughts about a proposed policy, satisfaction, agreement

Writing the questions General principles Specifically identify the measures of interest Write questions to match your objectives Accuracy Clear directions Q. How long is your commute on a normal weekday? Q. Are you young, middle-aged or old? Options provided should not overlap: mutually exclusive Should not be ambiguous: avoid double-barreled or multi-purpose questions Q. In the past year, have you visited any of the counseling and tutoring centers in your city? Ask specific questions Q. What is the age of your oldest child? a. 0-5 b. 5-10 c. 10-15 d. 15+ Q. How would you say things are these days: would you say you are very happy, pretty happy or not too happy? Writing the questions Simplicity and clarity Should be easy to understand: avoid jargon, abbreviations The type of desired answer should be indicated clearly Q. When did you move to the United States? Q. In what year did you move to the United States? Q. How old were you when you moved to the United States? Should be capable of being answered easily and reasonably quickly Q. What was your household income in 1999? Should be sufficiently clear to elicit the desired responses Should be designed so that the results are easy to analyze Q. On occasion, I am unable to express how interested in politics I am. Agree / Disagree (Double negatives)

Writing the questions Attitudes and perceptions Should not contain biased or leading questions Q. You wouldn t say you are in favor of the new costly health benefit plan, would you? Q1. Do you think cars cause pollution in the city center? Yes / No Q2. Do you think cars cause traffic hold-ups in the city center? Yes / No Q3. Do you think cars are a danger to pedestrians in the city center? Yes / No Q4. Do you think cars should be banned from the city center? Yes / No Need to be careful when asking about sensitive matters Completeness Should sufficiently cover a reasonably complete range of alternatives Q. What is your primary medium for getting local news? a. TV b. Newspaper c. Email Include other, not applicable, prefer not to respond if necessary Writing the questions Open vs. closed Category Description Types Characteristics Openended Respondent provides own answer Short answer Long answer Rich and detailed source of data Can use when too many options exist (e.g. college major) Can use to collect continuous data (e.g. age) Long answers are difficult to analyze Closedended Respondent chooses from provided options Dichotomous Multiple choice Ranking Rating Easy to record, code and analyze Efficient Can be more specific Suggestions Ask open-ended questions prior to designing final survey Ask both open and closed questions on the same topic Add open-ended follow-ups to closed questions (e.g. other please specify )

Putting the questions together Ordering Have a logical order List from most familiar to least familiar List from easier questions to more difficult questions List from more interesting to less interesting: demographics at the end Randomization A way of preventing negative effects of survey fatigue and order bias Putting questions and/or response options in random order Only randomized within a separate section: randomization should not change the overall order of your questionnaire Splitting your sample: when the survey is paper-based or difficult to randomize Putting the questions together Research example Topic: describing factors that affect older adults technology adoption and use decisions, and comparing results with other age groups Ordering & randomization 1. Technology experience and knowledge: different types of technology presented in a random order 2. Open-ended questions on experience with adopting and using a specific technology: questions followed a logical order of time 3. Closed-ended questions on perceptions toward various decision criteria: list of decision criteria presented in a random order 4. Life events and living arrangement 5. Demographics

Measurement issues Measurement levels Type Description Examples Analysis Nominal / categorical Responses are differentiated by given categories or classifications Gender Country of residence Central tendency is determined by mode Percentages Ordinal Responses are ordered or ranked Rating scales Ranking items The median can be calculated Ratio Responses are provided on a equal-distance scale that has a meaningful zero value Age Duration Number of people All statistical measures are allowed Number of options Don t list too many for multiple-choice or ranking questions: "The Magical Number Seven, Plus or Minus Two (Miller, 1956) Rating scales Offering a neutral midpoint vs. forced choice 5-point, 7-point or 9-point? Evaluating the questionnaire Class exercise Critique and improve the questionnaire based on the principles covered so far The questionnaire Researcher: a consulting firm with a new marketing tool ( the program ) Participants: employees of various companies that have participated in introductory briefing sessions given by the consulting firm Evaluating the effectiveness of the program developed by the consulting firm Understanding participants perceptions on the briefing sessions Understanding the companies that have participated in the briefing sessions

Evaluating the questionnaire Objectives Ensuring questionnaire quality: clarity, completeness, accuracy, etc. Estimating time and effort Expert reviews Testing the questionnaire with researchers with knowledge and experience with survey research and/or your topic Cognitive interviews One-on-one interviews with potential respondents (small n) Thinking aloud as participants fill out the questionnaire Pilot surveys Small scale launch (larger n compared to cognitive interviews) Using procedures and designs of the main study Evaluating the questionnaire Research example Topic: describing factors that affect older adults technology adoption and use decisions, and comparing results with other age groups Cognitive interviews (n=6) Problem area Original Revised Wording and description Response options General direction Response options Conceptual fit: The degree to which a technology s symbols and languages match the words that I normally use is important 7-point Likert scale Please indicate how much you agree or disagree with the following statements. Technology types: a list of nine types Conceptual fit: It s important for me to feel comfortable with the labels and words used in the technology Added don t know / not sure + Remember to think about your own experiences as you answer these questions. Technology types: added home appliances

Evaluating the questionnaire Research example (continued) Pilot survey (n=39) Gender and age balance, as desired for main survey Used online form designed for main survey, with comments section added Problem area Original Revised Wording and directions Wording and directions Layout and design Groups and sections Question contents Misunderstanding technology adoption I thought it had to do with adopting children and using Web sources for information Marital status - I am widowed and remarried but it's a radio box Small text size, no status bar A list of questions repeated for three different decision stages Examples in addition to brief descriptions of various technologies Survey on User Perceptions and Experiences around Purchase and Use of New Technologies Rephrased to be answered only according to the current status Used a different survey design tool with more design options One question asked three times for different stages, followed by the next question repeated three times Removed examples, as answers were anchored to them rather than descriptions Maintaining data quality Monitoring and screening Issues: straight-lining, random answers, lying Insert screeners, attention filters and/or remove ones that do not qualify Data collection environment Consistency and representativeness Consider the conditions that may affect responses (e.g. weekday vs weekend, weather, cultural differences, etc.) Q. Regardless of how you feel, please choose very positive for this question. a. Very positive b. Positive c. Neutral d. Negative e. Very negative

The survey sample Sampling and recruiting Define the population: who has the answers to your questions? Find your sample: where are they? How can you reach them? Probability vs. non-probability sampling Self-selection bias: people who choose to participate may be different from the rest of the population Response rate May be affected by how respondents were selected and contacted Incentives and personalized messages may increase response rate Survey fatigue may negatively impact response rate (and quality) Example: a total of 39 people completed my $10/15min pilot survey, out of 90 that were contacted (43.3%) Paper vs. online Web-based Easier to distribute Enables continuous data monitoring Enables use of advanced survey logic (e.g. conditional questions) Easier to ensure anonymity Easier to collect, manage and process data Cheaper and less time-consuming Higher response rate Paper When location, environment and place matter When surveying technology-illiterate population Easier to control visual design and keep it consistent In general, longer questions and more response options are tolerable

IRB approval Committee on the Use of Humans as Experimental Subjects (https://couhes.mit.edu/) Survey research conducted at MIT should be approved by COUHES Researchers must complete a web-based training course - the University of Miami CITI program Many survey studies qualify for exempt status and expedited review Submitting a survey research proposal to COUHES Specify desired sample size (the maximum that you will need) Describe compensation Describe any plans for follow-ups Attach a copy of questionnaire Attach a copy of participant invitation Submit amendment forms if you change anything IRB approval Information to participants Describe purpose of survey Include information on length of survey Ensure that responses are confidential (and anonymous, if applicable) Explain how data will be used and kept Ensure that participation is voluntary Provide contact info Describe open and close dates, if applicable Thank them in advance

Resources Online Survey Design Guide, University of Maryland, http://lap.umd.edu/survey_design/index.html Initiative on Survey Methodology, Duke University, http://dism.ssri.duke.edu/resources.php Survey Research Writing Guide, Colorado State University http://writing.colostate.edu/guides/guide.cfm Qualtrics, http://www.qualtrics.com/ Ask me! chaiwoo@mit.edu, E40-287