Survey research Chaiwoo Lee Postdoctoral Associate MIT AgeLab chaiwoo@mit.edu agelab.mit.edu Contents Key characteristics of survey research Designing your questionnaire Writing the questions Putting them together Measurement issues Evaluating the questionnaire Things to consider when collecting survey data Ensuring quality Paper vs. online IRB approval Resources
Key characteristics Collecting data by asking a series of questions to be answered in a prescribed format Useful method for describing the characteristics of a large population using a small sample (Babbie, 2007; Schuman and Presser, 1996) Strengths Easier to collect data from a larger sample compared to interviews, focus groups or observations Allows one to discover patterns and make generalizable statements through use of statistical techniques (Gable, 1994) Reliability and consistency: all respondents are presented with the same format (Babbie, 2007) Key characteristics Three types of information typically collected in surveys Type Description Example Demographic Behavioral Attitudinal Descriptive information about respondents Information about respondents behaviors Information about respondents opinions and thoughts Income, age, gender, employment status, household size, location Mode of transportation, time needed to complete a program, frequency, usage Preference for community program, thoughts about a proposed policy, satisfaction, agreement
Writing the questions General principles Specifically identify the measures of interest Write questions to match your objectives Accuracy Clear directions Q. How long is your commute on a normal weekday? Q. Are you young, middle-aged or old? Options provided should not overlap: mutually exclusive Should not be ambiguous: avoid double-barreled or multi-purpose questions Q. In the past year, have you visited any of the counseling and tutoring centers in your city? Ask specific questions Q. What is the age of your oldest child? a. 0-5 b. 5-10 c. 10-15 d. 15+ Q. How would you say things are these days: would you say you are very happy, pretty happy or not too happy? Writing the questions Simplicity and clarity Should be easy to understand: avoid jargon, abbreviations The type of desired answer should be indicated clearly Q. When did you move to the United States? Q. In what year did you move to the United States? Q. How old were you when you moved to the United States? Should be capable of being answered easily and reasonably quickly Q. What was your household income in 1999? Should be sufficiently clear to elicit the desired responses Should be designed so that the results are easy to analyze Q. On occasion, I am unable to express how interested in politics I am. Agree / Disagree (Double negatives)
Writing the questions Attitudes and perceptions Should not contain biased or leading questions Q. You wouldn t say you are in favor of the new costly health benefit plan, would you? Q1. Do you think cars cause pollution in the city center? Yes / No Q2. Do you think cars cause traffic hold-ups in the city center? Yes / No Q3. Do you think cars are a danger to pedestrians in the city center? Yes / No Q4. Do you think cars should be banned from the city center? Yes / No Need to be careful when asking about sensitive matters Completeness Should sufficiently cover a reasonably complete range of alternatives Q. What is your primary medium for getting local news? a. TV b. Newspaper c. Email Include other, not applicable, prefer not to respond if necessary Writing the questions Open vs. closed Category Description Types Characteristics Openended Respondent provides own answer Short answer Long answer Rich and detailed source of data Can use when too many options exist (e.g. college major) Can use to collect continuous data (e.g. age) Long answers are difficult to analyze Closedended Respondent chooses from provided options Dichotomous Multiple choice Ranking Rating Easy to record, code and analyze Efficient Can be more specific Suggestions Ask open-ended questions prior to designing final survey Ask both open and closed questions on the same topic Add open-ended follow-ups to closed questions (e.g. other please specify )
Putting the questions together Ordering Have a logical order List from most familiar to least familiar List from easier questions to more difficult questions List from more interesting to less interesting: demographics at the end Randomization A way of preventing negative effects of survey fatigue and order bias Putting questions and/or response options in random order Only randomized within a separate section: randomization should not change the overall order of your questionnaire Splitting your sample: when the survey is paper-based or difficult to randomize Putting the questions together Research example Topic: describing factors that affect older adults technology adoption and use decisions, and comparing results with other age groups Ordering & randomization 1. Technology experience and knowledge: different types of technology presented in a random order 2. Open-ended questions on experience with adopting and using a specific technology: questions followed a logical order of time 3. Closed-ended questions on perceptions toward various decision criteria: list of decision criteria presented in a random order 4. Life events and living arrangement 5. Demographics
Measurement issues Measurement levels Type Description Examples Analysis Nominal / categorical Responses are differentiated by given categories or classifications Gender Country of residence Central tendency is determined by mode Percentages Ordinal Responses are ordered or ranked Rating scales Ranking items The median can be calculated Ratio Responses are provided on a equal-distance scale that has a meaningful zero value Age Duration Number of people All statistical measures are allowed Number of options Don t list too many for multiple-choice or ranking questions: "The Magical Number Seven, Plus or Minus Two (Miller, 1956) Rating scales Offering a neutral midpoint vs. forced choice 5-point, 7-point or 9-point? Evaluating the questionnaire Class exercise Critique and improve the questionnaire based on the principles covered so far The questionnaire Researcher: a consulting firm with a new marketing tool ( the program ) Participants: employees of various companies that have participated in introductory briefing sessions given by the consulting firm Evaluating the effectiveness of the program developed by the consulting firm Understanding participants perceptions on the briefing sessions Understanding the companies that have participated in the briefing sessions
Evaluating the questionnaire Objectives Ensuring questionnaire quality: clarity, completeness, accuracy, etc. Estimating time and effort Expert reviews Testing the questionnaire with researchers with knowledge and experience with survey research and/or your topic Cognitive interviews One-on-one interviews with potential respondents (small n) Thinking aloud as participants fill out the questionnaire Pilot surveys Small scale launch (larger n compared to cognitive interviews) Using procedures and designs of the main study Evaluating the questionnaire Research example Topic: describing factors that affect older adults technology adoption and use decisions, and comparing results with other age groups Cognitive interviews (n=6) Problem area Original Revised Wording and description Response options General direction Response options Conceptual fit: The degree to which a technology s symbols and languages match the words that I normally use is important 7-point Likert scale Please indicate how much you agree or disagree with the following statements. Technology types: a list of nine types Conceptual fit: It s important for me to feel comfortable with the labels and words used in the technology Added don t know / not sure + Remember to think about your own experiences as you answer these questions. Technology types: added home appliances
Evaluating the questionnaire Research example (continued) Pilot survey (n=39) Gender and age balance, as desired for main survey Used online form designed for main survey, with comments section added Problem area Original Revised Wording and directions Wording and directions Layout and design Groups and sections Question contents Misunderstanding technology adoption I thought it had to do with adopting children and using Web sources for information Marital status - I am widowed and remarried but it's a radio box Small text size, no status bar A list of questions repeated for three different decision stages Examples in addition to brief descriptions of various technologies Survey on User Perceptions and Experiences around Purchase and Use of New Technologies Rephrased to be answered only according to the current status Used a different survey design tool with more design options One question asked three times for different stages, followed by the next question repeated three times Removed examples, as answers were anchored to them rather than descriptions Maintaining data quality Monitoring and screening Issues: straight-lining, random answers, lying Insert screeners, attention filters and/or remove ones that do not qualify Data collection environment Consistency and representativeness Consider the conditions that may affect responses (e.g. weekday vs weekend, weather, cultural differences, etc.) Q. Regardless of how you feel, please choose very positive for this question. a. Very positive b. Positive c. Neutral d. Negative e. Very negative
The survey sample Sampling and recruiting Define the population: who has the answers to your questions? Find your sample: where are they? How can you reach them? Probability vs. non-probability sampling Self-selection bias: people who choose to participate may be different from the rest of the population Response rate May be affected by how respondents were selected and contacted Incentives and personalized messages may increase response rate Survey fatigue may negatively impact response rate (and quality) Example: a total of 39 people completed my $10/15min pilot survey, out of 90 that were contacted (43.3%) Paper vs. online Web-based Easier to distribute Enables continuous data monitoring Enables use of advanced survey logic (e.g. conditional questions) Easier to ensure anonymity Easier to collect, manage and process data Cheaper and less time-consuming Higher response rate Paper When location, environment and place matter When surveying technology-illiterate population Easier to control visual design and keep it consistent In general, longer questions and more response options are tolerable
IRB approval Committee on the Use of Humans as Experimental Subjects (https://couhes.mit.edu/) Survey research conducted at MIT should be approved by COUHES Researchers must complete a web-based training course - the University of Miami CITI program Many survey studies qualify for exempt status and expedited review Submitting a survey research proposal to COUHES Specify desired sample size (the maximum that you will need) Describe compensation Describe any plans for follow-ups Attach a copy of questionnaire Attach a copy of participant invitation Submit amendment forms if you change anything IRB approval Information to participants Describe purpose of survey Include information on length of survey Ensure that responses are confidential (and anonymous, if applicable) Explain how data will be used and kept Ensure that participation is voluntary Provide contact info Describe open and close dates, if applicable Thank them in advance
Resources Online Survey Design Guide, University of Maryland, http://lap.umd.edu/survey_design/index.html Initiative on Survey Methodology, Duke University, http://dism.ssri.duke.edu/resources.php Survey Research Writing Guide, Colorado State University http://writing.colostate.edu/guides/guide.cfm Qualtrics, http://www.qualtrics.com/ Ask me! chaiwoo@mit.edu, E40-287