Survey Research: Designing an Instrument. Ann Skinner, MSW Johns Hopkins University

Similar documents
Self-administered Questionnaire Survey

Survey Research: Choice of Instrument, Sample. Lynda Burton, ScD Johns Hopkins University

Chapter 6 Multiple Choice Questions (The answers are provided after the last question.)

Avoiding Bias in the Research Interview

How do we know what we know?

SURVEY RESEARCH RESEARCH METHODOLOGY CLASS. Lecturer : RIRI SATRIA Date : November 10, 2009

Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire

Market Research Methodology

Managing Marketing Information. Chapter 4

How to Approach a Study: Concepts, Hypotheses, and Theoretical Frameworks. Lynda Burton, ScD Johns Hopkins University

Evaluating Survey Questions

see Designing Surveys by Ron Czaja and Johnny Blair for more information on surveys

Descriptive Methods Ch. 6 and 7

Analysis Issues II. Mary Foulkes, PhD Johns Hopkins University

Cohort Studies. Sukon Kanchanaraksa, PhD Johns Hopkins University

Prepared by the Policy, Performance and Quality Assurance Unit (Adults) Tamsin White

Chapter 2: Research Methodology

TIPSHEET IMPROVING RESPONSE SCALES

CHAPTER 3 RESEARCH METHODOLOGY

Biostatistics and Epidemiology within the Paradigm of Public Health. Sukon Kanchanaraksa, PhD Marie Diener-West, PhD Johns Hopkins University

Concepts in Economic Evaluation

Capstone Suggestions for Survey Development for Research Purposes

Good Practices in Survey Design Step-by-Step

Cost-Benefit and Cost-Effectiveness Analysis. Kevin Frick, PhD Johns Hopkins University

What do we mean by cause in public health?

Survey Research. Classifying surveys on the basis of their scope and their focus gives four categories:

Chapter 2. Sociological Investigation

Data Collection Instruments (Questionnaire & Interview) Dr. Karim Abawi Training in Sexual and Reproductive Health Research Geneva 2013

Pretesting A Person By Writing Material

Surveys can be an effective means to collect data

Executive Summary. Seven-year research project conducted in three phases

How to do a Survey (A 9-Step Process) Mack C. Shelley, II Fall 2001 LC Assessment Workshop

Surveys/questionnaires


Structured Interviewing:

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

Using Qualitative Methods for Monitoring and Evaluation

Curriculum Development in 6 easy steps for busy Med-Ed types. Curriculum Development in MedEd

Principles of Ventilation. Patrick N. Breysse, PhD, CIH Peter S.J. Lees, PhD, CIH Johns Hopkins University

Quantitative Research - Basics

Chapter 2 Quantitative, Qualitative, and Mixed Research

INTRODUCTION 3 TYPES OF QUESTIONS 3 OVERALL RATINGS 4 SPECIFIC RATINGS 6 COMPETITIVE RATINGS 6 BRANCHING 7 TRIGGERS 7

Semi-structured interviews

MARKETING RESEARCH AND MARKET INTELLIGENCE (MRM711S) FEEDBACK TUTORIAL LETTER SEMESTER `1 OF Dear Student

Section 2: Ten Tools for Applying Sociology

Using Surveys for Data Collection in Continuous Improvement

FINANCE AND ACCOUNTING OUTSOURCING AN EXPLORATORY STUDY OF SERVICE PROVIDERS AND THEIR CLIENTS IN AUSTRALIA AND NEW ZEALAND.

GROW Model for Coaching

Economics and Economic Evaluation

STRATEGIC DESIGN MANAGEMENT AND THE ROLE OF CONSULTING

The Competitive Environment: Sales and Marketing

SURVEY DESIGN: GETTING THE RESULTS YOU NEED

Training and supervising your research team. Qualitative Data Analysis Class Session 13

Section 11. Giving and Receiving Feedback

Manual on Training Needs Assessment

Qualitative methods for effectiveness evaluation: When numbers are not enough

Historical, Legal, and Ethical Perspectives on Public Health Policy. Gerard F. Anderson, PhD Johns Hopkins University

Interaction Design. Chapter 4 (June 1st, 2011, 9am-12pm): Applying Interaction Design I

Life Tables. Marie Diener-West, PhD Sukon Kanchanaraksa, PhD

Qualitative data acquisition methods (e.g. Interviews and observations) -.

Grade 6: Module 1: Unit 2: Lesson 19 Peer Critique and Pronoun Mini-Lesson: Revising Draft Literary Analysis

COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: MASS COMMUNICATION

Statistical Methods for Sample Surveys ( )

Setting the Expectation for Success: Performance Management & Appraisal System

Specific Measurable Achievable. Relevant Timely. PERFORMANCE MANAGEMENT CREATING SMART OBJECTIVES: Participant Guide PROGRAM OVERVIEW

QUALITATIVE RESEARCH

Sample Interview Questions

Questionnaire design. Module. Quantitative research methods in educational planning. Maria Teresa Siniscalco and Nadia Auriat

Best Practices: Understanding and Reducing Bias in Your Surveys

STUDENT THESIS PROPOSAL GUIDELINES

DESCRIPTIVE RESEARCH DESIGNS

Meeting the Challenges of Designing the Kauffman Firm Survey: Sampling Frame, Definitions, Questionnaire Development, and Respondent Burden

INTERNATIONAL STANDARD ON ASSURANCE ENGAGEMENTS 3000 ASSURANCE ENGAGEMENTS OTHER THAN AUDITS OR REVIEWS OF HISTORICAL FINANCIAL INFORMATION CONTENTS

Measurement and measures. Professor Brian Oldenburg

Key Measurement Issues in Screening, Referral, and Follow-Up Care for Young Children s Social and Emotional Development

Eye Tracking on a Paper Survey: Implications for Design

Multiple Year Cost Calculations

Brought to you by the NVCC-Annandale Reading and Writing Center

COM 365: INTRODUCTION TO COMMUNICATION RESEARCH METHODS Unit Test 3 Study Guide

Finding the Right Staff Recruiting and Interviewing Potential Employees

Information about INTERVENTION ORDERS

Data Collection Tools For Evaluation

Data Collection for Program Evaluation

Lesson 3: HOW TO WRITE INTERVIEW QUESTIONS

National Standards for Disability Services. DSS Version 0.1. December 2013

Section B. Some Basic Economic Concepts

Interview Questions. Accountability. Adaptability

Evaluation: Designs and Approaches

Organization Effectiveness Based on Performance Measurement and Organizational Competencies Getting to Perfect!

Data Collection Strategies 1. Bonus Table 1 Strengths and Weaknesses of Tests

Chapter 3. Methodology

Study Designs. Simon Day, PhD Johns Hopkins University

CUSTOMER SERVICE SATISFACTION SURVEY: COGNITIVE AND PROTOTYPE TEST

Forex Success Formula

Chapter 7 Conducting Interviews and Investigations

Comparison/Control Groups. Mary Foulkes, PhD Johns Hopkins University

Single and Multiple-Case Study Designs IS493

SEPAP GUIDELINES THE UNIVERSITY OF TEXAS - PAN AMERICAN STAFF EMPLOYEE PERFORMANCE APPRAISAL PROGRAM COMPILED BY THE OFFICE OF HUMAN RESOURCES

Essential Interview Questions for Recruiting and Staffing Agencies Find Top Talent by Asking the Right Questions

Performance Assessment Task Bikes and Trikes Grade 4. Common Core State Standards Math - Content Standards

Transcription:

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this site. Copyright 2007, The Johns Hopkins University and Ann Skinner. All rights reserved. Use of these materials permitted only in accordance with license rights granted. Materials provided AS IS ; no representations or warranties provided. User assumes all responsibility for use, and all liability related thereto, and must independently review all materials for accuracy and efficacy. May contain materials owned by others. User is responsible for obtaining permissions for use from third parties as needed.

Survey Research: Designing an Instrument Ann Skinner, MSW Johns Hopkins University

Section A Choice of Method

Characteristics of a Survey It applies systematic methodology It involves obtaining information directly from individuals It involves selecting a subgroup from a larger group It is done for the purpose of... Description Exploration Explanation 4

Tailored Survey Design Sampling Issues Whether to use probability sampling The sampling frame Size of the sample Sampling design/strategy Expected response rate Sampling Issues Question design issues Reliability and validity of items Pre-testing or pilot work Interviewing issues Selection of interviewers Supervision 5

Questionnaire Design Quantitative surveys in health services research 6

Traditional Survey Model A carefully standardized physical stimulus (i.e., question) Interviewer Respondent A response (i.e., answer) expressed in terms of a standardized format provided by the researcher 7

Symbolic Interactionist View Interviewer Encodes question Own purposes Perceptions of respondent Respondent Decodes question Own purposes Perceptions of Interviewer Interviewer Decodes answer Own presumptions Perceptions of respondent Respondent Encodes answer Own presumptions Perceptions of interviewer 8

Methods of Data Collection Self-administered Individually By mail In groups Internet or email Interviewer-administered By telephone In person 9

Methods of Data Collection Combinations Self-administered with interviewer instructions Mail with telephone follow-up Interviewer-administered with embedded self-administered section 10

Factors in Choice of Method Characteristics of study population Literacy Physical and mental abilities Motivation 11

Factors in Choice of Method Access to sample Location Time available for data collection Infrastructure available (telephones, mail service, internet access) 12

Factors in Choice of Method Availability of information about study population Telephone numbers Completeness of addresses Tracing information 13

Factors in Choice of Method Survey objectives Complexity of questioning Difficulty of reporting task Topic sensitivity 14

Factors in Choice of Method Question forms to be used Open-ended Fill in a number Write in text Closed-ended Multiple choice Scalar (likert-type) Dichotomous (yes/no) 15

Factors in Choice of Method Expected response rates Example (assume a group with interest): Mail survey with no follow-up 30% Mail survey with mail follow-up 50% Mail survey with telephone follow-up 60% to 80% 16

Section B Guidelines for Writing Questions

Guidelines for Writing Questions Restrain the impulse to write specific questions until you have thought through your research questions Write down your research questions and keep them handy when you are working on the questionnaire Every time you write a question, ask yourself... Why do I want to know this? How will it help answer a research question? Notes Available 18

Open-Ended or Closed-Ended Questions? Type of information sought Facts Opinions or attitudes Exploratory Complexity of the information or difficulty of the reporting task 19

Open-Ended or Closed-Ended Questions? Feasibility Range of possible answers Coding capabilities Sample size Data collection method 20

Advantages of Open-Ended Items Can get unanticipated answers May describe the respondent s real views better Respondents can answer in their own words Appropriate when the range of possible answers is long 21

Advantages of Closed-Ended Items Task of answering is easier Interpretation of the answer is easier Avoids rare answers 22

Scales, Indices, and Questionnaires Seek out appropriate scales, indices, questionnaires that have been used and tested by others Examples Health status scales Quality of life Mental health status Health services utilization Satisfaction ratings 23

Scales, Indices, and Questionnaires Can combine several in one questionnaire Be careful about copyrighted instruments 24

Writing Questions Fully scripted, so that as written, the question prepares the respondent to answer The question means the same thing to every respondent The respondent understands what an appropriate answer should be 25

Common Pitfalls Fuzzy Wording Vague and general questions produce vague and general answers Bad What do you like best about this neighborhood? We re interested in anything, like houses, the people, the parks, or whatever 26

Poor Question Organization Bad I would like you to rate different features of your neighborhood as very good, good, fair, or poor. Please think carefully about each item as I read it. Public schools. Parks. Other. 27

Poor Question Organization Better I am going to ask you to rate different features of your neighborhood. I want you to think carefully about your answers. How would you rate the public schools would you say very good, good, fair, or poor? How would you rate the parks, would you say very good, good, fair or poor? 28

Difficult Words Risk miscommunication with respondent Example: Do you think TV programs are impartial about politics? 29

Of 56 Respondents... 26 interpreted impartial correctly 10 overlooked the word altogether 9 thought it meant tending to spend too much time on politics 5 thought it meant unfair or biased 2 thought it meant giving too little time to politics 7 had no idea 30

Negatives and Sneaky Double Negatives Bad What is your view about the statement that conservationists should not be so uncooperative with the government? 31

Negatives and Sneaky Double Negatives Better What is your view about the statement that conservationists should be cooperative with the government? 32

Asking Two or More Questions at Once Bad When riding in the back seat of a car, do you wear a seat belt all of the time, most of the time, some of the time, once in a while, or never? 33

Backseat Belt Question 34

Asking Two or More Questions at Once Better In the past year, have you ridden in the back seat of a car? When you are riding in the back seat of a car, do you wear a seat belt all of the time, most of the time, some of the time, once in a while, or not at all? 35

Backseat Belt Question 36

Section C Questionnaire Construction and Critique

Putting the Questionnaire Together Item Order Self-administration capture respondent s interest Interviewer allow for practice with question/response format (information interviewer needs to know) 38

Putting the Questionnaire Together Ideal Length Self-administration 15 20 minutes Interviewer 20 30 minutes Testing Read it aloud to yourself Read it aloud to co-workers, friends Ask two to three others to fill it out themselves 39

Systematic Pilot Testing Small sample (10 15) Similar to study population Use all study procedures Discussion of problem questions 40

Standard 10-Step Critique Use simple unambiguous language that can be understood in the same way by all respondents Avoid long and complex sentences Avoid hypothetical questions Avoid double-barreled questions (asking two questions at once and questions that include assumptions) Notes Available 41

Standard 10-Step Critique Do not ask questions that ask respondents for information they do not have Avoid questions that ask about causality The time frame referred to in the question should be unambiguous and explicit 42

Standard 10-Step Critique For fixed response questions, response categories must be exhaustive and mutually exclusive Make sure the context of the question does not inappropriately affect its meaning Define terms as needed (complex definitions and instructions should be given in a preamble or introduction, not in the question itself) 43