Designing an Effective Questionnaire
|
|
- Leonard Simmons
- 1 years ago
- Views:
Transcription
1 Laura Colosi Designing an Effective Questionnaire Where to Begin Questionnaires are the most commonly used method for collecting information from program participants when evaluating educational and extension programs. When designing a questionnaire, it is important to keep some key points in mind in order to best meet the evaluation needs of a specific program or intervention. First, what type of information is needed both to capture the important objectives of your program and to fulfill the purpose of your evaluation? Second, what type of question(s) and response(s) will best capture the information you seek? Finally, in what format should the questionnaire be designed to make it user-friendly and to also capture the breadth of information needed to measure a program s impact? Kinds of Information The first question to ask is: What type of information do I need to collect? The answer to this depends on the goals and content of your program. For example, if you have a program that teaches parents age-appropriate discipline for children you will need to design a questionnaire that collects information on the basic content areas covered by this curriculum. A good way to start is to go through your curriculum and make a list of the key concepts or skills that parents should learn in each section. Then, you can design questions to capture whether parents learned new skills, changed their attitudes, and/or whether they ultimately changed their parenting practices in each of the areas covered by your program. Frequently, survey questions capture information in several categories, including, but not limited to, knowledge, attitudes, and behavior. Knowledge refers to what participants understand about program content, ideas or concepts. When capturing this information, it is usually best to offer response choices that are factual in nature to measure content comprehension. For example, it is true that Attitudes can include participant s perceptions, feelings, or judgments, collected in an effort to assess participant views on a subject or on the program itself. For example, the biggest challenge parents face is disciplining their child(ren) Behavior can include what people do, will do, or have done in areas related to the topic area of a program. For example, in a parent program about appropriate discipline, one behavior to measure is frequency of yelling at home. Behavior change is often hard to capture, especially if the duration of a program or intervention is short. Accurate assessment of behavior change frequently requires the opportunity to implement a pre and post test, and ideally, some follow-up data after the program has been completed. In essence, behavioral changes illustrate the translation of knowledge and attitudinal changes to skill sets and actions which can take some time to happen depending on the area of intervention. Types of Questions Once you have identified the type of information you want to collect, based on the content areas and intended outcomes of your program, the next step is to draft questions for each piece of relevant information you hope to capture with your questionnaire. Open ended questions are those that do not place restrictions on the answers respondents can provide. One such example is, What are the biggest challenges you face when disciplining your child? This question allows participants to answer in their own words. As a result, open-ended questions yield more varied responses than closed-ended questions, and may highlight responses that evaluators could not have anticipated. However, the information yielded from open ended questions takes much longer to read through and to code in order to identify common responses, which could slow down the reporting of results. As such, one must weigh the importance of
2 free expression of responses against the time required to get useful information from each question. In addition, because open-ended questions capture only one person s experience on a given question, it is difficult to report results for the entire group. This is due to the many possible interpretations of each respondent s views, and the inability to state with certainty that each participant s interpretation of both the question and their responses are consistent. However, open-ended data can be useful when exploring the range of possible responses to a question or topic, especially when pilot testing a program prior to its implementation. Another type of question is a close-ended question in which respondents much choose among specific response options for each question. For example, when asking What are the biggest challenges you face when disciplining your child? a close-ended question may offer the following choice of answers: 1) getting my child to listen; 2) losing my temper; or 3) there are no challenges to disciplining my child. There are many possible ways to structure responses to close-ended questions, including forced choices, agree/disagree and Likert scales (discussed below). Advantages of close-ended questions are that carefully chosen response options allow for the same frame of reference for all participants when choosing an answer. The answers to a close-ended question are pre-determined, and as a result, they are both more specific than open-ended questions and are more likely to promote consistency among respondents in terms of understanding both the question and responses. For example, Converse and Presser (1986) provide a classic scenario: consider the open- ended question, People look for different things in a job. What would you most prefer in a job? They found that the most common answer was pay. However, upon further analysis it appeared that some respondents meant high pay and some meant steady pay. Because the wording of answer choices was not specific enough, it was not possible to know which person meant which. Contrast this with a closed-ended question, People look for different things in a job. Which one of the five following things would you most prefer in a job? that offers the following response choices: 1. work that pays well; 2. work that I am proud of; 3. autonomy; 4. friendly co-workers; or 5. steady pay/job security. Because the respondents who meant steady pay chose steady pay/job security and the others who meant high pay on the open ended question could choose work that pays well the distinction among them was apparent based on the answer choices available for the question. As a result, areas in which an answer to an open-ended question can be unclear are easily solved by using a closed-ended question with greater specificity in response choices. It is the greater specificity and consistency yielded by closed-ended questions that allows one to generalize results that can be reported across respondents. In addition, the use of closed-ended choices allows for a timelier and more systematic analysis of data collected. Instead, one only needs to compute frequency of responses for each question, or means on a Likert scale to gauge impact of the program itself. Disadvantages are that there is no room for participants to express responses in their own words, as they must use only pre-determined choices to answer a question. When deciding whether to use open- or closedended questions, it is important to think about your goals. A general rule of thumb to follow is if an educator knows the specific information needed to answer a question and requires a single frame of reference among respondents, closed-ended responses are preferred (Converse and Presser, p.33). If however, an educator is not sure what the range of possible responses are to a question, and hopes to conduct a preliminary exploration of a topic, openended questions will work better. Note also that previous research 1 shows that respondents are more willing to offer sensitive information on a survey using an open-ended response. Common Pitfalls in Designing Questions Leading Questions: Regardless if a question is open- or closed-ended, it is important that it not lead a participant to your preferred response. An example of a leading question is, Do you believe it is okay to spank young children despite the AAP recommendations not to do so? in which it is clear that the acceptable answer is no. Double barreled questions: Make sure each question addresses only one issue. For example, do NOT ask, Did this program teach you how to discipline you child and manage your home finances? It is possible that the program was successful in one domain but not the other, and this 1 Converse and Presser 1986; Taylor-Powell 1998; and Bradburn, et al
3 question does not allow for that distinction to be made. Unclear or ambiguous questions: Each question needs to be written very clearly. For example, asking Do you think children require strict discipline? requires each respondent to answer based on their own definition of strict. In contrast, asking Is it appropriate to spank a child who breaks your rules? is much more specific. It is also important provide as much information as possible to make sure that each question is clear and interpreted the same way by everyone. For example, Do you think youth should be spanked? is not as clear as Do you think it is appropriate to spank a child between years old? Invasive/personal: Questions about personal information are difficult to phrase in a non-intrusive way. One possible solution is to provide broad categories of responses from which to choose. For example, a participant may be more comfortable reporting his or her income as part of a category (e.g. $20,000 $30,000) rather than a specific number. It is helpful to re-read your questionnaire to see each question through the survey respondent s eyes by asking if each question is: too personal; judgmental; ambiguous or unclear; leading; or capturing more than one idea. Finally, it is very important to collect demographic data (such as age, sex, educational level or race) from participants when possible, preferably at the end of the survey instrument. This data allows educators to better understand their audience and also allows researchers to examine whether a program may affect different groups in different ways. For example, one program may have better outcomes for women than for men, or older versus younger parents. This ability to correlate outcomes and participant characteristics allows for the identification of shortcomings or strengths in a given curricula for certain audiences, and thus, can guide any needed modifications of the curriculum. Choosing Response Options When using closed-ended questions, you will need to carefully consider what response options you would like to provide to the respondents. Generally, responses should be a set of mutually-exclusive categorical choices. For example, when asking about income, a range of choices may be presented such as (1) less than $20,000 a year; (2) $21,000 40,000 a year; (3) 41,000 60,000 a year; or (4) more than 61,000 a year. These types of categories remove the possibility that more than one answer may apply to a participant. Note also that Bradburn (2004) cautions against the use of categories such as (1)never, (2) rarely, (3) occasionally, (4)fairly often, and (5) often; as these types of choices can confuse and annoy respondents, and also because a researcher has no way of knowing what often means to each participant. It is very important that categories chosen as responses capture all possible choices, but are also carefully worded to avoid ambiguity of interpretation to reduce the likelihood of invalid responses on your questionnaire. Another popular way structure response choice is to use a Likert Scale. A Likert scale is used to rate each item on a response scale. For instance, participants are asked to answer each question by rating each item on a 1-to-3, 1-to4, 1-to-5, 1-to-6, or 1-to-7 response scale. In general, a 5-point Likert scale is an appropriate choice and would include the items: strongly agree, agree, neutral, disagree, and strongly disagree. Using this type of scale allows you to ask many questions as statements like Children should be yelled at as infrequently as possible in which participants choose how strongly they agree or disagree with that statement. Circle the number that corresponds to your feelings about each statement Children should be yelled at as infrequently as possible. Reading books with a child under age 3 is important to foster early literacy skills. Strongly Agree Agree No opinion Disagree Strongly disagree One primary reason to use a Likert Scale is the data are easy to code and report back by simply assigning codes to the responses (for example strongly agree = 5, agree = 4, neutral = 3, disagree = 2, and strongly disagree = 1) so that a higher score reflects a higher level of agreement of each item. This is important because after you enter the individual scores, you can easily calculate an average or mean score for the whole group for each survey question. In the case of assigning higher values to stronger agreement, then 3
4 higher mean scores for each question will translate into levels of agreement for each item, and thus, lower scores will reflect participants disagreement with each item asked. 2 Questionnaire Format The layout of a questionnaire determines how easy it is for respondents to read, understand and answer each question you ask. As such, questionnaire format heavily influences the quality of the data you collect. Specific things to consider in design include general appearance, typeface, blank space, order of questions, and the placement and clarity of instructions on the questionnaire. Finally, it is also recommended that, if time permits, the instrument be piloted before implementation to identify any problem areas in both format and content of the questionnaire. Here are some general tips 3 on format: 1. Start with a clear introduction that is both informative about the purpose of the questionnaire, explains how the information collected will be used (e.g., improve program, etc), and assures them that their personal information will remain confidential (alternatively, you could number instruments and remove names altogether you must, however, match the pre and post instruments by participant number if using that design). 2. Use an easy to read typeface, and allow for some blank space between questions. 3. Do not break question text or instructions to turn pages; keep all text together for each question. 4. Use italics or bold for instructions to distinguish them from the question itself. 5. Arrange the answers vertically under each question. For example, How often do you read to your elementary school aged child? Every day 4-6 times a week 1-3 times a week Less that once a week 6. If needed, place any explanatory text or definitions in a parenthesis immediately after the question. 2 For more information, see Measuring Evaluation Results with Microsoft Excel available at RED3.pdf 3 For more detail see reference by Ellen Taylor-Powell, 1998 cited at the end of this brief. 7. The first questions should be easy to read, cover the more important topics of interest, and if possible, be closed ended. 8. It is also good to start with more general questions and then move to greater specificity towards the end of the questionnaire. 9. Pay attention to the flow of questions; the instrument should be logical, and easy to follow. If possible, keep questions on similar topics close together. Check and re-check all instructions and/or skip patterns. 10. When switching to different topic areas use a transitional statement to help guide the respondent through the questionnaire. Note also that asking too many questions is burdensome to both the program s participants and the staff who need to analyze and report the evaluative data. As a result, it is important that each question included gathers essential information (Taylor-Powell, 1998). Remember also to think about the educator s time constraints for implementation of the questionnaire, the possible literacy level variation or any possible language barriers the participants may have that impact the educator s ability to administer the survey. Summary When writing questions for an evaluative survey, it is important to recall the objective of the program delivered, and the information required to measure the success in meeting the objectives. Educators need also to focus on the type of information needed to assess the program impact, deciding whether to include knowledge, attitudes, skills and/or behavior. Good survey design also includes attention to the audience completing the questions (e.g., literacy, language, etc.), and the purpose of the question (outcome data, qualitative feedback, satisfaction with the program and/or audience demographics). The flow of the entire questionnaire is also critical, and educators need to ensure that each question is clear and the directions are equally easy to understand. Most importantly, remember that the quality of the information gathered from a survey instrument is dependent on the clarity of each question asked on a questionnaire. This reinforces the importance of thoughtful design for questionnaires to capture and report the true experience and change of participants in programs. 4
5 Steps to Developing an Effective Questionnaire Decide what information you need to collect for your program; Search for previous evaluative efforts on programs similar to yours to see what others may have done (and review their questionnaires if possible, but be sure to obtain permission if using a questionnaire or items developed by someone else); Draft your questions or modify questions on an existing survey to fit your own program needs; Place the questions in logical order; Re-read the entire questionnaire and add specific instructions, transitions, and any clarifying information for questions or responses (in parentheses where applicable); Focus on the format of the questionnaire with attention to layout, readability, time demands on respondents, logic and clarity of content; If time allows, have a few co-workers complete the questionnaire to identify problem areas in your survey; Revise the instrument as needed based on feedback provided; Prepare educators on protocol for implementing the questionnaire; and You are ready to go!
6 References Bradburn, N., Sudman, S. and Wansink, B. (2004) Asking Questions. Jossey-Bass; San Francisco, Ca. Converse, J. and Presser, S. (1986) Survey Questions: Handcrafting the Standardized Questionnaire. Sage University Press; Newbury Park, Ca. Schwarz, N. and Oyserman, D. (2001) Asking Questions about Behavior: Cognition, Communication, and Questionnaire Construction. American Journal of Evaluation, Vol. 22, No. 2, pp Taylor-Powell, E. (1998) Questionnaire Design: Asking questions with a purpose. University of Wisconsin Extension. For more information about the Parenting in Context project at Cornell, Visit our website at: Laura Colosi is an Extension Associate in the Department of Policy Analysis and Management at Cornell University Cornell Cooperative Extension 6
EVALUATION METHODS 68 S ECTION 5: DEVELOPING AND IMPLEMENTING AN EVALUATION PLAN
68 S ECTION 5: DEVELOPING AND IMPLEMENTING AN EVALUATION PLAN EVALUATION METHODS Although indicators identify what you will look at to determine whether evaluation criteria are met, they do not specify
Methods and Strategies for Student Assessment
Methods and Strategies for Student Assessment Developed for the Health Professions Schools in Service to the Nation (HPSISN) Program, a national demonstration program of service-learning in the health
Capstone Suggestions for Survey Development for Research Purposes
Capstone Suggestions for Survey Development for Research Purposes 1. Begin by listing the questions you would like to answer with the survey. These questions will be relatively broad and should be based
Improving Survey Questions
Maria Korjenevitch Rachel Dunifon Improving Survey Questions Evaluating parent education programs is an essential part of the educational process. Typically, we seek to understand whether a particular
Focus Groups Procedural Guide
Focus Groups Overview The Focus Group is an exploratory research method used to help researchers gather indepth, qualitative information of their participants' attitudes and perceptions relating to concepts,
Designing Effective Survey Instruments
Designing Effective Survey Instruments Thomas M. Archer, Ph.D. 2007 Definitions: Handout Some Terms Data Instrumentation Quantitative and Qualitative Survey and a Questionnaire Reliability and Validity
An Introduction to Effective Survey Design and Administration. Kurt Johnson, Ph.D. Director, Penn State Survey Research Center
An Introduction to Effective Survey Design and Administration Kurt Johnson, Ph.D. Director, Penn State Survey Research Center Things to Consider What is the goal? What is the research question? Is it research?
Survey Design and Questionnaire Writing Guidelines
Survey Design and Questionnaire Writing Guidelines Amy Feder Assessment Consultant StudentVoice tv September 2, 2009 ACPA ASK Standards Professional Equivalencies Ability to evaluate and create a rigorous
*Note: Screen magnification settings may affect document appearance.
Welcome to today s Coffee Break presented by the Evaluation and Program Effectiveness Team in the Division for Heart Disease and Stroke Prevention at the Centers for Disease Control and Prevention. *Note:
Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire
Survey research Chaiwoo Lee Postdoctoral Associate MIT AgeLab chaiwoo@mit.edu agelab.mit.edu Contents Key characteristics of survey research Designing your questionnaire Writing the questions Putting them
Getting the most out of your survey:
Getting the most out of your survey: discussions on planning, designing and analyzing surveys. Dr Helen Klieve, School of Education & Professional Studies Session Overview This interactive workshop session
Literature Review of Interview/Re-interview Designs
Literature Review of Interview/Re-interview Designs Given the previous literature review s extensive information regarding interview research methods, this literature review begins to focus on creating
Stages of Instructional Design V. Professional Development
Stages of Instructional Design V. Professional Development Derived from Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of Instructional Design (4th ed.). Fort Worth, TX: Harcourt Brace
Structured Interviewing:
Structured Interviewing: Interview Board Guide How to conduct structured interviews in the appointment process Assessment Oversight and Personnel Psychology Centre TABLE OF CONTENTS INTRODUCTION... 3 SECTION
SURVEY RESEARCH AND RESPONSE BIAS
SURVEY RESEARCH AND RESPONSE BIAS Anne G. Scott, Lee Sechrest, University of Arizona Anne G. Scott, CFR, Educ. Bldg. Box 513, University of Arizona, Tucson, AZ 85721 KEY WORDS: Missing data, higher education
Research Ethics Review Committee (WHO ERC)
Research Ethics Review Committee (WHO ERC) 20, AVENUE APPIA CH-1211 GENEVA 27 SWITZERLAND HTTP://INTRANET.WHO.INT/HOMES/RPC/ERC HTTP://WWW.WHO.INT/RPC/RESEARCH_ETHICS Informed Consent Template for Research
Discipline and Intergenerational Transmission
Discipline 1 Discipline and Intergenerational Transmission Aria Snyder and Samantha Bub Undergraduate Students: Human Development and Family Studies Key Words: discipline practices, intergenerational transmission,
Persuasion: What the Research Tells Us
: What the Research Tells Us J o A n n e Y a t e s We have seen that in all communication, whether persuasive or not, you should consider the audience, your credibility, your purpose, and the context of
Growing up in Scotland
Designing and developing a questionnaire for children in the Growing up in Scotland study Judith Mabelis, Senior Researcher, ScotCen Social Research Overview Growing up in Scotland (GUS) Children as respondents
EVALUATION METHODS TIP SHEET
EVALUATION METHODS TIP SHEET QUANTITATIVE METHODS: Quantitative data collection methods consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or
Running surveys and consultations
Running surveys and consultations Susannah Wintersgill, Public Affairs Annette Cunningham & Rob Markham, Estates Tuesday 9 June 2015 Contents Five key stages of planning a survey Questionnaire design:
How to Read a Research Article
RACHEL DUNIFON How to Read a Research Article The goal of this Research Brief is to provide information that will make reading a research article more illuminating. For example, you might want to learn
How to Plan and Guide In Class Peer Review Sessions
How to Plan and Guide In Class Peer Review Sessions Incorporating peer review into your course can help your students become better writers, readers, and collaborators. However, peer review must be planned
School-Age Professional Credential Advisor Instructions
Pennsylvania School-Age Professional Credential Advisor Instructions Table of Contents Introduction... 1 Credentialing Process... 2 Eligibility... 5 Roles and Responsibilities... 7 Advisor Checklist...
CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input
P ATHFINDER I NTERNATIONAL T OOL S ERIES Monitoring and Evaluation 2 CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input By Carolyn Boyce, MA,
Section 4: Key Informant Interviews
UCLA CENTER FOR HEALTH POLICY RESEARCH Section 4: Key Informant Interviews Purpose Key informant interviews are qualitative in-depth interviews with people who know what is going on in the community. The
Employee Engagement Profiling
Employee Engagement Profiling Demo Pilot Study 2013 Sample Report - Tables and Charts have been shortened Registered Associates All enquiries to: support@pario-innovations.com Web: http://www.pario-innovations.com
see Designing Surveys by Ron Czaja and Johnny Blair for more information on surveys
SURVEYS Survey = ask the same questions of a large sample see Designing Surveys by Ron Czaja and Johnny Blair for more information on surveys Basic Survey Process: 1. Design Questionnaire 2. Design Sample
Writing for work documents
Writing for work documents Written communications in the work environment are often compiled using existing templates that guide you through most of the information that is required. Nonetheless, there
Asking about condom use: Is there a standard approach that should be adopted across surveys?
Asking about condom use: Is there a standard approach that should be adopted across surveys? John E Anderson Cornelis Rietmeijer Ronald W Wilson Peggy Barker John E Anderson Centers for Disease Control
Developing a Survey. Data Collection Tips:
This document presents an overview of some common issues that nonprofits face when constructing a survey. While there are certainly different ways to structure and administer a survey, these issues apply
Basic Principles of Survey Design. Thomas Lindsay Andrew Sell
Basic Principles of Survey Design Thomas Lindsay Andrew Sell Introductions Name, Institution, why are you here? Survey Review 5 minutes Political Participation and Attitudes Survey What is the research
Stage 3: Writing an Evaluation Report
Stage 3: Writing an Evaluation Report Table of Contents SECTION 1: Executive Summary SECTION 2: Program Description SECTION 3: Evaluation Methodology SECTION 4: Findings SECTION 5: Interpretation and Reflection
Chapter 15: Physical and Cognitive Development in Middle Adulthood
Chapter 15: Physical and Cognitive Development in Middle Adulthood After reading this chapter, you should be able to: 15.1 Describe physical changes of middle adulthood, including those related to vision,
Statistical Methods 2. Questionnaire Design
community project encouraging academics to share statistics support resources All stcp resources are released under a Creative Commons licence Statistical Methods 2. Questionnaire Design Based on materials
Research Design. Relationships in Nonexperimental Research. Nonexperimental Research Designs and Survey Research. Katie Rommel-Esham Education 504
Nonexperimental Research Designs and Survey Research Katie Rommel-Esham Education 504 Research Design Research design deals with the ways in which data are gathered from subjects Relationships in Nonexperimental
Completing a Peer Review
Completing a Peer Review University of Guelph, Integrative Biology Undergraduate Curriculum Committee 2011/2012 Completing a Peer Review Based on Writing in the Sciences an online resource provided by
Measuring Print Advertising
Print article Close window Measuring Print Advertising By Bill McBride Over the past 50 years, great progress has been made in developing print communications research techniques that help us understand
Measuring Evaluation Results with Microsoft Excel
LAURA COLOSI Measuring Evaluation Results with Microsoft Excel The purpose of this tutorial is to provide instruction on performing basic functions using Microsoft Excel. Although Excel has the ability
Attitudinal Survey. Eileen L. Lewis Department of Chemistry University of California, Berkeley
Attitudinal Survey Eileen L. Lewis Department of Chemistry University of California, Berkeley Elaine Seymour Bureau of Sociological Research University of Colorado, Boulder WHY USE AN ATTITUDINAL SURVEY?
Survey design - best practice guidelines
1 Survey design - best practice guidelines This document outlines some of the main considerations in survey question and answer design. Key considerations include: 1 brevity 5 quantitative statements 2
Questionnaire Design. Questionnaires in Clinical Research
Questionnaire Design Theresa A Scott, MS Department of Biostatistics theresa.scott@vanderbilt.edu D-2217 MCN; 3-1713 http://biostat.mc.vanderbilt.edu/theresascott Theresa A Scott, MS (Biostatistics) Questionnaire
Qualitative methods for effectiveness evaluation: When numbers are not enough
Chapter 7 Qualitative methods for effectiveness evaluation: When numbers are not enough 7.1 Introduction 7.2 Methods of collecting qualitative information 7.2.1 Interviews and focus groups 7.2.2 Questionnaires
Data Collection Instruments (Questionnaire & Interview) Dr. Karim Abawi Training in Sexual and Reproductive Health Research Geneva 2013
Data Collection Instruments (Questionnaire & Interview) Dr. Karim Abawi Training in Sexual and Reproductive Health Research Geneva 2013 Geneva Workshop 2013 Data Collection Instruments Accurate and systematic
APPENDIX C Process Monitoring and Evaluation Tools
APPENDIX C Process Monitoring and Evaluation Tools FACILITATOR FIDELITY/PROCESS FORMS When to Use: At the end of each training session Administered By: Facilitator Completed By: Facilitator Instructions:
Supervisory Competency Model
Supervisory Competency Model INTERPERSONAL SKILLS 1. Building Positive Working Relationships 2. Building Trust 3. Communication Skills LEADERSHIP SKILLS 4. Identifying and Mobilizing Resources 5. Initiating
Making Foreign Languages compulsory at Key Stage 2 Consultation Report: Overview
Making Foreign Languages compulsory at Key Stage 2 Consultation Report: Overview Introduction This report summarises the responses to the consultation on the Government s proposal to introduce foreign
Parent Consent - Baltimore. Parent Consent for Student Participation For Baltimore survey participants
Parent Consent - Baltimore Parent Consent for Student Participation For Baltimore survey participants Evaluation of CDC s Dating Matters: Strategies to Promote Healthy Teen Relationships Initiative Parental
The Status of Women in the Middle East and North Africa (SWMENA) Project
The Status of Women in the Middle East and North Africa (SWMENA) Project Focus on Morocco Social Attitudes Toward Women Topic Brief A project by the International Foundation for Electoral Systems (IFES)
TeamSTEPPS TM Teamwork Attitudes Questionnaire Manual
TeamSTEPPS TM Teamwork Attitudes Questionnaire Manual Prepared for: U.S. Department of Defense, Tricare Management Activity Subcontract agreement # SK-443-07-AIR-HCTCP Prime Contract # W81XWH-06-F-0526
A Study on Teachers Feedback of English Major Compositions
US-China Foreign Language, ISSN 1539-8080 April 2011, Vol. 9, No. 4, 247-253 A Study on Teachers Feedback of English Major Compositions NING Zhen-ye Northwest Normal University, Lanzhou, China The paper
Sub-question 6: What are student and faculty perceptions concerning their frequency of use of the DDP?
1 Sub-question 6: What are student and faculty perceptions concerning their frequency of use of the DDP? Data were gathered to describe this question from two of the three approaches. No data were gathered
Evaluation of the Casework Documentation Pilot Final Report
Evaluation of the Casework Documentation Pilot Final Report May 12, 2015 Submitted by: Cynthia Osborne, Ph.D. Director, Child and Family Research Partnership LBJ School of Public Affairs, University of
Best Practices. Create a Better VoC Report. Three Data Visualization Techniques to Get Your Reports Noticed
Best Practices Create a Better VoC Report Three Data Visualization Techniques to Get Your Reports Noticed Create a Better VoC Report Three Data Visualization Techniques to Get Your Report Noticed V oice
Customer Service and Sales Skill Standards Sales & Service Voluntary Partnership
Customer Service and Sales Skill Standards Sales & Service Voluntary Partnership A Partner in the National Skill Standards System Notice The Sales & Service Voluntary Partnership is pleased to present
To grow your market, you need to assess its current performance,
3 Collecting and Interpreting Data: Research Tools and Methods for Market Performance Assessment This chapter will enable you to: Be familiar with a variety of data collection methods and their uses. Begin
Survey Research. Classifying surveys on the basis of their scope and their focus gives four categories:
Survey Research Types of Surveys Surveys are classified according to their focus and scope (census and sample surveys) or according to the time frame for data collection (longitudinal and cross-sectional
Effective Classroom Discussions 1 William E. Cashin, professor emeritus Kansas State University
IDEA PAPER #49 Effective Classroom Discussions 1 William E. Cashin, professor emeritus Kansas State University The prototypic teaching method for active learning is discussion. Svinicki and McKeachie (2011,
Evaluating Survey Questions
Evaluating Survey What Respondents Do to Answer a Question Chase H. Harrison Ph.D. Program on Survey Research Harvard University Comprehend Question Retrieve Information from Memory Summarize Information
Self-administered Questionnaire Survey
Primary methods of survey data collection Self-administered questionnaire (SAQ) Mail Face-to-face interview Telephone interview Self-administered Questionnaire Survey 1 Steps are: a. Mail distribution
Listening skills audit
Listening skills audit There are 30 statements to consider. Decide how much you agree or disagree with each statement, and circle the appropriate number. Once you re finished, count up your score and see
CONSUMER AWARENESS AND SATISFACTION SURVEY 2014
CONSUMER AWARENESS AND SATISFACTION SURVEY 2014 October 2014 Infocomm Development Authority of Singapore 10 Pasir Panjang Road #10-01 Mapletree Business City Singapore 117438 Tel: (65) 6211 0888 Fax: (65)
21st Century Community Learning Centers Grant Monitoring Support
21st Century Community Learning Centers Grant Monitoring Support Contract No. ED-04-CO-0027 Task Order No. 0005 For the U.S. Department of Education, Office of Elementary and Secondary Education Evaluation
Measurement and measures. Professor Brian Oldenburg
Measurement and measures Professor Brian Oldenburg Learning objectives 1. To identify similarities/differences between qualitative & quantitative measures 2. To identify steps involved in choosing and/or
The Status of Women in the Middle East and North Africa (SWMENA) Project
The Status of Women in the Middle East and North Africa (SWMENA) Project Focus on Lebanon Social Attitudes Toward Women Topic Brief A project by the International Foundation for Electoral Systems (IFES)
Survey Design. Assessment Spotlight Hosted by the Student Affairs Assessment Council University of North Carolina at Chapel Hill
Survey Design Assessment Spotlight Hosted by the Student Affairs Assessment Council University of North Carolina at Chapel Hill Surveys are Self-report data collection instruments (Gideon, 2012) Tools
This Programme is funded by the EU, Russia and Finland.
ETELÄ-KARJALAN LIITTO REGIONAL COUNCIL OF SOUTH KARELIA Kauppakatu 40 D, 1-floor FI-53100 Lappeenranta www.southeastfinrusnpi.fi www.twitter.com/sefinrusenpi This Programme is funded by the EU, Russia
Information differences between closed-ended and open-ended survey questions for high-technology products
Information differences between closed-ended and open-ended survey questions for high-technology products ABSTRACT William Bleuel Pepperdine University Many businesses collect survey data that has two
Principles of Data-Driven Instruction
Education in our times must try to find whatever there is in students that might yearn for completion, and to reconstruct the learning that would enable them autonomously to seek that completion. Allan
Questionnaires a brief introduction
Questionnaires a brief introduction A questionnaire is often the first tool that people consider when undertaking a research project. However, designing a questionnaire is complex and time-consuming and
Guide to Effective Staff Performance Evaluations
Guide to Effective Staff Performance Evaluations Compiled by Human Resources Siemens Hall, Room 211 The research is clear. The outcome is consistent. We know with certainty that the most powerful leadership
Drive Interactivity and Engagement in Your Webinars
FROM PRESENTATION TO CONVERSATION Drive Interactivity and Engagement in Your Webinars 1 AUDIENCE ENGAGEMENT IS ESSENTIAL Webinars have become a top-tier marketing tool, with an ever-growing number of companies
Strategic Plan SFY Oklahoma Department of Human Services
Strategic Plan SFY 2014-2015 Oklahoma Department of Human Services GROWING in a NEW DIRECTION SFY 2014-2015 Strategy Map DDS Strategic Plan Page 1 OUR MISSION We improve the quality of life of vulnerable
GUIDE TO EFFECTIVE STAFF PERFORMANCE EVALUATIONS
GUIDE TO EFFECTIVE STAFF PERFORMANCE EVALUATIONS The research is clear. The outcome is consistent. We know with certainty that the most powerful leadership tool for improving productivity and increasing
A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation
A Guide to Assessment of Learning Outcomes for ACEJMC Accreditation Accrediting Council on Education in Journalism and Mass Communications, 2012 This guide explains ACEJMC s expectations of an assessment
THE EFFECTIVENESS OF SHORT-TERM STUDY ABROAD PROGRAMS IN THE DEVELOPMENT OF INTERCULTURAL COMPETENCE AMONG PARTICIPATING CSU STUDENTS.
THE EFFECTIVENESS OF SHORT-TERM STUDY ABROAD PROGRAMS IN THE DEVELOPMENT OF INTERCULTURAL COMPETENCE AMONG PARTICIPATING CSU STUDENTS Rafael Carrillo Abstract Study abroad programs have recently been shortening
Survey Research: Designing an Instrument. Ann Skinner, MSW Johns Hopkins University
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
How do we know what we know?
Research Methods Family in the News Can you identify some main debates (controversies) for your topic? Do you think the authors positions in these debates (i.e., their values) affect their presentation
Counting Money and Making Change Grade Two
Ohio Standards Connection Number, Number Sense and Operations Benchmark D Determine the value of a collection of coins and dollar bills. Indicator 4 Represent and write the value of money using the sign
Jon A. Krosnick and LinChiat Chang, Ohio State University. April, 2001. Introduction
A Comparison of the Random Digit Dialing Telephone Survey Methodology with Internet Survey Methodology as Implemented by Knowledge Networks and Harris Interactive Jon A. Krosnick and LinChiat Chang, Ohio
USAR Nepal Earthquake Response Technical Evaluation
USAR Nepal Earthquake Response Technical Evaluation KEY FINDINGS 1. Positive contributions were made by smaller, lighter teams and nonclassified teams and this should be further strengthened. 2. The revised
What are some effective standards-based classroom assessment practices?
How does classroom assessment help teachers and students? Classroom assessments can help teachers plan and implement effective instruction and can help students learn at deeper and higher levels. Assessments
should be written as a future tense verb. A list of good verbs for learning outcomes is provided in this packet.
Writing Learning Outcomes What is a learning outcome? Learning outcomes are statements that indicate what learners will know or be able to do at the completion of a learning experience (program, conference,
IMPROVING INTERPERSONAL COMMUNICATION
In general, people want to feel that they have been treated fairly and feel that they have been understood and respected, regardless of what is being communicated. The ability to listen respectfully can
A Guide To Understanding Your 360- Degree Feedback Results
A Guide To Understanding Your 360- Degree Feedback Results 1 Table of Contents INTRODUCTION CORE BELIEFS... 1 PART ONE: UNDERSTANDING YOUR FEEDBACK... 2 360 Degree Feedback... 2 Evaluate Your Effectiveness...
IR Report. Non-returner Survey: Why Students Leave. Laura Ariovich and W. Allen Richman. Prince George s Community College
1 IR Report Non-returner Survey: Why Students Leave Laura Ariovich and W. Allen Richman Prince George s Community College 310 Largo Rd Largo, MD 20774 Abstract Prince George s Community College has regularly
The importance of promoting healthy lifestyles in the workplace: an Optum research study
The importance of promoting healthy lifestyles in the workplace: an research study Background Companies institute wellness programs in the workplace based on their belief that such programs will improve
Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?
Assessment METHODS What are assessment methods? Assessment methods are the strategies, techniques, tools and instruments for collecting information to determine the extent to which students demonstrate
How To: Set an Audit Sample & Plan Your Data Collection
INTRODUCTION The aim of this How To guide is to provide advice on how to set your audit sample and how to design your data collection methodology and your data collection form. Aspects of this guide are
DEEPER LEARNING COMPETENCIES April 2013
DEEPER LEARNING COMPETENCIES April 2013 Deeper learning is an umbrella term for the skills and knowledge that students must possess to succeed in 21 st century jobs and civic life. At its heart is a set
Gender Sensitive Data Gathering Methods
Gender Sensitive Data Gathering Methods SABINA ANOKYE MENSAH GENDER AND DEVELOPMENT COORDINATOR GRATIS FOUNDATION, TEMA, GHANA sabinamensah@hotmail.com Learning objectives By the end of this lecture, participants:
Top Tips on Analysing Data and Presenting Findings for Your Education Research Project
Top Tips on Analysing Data and Presenting Findings for Your Education Research Project First published on the Facebook page of A Beginner s Guide to Doing Your Education Research Project: http://on.fb.me/qsly2q
The economic crisis has demonstrated how critical informed and effective decision-making is for
April 2014 RESEARCH BRIEF: Financial Education and Account Access Among Elementary Students Prepared by Corporation for Enterprise Development (CFED) for U.S. Department Treasury Are classroom education
Tips & Tools #17: Analyzing Qualitative Data
Tips & Tools #17: Analyzing Qualitative Data Introduction Qualitative evaluation methods yield narrative data often describing experiences, perceptions, or opinions from key informant interviews, open
Learning and Development Hiring Manager Guide For Onboarding A New Manager
Learning and Development Hiring Manager Guide For Onboarding A New Manager www.yorku.ca/hr/hrlearn.html Table of Contents Introduction...1 What is Onboarding?...1 What is the Difference Between Orientation
INTERNATIONAL STANDARD ON ASSURANCE ENGAGEMENTS 3000 ASSURANCE ENGAGEMENTS OTHER THAN AUDITS OR REVIEWS OF HISTORICAL FINANCIAL INFORMATION CONTENTS
INTERNATIONAL STANDARD ON ASSURANCE ENGAGEMENTS 3000 ASSURANCE ENGAGEMENTS OTHER THAN AUDITS OR REVIEWS OF HISTORICAL FINANCIAL INFORMATION (Effective for assurance reports dated on or after January 1,
Descriptive Inferential. The First Measured Century. Statistics. Statistics. We will focus on two types of statistical applications
Introduction: Statistics, Data and Statistical Thinking The First Measured Century FREC 408 Dr. Tom Ilvento 213 Townsend Hall ilvento@udel.edu http://www.udel.edu/frec/ilvento http://www.pbs.org/fmc/index.htm
Professional Experience Required for Certification Guide for Supervisors and Verifiers
Professional Experience Required for Certification Guide for Supervisors and Verifiers Certified General Accountants Association Certified General Accountants Association of Canada 100-4200 North Fraser
Analyzing and interpreting data Evaluation resources from Wilder Research
Wilder Research Analyzing and interpreting data Evaluation resources from Wilder Research Once data are collected, the next step is to analyze the data. A plan for analyzing your data should be developed
Avoiding Bias in the Research Interview
Avoiding Bias in the Research Interview Sonja I. Ziniel, MA Ph.D. Clinical Research Program, Children s Hospital Boston Department of Medicine, Harvard Medical School sonja.ziniel@childrens.harvard.edu