Data Collection for Program Evaluation. By Dornell Pete & Kevin English Albuquerque Area Southwest Tribal Epidemiology Center

Similar documents
Section 4: Key Informant Interviews

Running surveys and consultations

Indian Health Service Creating Strong Diabetes Programs: Plan A Trip to Success!

Stakeholder Engagement Working Group

EVALUATION METHODS TIP SHEET

CHAPTER 3 RESEARCH METHODOLOGY

Measuring the Impact of Volunteering

Prepared by the Policy, Performance and Quality Assurance Unit (Adults) Tamsin White

Semi-structured interviews

Evaluation: Designs and Approaches

Assessing Research Protocols: Primary Data Collection By: Maude Laberge, PhD

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

Qualitative methods for effectiveness evaluation: When numbers are not enough

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

Primary Data Collection Methods: Survey Design

Competency Approach to Human Resource Management

QUALITATIVE RESEARCH

Information collection methods Choosing tools for assessing impact

Interviewing Strategies & Tips. Career Center For Vocation & Development

*Note: Screen magnification settings may affect document appearance.

Promoting hygiene. 9.1 Assessing hygiene practices CHAPTER 9

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

2. Choose the location and format for focus groups or interviews

Conducting Formative Research

MARKETING RESEARCH AND MARKET INTELLIGENCE (MRM711S) FEEDBACK TUTORIAL LETTER SEMESTER `1 OF Dear Student

Social Survey Methods and Data Collection

A QuestionPro Publication

When you complete the course, you will have the option of receiving a course completion confirmation from the SBA.

Running Head: HEARTSHARE S MANAGEMENT TRAINING PROGRAM

Survey Research. Classifying surveys on the basis of their scope and their focus gives four categories:

EXECUTIVE SUMMARY: 2013 ICF Organizational Coaching Study

GUIDELINES FOR PROPOSALS: QUALITATIVE RESEARCH Human Development and Family Studies

Data Collection Instruments (Questionnaire & Interview) Dr. Karim Abawi Training in Sexual and Reproductive Health Research Geneva 2013

Customer Market Research Primer

DATA COLLECTION AND ANALYSIS

TOOL D14 Monitoring and evaluation: a framework

Avoiding Bias in the Research Interview

Planning for monitoring and evaluation activities during the programme's lifetime

How do we know what we know?

Surveys/questionnaires

How To Write A Personal Essay

Performance Measures for Internal Auditing

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input

Benefits of conducting a Project Management Maturity Assessment with PM Academy:

Appendix A: Annotated Table of Activities & Tools

Interview studies. 1 Introduction Applications of interview study designs Outline of the design... 3

Manual on Training Needs Assessment

Developing an implementation research proposal. Session 2: Research design

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

BASIC ELEMENTS OF GRANT WRITING

Performance Development and Review Template Guide

Conducting Interviews in Qualitative Social Science Research

INL PROGRAM PERFORMANCE METRIC EXAMPLES

Structured Interviewing:

Descriptive Methods Ch. 6 and 7

Module 2: Setting objectives and indicators

Maths Mastery in Primary Schools

5. project evaluation

Qualitative Interview Design: A Practical Guide for Novice Investigators

Pre-experimental Designs for Description. Y520 Strategies for Educational Inquiry

Using qualitative research to assess your impact

INTERNAL COMMUNICATIONS REVIEW TOOLKIT

Nutrition education for adolescents: Principals' views

Defining Customer and Business Requirements

The RESPECT Project : Training for SUPPLIERS. Module 7. Negotiation skills

Market Research Methodology

Before we get started

Comparison of Research Designs Template

Creating a Customer Advisory Board Overview and Checklist by Clearworks

How To Write A Customer Profile Sheet

An Introduction to Secondary Data Analysis

Baseline Assessment and Methods of Evaluation

Terms of Reference: Baseline Survey for Global Multi-tier Measurement for Access to Energy

Explaining the difference your project makes A BIG guide to using an outcomes approach. Sara Burns and Joy MacKeith Triangle Consulting October 2006

Using Evaluation to Improve Programs. Strategic Planning.

SURVEY DESIGN: GETTING THE RESULTS YOU NEED

Module 4: Formulating M&E Questions and Indicators

Principles of Data-Driven Instruction

Maryland Affiliate of. Susan G. Komen for the Cure. Grant Writing Workshop. Who is Susan G. Komen for the Cure? Workshop Agenda.

Assessment Handbook. Assessment Practice. Student Affairs Assessment Council (SAAC)

Evaluating Survey Questions

Study Guide for the Physical Education: Content and Design Test

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

Research & Consultation Guidelines

Using Focus Groups in Program Development and Evaluation

Interview as a Method for Qualitative Research

Government leadership in assuring better quality healthcare in South Africa: policy into practice

How to Increase Value on Investment for Your Wellness Program

Manual on Training Evaluation

Collecting & Analyzing Qualitative Data in Community Health Assessments

Interaction Design. Chapter 4 (June 1st, 2011, 9am-12pm): Applying Interaction Design I

Data Collection Strategies 1. Bonus Table 1 Strengths and Weaknesses of Tests

Linking Classroom Assessment. Student Learning. with. Listening. Learning. Leading.

The Influence of Feeding Practices in Child Care on Obesity in Early Childhood

PROJECT MANAGEMENT TRAINING MODULES

Usability Evaluation with Users CMPT 281

DATA COLLECTION TECHNIQUES

Getting the best from your 360 degree feedback

MANAGING YOUR LIST

Transcription:

Data Collection for Program Evaluation By Dornell Pete & Kevin English Albuquerque Area Southwest Tribal Epidemiology Center

Presentation Objectives Making the Connection between Program Evaluation and Data Data Collection Methods Data Collection Tools Data Collection Quality Control Data Analyses

Making the connection between Program Evaluation and Data Community

Roadmap Where are we going? How will we get there? What will show that we ve arrived? How will you define success?

Making the Data Connection Process Measures Outcome Measures Inputs Outputs Outcomes Program Plans Staff/Expertise Curriculum Activities Participation Learning (immediate term) Action (intermediate term) Impacts (long-term) Example: Number of dietitians hired to develop and implement a cooking class is an input. Number of youth 13-17 years old who participated in a cooking class is an output. Percentage of youth who are able to prepare healthier meals is an outcome.

Process/Outcomes Measures Process focuses on what activities were implemented, the quality of the implementation, and the strengths and weaknesses of the implementation Outcomes assesses the immediate, intermediate and long term effects of your program on the target population.

Why include Process Measures? Helps you determine whether or not your program is operating as intended. Focuses on: what activities were implemented quality of the implementation strengths and weaknesses of the implementation.

Sample Process Measures Number of workshops held Meetings held Materials produced & distributed Program participation rates Participant satisfaction # of assessments conducted # of health education activities completed Others??

Outcomes Measures Determines whether or not the program contributed to beneficial effects on your specific target population. Assesses the immediate, intermediate and long term effects of your program on the target population.

Sample Outcomes Measures Changes in knowledge, attitudes & beliefs Behavior Change Nutrition Physical activity Improvement in clinical measures BMI Blood pressure/ Cholesterol HgA1c Changes in services & systems (i.e. new policies & protocols) Others??

Example Participation Reactions Learning Actions Impacts Number & characteristics of people reached; frequency of contact Degree of satisfaction with program; level of interest; feelings toward activities and methods Changes in knowledge, attitudes, skills Changes in behaviors and practices Net effects, final consequences, what extent can changes be attributed to the program? Data Indicators # Staff # Partners # Sessions held #, % attended per session #, % demonstrating increased knowledge/skill Additional outcomes #, % demonstrating changes Types of changes #, % demonstrating improvements Types of improvements

Approaches to collecting data Quantitative Survey/Questionnaire Pre-/Post-Tests Data Collection Mix method Interviews Focus Groups Qualitative Community Forum Photovoice Digital Storytelling

What data collection method shall I use? There is no simple answer There is no ONE best method It all depends

Survey/Questionnaire Definition: way of collecting information that represents the views of the community or the group in which you are interested.

Survey/Questionnaire When? You are starting or evaluating a project or activity You need a quick and efficient way of getting information You need to reach a large number of people The information you need isn t readily available through other means You need statistically valid information about a large number of people

Steps for Conducting Survey/Questionnaire 1. Purpose 2. Audience/target population 3. Select method: a. Face-to-face b. Mail c. Telephone d. Internet e. Email 4. Write your questions 5. Pilot test and refine 6. Administer 7. Analyze the data and plan for action

Survey/Questionnaire Design Tips Writing effective survey questions Remember survey s purpose If in doubt, throw it out If a question can be misinterpreted, it will be Include only one topic per question Consider alternative ways to ask sensitive questions Keep open-ended questions to a minimum Avoid leading questions Find a balance in the number of questions to include

Survey/Questionnaire Design Tips Plan carefully What variables to measure What relationships might exist What data type(s) to use What analyses will you conduct Descriptive analysis (numbers/frequency, proportions, percentages) Multivariate or In-Depth analysis (statistical testing)

Survey/Questionnaire Design Tips Survey/questionnaires lend best to closed questions: List (nominal) Categorical (nominal) Ranking (ordinal) Scale (interval)

Example Data Types Continuous Discrete Ordinal Nominal Interval Open Age, 44, 45, 56, 76 Number of times a week you went running Very satisfied, satisfied, dissatisfied Physical activity: run, walk, Zumba, aerobics After learning about Zumba, how likely are you to consider Zumba as a physical activity: Very likely - 1, 2, 3, 4, 5 - Least Likely Describe barriers to not engaging in physical activity

Survey/Questionnaire Pilot Testing After developing your questionnaire Ask a small sample to complete it Examine their responses to see if they interpreted it and answered it correctly Analyze the data Revise or look for another pre-tested question

Pre-Post-Tests Definition: tests that are administered upon a beginning point and end point among participants (youth, parents) in a program for knowledge testing. Pre test Curriculum, Activities Post Test

Pre-Post Tests Why? Establish a base measure of knowledge and understanding of a topic Determine if the activities or curriculum has had an impact Understand which concepts were well taught and which ones need additional time Quantify the extent of any changes in knowledge or understanding

Steps for Conducting Pre-Post Tests 1. Determine what participants must know 2. Create questions that focus on the learning objectives 3. Develop tests 4. Validate tests with staff for clarity and response a) Rewrite and retest with different staff, if necessary 5. Administer the pre-test before the curriculum 6. Administer the post-test after the curriculum 7. Analyze data and make any adjustments to the curriculum

Pre-Post Tests Question Types Open-ended questions Specific and understandable True/False Simply worded and to the point Multiple-choice Develop responses that are distinct from one another Fill-in-the-blank

Pre-Post Tests Advantages: Measures the value added by your program Baseline knowledge of participants, which is helpful for guiding future activities Disadvantages Hard to determine if positive change is due to program or natural maturation Statistical problem: if participants scored so low that they can only improve, or participants scored high they showed no improvement

Interviews Definition: a structured one-on-one conversation with key stakeholders to gather in-depth information on a particular health topic, issue, concern Types of Interview: Structured, semi-structured, unstructured Standardized, non-standardized Respondent, informant

Interviews Why? Gathering in-depth information on local obesity and nutrition health issues Asking open-ended questions about what people think, believe, desire & feel Gathering feedback on program activities, event, etc. Watching nonverbal & spontaneous behavior about specific topics Developing rapport with experts, leaders & key stakeholders

Interviews Who? Community leaders Affected community members Youth Parents Health care providers Project staff Local experts Other important stakeholders

Steps for Conducting Interviews 1. Determine what you want to know 2. Discuss and draft interview questions 3. Determine who to interview 4. Train interviewers 5. Pilot test for clarity and appropriateness 6. Contact and schedule interviewing appointments 7. Take notes and tape-record interview 8. Analyze data, provide feedback to interviewees, and plan for action

Interviews Common Challenges Putting people on the defensive Two-in-one questions Complex questions Question order Bias Interviewer Response

Interviews Tips Practice Small-talk Be natural Listen Think of your goals Avoid yes/no answers Respect Confidential Record the data

Focus Groups Definition: a small-group discussion used to learn more about the opinions, attitudes & beliefs of a specific group about a designated topic Components: Specific discussion topic Facilitated to keep the group on course Conducted in an open & nonthreatening environment

Focus Group Why? Learn about group/community opinions, strengths and needs Get to what people are really thinking and feeling

Focus Groups When? If you re considering a new program or service Understanding attitudes, opinions, beliefs or perceptions, rather than simply whether people agree or disagree Ask questions that can t easily be asked or answered on a written survey Add to the knowledge you gain from written surveys

Focus Group Steps to Planning 1. Assemble planning committee 2. Develop focus group guide 3. Find a facilitator/leader 4. Use a recorder 5. Decide about incentives 6. Decide on meeting details 7. Identify participants 8. Recruit participants

Focus Group - Results Examine written and tape recorded information: What patterns emerge? What are the common themes? What new questions arise? What conclusions seem realistic? Have more than one person review the data and compare interpretations and conclusions

Focus Group Results/Findings Honor participants time and commitment Visit, call or mail participants the results Consider conducting a second gathering to review results, and allow participants to get involve and plan next steps!

Photovoice Definition: a process where community members can identify and address a topic through photographs and storytelling 3 aims of Photovoice: To record and reflect upon community strengths and concerns To promote dialogue, knowledge, and reflection about individual, family and community issues through large/ small group discussions To initiate change and promote action

Photovoice Advantages: Flexible Community-owned Participatory Image-centered Action-oriented Empowerment

Steps for a Photovoice Project 1. Recruit participants 2. Conduct training 3. Obtain informed consent 4. Determine initial theme 5. Conduct camera operation training 6. Take pictures 7. Discuss photographs 8. Participatory analysis 9. Disseminate findings 10. Plan for action

Photovoice Analysis SHOWeD Questions: 1. What do you See Here? 2. What is really Happening Here? 3. How does this relate to Our Lives? 4. Why does this situation, concern or strength exist? 5. What can we Do About It?

Digital Storytelling Definition: a process of using computers or mobile devices to create and tell stories that focus on a specific topic. Contains a mixture of computer based images and text with recorded audio and narration. Aim: To express ideas through multiple forms of media to enhance and deepen meaning of a topic

Secondary Data Project will involve primary data collection but you can also use secondary data What is secondary data? Data collected by someone else other that you. Examples: Published/unpublished data Survey data Multiple sources

Secondary Data Advantages: May have fewer resource requirements Provide as comparative or contextual data Disadvantages: May not match your need Accessing may be difficult or costly Unsuitable sample size or representativeness

Secondary Data - Examples Local programs, organizations: Tribal Special Diabetes Programs Schools Indian Health Service State health departments: Chronic Disease programs (Diabetes, Obesity) Youth Risk and Resiliency Survey Behavioral Risk Factor Surveillance System Nutrition Program Federal agencies: IHS Centers for Disease Control School Health Policies and Programs Others?

Data Collection Tools Paper Video Audio tape Tablets Mobile devices Computer Telephone

Data Collection Quality Control (QC) Develop Data QC Plan to ensure data is collected, processed, validated, and reported in a standardized way Implement training (i.e., for interviewing, transcribing, data entry) Conduct double data entry Clean data Conduct preliminary analyses (i.e., response rates, calculations) Always save and back up data!

Data Analyses Analyses Formative analysis Determine ways to change the program to be more focused; assesses the curriculum or activities Summative analysis Results Assesses the learning or uptake of overall program curriculum or activities Data interpretations and conclusions

Reflection What collection method aligns with your program? How can you improve your data collection process? Best practices? Questions

Contact Information Dornell Pete, Epidemiologist dpete@aaihb.org Kevin English, Director kenglish@aaihb.org Phone: (505) 764-0036