Evaluating Breastfeeding Programs and Initiatives



Similar documents
Guide for the Development of Results-based Management and Accountability Frameworks

Engage Government & Community 2 PLAN REFLECT ACT

Evaluating health promotion programs

Evaluating health promotion programs. Region of Peel September 27, 2013

Guide for Clinical Audit Leads

2016 COMMUNITY SERVICES GRANTS

We would like to hear from you. Please send your comments about this guide to Thank you.

What Do You Mean by Base Camp?

How To Teach Human Resources Management

PROJECT MANAGEMENT TRAINING MODULES

Resource 6 Workplace travel survey guide

A guide to evaluating road safety education programs for young adults

Durham Technical Community College Beginning:, PERFORMANCE REVIEW AND EVALUATION FORM Ending:,

Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

NYSAN PROGRAM QUALITY SELF-ASSESSMENT (QSA) TOOL, 2 ND EDITION QUALITY INDICATOR DEFINITIONS

The ADDIE Model: Designing, Evaluating Instructional Coach Effectiveness By Shelby Danks, Hurst-Euless-Bedford Independent School District

Learning and Development Hiring Manager Guide For Onboarding A New Manager

Performance Management

Internship Guide. Get Started

Monitoring and Evaluation Plan Primer for DRL Grantees

The Canadian Code for Volunteer Involvement: An Audit Tool

External Field Review Results for the Canadian Multiple Sclerosis Monitoring System

CenteringParenting, a unique group post-partum care and social-support model, is ready for

Wilder Research. Program evaluation capacity building at the Minnesota Historical Society Summary of the project and recommended next steps

Using Public Health Evaluation Models to Assess Health IT Implementations

Research and information management strategy Using research and managing information to ensure delivery of the Commission s objectives

FAILURE TO LAUNCH: Why Companies Need Executive Onboarding

Step 1: Analyze Data. 1.1 Organize

Program theory and logic models Evaluation resources from Wilder Research

Leadership. D e v e l o p m e n t Your Guide to Starting a Program

Learning and Development New Manager Onboarding Guide

Applies from 1 April 2007 Revised April Core Competence Framework Guidance booklet

Analyzing and interpreting data Evaluation resources from Wilder Research

Implementing wraparound in a community is complex and

Mapping Your Science Education Program. Precollege and Undergraduate Science Education Program

Stakeholder Engagement Working Group

EAPRIL Best Research and Practice Award

Building a Training Program

JOHNSON-SHOYAMA GRADUATE SCHOOL OF PUBLIC POLICY EXECUTIVE INTERNSHIP MENTOR HANDBOOK

Lessons Learned by engineering students on placement

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

Effective Workforce Development Starts with a Talent Audit

BSBMKG506B Plan market research

A Human Resource Capacity Tool for First Nations // planning for treaty

Chapter 7: Monitoring and Evaluation

A framework to plan monitoring and evaluation

Section 1: Introduction to the Employee Satisfaction Roll Out Process 3

Preparation for careers in substance abuse or mental health can begin as early as elementary school.

Evaluating health & wellbeing interventions for healthcare staff: Key findings

Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5)

Strategic Executive Coaching: An Integrated Approach to Executive Development

Onboarding and Engaging New Employees

FINAL SYLLABUS -- PH Program Evaluation -- Spring 2013

How To Monitor A Project

Non-Researcher s Guide to Evidence-Based Program Evaluation

UM-Dearborn Staff Satisfaction Data Review and Discussion Guide 2007 Survey

JAD Guidelines. Description

Measuring ROI in Executive Coaching

How To Be A Successful Supervisor

Evaluation of four autism early childhood intervention programs Final Evaluation Report Executive Summary

Leadership Development Best Practices. By Russel Horwitz

Good Practices in Survey Design Step-by-Step

Individual performance assessment and motivation in agile teams

By Jack Phillips and Patti Phillips How to measure the return on your HR investment

Strategic Planning Planning for Organizational Success

HUMAN AND INSTITUTIONAL CAPACITY DEVELOPMENT HANDBOOK A USAID MODEL FOR SUSTAINABLE PERFORMANCE IMPROVEMENT

ORGANISATIONAL DIAGNOSIS QUESTIONNAIRE (ODQ)

Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook

SEVEC Community Involvement Program - How to Choose a Project

BEFORE-DURING-AFTER (BDA)

Summary of Study and Findings J. Lisa Stewart, Ph.D.

Quality Outcome Measures: Provider Unit Level

CLINICAL RESEARCH GENERIC TASK DESCRIPTIONS

Running Head: 360 DEGREE FEEDBACK 1. Leadership Skill Assessment: 360 Degree Feedback. Cheryl J. Servis. Virginia Commonwealth University

TRAINING NEEDS ANALYSIS

Draft Pan-Canadian Primary Health Care Electronic Medical Record Content Standard, Version 2.0 (PHC EMR CS) Frequently Asked Questions

BENCHMARKING IN HUMAN RESOURCES

Effective Strategies for Patient Recruitment and Retention. Do You Have Your Calculators Ready?

Performance Appraisal Module Facilitator s Guide

Facilitated Expert Focus Group Summary Report: Prepared for the Training Strategy Project. Child Care Human Resources Sector Council

Developing and Implementing Integrated Care Pathways Guide 2: Developing Integrated Care Pathways

How to gather and evaluate information

NGO Self-assessment through a SWOT exercise

Why Evaluate? This chapter covers:

pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

Article. Developing Statistics New Zealand s Respondent Load Strategy. by Stuart Pitts

Strategic Planning. Strategic Planning

Part B1: Business case developing the business case

PROGRAM DESIGN. A Practical Guide. Prepared by: Katherine Main. Give. Volunteer. Act.

A 360 degree approach to evaluate a broker s impact on partnerships

V. Course Evaluation and Revision

Alberta Health. Primary Health Care Evaluation Framework. Primary Health Care Branch. November 2013

Developing an R Series Plan that Incorporates Mixed Methods Research

STRENGTHENING NONPROFITS: A Capacity Builder s Resource Library. Measuring Outcomes

Organizational Application Managing Employee Retention as a Strategy for Increasing Organizational Competitiveness

How To Design A Program

IN ACTION. Assessing & Strengthening the Sustainability of Community Development Organizations

HRM. Human Resource Management Rapid Assessment Tool. A Guide for Strengthening HRM Systems. for Health Organizations. 3rd edition

Billing Department: Staffing & Structure for Sustainability

Transcription:

Breastfeeding in Ontario Evaluating Breastfeeding Programs and Initiatives What is evaluation? Evaluation means different things to different people. Fundamentally, evaluation means getting feedback about a program or initiative in a deliberate, systematic way in order to help make decisions about the program concerning effectiveness, results and troubleshooting. It is most common to evaluate programs and other types of initiatives, including resources. In this fact sheet, any program or initiative to be evaluated will be referred to as program. Why is it important? Learning about evaluation can give you the knowledge, tools and confidence needed to: Determine how your program can be improved (quality improvement). Determine if your program is achieving the results it set out to achieve (accountability). Interpret the evaluation work of others and the lessons they have learned (shared knowledge). Evaluating Breastfeeding Programs and Initiatives 1

can I make an evaluation plan? Evaluation, in some situations, can be complicated, and you may need specific expertise. However, you and your colleagues can learn to develop and implement a basic evaluation plan. This includes knowing when to get help along the way. For those with no evaluation experience, knowing how to start and keeping evaluation plans realistic are the biggest hurdles to overcome. You will gain confidence by doing it! When do I start? As ideas for your program are being formulated, it is wise to start evaluation planning. If you are submitting a proposal for funding, include your evalu ation plan. As part of your considerations, imagine the program is operational and begin to brainstorm questions you and other stakeholders will want to have answered. Consider the following questions as you begin planning your evaluation: Which questions can you ask to get the feedback you need? Who do you need to ask to be sure you have heard from the right people? What methods can you use to get answers to your questions? I thought I was good to go, but I needed help with the focus groups and the analysis. What are the basic steps? To guide your evaluation plan, use the following Evaluation Planning Steps. You may need to learn other skills or get help from colleagues with relevant experience (e.g., running a focus group, designing a questionnaire, analysing and presenting information in a way to get the maximum benefit for your work). Evaluating Breastfeeding Programs and Initiatives 2

I asked around and found great tools to work with in the development of the evaluation plan and collection of data. evaluation Planning Steps Step 1 Involve the right people. Step 2 Assess your resources. Step 3 Describe your program for evaluation purposes. Step 4 Undertake a systems or contextual analysis. Step 5 Define the evaluation questions. Step 6 Determine the evaluation measures and design. Step 7 Develop a plan to use the results. Step 8 Reevaluate your evaluation plan. Step 1 Involve the right people One of the most important steps is to involve other people. The people you include may be current or potential clients, colleagues, supervisors, community partners, opinion leaders, or people with specific expertise. By including others, the evaluation plan will be more relevant, everyone will learn more, and the process will be more enjoyable. Each person you involve will have different experiences with evalu ation and different ideas about the most important aspect to evaluate. In addition, the results will be more useful in assessing your program. As a group, allow time to develop a plan for decision-making. Make this early process as participatory as possible. Include people who will have access to the program or who will benefit from the program. Step 2 Assess your resources Your evaluation plan needs to be realistic and in proportion to the nature and scope of your program. To achieve this, it is helpful to set aside a percentage of your budget for evaluation and plan accordingly. You also need to take into account other in-kind resources and expertise that may be available from colleagues, your organization, or community partners. As your evaluation plan evolves, develop a detailed evaluation budget, just as you will have done for the program itself. Your budget may include: Gifts for focus group participants. Cost of telephone calls. Photocopying or printing of surveys, handouts, or recruitment posters. Consulting an evaluation expert. Data collection or analysis software. Remuneration for someone to complete data collection or data analysis. Evaluating Breastfeeding Programs and Initiatives 3

Step 3 Describe your program for evaluation purposes It is most common to organize your evaluation around the intended objectives of the program. Developing a program logic model is a great way to clarify the logic that links the process objectives to the outcome objectives. In other words, the logic model links what your program will do, or is intending to do, with the intended results in the short, intermediate, and longer term. For this reason, the logic model is also a great tool to include in your program proposal - it then becomes a good stepping stone to your evaluation plan. There are many online resources and articles about logic models. Some examples are provided in the resources listed on pages 9 and 10. You can pick the format that works for you. A paper by Brian Rush (1991) may be helpful as well as the webinar he conducted for the Best Start Resource Centre breastfeeding community projects (see page 10). Following the approach suggested by Rush (1991), a practical tip in developing a program logic model is to use small, repositionable sticky notes. Give each major component of your program a sticky note and label it (e.g., Volunteer Recruitment and Training, Peer Support, Breastfeeding Café, etc.). Lay these sticky notes out horizontally. Next, put a second row of sticky notes underneath the first row and list what the program is doing, or will do, in each component. For example, under Peer Support, you might include match peer support volunteer and the mother or provide one-on-one telephone support. These are your process objectives, and they state what should be undertaken in each component of your program they are action oriented. Under the row of process objectives, put a third row of sticky notes for short-term outcomes, a fourth row for intermediate outcomes, and a final row for long-term outcomes. For example, under Peer Support, the short-term outcome may be increased knowledge in handling practical challenges, increased confidence in breastfeeding, or increased knowledge of community supports. An intermediate outcome might be extended duration of breastfeeding. A long-term goad might be a noticeable trend of extended breastfeeding duration within the community. Notice these are Evaluating Breastfeeding Programs and Initiatives 4

all change oriented. Other types of programs might be more complicated. One purpose of the logic model is to test the logic or underlying rationale of your program. If the underlying logic doesn t make sense, put the evaluation on hold and re-think the design of the program itself. The following are examples of questions you may uncover when developing your logic model. Will physicians allocate time to training sessions? Do high-priority mothers within some populations have regular Internet access? Two more tips on logic models: 1. Create your model in a group process to get a diversity of ideas. 2. Be sure to include your intermediate and long-term objective(s) even if your project funding is short-term. For example, your intentions may include changes in awareness, attitudes, breastfeeding behaviour, breastfeeding rates, healthier children, etc. At this stage, it s important that you use the logic model to clarify your thinking and to communicate the overall design and expectations for your program. Step 4 Undertake a systems or contextual analysis A logic model by its very nature does not consider any outside variables. However, your program exists, or will exist, in a community and organizational context. It is important for your evaluation planning to consider the context in which your program will operate. For example, do you have sufficient resources for the program itself? Are key-opinion leaders within the organization or community aware and supportive? Will your program duplicate other services? Has something similar been tried before with limited success? If needed, will you be able to recruit the right staff to manage the program? These types of questions can be answered through discussions with colleagues and community partners, interviews with key-opinion leaders, or more formal community needs assessment. Step 5 Define the evaluation questions When you travel to an unfamiliar place, it is helpful to have a roadmap in hand and some preliminary ideas about the important places to visit considering your available time and budget. In evaluation, this is the purpose of your evaluation questions. It is a common mistake for an evaluation to be too ambitious at the beginning. If evaluation priorities do not get established and documented from the beginning, you and your collaborators can easily get overwhelmed with more information than is necessary. Framing evaluation questions can be challenging as it requires those who are involved in project planning and implementation to think past the setup of the project and to focus on results. In evaluation, you ask questions about how something went in the past. When planning the program and its evaluation at the same time, you are imagining the future when the program is operational and looking back. Evaluating Breastfeeding Programs and Initiatives 5

The following are examples of well-framed evaluation questions: Did we reach the people we intended with the community breastfeeding café? If not, why not? Were our peer support volunteers able to resolve breastfeeding challenges? Did their support increase the confidence of breastfeeding mothers? Did their support increase the duration of breastfeeding for some mothers? Did we get the expected participation from health service providers in our training program? If not, why not? Did we effect changes in the breastfeeding rates in our community? Step 6 Determine the evaluation measures and design There is a close connection between the question(s) to be answered and the evaluation format that will provide the most useful answers, including the indicators and measures to be tracked. The following table shows how evaluation measures correlate to the evaluation design. evaluation measures Questions about reach and operations: Is the breastfeeding café engaging mothers from our target population (e.g., new Canadians)? Are participants expectations and expressed needs being met? evaluation design Process Evaluation: Use registration forms to track demographics. Use interviews or focus groups to identify challenges accessing the program. Use an anonymous questionnaire to assess participants satisfaction. Questions about results achieved: Did participation in the training workshops increase health care providers' consistent messaging and support for breastfeeding? Was there an increase in the rate of breastfeeding among new mothers following health care provider training? Outcome Evaluation: Use a questionnaire pre- and post-workshop to assess participants attitudes and practices with respect to breastfeeding. Supplement these with semi-structured interviews. Conduct retrospective random chart reviews of new mothers involved in the program to ensure their breastfeeding practices were documented. Evaluating Breastfeeding Programs and Initiatives 6

Using the following table can help organize your thinking as well as identify and prioritize options. It is best used as a brainstorming tool, while working your way through the logic model. Once you have your evaluation plan, go back and prioritize what is most important for future decisions and what is most feasible to accomplish with your allocated resources. Evaluation measures and design using a peer-support program as an example PrOgram components Or OBjectIveS taken from your logic model Recruit peer support volunteers. Train peer support volunteers. Provide one-on-one telephone support. Increased knowledge in handling practical challenges. Increased confidence in breastfeeding. Increased knowledge of community supports. Increased duration of breastfeeding. evaluation questions evaluation measures measuring PROCESS effectiveness Has our recruitment process been successful? Do volunteers feel adequately trained to handle common questions (e.g., latching, sore nipples, and breastfeeding after caesarian)? Are volunteers actively providing one-on-one telephone support? Do participating mothers say that information needs were met? Do participants report an increase in breastfeeding confidence? Do participants report an increase in knowledge of community supports? Have participants continued breastfeeding beyond their expectations, previous breastfeeding experiences, or community norm? OutPutS, measures, Or IndIcatOrS taken from your logic model Records for volunteers, including demographic information. Peer support volunteer documentation shows common questions were discussed. Number of peer volunteers matched with mothers Number of mothers being supported Number of calls completed and recorded between mothers and volunteers. measuring OUTCOME effectiveness Expressed satisfaction with program participation. Self-reported breastfeeding confidence. Self-reported increase in knowledge of community supports. Self-reported increase in breastfeeding duration beyond expectations, previous breastfeeding experiences, or community norm. data collection StrategIeS evaluation design Peer support volunteer application form. Tracking of applications. Peer support volunteer documentation forms. Tracking of volunteers in program and peer support matches made. Tracking of mothers matched. Peer support volunteer documentation forms. Brief survey questionnaire on entry to and exit from program. Brief survey questionnaire. Brief survey questionnaire. A follow-up survey that asks about planned and actual breastfeeding duration. Evaluating Breastfeeding Programs and Initiatives 7

Depending on the data collection strategies proposed, you may need to get some specialized help or learn some additional skills. For example, semi-structured interviews or focus groups will require qualitative analysis. There are specific methods available as well as software for analysing qualitative information. If there is too much information to manually sort through and identifying common themes, you may want to consider using software. If there is not too much, you can organize similar information into themes manually or with a word processing computer program. Similarly, a pre and post intervention design may require statistical analysis to see if the differences observed are greater than what might be attributable to chance. Analysis and presentation of evaluation data can be handled with a spreadsheet computer program creating frequency counts (e.g., confidence level by age of the mothers) and presentations with bar charts or other types of graphing formats. Step 7 Develop a plan to use the results As you develop your evaluation priorities and plans, it is important to prepare a written eval u ation plan that can be referred to and updated from time to time. This can also be useful for documenting why you chose to measure some things and not others, which can be extremely helpful if there is turnover among key stakeholders, partners, or those responsible for the evaluation. Step 8 Reevaluate your evaluation plan Before finalizing your plans and beginning data gathering, it is wise to revisit the budget and ensure that the time and resources are available to complete the work successfully. You may need to reevaluate your evaluation plan more than once as your program unfolds. If during your reevaluation, you recognize problems with your evaluation plan, return to and repeat previouslycompleted steps. This process can be repeated until you are ready to proceed. common lessons learned from others The following are some common lessons learned that may provide some additional guidance. I wish I had prepared my evaluation plans and information collection tools to be ready for the launch. Once I got going it was hard to develop my evaluation plan when the program was already running. I reviewed the demographics of the participants weekly and also where they heard about the peer support and café. Knowing this helped us adjust our promotional efforts. It can be difficult to get real outcomes in a short project. We supplemented our resources to do a longer tracking of actual behavioural change. I didn t get a good follow-up rate to my online questionnaire, so I found a volunteer student to make some follow-up calls. Our first priority was reach. If we don t have good participation, then the whole program is at risk. We focused our efforts on tracking use of the online resource and success of promotional efforts. It s best to rely on interventions that have a proven track record from research and with documented outcomes. This can save time when creating your own outcome evaluation. It is helpful to consider the uniqueness of your community context. Evaluating Breastfeeding Programs and Initiatives 8

Resources and References Practical Tools Canada US Quality Improvement and Innovation Partnership s (QIIP) Evaluation, Needs Assessment and Performance Measures Working Group. (2010, January). Logic model resource guide. Mississauga, ON: Author. Retrieved from www.hqontario.ca/portals/0/documents/qi/qi-rg-logic-model-1012-en.pdf Treasury Board of Canada Secretariat. (2010). Supporting effective evaluations: A guide to developing performance measurement strategies. Retrieved from www.tbs-sct.gc.ca/cee/dpms-esmr/dpms-esmr05-eng.asp Treasury Board of Canada Secretariat. (2012). Theory-based approaches to evaluation: Concepts and practices. Retrieved from www.tbs-sct.gc.ca/cee/tbae-aeat/tbae-aeateng.pdf Weaver, L., Born, P., & Whaley, D. L. (2010). Approaches to measuring community change indicators. Waterloo, ON: Tamarack. Retrieved from tamarackcommunity.ca/downloads/vc/measuring_community_change.pdf Preskill, H., Gopal, S., Mack, K., Cook, J. (2014). Evaluating complexity: Propositions for improving practice. Boston: FSG. Retrieved from www.fsg.org/publications/evaluating-complexity#download-area Preskill, H., Parkhurst, M., Splansky Juster, J. (2013). Building a strategic learning and evaluation system for your organization. Boston: FSG. Retrieved from www.fsg.org/publications/building-strategic-learning-and-evaluation-system-your-organization#download-area Preskill, H., Parkhurst, M., Splansky Juster, J. (n.d.). Guide to evaluating collective impact. Boston: FSG. Retrieved from www.fsg.org/publications/guide-evaluating-collective-impact#download-area Organizations Best Start Resource Centre www.beststart.org HC Link www.hclinkontario.ca Public Health Ontario Health Promotion Capacity Building Services www.publichealthontario.ca www.publichealthontario.ca/en/servicesandtools/healthpromotionservices/pages/default.aspx Evaluating Breastfeeding Programs and Initiatives 9

Articles Kaplan, S. A., & Garrett, K. E. (2005). The use of logic models by community-based initiatives. Evaluation and Program Planning, 28(2005), 167-172. Retrieved from www.uic.edu/sph/prepare/courses/chsc433/kaplan.pdf Mayne, J., & Rist, R. C. (2006). Studies are not enough: The necessary transformation of evaluation. The Canadian Journal of Program Evaluation, 21(3), 93-120. Retrieved from www.evaluationcanada.ca/secure/21-3-093.pdf Porteous, N. L., Sheldrick, B. J., & Stewart, P. J. (2002). Introducing program teams to logic models: Facilitating the learning process. The Canadian Journal of Program Evaluation, 17(3), 113-141. Retrieved from www.research.familymed.ubc.ca/files/2012/03/faciliter_modele_logiques_cjpe-2002_f.pdf Renger, R., & Renger, J. (2007). Research and practice note: What an eight-year-old can teach us about logic modelling and mainstreaming. The Canadian Journal of Program Evaluation, 22(3), 195-204. www.evaluationcanada.ca/secure/22-1-195.pdf Rush, B.R., & Ogborne, A.C. (1991). Program logic models: Expanding their role and structure. Canadian Journal of Program Evaluation, 6(2), 93-105. Books McLaughlin, J. A., & Jordan, G. B. (2004). Using logic models. In J. S. Wholey, H. P. Hatry, & K. E. Newcomer, Eds. Handbook of practical program evaluation (pp. 7-30). San Francisco, CA: Jossey-Bass. Wyatt Knowlton, L., & Phillips, C. C. (2013). The logic model guidebook: Better strategies for great results, 2nd ed. Thousand Oaks, CA: Sage. Webcast Rush, B.R. (2014). Populations with Lower Rates of Breastfeeding: Using a Logic Model to Design an Effective Evaluation Strategy. Available at http://healthnexus.adobeconnect.com/p1s7kwifhg6/ This document has been prepared with funds provided by the Government of Ontario. The information herein reflects the views of the authors and is not officially endorsed by the Government of Ontario. The resources and programs cited throughout this resource are not necessarily endorsed by the Best Start Resource Centre. 2015 10