Developing a Monitoring and Evaluation Work Plan
|
|
|
- Jane Brooks
- 10 years ago
- Views:
Transcription
1 CORE MODULE 3: Developing a Monitoring and Evaluation Work Plan Monitoring HIV/AIDS Programs A FACILITATOR S TRAINING GUIDE A USAID RESOURCE FOR PREVENTION, CARE AND TREATMENT
2 In July 2011, FHI became FHI 360. FHI 360 is a nonprofit human development organization dedicated to improving lives in lasting ways by advancing integrated, locally driven solutions. Our staff includes experts in health, education, nutrition, environment, economic development, civil society, gender, youth, research and technology creating a unique mix of capabilities to address today s interrelated development challenges. FHI 360 serves more than 60 countries, all 50 U.S. states and all U.S. territories. Visit us at
3 Monitoring HIV/AIDS Programs: A Facilitator s Training Guide A USAID Resource for Prevention, Care and Treatment Core Module 3: Developing a Monitoring and Evaluation Work Plan September 2004 Family Health International
4 2004 Family Health International (FHI). All rights reserved. This book may be freely reviewed, quoted, reproduced or translated, in full or in part, provided the source is acknowledged. This publication was funded by USAID s Implementing AIDS Prevention and Care (IMPACT) Project, which is managed by FHI under Cooperative Agreement HRN-A
5 Table of Contents Session Overview and Schedule...1 Welcome and Review...3 Overview of Monitoring and Evaluation Work Plans...4 Overview of Monitoring and Evaluation Work Plans (Continued)...6 Creating a Monitoring and Evaluation Work Plan...7 Presentation of Monitoring and Evaluation Work Plans...8 Disseminating and Using Monitoring and Evaluation Results...8 Wrap-Up...9 Appendix... 10
6
7 CORE MODULE 3: This Monitoring and Evaluation series is based on the assumption that Core Module 1 (Introduction to Monitoring and Evaluation) is always the first module, that it is followed directly by Core Module 2 (Collecting, Analyzing, and Using Monitoring Data), which is followed by one or more of the optional technical area modules (Modules 4 through 10), and that in all cases the final module is Core Module 3 (Developing a Monitoring and Evaluation Plan). The specified sequence is shown below: 1. Core Module 1: Introduction to Monitoring and Evaluation 2. Core Module 2: Collecting, Analyzing, and Using Monitoring Data 3. Optional Technical Area Modules 4 through Core Module 3: Developing a Monitoring and Evaluation Plan Learning Objectives The goal of the workshop is to increase participants capacity to develop and implement comprehensive monitoring and evaluation work plans for country/site specific HIV/AIDS/STI Prevention, Care and Impact Mitigation programs. At the end of this session, participants will be able to: Understand the rationale, key elements, and steps required to develop a Monitoring and Evaluation Work Plan Apply program goals and objectives in developing a Monitoring and Evaluation Work Plan Develop program monitoring and evaluation questions and indicators and review the issues related to program evaluation, including selection of data collection methodologies Review Monitoring and Evaluation Work Plan implementation issues: Who will carry out the work? How will existing data and past evaluation studies be used? Identify internal and external evaluation resources and capacity required for implementation of a Monitoring and Evaluation Work Plan Develop and review a Monitoring and Evaluation Work Plan matrix and timeline Understand monitoring and evaluation data flow Develop and/or review and implement a monitoring and evaluation work plan of a country/site program, taking into consideration donor, FHI and country/site (government) requirements Apply the M&E Work Plan Template in developing an individual country/site/program activity work plan (semi-annual plans, annual plans) Session Overview and Schedule TIME TOPIC TRAINING METHOD 8:30-8:45 15 min A. Welcome and Review Facilitator Presentation 8:45-10:00 75 min B. Overview of Monitoring and Evaluation Work Plans Facilitator Presentation, Discussion, Small Group Activity 10:00-10:15 15 min BREAK 10:15-10:30 15 min B. Overview of Monitoring and Evaluation Work Plans (cont d) Facilitator Presentation Core Module 3: 1
8 Session Overview and Schedule TIME TOPIC TRAINING METHOD 10:30-12:00 90 min C. Creating a Monitoring and Evaluation Work Plan Small Group Activity 12:00-1:00 60 min LUNCH 1:00-2:30 90 min C. Creating a Monitoring and Evaluation Work Plan (cont d) Small Group Activity 2:30-2:45 15 min BREAK 2:45-3:15 30 min C. Creating a Monitoring and Evaluation Work Plan (cont d) Small Group Activity 3:15-4:30 75 min D. Presentation of Monitoring and Evaluation Work Plans 4:30-4:50 20 min E. Disseminating and Using Monitoring and Evaluation Results Small Group Activity, Discussion Full Group Discussion 4:50-5:00 10 min F. Wrap-Up Full Group Activity Materials Flipchart paper and stand Handout: Key Elements of M&E Work Plan Markers Handout: Seven Steps to Developing an M&E Work Plan Pens or pencils Overhead: Key Elements of M&E Work Plan Tape or Blue-Tac Overhead: Seven Steps to Developing an M&E Work Plan Evaluation form Overhead: Assessing How Well the Evaluation Plan Works Half-sheets of colored paper, one for each participant Handout: M&E Work Plan Template (distributed separately by facilitator) Facilitator Reference: Developing a Monitoring and Evaluation Work Plan Core Module 3: 2
9 A. Welcome and Review 8:30-8:45 15 min A. Welcome and Review Facilitator Presentation Facilitator Note: Before participants arrive, place small signs with country names on them at each table/table cluster, and as participants enter the classroom, welcome them and ask them to sit at a table identifying the country where their program is. The seating arrangement will facilitate crossfertilization of ideas among participants from countries with and without current M&E work plans. 8:30-8:40 (10 min) 1. Welcome Participants and Group Introductions Thank participants for arriving on time and remind them (in a humorous way) that anyone who arrives late will be subject to shame and humiliation from the whole group. Next, because this module (Core Module 3: Developing a Monitoring and Evaluation Plan) is the last module, delivered after Core Module 1 (Introduction to Monitoring and Evaluation) and Core Module 2 (Collecting, Analyzing, and Using Monitoring Data) and after any of the Optional Technical Area modules (4 through 10), participants will be familiar with each other. Therefore, each morning during this time, the facilitator can take about 15 minutes to review with the participants the material they have learned in the preceding modules. This provides an excellent opportunity to generate energy among the group by asking the participants to ask questions of each other, to quiz each other, and to see who has the answer. This review activity can be light, energetic, and even humorous. Encourage participants to stand up or do something else physical as they ask or answer their questions. 8:40-8:45 (5 min) 2. Overview of Workshop Objectives and Agenda The goal of this workshop is to build your skills in developing and implementing comprehensive monitoring and evaluation work plans for country/site-specific HIV/AIDS/STI prevention, care, and impact mitigation programs. At the end of this session, participants will be able to: Understand the rationale, key elements, and steps required to develop a Monitoring and Evaluation Work Plan Apply program goals and objectives in developing an M&E Work Plan Develop program M&E questions and indicators and review the issues related to program evaluation, including selecting data collection methodologies Review M&E Work Plan implementation issues: Who will carry out the work? How will existing data and past evaluation studies be used? Identify internal and external evaluation resources and capacity required for implementation of an M&E Work Plan Develop and review an M&E Work Plan matrix and timeline Understand M&E data flow Develop and/or review and implement an M&E Work Plan for a country/site program, taking into consideration donor, FHI, and country/site (government) requirements Apply the M&E Work Plan Template in developing an individual country/site/program activity work plan (semi-annual plans, annual plans) Core Module 3: 3
10 There will be a 15-minute mid-morning break, lunch will be from 12:00 to 1:00, and there will be a 15- minute mid-afternoon break. We will finish the workshop by 5:00 p.m. B. Overview of Monitoring and Evaluation Work Plans 8:45-10:00 75 min B. Overview of Monitoring and Evaluation Work Plans Facilitator Presentation, Discussion, Small Group Activity Materials Handout/Overhead: Key Elements of Monitoring and Evaluation Work Plan Handout: The Seven Steps to Developing an Monitoring and Evaluation Work Plan 8:45-8:50 (5 min) 1. Overview of Monitoring and Evaluation Work Plans Tell participants: You have learned about the critical elements of monitoring and evaluation in previous modules, and you have shared and learned knowledge and skills about the what, why, how, and when of monitoring and evaluation. Today, the final day of the training, we will practice developing a complete Monitoring and Evaluation Work Plan. The Monitoring and Evaluation Work Plan is a flexible guide to the steps you can use to document project activities, answer evaluation questions, and show progress toward project goals and objectives. As a guide, the M&E Work Plan explains the goals and objectives of the overall plan as well as the evaluation questions, methodologies, implementation plan, matrix of expected results, proposed timeline, and M&E instruments for gathering data. To ensure that M&E activities produce useful results, it is essential that you incorporate M&E in the program design stage. Planning an intervention and designing an M&E strategy are inseparable activities. To ensure the relevance and sustainability of M&E activities, project designers must collaborate with stakeholders and donors to develop an integrated and comprehensive M&E plan. Projects at all levels, whether single interventions or multiple integrated projects, should have an M&E plan in place to assess the project s progress toward achieving its goals and objectives and to inform key stakeholders and program designers about M&E results. Such plans will guide the design of monitoring and evaluation, highlight what information remains to be collected and how best to collect it, and suggest how to use the results to achieve greater effectiveness and efficiency. Comprehensive M&E plans should describe the overall goals and objectives of the country program (i.e., they should be site-specific); the specific M&E questions, methods, and designs to be used; what data will be collected and how; the required resources; who will implement the various components of the M&E work plan; and the timeline of the M&E plan. Monitoring and evaluation work plans are often written to cover a four- to five-year period because they may involve numerous M&E efforts on multiple interventions for different target populations. Some of these M&E activities require time to observe intervention or program outcomes (immediate or short-term effects) as well as overall program impact (long-term effects). Core Module 3: 4
11 8:50-9:05 (15 min) 2. Facilitator should review Handout: Key Elements of Monitoring and Evaluation Work Plan with participants. As we learned earlier, it is important to involve program planners, evaluators, donors, and other stakeholders (e.g., National AIDS and STD Control Program, National AIDS Commission, and other multisectoral program partners) throughout the development and design of the M&E work plan. Stakeholder involvement in the early phases helps ensure that results obtained from an M&E effort will be used in an ongoing manner. Involving members of the target community also helps inform the process. 3. Seven Steps to Develop a Monitoring and Evaluation Work Plan Facilitator should review Handout: The Seven Steps to Developing a Monitoring and Evaluation Work Plan with participants. 9:05-9:20 (15 min) 4. Opportunities and Barriers for M&E Work Plan Development and Implementation You will often find opportunities to develop and implement monitoring and evaluation plans for your programs, projects, and interventions. You will also encounter barriers to being able to conduct monitoring and evaluation efforts. The idea is to, as a team, identify in the planning stage opportunities and barriers with a view toward problem-solving and maximizing opportunities. You can design some of these solutions into your work plan and be flexible so that you can make adjustments within the context of your work plan to account for issues that may arise during the M&E process. Group Activity Divide the participants into four groups of about five people each. Then ask: Group 1 to list opportunities for developing a Monitoring and Evaluation Work Plan Group 2 to list barriers to developing a Monitoring and Evaluation Work Plan Group 3 to list opportunities for implementing a Monitoring and Evaluation Work Plan Group 4 to list barriers to implementing a Monitoring and Evaluation Work Plan Ask each group to create these lists in the context of faith-based interventions. Facilitator Note: When doing this activity, the groups often lean toward more generic opportunities and barriers, but encourage them to come up with responses specific to faith-based interventions). Core Module 3: 5
12 9:20-9:40 (approx.) (20-25 min) Give the groups 10 minutes to generate their lists. Facilitator Note: You can help the groups create and prepare their lists in a timely manner by walking around and visiting the different groups and asking them how they re doing ( Have you decided on the barrier? Have you begun your list? ). If a group is falling behind the other groups, encourage them to move onto the next step, reminding them that this is an example of what they might include, but that it is not necessary to provide an exhaustive or extensive list. Give the groups another five minutes to select one barrier, propose a way to overcome it, and prepare to act out the barrier and solution they identified for the rest of the class. Invite them to do so by singing a song, writing a poem, engaging in role-playing, or using any other creative approach. Facilitator Note: When it is time for the groups to decide how they will present to the group, ask them if they would like to use markers to create signage or if there is anything else they need. Also encourage creativity, letting them know that a song, skit, TV commercial, or anything else they think of is entirely welcome. Keep the groups moving forward from concept to rehearsal or materials development so they are ready in time. 9:40-10:00 (approx.) (15 20 min) Convene the full group and give each group five minutes to demonstrate the barrier and the solution that they identified. After each group s presentation, encourage the full class to note what they saw and learned. 10:00-10:15 15 min BREAK B. Overview of Monitoring and Evaluation Work Plans (cont d) 10:15-10:30 15 min B. Overview of Monitoring and Evaluation Work Plans (cont d) Facilitator Presentation 5. Assessing How Well the Monitoring and Evaluation Plan Works Planning an HIV/AIDS/STI Monitoring and Evaluation Work Plan is a dynamic process due to the evolving nature of the key issues, including indicators for M&E programs. Also, the contextual basis of programming changes over time. Therefore, it is important to periodically assess how well the Monitoring and Evaluation Work Plan is working, with a view toward making changes as appropriate. Ask participants to note changes that have taken place within and outside the parameters of their programs and how this is affecting their M&E plan. Example: Another donor is funding a new international NGO to work within the same sites and target groups in a district in Malawi. How will this affect the existing M&E plan? Take comments from the participants and sum up using the overhead on Assessing How Well the Evaluation Plan Works. Core Module 3: 6
13 Key questions to be asked to determine if an M&E plan is working include the following: Key Questions Are the M&E activities progressing as planned? Are M&E questions being answered sufficiently? Are other data needed to answer these questions? How can such data be obtained? Should the M&E questions be re-framed? Have other M&E questions arisen that should be incorporated into the plan? Are there any methodological or valuation design issues that need to be addressed? Are there any practical or political factors that need to be considered? Are any changes in the M&E plan needed at this time? How will these changes be made? Who will implement them? Are appropriate staff and funding still available to complete the evaluation plan? How are findings from M&E activities being used and disseminated? Should anything be done to enhance their application to programs? C. Creating a Monitoring and Evaluation Work Plan 10:30-12:00 90 min C. Creating a Monitoring and Evaluation Work Plan Small Group Activity Materials Workbook for M&E Work Plan Development (distributed separately by facilitator) M&E Work Plan Template (distributed separately by facilitator) Small Group Activity The participants should be sitting at (or should return to) the tables/table clusters they sat at earlier in the morning. Each group s task for the rest of the training period is to develop a Monitoring and Evaluation Work Plan for a group member s program. They should follow the steps outlined in the previous sections and produce a comprehensive work plan that covers all aspects discussed today. Remind them that key stakeholders normally would be present when they carry out this planning activity. Today they have a chance to go through the process by themselves. When they return home, they should take others through this process and incorporate their ideas into the work plan that they have started this week. 12:00-1:00 60 min LUNCH C. Creating a Monitoring and Evaluation Work Plan (cont d) 1:00-2:30 90 min C. Creating a Monitoring and Evaluation Work Plan (cont d) Small Group Activity 2:30-2:45 15 min BREAK Core Module 3: 7
14 C. Creating a Monitoring and Evaluation Work Plan (cont d) 2:45-3:15 30 min C. Creating a Monitoring and Evaluation Work Plan (cont d) Small Group Activity D. Presentation of Monitoring and Evaluation Work Plans 3:15-4:30 75 min D. Presentation of Monitoring and Evaluation Work Plans Small Group Activity, Discussion Ask each group to present its work to the whole class, using 5 10 minutes each to do so. At the end of each presentation, ask each small group to explain what they really like about their plan, and then ask the full group what they also like about the plan. Also ask the presenter group what they do not like about it or ask for questions and invite ideas from the entire group for possible solutions and recommendations. E. Disseminating and Using Monitoring and Evaluation Results 4:30-4:50 20 min E. Disseminating and Using Monitoring and Evaluation Results Full Group Discussion 4:30-4:40 (10 min) Facilitate a brief discussion on dissemination and use of M&E results using the following key questions: What information should be distributed? Who needs the information? How does the information get distributed? Summarize by saying something like: The dissemination of evaluation findings can support the usefulness of evaluation and future activities as well as reduce the duplication of evaluation efforts that might result if others are not aware of the findings of previous evaluation efforts. Furthermore, disseminating results enables us to teach others that monitoring and evaluation is indeed a tool for improving programs. For these reasons, a plan for how to disseminate the evaluation findings is crucial to your plan. 4:40-4:50 (10 min) Ask participants to list the technical assistance needs required to develop, review, and/or implement country/site/project-specific work plans. Give participants sheets of paper to record these needs in order of priority. Technical assistance needs for implementing existing work plans will also be discussed. This information will be part of the recommendations for action/next steps for facilitators. Core Module 3: 8
15 F. Wrap-Up 4:50-5:00 10 min F. Wrap-Up Full Group Activity Materials Half-sheets of colored paper Markers Masking tape or Blue-Tac Evaluation Form 4:50-4:55 (5 min) Distribute the Workshop Evaluation Form to participants and ask them to take five minutes to fill it out before you lead them in a closing activity. 4:55-5:00 (5 min) Distribute a half-sheet of colored paper to each individual, and say something like the following: This has been an amazing week, where we have learned and shared so much with each other. I know the other facilitators and I have learned much from you, and I hope with our workshop we were able to teach you more and support you in your good work. We are all here to do the same thing: to reduce prevalence, alleviate pain and suffering, and mitigate the impact of HIV/AIDS in our communities, our countries, and our world. Each of you came into this training with skills and passion and knowledge, and each of you is full of life experience and more knowledge to share. Please think of one word that summarizes for you what you feel you contribute to the fight against HIV/AIDS. Write this one word on your paper. Then, as you walk toward each participant and indicate him or her, have them stand, say their one word, and tape it on the wall. After everyone has taped their word to the wall, give a closing statement, such as: We are honored and privileged to be working side by side with such passionate, committed, articulate, and professional colleagues as you. Thank you. Core Module 3: 9
16 Appendix Core Module 3: (Facilitator Reference)...1 Key Elements of a Monitoring and Evaluation Work Plan (Handout)...3 Seven Steps to (Handout)...4 Key Elements of an M&E Work Plan (Overhead)...7 Seven Steps to Developing an M&E Work Plan (Overhead)...8 Assessing How Well the Evaluation Plan Works (Overhead)...9 M&E Work Plan Template (Handout) Core Module 3: 10
17 Facilitator Reference Understand the rationale, key elements, and steps required in developing a Monitoring and Evaluation Work Plan The Monitoring and Evaluation Work Plan is a flexible guide to the steps used to document project activities, answer evaluation questions, and show progress toward project goals and objectives. Apply program goals and objectives in developing a Monitoring and Evaluation Work Plan The first step in developing a Monitoring and Evaluation Work Plan requires writing a clear statement that identifies program goals and objectives and describes how the program expects to achieve them. Develop and select (as appropriate) program monitoring and evaluation questions and indicators, and review the issues related to program evaluation, including selection of data collection methodologies Evaluation questions should be determined based on inputs from all stakeholders, including the program managers, donors, and members of the target population. Determine monitoring and evaluation methods, including identification of outcome indicators, data source, and plans for data analysis Design methodology should include monitoring and evaluation (as appropriate), data collection methods, an analysis plan, and an overall timeline for the comprehensive plan. Core Module 3: Appendix: page 1
18 Facilitator Reference Review Monitoring and Evaluation Work Plan implementation issues Who will carry out the work? How will existing data and past evaluation studies be used? Roles and responsibilities for each component of the work plan should be clearly detailed. Identify internal and external monitoring and evaluation resources and capacity required for implementation of a monitoring and evaluation plan Identifying evaluation resources means identifying not just the funds for the evaluation, but also experienced personnel who can assist in planning and conducting the evaluation activities. Develop and review a Monitoring and Evaluation Work Plan matrix and timeline The matrix provides a format for presenting the inputs, outputs, outcomes, and impacts and their corresponding activities for each program activity. Understand monitoring and evaluation data flow Develop and/or review and implement a monitoring and evaluation work plan for country/site program, taking into consideration donor and country/site (government) requirements The Monitoring and Evaluation Work Plan guides the process that will document project activities, answer evaluation questions, and identify progress toward goals and objectives. This guide will contain the objectives, evaluation questions, methodologies, implementation plan, matrix of expected results, proposed timeline, and data collection instruments. Core Module 3: Appendix: page 2
19 Key Elements of a Monitoring and Evaluation Work Plan The scope of the monitoring and evaluation Specifying program goals and developing a conceptual framework that integrates the inputs, activities, outputs, outcomes, and impact and establishes realistic expectations for what monitoring and evaluation can produce. The methodological approach Determining monitoring and evaluation methods, including identification of outcome indicators, data source, and plans for data analysis. The implementation plan Delineating activities, roles, responsibilities, and a timetable for identified activities with realistic expectations of when data will be analyzed and results will be available. A plan for disseminating and using the results Determining who will translate the results into terms understandable to program designers, managers, and decision-makers; how findings will be shared and used (e.g., written papers, oral presentations, program materials, community and stakeholder feedback sessions); and the implications for future monitoring and evaluation. Core Module 3: Appendix: page 3
20 Seven Steps to If these elements have not already been developed, these steps may help to create a monitoring and evaluation plan. 1 Identify Program Goals and Objectives The first step requires writing a clear statement that identifies country (site) program goals and objectives (and sometimes sub-objectives) and describes how the program expects to achieve them. A program logical model or results framework can then be easily diagrammed to establish a monitoring and evaluation plan. The country evaluation matrix in the appendix illustrates a results framework (with sample goals, objectives, activities, indicators, sources of data and methods, periodicity, and persons responsible) to gather monitoring and evaluation data at the country level. This framework illustrates how the role of national governments in monitoring and planning HIV prevention and care activities complements the strengths of individual projects at the local level. For example, individual projects do not often conduct impact evaluations because the results are hard to separate from those of other projects that work toward the same goals. Impact evaluation, most appropriately measured in large geographic areas, examines whether the collective efforts of numerous projects are producing the desired effect. These impacts can be measured through sero-surveillance systems (which monitor trends in HIV and STI prevalence) and through repeated behavioral risk surveys. Local organizations in direct contact with target groups should evaluate the program s implementation, rather than its outcome or impact. This demands greater concentration on quality inputs, such as training and the pre-testing of communication messages. This framework also illustrates the time required to show progress at various levels, ranging from several months for process-level accomplishments (the training of staff) to several years for outcomeand impact-level goals. 2 Determine Monitoring and Evaluation Questions, Indicators, and Their Feasibility In this step, monitoring and evaluation specialists and program managers identify the most important evaluation questions, which should link directly to the stated goals and objectives. Questions should come from all stakeholders, including the program managers, donors, and members of the target populations. The questions should address each group s concerns, focusing on these areas: What do we want to know at the end of this program? and What do we expect to change by the end of this program? Framing and prioritizing monitoring and evaluation questions is sometimes difficult, especially when resources, time, and expertise are limited and where multiple stakeholders are present. Monitoring and evaluation questions may require revision later in the plan development process. Core Module 3: Appendix: page 4
21 3 Determine Monitoring and Evaluation Methodology Monitoring the Process and Evaluating the Effects This step should include the monitoring and evaluation methods, data collection methods and tools, analysis plan, and an overall timeline. It is crucial to clearly spell out how data will be collected to answer the monitoring and evaluation questions. The planning team determines the appropriate monitoring and evaluation methods, outcome measures or indicators, information needs, and the methods by which the data will be gathered and analyzed. A plan must be developed to collect and process data and to maintain an accessible data system. The plan should address the following issues: What information needs to be monitored? How will the information be collected? How will it be recorded? How will it be reported to the central office? What tools (forms) will be needed? For issues that require more sophisticated data collection, what study design will be used? Will the data be qualitative, quantitative, or a combination of the two? Which outcomes will be measured? How will the data be analyzed and disseminated? 4 Resolve Implementation Issues: Who Will Conduct Monitoring and Evaluation? How Will Existing Monitoring and Evaluation Results and Past Findings Be Used? Once the data collection methods are established, it is important to clearly state who will be responsible for each activity. Program managers must decide: How are we really going to implement this plan? Who will report the process data and who will collect and analyze it? Who will oversee any quantitative data collection and who will be responsible for its analysis? Clearly, the success of the plan depends on the technical capacity of program staff to carry out monitoring and evaluation activities. This invariably requires technical assistance. Monitoring and evaluation specialists might be found in planning and evaluation units of the National AIDS Commission, National AIDS and STD Control Program, Census Bureau, Office of Statistics, multisectoral government ministries including Ministry of Health, academic institutions, non-governmental organizations, and private consulting firms. It is important to identify existing data sources and other monitoring and evaluation activities, whether they have been done in the past, are ongoing, or have been sponsored by other donors. At this step, country office monitoring and evaluation specialists should determine whether other groups are planning similar evaluations and, if so, invite them to collaborate. 5 Identify Internal and External Monitoring and Evaluation Resources and Capacity Identifying monitoring and evaluation resources means identifying not just the funds for monitoring and evaluation, but also experienced personnel who can assist in planning and conducting monitoring and evaluation activities. It also means determining the program s capacity to manage and link various databases and computer systems. Core Module 3: Appendix: page 5
22 6 Develop the Monitoring and Evaluation Work Plan Matrix and Timeline The matrix provides a format for presenting the inputs, outputs, outcomes, and impacts and their corresponding activities for each program objective. It summarizes the overall monitoring and evaluation plan by including a list of methods to be used in collecting the data. (An example of a matrix is presented in the appendix.) The timeline shows when each activity in the Monitoring and Evaluation Work Plan will take place. 7 Develop Plan to Disseminate and Use Evaluation Findings The last step is planning how monitoring and evaluation results will be used, translated into program policy language, disseminated to relevant stakeholders and decision-makers, and used for ongoing program refinement. This step is not always performed, but it should be. It is extremely useful in ensuring that monitoring and evaluation findings inform program improvement and decision-making. A mechanism for providing feedback to program and evaluation planners should be built-in so that lessons learned can be applied to subsequent efforts. This step often surfaces only when a complication at the end of the program prompts someone to ask, How has monitoring and evaluation been implemented and how have the results been used to improve HIV prevention and care programs and policies? If no plan was in place for disseminating monitoring and evaluation results, this question often cannot be answered because monitoring and evaluation specialists have forgotten the details or have moved on. The absence of a plan can undermine the usefulness of current monitoring and evaluation efforts and future activities. Inadequate dissemination might lead to duplicate monitoring and evaluation efforts because others are not aware of the earlier effort. It also reinforces the negative stereotype that monitoring and evaluation are not truly intended to improve programs. For these reasons, programs should include a plan for disseminating and using monitoring and evaluation results in their overall Monitoring and Evaluation Work Plan. Core Module 3: Appendix: page 6
23 Overhead Key Elements of an M&E Work Plan Scope of the monitoring and evaluation Methodological approach Implementation plan Plan for disseminating and using the results Core Module 3: Appendix: page 7
24 Overhead Seven Steps to Developing an M&E Work Plan 1. Identify program goals and objectives. 2. Determine M&E questions, indicators, and their feasibility. 3. Determine M&E methodology for monitoring the process and evaluating the effects. 4. Resolve implementation issues: Who will conduct the monitoring and evaluation? How will existing M&E data and data from past evaluation studies be used? 5. Identify internal and external M&E resources and capacity. 6. Develop an M&E Work Plan matrix and timeline. 7. Develop plan to disseminate and use evaluation findings. Core Module 3: Appendix: page 8
25 Overhead Assessing How Well the Evaluation Plan Works Are the evaluation activities progressing as planned? Are the evaluation questions being answered sufficiently? Are other data needed to answer these questions? How can such data be obtained? Should the evaluation questions be re-framed? Have other evaluation questions arisen that should be incorporated into the plan? Are there any methodological or evaluation design issues that need to be addressed? Are there any practical or political factors that need to be considered? Are any changes in the plan needed at this time? How will these changes be made? Who will implement them? Are appropriate staff and funding still available to complete the evaluation plan? How are findings from the evaluation activities being used and disseminated? Should anything be done to enhance their application to programs? Core Module 3: Appendix: page 9
26 M&E Work Plan Template DRAFT (COUNTRY) (PROGRAM NAME) HIV/AIDS/STI MONITORING AND EVALUATION WORK PLAN (DATE) Core Module 3: Appendix: page 10
27 M&E Work Plan Template Contents Introduction...12 Goals and Objectives of Country Program Monitoring and Evaluation Questions...12 Illustrative Monitoring and Evaluation Questions...13 Methodology...15 Monitoring Evaluation Research Monitoring Quality of Services Special Studies...16 Data Flow Management Information System and Data Feedback...18 Management Information System Cycle...18 Implementation...19 Evaluation Matrix and Proposed Timeline...21 Follow-Up System on Reporting Requirements...26 Data Dissemination and Use...26 Process Monitoring Tools...27 Core Module 3: Appendix: page 11
28 Introduction 1 Monitoring and evaluation (M&E) should be an essential element of every program, providing a way to assess the progress of the program in achieving its goals and objectives and informing key stakeholders and program designers about the results. For M&E to be successful and provide useful results, it must be incorporated into the program at the design stage. That is, planning an intervention and developing an M&E strategy should be inseparable, concurrent activities. To ensure the relevance and sustainability of M&E activities, project designers, in collaboration with national and local stakeholders and donors, must work in a participatory manner to develop an integrated and comprehensive M&E work plan. Projects at all levels, whether they consist of multiple integrated projects or single interventions, should include an M&E work plan. Such plans will guide the design of M&E, highlight what information or data need to be collected, describe how best to collect it, and specify how to disseminate and use the results of M&E. This comprehensive M&E Work Plan Template describes the overall purpose of M&E; presents specific M&E evaluation questions, M&E methods, and M&E tools; shows how to determine what data should be collected and how; describes M&E data flow; specifies necessary resources and who will implement M&E; and presents a basic M&E plan timeline and plans for dissemination and data use. Goals and Objectives of Country Program Monitoring and Evaluation Questions An important aspect of the monitoring and evaluation plan is to clearly state what are the crucial questions of interest that can be answered through M&E activities. By stating these questions at the beginning, M&E specialists are better prepared to design tools, instruments, and methodologies that will gather the needed information. The following pages contain a list of the most important questions to be answered by a monitoring and evaluation plan. In addition, methods for answering some types of questions have been identified and illustrated, along with indicators in the Country Monitoring and Evaluation Matrix (see page 22). The questions should be a combination of (1) questions that your organization would like to be able to answer, (2) questions that when answered will allow you to show progress toward your program goals and objectives, and (3) questions that when answered can be reported to USAID (or other donors as appropriate) in response to their Strategic Objectives, Results, and Intermediate Results. The questions listed below represent some core indicators for USAID. Your program will probably have additional questions that you want to pose in order to manage your program, to improve your program, or to answer some of the why questions about your target group and program. In addition, these questions may lead your program into new areas. 1 Taken from Monitoring and Evaluation Guidebook, Chapter 2, Developing an integrated and comprehensive monitoring and evaluation plan, by Deborah Rugg and Stephen Mills, in process. Core Module 3: Appendix: page 12
29 Illustrative Monitoring and Evaluation Questions (USAID Core Indicators) Capacity-Building: How many training sessions were held to build organizational capacity and how many people have been trained? How many commodities and/or drugs have been procured? How many proposals have been developed and how many projects have been designed? How many organizations have been worked with to strengthen their organizational management? How many technical assistance activities have been carried out (by type of TA) and who has received the TA? Coordination, Leadership, and Collaboration: How many guidelines have been developed and how many standards have been established? How many conferences have been coordinated? How many in-country collaborative events have been held? Policy Development: How many training sessions have been held for policy development and how many people have attended? How many advocacy activities have been carried out and how many people have been reached? How many networks, NGOs, and coalitions have been formed? How many new organizations have become involved in advocacy efforts? How many policy and advocacy tools have been developed and disseminated? How many policy and advocacy strategies and guidelines have been developed? Condom Sales/Distribution: How many condoms have been sold and/or distributed? Sexually Transmitted Infections: How many service providers have been trained? How many people were referred for STI diagnosis and treatment? How many people were served (by gender and age)? How many STI clinics were established using USAID funds? Behavior Change Communication: How many training sessions on the comprehensive BCC approach were held and how many participants were there? How many peer education trainings were held and how many peers were trained (by target group)? How many people were reached? How many materials were developed and disseminated? How many community BCC events were held? How many formative studies/assessments were conducted? Are there differences in cost per person reached in different BCI projects? What explains these differences? Core Module 3: Appendix: page 13
30 Prevention of Mother-to-Child Transmission: How many health facilities offer PMTCT services? How many service providers were trained? How many women attending the PMTCT sites for a new pregnancy were there in the past 12 months? How many women with known HIV infection were among all those women seen at PMTCT sites in the past year? How many infants received medication? What are the costs incurred to start PMTCT services at a site? What is the annual budget required to continue the PMTCT services? Voluntary Counseling and Testing: How many counselors have been trained? How many clients (by gender and age) came to the VCT sites? How many clients (by gender and age) were tested? How many clients (by gender and age) received their test results? How many new VCT sites were established? How many VCT centers exist that are supported with USAID funds? Are there differences in cost per client tested across VCT projects? What explains these differences? Tuberculosis: How many HIV-positive people are receiving TB prophylaxis? How many HIV-positive people are receiving TB treatment? How many people are referred to a TB clinic/department? How many service providers have been trained? How many people have been reached? Intravenous Drug Users: How many service providers are trained? How many people (by gender and age) were reached? Blood Safety: How many service providers are trained? How many people were reached? What percent of all donated blood is screened? Antiretroviral Therapy: How many clinicians have been trained in ART management? How many HIV-positive persons are receiving ART? What percent of HIV-positive persons receiving ART are adhering to the drug regimen at the 95 percent level? How many ART programs are supported with USAID funds? Clinic-Based Care: How many service providers have been trained? How many people were served? What are the costs incurred to start HIV/AIDS clinical services at a site? How much does it cost to scale-up HIV/AIDS clinical services to other sites? What is the cost of the first-line ARV drug regimen per patient per year? Core Module 3: Appendix: page 14
31 What is the cost of basic laboratory tests per patient per year? Home-Based Care: How many HBC providers have been trained? How many households are receiving services? How many HIV-positive individuals were reached by community- and home-based care programs? How many community- and home-based care programs are supported with USAID funds? Is home-based care using approach A more cost-effective than approach B? What is the cost of a basic HBC supplies kit? Orphans and Vulnerable Children: How many service providers/caretakers have been trained in caring for OVC? How many OVC were reached? (by type of service) How many OVC programs are funded with USAID money? Are there differences in cost per OVC reached in different OVC projects? What explains these differences? Delivery of Opportunistic Infection Services: How many service providers have been trained? How many people were served? Methodology Monitoring Explain how the work plan will record the activities and the processes followed in implementing activities. Each program will need to have a means for collecting and reporting on the process that the activities followed. This system will not only report the specific steps that were undertaken to implement the activity, it also will document all of the activities in the general program in order to better explain any changes in risk behaviors in the target populations. The system should be flexible and should not overburden the implementing agencies, yet it also must provide comprehensive information to clearly document what is happening in each project. Evaluation Research There are several methods that can be used to complete this evaluation plan. To answer the questions about the effect the program has had on the target population s knowledge, attitudes, and behaviors, a detailed quantitative survey has to be implemented. (Describe the methods that will be used or have been used to answer these questions). For example: The initial survey was conducted with 750 randomly selected youth from the three target areas, Georgetown, New Amsterdam, and Linden. This survey will be administered a second time approximately 30 months after the initial assessment. The access to services questions will be answered by conducting. (Describe the methods that will be used or have been used to answer these questions). Core Module 3: Appendix: page 15
32 Monitoring Quality of Services The following examples illustrate some methods that can be used to address the quality of services. (As an illustrative example ) An organizational assessment has been conducted with three of the six NGOs participating in this program; the other three will also undergo the same assessment prior to program start-up. These six NGOs will be re-assessed approximately two years after start-up of the program. In addition, answers to questions from the quantitative youth survey will be used to determine the attitudes of the youth toward these NGOs and the services that they provide. To determine the effect of the collaborative framework on the provision of HIV/AIDS services by the NGOs, a qualitative assessment will also be conducted. Through in-depth interviews, a case study of each NGO will be created to describe the effect that working together in this collaborative manner has had on their ability to do their work. (As an illustrative example ) As a program activity, many of the services used by the youth in the target areas, including STI services, are being assessed. The assessment is being conducted to determine what services providers are currently providing, to get an initial picture of the quality of these services, and to determine the youth-friendliness of these services. Several service providers (clinics, pharmacies, and private physicians) will be selected, and the degree to which youth access these services and feel positive about their interactions there will be assessed through exit interviews, mystery clients, interviews with the providers, and the quantitative youth assessment. This information will be used to design potential interventions to improve the services and to leverage the funds to conduct needed interventions. In an attempt to evaluate the effectiveness of the trainings that will take place during this program, trainees will be monitored and interviewed at several points. (As an illustrative example ) Peer educators are an integral part of this program, and the quality of their work is key to the success or failure of the program. As a result, it is important to ensure that they are providing the highest quality education and support to their peers. To assess this, youth who participate in the peer educator training will have their knowledge, attitudes, and skills assessed at multiple intervals throughout the program. The first assessment will be carried out when they enter the training program. This will be followed by an assessment at the end of the training to determine its effectiveness at improving knowledge, attitudes, and skills. In addition, peer educators will be observed and assessed by their supervisors monthly and will participate in a formal assessment every six months.) Special Studies Data Flow In this section, please describe how the data will flow from the frontline workers who are responsible for carrying out the activities (e.g., clinic staff, home-based care Core Module 3: Appendix: page 16
33 providers, and peer educators) to your funders, as well as how feedback to the frontline staff will be given. The following is an illustrative diagram of a data flow system. FHI/Country Office Data Collection and Reporting Scheme Implementing Agencies and Partners FHI/Country Office USAID Mission FHI Arlington office Programmatic database Other funders Core Module 3: Appendix: page 17
34 Management Information System and Data Feedback FHI/Nepal has developed a uniform reporting system for each program component. Each implementing agency (IA) reports its program information monthly. It is then compiled and analyzed quarterly for quarterly review and semi-annual compilation to support the reporting requirements of the program. FHI/Nepal has an effective system of gathering, processing, and feeding data back into ongoing programs. Management Information System Cycle Link with GIS Input Analysis Dissemination Monthly Data Gathering Feedback Tools to Support MIS Cycle Expectations PIFs MS Excel, MS PowerPoint, MS Access, GIS Quarterly Review and Monitoring Visits Better Planning Ongoing Program Core Module 3: Appendix: page 18
35 Implementation Most information can be gathered through the program itself. However, when examining changes in HIV and STI prevalence among the target population, the national HIV/AIDS program must be involved. This activity requires establishing serosurveillance sites equipped to systematically monitor the epidemic. This is not a task that the NGOs should take on; instead they should work in collaboration with the National AIDS Program Secretariat and the Ministry of Health to ensure that the needed information is included in the data collection system for biological data. For each of the main data collection systems, organizations and/or individuals must be identified to oversee the collection of the data. The following table illustrates what is needed. As shown, there are five evaluation areas (representing attitude and behavior change, NGO capacity improvement, and quality of service) that should be considered. Activity Follow-up quantitative survey with youth NGO baseline and follow-up assessments NGO in-depth interviews and case studies Other service provider assessments Peer educator knowledge, attitudes, and skills Organization or Individuals to Oversee Data Collection Local university Local consultant Local consultant Local university and local consultant Consultant developing curricula and trainers Carrying out and completing components of the monitoring and evaluation in phases provides the M&E planning group with an opportunity to assess how well the M&E plan is working. After the first or second M&E activities are completed, it may become apparent that the plan needs to be revised or corrected. At this point, the M&E group should spend time to ensure that the plan is still useful. This process does not need to be overly complicated or involved. The group can begin merely by discussing the following questions: Are monitoring and evaluation activities progressing as planned? Are initially posed M&E questions being sufficiently answered? Are other data needed to answer these questions? How can such data be obtained? Do the M&E questions themselves need to be reframed? Have other M&E questions arisen since the initial planning that should be incorporated into the plan? Are there any methodological issues that need to be addressed or changes that need to be made to the evaluation designs? Are there any factors, practical or political, that need to be considered in the M&E activities yet to be implemented? Core Module 3: Appendix: page 19
36 Are any changes in the plan needed at this time? How will these changes be made? Who will implement them? Is the right mix of personnel and fiscal resources still available to carry out the rest of the evaluation plan? How are findings so far from the M&E activities being used and disseminated? Does anything need to be done to enhance their application to programs? This assessment of the M&E plan is most helpful if it occurs at least annually. Keeping the plan current and relevant will help ensure the usefulness and quality of the remaining components. It will also foster the overall success and usefulness of all of the evaluation activities by the end of the project period. Reviewing the evaluation plan and performing mid-course corrections as needed also facilitates the connection between the evaluation activities and the programs as well as the design of subsequent plans. Some additional sources of data currently exist that may be useful in the M&E of a project. Such sources include existing datasets, ongoing efforts (activities) to collect other sets of data, and planned data collection efforts. These other sources should be examined for their usefulness in monitoring and evaluating this program, and, where appropriate, the organizations collecting these data should be invited to participate in the process. Following is a list of some other data sources. (Following list for illustrative purposes only.) 1) BCCP has conducted an assessment of the capacity of three of the NGOs in the project. 2) The Green Report will provide countrywide epidemiology data before project start-up. 3) Population Services International (PSI) has conducted a feasibility study of condom social marketing. 4) The European Union (EU) will conduct an assessment of current condom distribution and storage activities and methods. (Here storage refers to warehousing and general storage as it relates to condom quality not sure if activities and methods aptly describe this?) 5) UNICEF has conducted a feasibility study among youth for a family life education program. 6) UNFPA has conducted two KABP studies with youth at project sites (1998 and 2000). 7) CAREC is conducting an evaluation of VCT services in Region 6 and will expand VCT sites, as well as the evaluation of the sites. 8) NAPS is conducting an assessment of condom procurement and distribution. 9) The Ministry of Health has proposed conducting second-generation surveillance with numerous groups, including youth, nationwide. Core Module 3: Appendix: page 20
37 In addition to these sources of data, there are also a number of resources and sources of assistance for completing the monitoring and evaluation plan. These sources exist at a number of levels, from the NGOs themselves to the national government to groups and individuals in the country to international donors. (Following table provided for illustrative purposes only.) Matrix of Monitoring and Evaluation Resources Internal NGOs National Programs Country-wide External Basic skills Computers Basic Computers Computer skills Analytic software University of Guyana CESRA FHI USAID CAREC computer Staff with Guyana Voluntary PAHO skills knowledge of M&E Consultancy UNICEF One GRPA, methods Bureau of Statistics CIDA has an M&E Access to Private sector JICA/Grants unit statistics support for People who Epidemiology Unit Dr. Janice Jackson Grassroots can learn Health Science Bonita Harris Projects Education Unit Evaluation Matrix and Proposed Timeline The following matrix depicts the objectives, their related activities, the variables to be monitored and evaluated, and the methods for conducting the evaluations. This matrix is built on a conceptual monitoring and evaluation framework that draws on all stages of M&E to provide an overall picture of the program. The variables listed in the Input column are those that account for the existing resources within the NGOs carrying out this program. The items listed in the Output column are variables expected to occur as a result of the program and are considered immediate and shortterm results that may ultimately reduce the risk of contracting HIV (e.g., knowledge of HIV transmission, condom availability, number of youth participating in peer education programs, and percent of services with improved quality). The Outcomes are short-term and immediate effects that may result from this program. It is assumed that they have the most direct effect on the risk of contracting HIV (e.g., condom use, STI prevalence, and reduction in number of partners). For prevention programs, the items in the Impact column refer directly to the prevalence of HIV in target populations. (Information in following matrix for illustrative purposes only.) Core Module 3: Appendix: page 21
38 Country Monitoring and Evaluation Matrix STRATEGIC Objective 1: Increased number of young people (ages 8-25) having access to quality services from USAID-assisted indigenous NGOs. USAID Intermediate Objective: Activities/Resources 1) Building NGO Capacity Trainings Administrative Support TA Ensuring funds Outputs Increased access to NGO services by youth Key Outputs Indicators Key Outcomes Increased quality of services Definition of Key Outcome Indicators Determined by client satisfaction surveys Sources of Data and Collection Methods Service statistics Exit interviews with youth at specific service providers Frequency of Data Collection Responsible Person(s) & Team M&E Work Plan Template 22
39 Activities/Resources 2) Outreach to youth Street fairs NGO trainings for peers Peer workshops Peers educating peers Outputs Increased knowledge levels among targeted youth Increased skill levels of peer educators Key Outputs Indicators Key Outcomes Change in attitudes toward HIV/AIDS and associated risks Definition of Key Outcome Indicators Sources of Data and Collection Methods Baseline and follow-up quantitative youth assessment Pre- and post- training assessment of skills Six month post-training assessments Supervisor observation of activities Frequency of Data Collection Responsible Person(s) & Team 3) Public Communications TV Radio Posters and flyers Increased awareness of risk among targeted youth Quality media campaigns and interpersonal communication materials Baseline and follow-up quantitative youth assessment Focus groups to pilot-test materials M&E Work Plan Template 23
40 Activities/Resources Outputs Key Outputs Indicators Key Outcomes Definition of Key Outcome Indicators Sources of Data and Collection Methods Frequency of Data Collection Responsible Person(s) & Team Activities to Assess First Quarter Second Quarter Illustrative Timeline Year One Year Two Year Three Third Quarter Fourth Quarter First Quarter Second Quarter Third Quarter Fourth Quarter First Quarter Second Quarter Changes in levels of knowledge and risk behaviors among youth X X NGOs capacity to provide HIV/AIDS services X X Quality of services being provided by NGOs X X Monitoring evolution of Executive Steering Committee X X Access to NGO services X X Use of STI and support services X X Quality of STI healthcare providers X Media campaigns and interpersonal communication materials X M&E Work Plan Template 24
41 Skill levels of peer educators X X X X X X X X X Secondary data X X X M&E Work Plan Template 25
42 Follow-Up System on Reporting Requirements Each implementing agency (IA) agrees to specific reporting requirements, and these requirements may differ for each IA. The country office has developed a follow-up system that has effectively ensured that IAs submit the required information, and that they submit it on time. Data Dissemination and Use Based on an established country program management information system (MIS) framework, different reporting systems have been developed. IAs submit their program information monthly. Analyzing program information and providing feedback to improve ongoing activities and to plan for upcoming activities are important contributions of the MIS. During regular meetings with IAs, analyzed program information is used in the program review process. The regular system of program information sharing and interaction with field program staff will enhance the local capacity for monitoring and evaluation. An important function of the MIS is to disseminate program information with the intent of helping improve field programs. Disseminating monitoring and evaluation findings can support the usefulness of M&E and future activities, as well as reduce redundancy that might occur if others are not aware of the findings of previous M&E efforts. Furthermore, disseminating results will break down one of the negative stereotypes about M&E, which is that M&E is not intended to help improve programs. For these reasons, a plan for disseminating and using M&E findings is crucial. What information should be distributed? Who needs the information? How does the information get distributed? M&E Work Plan Template 26
43 PROCESS MONITORING TOOLS (The following monitoring tools are presented for illustrative purposes only and should be adapted to reflect the activities and reporting needs for each country. The information from these forms is intended to be used by the in-country program manager to monitor the activities and benchmarks of programs. These forms are useful for identifying areas of weakness and strength, as well as for creating biannual and annual reports to be sent to headquarters.) Process Monitoring of Public Communications Activities: TV, Radio, Posters, and Flyers Month: 1) Number of TV spots aired # 2) Number of radio spots aired # 3) Number of posters distributed # 4) Number of flyers distributed # M&E Work Plan Template 27
44 Process Monitoring of Outreach to Youth Activities: NGO Trainings for Peers, Peer Workshops, Peers Educating Peers Month: 1) Number of peer educator trainings conducted # 2) Number of peer educators trained # females # males 3) Number of supervisor trainings conducted # 4) Number of peer educator supervisors trained # of females # of males 5) Number of peer educators supervised/observed # 6) Number of follow-up trainings held # 7) Number of events/activities held for peer educators # M&E Work Plan Template 28
45 Process Monitoring of Outreach to Youth Activities: Street Fairs Month: 1) Number of events conducted # 2) Number of peers contacted # of females # of males M&E Work Plan Template 29
46 Process Monitoring of NGO Referrals to Identified Service Providers Month: 1) Number of referrals made to VCT centers # 2) Number of referrals made to STI centers # 3) Number of referrals made to other types of youth services: Total # Family planning clinic Alcohol support Drug support Rape crisis Educational support Social services Other # # # # # # # M&E Work Plan Template 30
47 Process Monitoring of Building NGO Capacity Activities 1) Trainings for capacity-building received Month: # conducted Type of training: 2) Administrative support received Type of support: 3) Technical assistance received Type of TA received: 4) Diversification of funds Types of additional funds received: M&E Work Plan Template 31
48 M&E Work Plan Template 32
49 Family Health International Institute for HIV/AIDS 2101 Wilson Blvd. Suite 700 Arlington, VA Tel: Fax:
Monitoring and Evaluating Behavior Change Communication Programs
MODULE 6: Monitoring and Evaluating Behavior Change Communication Programs Monitoring HIV/AIDS Programs A FACILITATOR S TRAINING GUIDE A USAID RESOURCE FOR PREVENTION, CARE AND TREATMENT In July 2011,
Collecting, Analyzing and Using Monitoring Data
CORE MODULE 2: Collecting, Analyzing and Using Monitoring Data Night 1: Entering Data into the Global Spreadsheet Night 2: Descriptive Statistics and Charts Monitoring HIV/AIDS Programs A FACILITATOR S
Monitoring and Evaluating Community Home-Based Care Programs
MODULE 4: Monitoring and Evaluating Community Home-Based Care Programs Monitoring HIV/AIDS Programs A FACILITATOR S TRAINING GUIDE A USAID RESOURCE FOR PREVENTION, CARE AND TREATMENT In July 2011, FHI
Stigma and Discrimination
Stigma and Discrimination T he Network of Associations for Harm Reduction (NAHR) aims to reduce HIV/AIDS Stigma and Discrimination towards Most at Risk Populations (MARPs) and People Living with HIV/AIDS
Module 2: Introduction to M&E frameworks and Describing the program
Module 2: Introduction to M&E frameworks and Describing the program Learning Objectives Understand the 4 basic steps required to develop M & E plans Learn to develop goals and objectives Learn to develop
Annex 3 Tanzania Commission for AIDS TACAIDS. M&E Database User Manual
Annex 3 Tanzania Commission for AIDS TACAIDS M&E Database User Manual Version 1.02 29 November 2005 M&E Database Table of Contents INTRODUCTION...2 1. THE DATABASE SYSTEM...2 1.1 APPROACH TO THE DEVELOPMENT...2
TRAINING ON COSTING METHODOLOGIES AND TOOLS FOR HIV AND AIDS PROGRAMS
TRAINING ON COSTING METHODOLOGIES AND TOOLS FOR HIV AND AIDS PROGRAMS A FACILITATOR S GUIDE AUGUST 2013 This publication was produced for review by the U.S. Agency for International Development (USAID).
Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5)
Assessing an Organization s in Health Communication: A Six Cs Approach [11/12 Version] Name & location of organization: Date: Scoring: 0 = no capacity, 5 = full capacity Category Assessment Indicator As
Engage Government & Community 2 PLAN REFLECT ACT
Step 6: Monitor and Evaluate 7 Improve & Revise Services 1 Engage Government & Community 2 Establish Leadership & Management Structure REFLECT PLAN 6 Monitor & Evaluate ACT 3 Adapt the Community-Based
HIV/AIDS policy. Introduction
HIV/AIDS policy Introduction The International Federation of Red Cross and Red Crescent Societies (International Federation) has a long tradition of working in the area of health and care. National Red
Assessing the Human Resource Capacity for Implementation of the National Plan of Action for Orphans and Vulnerable Children
Assessing the Human Resource Capacity for Implementation of the National Plan of Action for Orphans and Vulnerable Children Process Description and Tool Library June 2007 CONTEXT One of the main challenges
How To Teach Human Resources Management
Human Resources Management Training Curriculum Technical Assistance to the New Partners Initiative (TA-NPI) New Partners Initiative Technical Assistance Project (NuPITA) The New Partners Initiative Technical
- % of participation - % of compliance. % trained Number of identified personnel per intervention
Fighting Disease, Fighting Poverty, Giving Hope KEY OBJECTIVE 1 : HUMAN RESOURCE MANAGEMENT KEY RESULT AREA : HUMAN RESOURCE ACTIVITIES OUTPUT KEY ACTIVITIES INDICATOR TARGET RESOURCE/ENABLERS Have adequate
Using Evaluation to Improve Programs. Strategic Planning. www.cdc.gov/healthyyouth/evaluation
Using Evaluation to Improve Programs Strategic Planning www.cdc.gov/healthyyouth/evaluation PROGRAM STRATEGIC PLANNING KIT Table of Contents Introduction Part 1: What is strategic planning? Part 2: What
Part III: FP Counseling in Practice. ACQUIRE Project/EngenderHealth Counseling for Effective Use of FP Trainer s Manual 24-5
Part III: FP Counseling in Practice ACQUIRE Project/EngenderHealth Counseling for Effective Use of FP Trainer s Manual 24-5 24-6 Counseling for Effective Use of FP Trainer s Manual ACQUIRE Project/EngenderHealth
PROJECT MANAGEMENT TRAINING MODULES
PROJECT MANAGEMENT TRAINING MODULES KENYA PROJECTS ORGANIZATION < Projects Solutions Provider > Macjo Arcade, 4 th Flr., Suite 15E P.O Box, 3029 00200, Nairobi - Kenya Tel: 254 (0)202319748 Mob: 0725 788
Guidance Note on Developing Terms of Reference (ToR) for Evaluations
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
Performance Appraisal Module Facilitator s Guide
Performance Appraisal Module Facilitator s Guide New Partners Initiative Technical Assistance (NuPITA) Project 2010 The New Partners Initiative Technical Assistance (NuPITA) project is funded by the United
NATIONAL MONITORING AND EVALUATION PLAN OF THE NATIONAL STRATEGIC PLAN 2006 2011
2008 NATIONAL MONITORING AND EVALUATION PLAN OF THE NATIONAL STRATEGIC PLAN 2006 2011 Acknowledgements The National AIDS Commission would like to acknowledge the sponsorship of USAID/ PASCA in the development
Monitoring and Evaluating Orphan and Other Vulnerable Children Programs
MODULE 8: Monitoring and Evaluating Orphan and Other Vulnerable Children Programs Monitoring HIV/AIDS Programs A FACILITATOR S TRAINING GUIDE A USAID RESOURCE FOR PREVENTION, CARE AND TREATMENT In July
COUNTRY PROFILE: TANZANIA TANZANIA COMMUNITY HEALTH PROGRAMS DECEMBER 2013
COUNTRY PROFILE: TANZANIA DECEMBER 2013 Advancing Partners & Communities Advancing Partners & Communities (APC) is a five-year cooperative agreement funded by the U.S. Agency for International Development
Develop a strategic plan. A tool for NGOs and CBOs M
Develop a strategic plan A tool for NGOs and CBOs M Developing Strategic Plans: A Tool for Community- and Faith-Based Organizations Developed by the International HIV/AIDS Alliance and Davies & Lee: AIDS
Chapter 11 QUALITY IMPROVEMENT (QI)
Chapter 11 QUALITY IMPROVEMENT (QI) 11.1 INTRODUCTION TO QUALITY IMPROVEMENT The quality of care delivered in your health centre is determined by many factors, including how its services are organized,
HUMAN AND INSTITUTIONAL CAPACITY DEVELOPMENT HANDBOOK A USAID MODEL FOR SUSTAINABLE PERFORMANCE IMPROVEMENT
HUMAN AND INSTITUTIONAL CAPACITY DEVELOPMENT HANDBOOK A USAID MODEL FOR SUSTAINABLE PERFORMANCE IMPROVEMENT HUMAN AND INSTITUTIONAL CAPACITY DEVELOPMENT HANDBOOK October 2010 This document was prepared
Module 4: Formulating M&E Questions and Indicators
Module 4: Formulating M&E Questions and Indicators Four Steps to Developing an Establish the M&E planning team and: M&E Plan 1. Align projects and activities with program goals. 2. Identify information
How To Help The Ghanian Hiv Recipe Cards
UN AID S PROGRAM M E COORDIN AT ING BO ARD UNAIDS/PCB (32)/13.CRP 3 Issue date: 07 June 2013 THIRTY-SECOND MEETING Date: 25-27 June 2013 Venue: Executive Board Room, WHO, Geneva Agenda item 4 Joint United
EKWENDENI HOSPITAL HIV/AIDS RESOURCE CENTRE.
EKWENDENI HOSPITAL HIV/AIDS RESOURCE CENTRE. Brief of 6 months activities. Youth Programme YOUTH HEALTH GUIDANCE AND COUNSELLING Objectives Reduce HIV/AIDS incidences among youth 1. Behaviour change 2.
Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures
Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures ADAPTATION OF THE ROUTINE DATA QUALITY ASSESSMENT TOOL MEASURE Evaluation SPECIAL REPORT Botswana s Integration of Data
PARTICIPATORY MANAGEMENT IN BASIC EDUCATION
UNESCO BASIC EDUCATION CAPACITY BUILDING PROJECT ~TRAINING KITS FOR LOCAL NGOS~ Theme 2 PARTICIPATORY MANAGEMENT IN BASIC EDUCATION Written by Margaret Kamera Emmanuel Mukanda Edited By: Geoff Tambulukani
9 million people get sick with TB.
Every year 9 million people get sick with TB. 3 MILLION DON T GET THE CARE THEY NEED. HELP US TO REACH THEM. World TB Day 2015 WORLD TB DAY 24 MARCH 2015 2 the missed three million TB is curable, but our
Section 7. Terms of Reference
APPENDIX-A TERMS OF REFERENCE UNION-LEVEL TECHNICAL ASSISTANCE TO PROVIDE INSTITUTIONAL SUPPORT TO THE MYANMAR NATIONAL COMMUNITY DRIVEN DEVELOPMENT PROJECT I. INTRODUCTION IDA GRANT H814MM FY 2013-16
GUIDANCE FOR COMPLETION OF THE ENHANCED FINANCIAL REPORTING TEMPLATE
GUIDANCE FOR COMPLETION OF THE ENHANCED FINANCIAL REPORTING TEMPLATE Version: November 2007 TABLE OF CONTENTS INTRODUCTION...3 REPORTING ON THE GRANT AS A WHOLE...4 REPORTING PERIODS AND TIMELINES...7
Common Grant Application User Guide
Common Grant Application User Guide Welcome to the Washington Regional Association of Grantmakers Common Grant Application (CGA). This user guide was revised in 2012 to reflect the changes in the CGA that
USAID/India is accepting applications for the following Foreign Service National (FSN) Personal Services Contract position:
USAID/India is accepting applications for the following Foreign Service National (FSN) Personal Services Contract position: Advertisement No. : USAID/14-07 Position Title : Project Management Specialist
NGO Self-assessment through a SWOT exercise
NGO Self-assessment through a SWOT exercise Step 1: Analyse your NGO s Capacity > page 2 Step 2: Do the SWOT Exercise > page 5 Step 3: Make a Strategic Plan > page 8 Step 4: Implement, Monitor, Evaluate
Capacity Building and Strengthening Framework
FY 2012 The President s Emergency Plan for AIDS Relief (PEPFAR) Capacity Building and Strengthening Framework Version 2.0-1 - Table of Contents 1 Background... 3 2 Purpose... 4 2.1 How to Use the Capacity
Aids Fonds funding for programmes to prevent HIV drug resistance
funding for programmes to prevent HIV drug resistance Call for proposals July 2012 Page 1 van 10 [email protected] Documentnumber 20120719/JKA/RAP Universal Access Lifting barriers to universal access
Planning, Monitoring, Evaluation and Learning Program
Planning, Monitoring, Evaluation and Learning Program Program Description Engineers Without Borders USA EWB-USA Planning, Monitoring, Evaluation 02/2014 and Learning Program Description This document describes
7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION NEEDS: INFORMATION GAP ANALYSIS
7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION
USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan
USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan January 2004 Contract Number: GDG-A-00-03-00006-00 Task Order Number: 165-00-03-00105-00 EQUIP1: Secondary Education Activity
2008-2009 Unified Budget and Workplan. Performance Monitoring and Evaluation Framework
2008-2009 Unified Budget and Workplan Performance Monitoring and Evaluation Framework Secretariat UNAIDS UNHCR UNICEF WORLD BANK WFP UNDP WHO UNFPA UNESCO ILO UNODC 2 2008-2009 UBW Performance Monitoring
The P ProcessTM. 2 Design Strategy. Five Steps to Strategic Communication. Mobilize & Monitor. Create & Test. Evaluate & Evolve. Inquire T H E O RY
The P ProcessTM Five Steps to Strategic Communication 3 Create & Test PARTICIPATION 4 Mobilize & Monitor T H E O RY 2 Design Strategy 5 Evaluate & Evolve C A P A C I T Y 1 Inquire TM Suggested citation:
HIMMELMAN Consulting 210 Grant Street West, Suite 422 Minneapolis, MN 55403-2245 612/998-5507 [email protected]
HIMMELMAN Consulting 210 Grant Street West, Suite 422 Minneapolis, MN 55403-2245 612/998-5507 [email protected] COLLABORATION FOR A CHANGE (revised January 2002) Definitions, Decision-making models,
SIXTY-SEVENTH WORLD HEALTH ASSEMBLY. Agenda item 12.3 24 May 2014. Hepatitis
SIXTY-SEVENTH WORLD HEALTH ASSEMBLY WHA67.6 Agenda item 12.3 24 May 2014 Hepatitis The Sixty-seventh World Health Assembly, Having considered the report on hepatitis; 1 Reaffirming resolution WHA63.18,
Step 1: Analyze Data. 1.1 Organize
A private sector assessment combines quantitative and qualitative methods to increase knowledge about the private health sector. In the analytic phase, the team organizes and examines information amassed
I-TECH Ethiopia Clinical Mentoring Program: Field-Based Team Model
a I - T E C H P R O J E C T P R O F I L E I-TECH Ethiopia Clinical Mentoring Program: Field-Based Team Model Background In 2003, the Centers for Disease Control s Ethiopia Global AIDS Program (CDC GAP)
International Service Program 2010-2012
International Service Program 2010-2012 Prevention of Mother-to-Child Transmission of HIV and Gender-Based Violence in Rwanda UNICEF USA$500,000 Project Description THE GOAL To prevent mother-to-child
HIV Services Quality Management Plan San José, CA (Santa Clara County) Transitional Grant Area. January 1 to December 31, 2012
HIV Services Quality Management Plan San José, CA (Santa Clara County) Transitional Grant Area January 1 to December 31, 2012 Plan developed and written by: Jeremy Holman, PhD Dianne Perlmutter, MPH, MSW
2016 COMMUNITY SERVICES GRANTS
SOCIAL POLICY DIVISION, SOCIAL DEVELOPMENT DEPARTMENT 2016 COMMUNITY SERVICES GRANTS 2016 COMMUNITY SERVICES GRANTS INFORMATION SHEET FOR SUSTAINABLE FOOD SYSTEMS GRANTS STANDARD APPLICATION APPLICATION
Rapid Assessment of Sexual and Reproductive Health
BURKINA FASO Rapid Assessment of Sexual and Reproductive Health and HIV Linkages This summary highlights the experiences, results and actions from the implementation of the Rapid Assessment Tool for Sexual
Glossary Monitoring and Evaluation Terms
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
DATA ANALYSIS AND USE
CapacityDevelopment RESOURCE GUIDE DATA ANALYSIS AND USE July 2014 The s highlight the key technical areas of expertise needed to effectively influence health policy design, implementation, and monitoring
Writing Good Learning Objectives
I - T E C H T E C H N I C A L I M P L E M E N T A T I O N G U I D E # 4 Writing Good Learning Objectives I-TECH s Technical Implementation Guides are a series of practical and instructional papers designed
performance and quality improvement to strengthen skilled attendance
An affiliate of Johns Hopkins University using performance and quality improvement to strengthen skilled attendance United States Agency for International Development The Maternal and Neonatal Health (MNH)
24 th Meeting of the UNAIDS Programme Coordinating Board Geneva, Switzerland 22-24 June 2009
15 May 2009 24 th Meeting of the UNAIDS Programme Coordinating Board Geneva, Switzerland 22-24 June 2009 2010-2011 Performance Monitoring Framework Page 2/20 Additional documents for this item: 2010-2011
2.1 Initiation Phase Overview
2.1 Initiation Phase Overview The is the conceptualization of the project. This section describes the basic processes that must be performed to get a project started. Accordingly, the purpose of the is
Islamic Republic of Afghanistan, Post Disaster Need Assessment (PDNA) Training Manual
Islamic Republic of Afghanistan, Ministry of Rural Rehabilitation and Development (MRRD) Ministry of Rural Rehabilitation & Development Post Disaster Need Assessment (PDNA) Training Manual Social Protection
APPRAISAL: SYNTHESIS
APPRAISAL: SYNTHESIS Title : Merundoi / HFLE Support Module 3 Type of document : Material for Teachers Year of publication: 2007 Author/publisher : Radio Merundoi the Centers for Disease Control and Prevention,
World Health Organization
March 1, 2005 Proposed Networks to Support Health Decision-Making and Health Policy Formulation in Low and Lower Middle Income Countries & Considerations for Implementation World Health Organization A
SVN/MM040 Open to Internal and External Candidates
SVN/MM040 Open to Internal and External Candidates Position Title : Medical Doctor (HIV) Duty Station : Mawlamyine Township, Mon State, Myanmar Classification : Field Service Staff, FS- 6 Type of Appointment
Interpersonal Communication Skills for Differential Diagnosis of Fever in Children
Interpersonal Communication Skills for Differential Diagnosis of Fever in Children National Master Trainer TOT Guide for training District Trainers Uganda July, 2013 TOT Guide Page 2 About this TOT Guide
HIV/AIDS Tool Kit. B. HIV/AIDS Questionnaire for Health Care Providers and Staff
8 HIV/AIDS Tool Kit B. HIV/AIDS Questionnaire for Health Care Providers and Staff FOR STAFF USE ONLY: SURVEY ID # HIV/AIDS KAP Questionnaire for Health Care Providers and Staff Introduction The goal of
Integrated Risk Management:
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
A framework to plan monitoring and evaluation
6. A framework to plan monitoring and evaluation Key ideas Monitoring and evaluation should be an integral part of (and therefore affect) all stages of the project cycle. As mentioned earlier, you need
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program
Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program Introduction: This evaluation plan describes a process evaluation for Financial Empowerment Corps (FEC), an AmeriCorps State and
Rapid Assessment of Sexual and Reproductive Health
LEBANON Rapid Assessment of Sexual and Reproductive Health and HIV Linkages This summary highlights the experiences, results and actions from the implementation of the Rapid Assessment Tool for Sexual
HIV/AIDS: General Information & Testing in the Emergency Department
What Is HIV? HIV/AIDS: General Information & Testing in the Emergency Department HIV is the common name for the Human Immunodeficiency Virus. HIV is a retrovirus. This means it can enter the body s own
Evaluating Breastfeeding Programs and Initiatives
Breastfeeding in Ontario Evaluating Breastfeeding Programs and Initiatives What is evaluation? Evaluation means different things to different people. Fundamentally, evaluation means getting feedback about
HOW TO ORGANISE AND RUN FOCUS GROUPS
HOW TO ORGANISE AND RUN FOCUS GROUPS BACKGROUND Focus groups can form an important part of the process of using the Management Standards to assess and control risks related to work-related stress. Focus
HIV and AIDS in Bangladesh
HIV and AIDS in Bangladesh BACKGROUND The first case of HIV/AIDS in Bangladesh was detected in 1989. Since then 1495 cases of HIV/AIDS have been reported (as of December 2008). However UNAIDS estimates
Dublin Declaration. on Partnership to fight HIV/AIDS in Europe and Central Asia
Dublin Declaration on Partnership to fight HIV/AIDS in Europe and Central Asia Against the background of the global emergency of the HIV/AIDS epidemic with 40 million people worldwide living with HIV/AIDS,
IV. Counseling Cue Cards. ICAP International Center for AIDS Care and Treatment Mailman School of Public Health Columbia University
IV. Counseling Cue Cards ICAP International Center for AIDS Care and Treatment Mailman School of Public Health Columbia University How to Use These Counseling Cue Cards ABOUT THE CUE CARDS This set of
Viet Nam: Technical Training Manuals for Microfinance Institutions in Vietnam. Basic Course in Human Resource Management
Project Number: 42492-012 November 2011 Viet Nam: Technical Training Manuals for Microfinance Institutions in Vietnam Basic Course in Human Resource Management 1 COURSE OUTLINE Course Name Target Participants
DATA, DATA, DATA Steps and Tools for Planning and Using Your Data
DATA, DATA, DATA Steps and Tools for Planning and Using Your Data It is no longer enough for Professional School Counselors to answer the question, What do school counselors do? Professional School Counselors
Risk Management Strategy EEA & Norway Grants 2009-2014. Adopted by the Financial Mechanism Committee on 27 February 2013.
Risk Management Strategy EEA & Norway Grants 2009-2014 Adopted by the Financial Mechanism Committee on 27 February 2013. Contents 1 Purpose of the strategy... 3 2 Risk management as part of managing for
Technical guidance note for Global Fund HIV proposals in Round 11
Technical guidance note for Global Fund HIV proposals in Round 11 UNAIDS I World Health Organization I August 2011 Rationale for including the development of a system in the proposal With the global momentum
USING OPEN AND DISTANCE LEARNING FOR TEACHER PROFESSIONAL DEVELOPMENT IN THE CONTEXT OF EFA AND THE MDG GOALS: CASE OF THE PEDAGOGIC DRUM IN CAMEROON
USING OPEN AND DISTANCE LEARNING FOR TEACHER PROFESSIONAL DEVELOPMENT IN THE CONTEXT OF EFA AND THE MDG GOALS: CASE OF THE PEDAGOGIC DRUM IN CAMEROON Michael Nkwenti Ndongfack, Ministry of Basic Education,
UNAIDS ISSUES BRIEF 2011 A NEW INVESTMENT FRAMEWORK FOR THE GLOBAL HIV RESPONSE
UNAIDS ISSUES BRIEF 2011 A NEW INVESTMENT FRAMEWORK FOR THE GLOBAL HIV RESPONSE Copyright 2011 Joint United Nations Programme on HIV/AIDS (UNAIDS) All rights reserved The designations employed and the
GHAIN SUPPORT TO HIV-RELATED PHARMACEUTICAL SERVICES IN NIGERIA
GHAIN SUPPORT TO HIV-RELATED PHARMACEUTICAL SERVICES IN NIGERIA END OF PROJECT MONOGRAPH GHAIN SUPPORT TO HIV-RELATED PHARMACEUTICAL SERVICES IN NIGERIA END OF PROJECT MONOGRAPH GLOBAL HIV/AIDS INITIATIVE
Stakeholder Guide 2014 www.effectivehealthcare.ahrq.gov
Stakeholder Guide 2014 www.effectivehealthcare.ahrq.gov AHRQ Publication No. 14-EHC010-EF Replaces Publication No. 11-EHC069-EF February 2014 Effective Health Care Program Stakeholder Guide Contents Introduction...1
Project Proposal Writing Module
Project Proposal Writing Module Introduction This project proposal writing module is designed to be used by an individual or group to help them prepare funding proposals. This module will explain the basics
VOCATIONAL TRAINING and JOB PLACEMENT PROGRAM
Program brief on VOCATIONAL TRAINING and JOB PLACEMENT PROGRAM for clients in drug treatment program in Hai Phong and Ho Chi Minh city In July 2011, FHI became FHI 360. FHI 360 is a nonprofit human development
Rapid Assessment of Sexual and Reproductive Health
MOROCCO Rapid Assessment of Sexual and Reproductive Health and HIV Linkages This summary highlights the experiences, results and actions from the implementation of the Rapid Assessment Tool for Sexual
... and. Uses data to help schools identify needs for prevention and intervention programs.
Rubric for Evaluating North Carolina s School Psychologists Standard 1: School psychologists demonstrate leadership. School psychologists demonstrate leadership by promoting and enhancing the overall academic
IMPROVING QUALITY. Quality criteria for global education school visits
IMPROVING QUALITY Quality criteria for global education school visits Organisations that have worked on these quality criteria: Pro Ethical Trade Finland Kepa Service Centre for Development Cooperation
TRAINING NEEDS ANALYSIS
TRAINING NEEDS ANALYSIS WHAT IS A NEEDS ANALYSIS? It is a systematic means of determining what training programs are needed. Specifically, when you conduct a needs analysis, you Gather facts about training
SADC HIV and AIDS Business Plan:
HIV AIDS Business Plan: Strategic 5-Year Business Plan 2005-2009 [ 1 ] HIV AIDS Business Plan HIV AIDS Unit, November 2004 ISBN: 99912-563 - 5-0 The contents of this publication are the sole responsibility
Chapter 13: Transition and Interagency Agreements
Healthy Start Standards & Guidelines 2007 Chapter 13: Transition and Interagency Agreements Introduction Transition is movement or change from one environment to another. Transition activities are a critical
