PART III: DME CAPACITY ASSESSMENT TOOLKIT

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "PART III: DME CAPACITY ASSESSMENT TOOLKIT"

Transcription

1 IMPACT GUIDELINES PART III: DME CAPACITY ASSESSMENT TOOLKIT VER. 2.04

2 PROJECT DME CAPACITY ASSESSMENT TOOL DISCUSSION GUIDE AND QUESTIONNAIRE 1 Introduction CARE projects in each Country Office (CO) are encouraged to assess their capacities for program Design, Monitoring and Evaluation (DME). As a means of helping them to do this the following list of principles and related questions are offered, to help project and CO staff think through some of the many elements involved in good D+M&E. This list is based on the Impact Evaluation Checklist contained within the CARE Impact Guidelines. 2 The participants in a DME Capacity Assessment exercise are referred to that document. It should be read, or, at a minimum, the PowerPoint overview of DME principles contained in the CARE Impact Guidelines should be viewed and discussed before the following questions are addressed. All project staff with any responsibilities related to design and/or monitoring and evaluation should be invited to be a part of this assessment. It is recommended that at least a full day be set aside for this purpose. The intent of posing these questions is to stimulate reflection and discussion about what it takes to design good programs and projects, to set up and operate effective monitoring systems, and to plan for and conduct/coordinate evaluations which meet acceptable standards of quality. There at least two purposes for using this DME Capacity Assessment Tool (CAT). The first is to guide project staff in reflecting on how their project was designed, and on its systems for monitoring and evaluating progress towards achieving impact. This can help to bring a clearer picture on what good DME principles are, and how well the project is doing in light of the standards set forth in the CARE Impact Guidelines. It is hoped that this "reality check will help project staff to focus on those issues which appear to be especially revealing or pertinent to their project. These responses can then be used to guide the development of strategies to strengthen DME capacity of that project. 1 This updated version of the DME CAT was developed with CARE Bangladesh during the DME CA Workshop May The CARE Impact Guidelines are based on work of the Impact Evaluation Initiative Working Group (IEI WG) at a workshop held at the CARE USA Headquarters, Atlanta, April CARE Project DME Capacity Assessment Tool ver

3 The second objective for using this instrument is to use a standard format to measure the strengths and weaknesses of each project to meet the DME standards. The assessments of all projects will then be aggregated and analyzed to develop a profile of the DME capacity of the Country Office as a whole. This objective information should then be used to inform the creation of CO-level strategies for strengthening the DME capacities of all programs and projects. The pattern which follows for each stage of the DME cycle includes the following three subsections: 1. A brief review of the DME principles; and sub-principles; 2. an open discussion of how this was done / is being done / is planned to be done in this project; and then 3. written responses to specific questions (numbered Q). It may be useful to record the narrative comments made during the focus group discussion, as this can be part of the project s documentation of lessons learned for historic purposes. While only the responses to the specific questionnaire (Q questions) need to be reported to and aggregated at the CO level, the narrative notes can be useful to provide more in-depth contextual content. Finally, after completing this project assessment, individual staff with some form of responsibility for Design and/or Monitoring and/or Evaluation, should be asked to rate their current skills using the Assessment of Individual Staff Skills / Training Needs in DME which begins on page 19. CARE Project DME Capacity Assessment Tool ver

4 PROJECT DME CAPACITY ASSESSMENT TOOL (DME CAT) 1. Concept The CARE Impact Guidelines begin with diagnosis. But even before a diagnosis (needs assessment) is undertaken, someone has made the decision to consider the possibility of designing a project to address the needs of a particular community. The source of the concept can make a big difference on whether or not there is scope for a holistic diagnosis and thorough design process before the proposal is written to obtain funding for a project Where did the initial idea come from that gave birth to this project? Who initiated the concept? Q1.0 Source of initial concept for project (check appropriate category box, then, if useful, give name of person[s] who came up with the idea). Category Name(s) Donor (i.e. RFP / RFA) CARE International Member HQ CARE Country Office CARE Project Staff Other: Q1.1 Was there an initial concept paper prepared for this project? Yes / No Q1.2 Who wrote the concept paper? Name Position Q1.3 When was the concept paper completed? (date) Q1.4 Was this project based on the results of a pilot project or previous phase? Yes / No 1.2. What where the main considerations which led to the decision to seek funding for this follow-on phase? 2. Diagnosis Effective measurement of impact begins with sound diagnosis, analysis, and design. Begin with clear problem identification. [The following questions should help lead a discussion on the process which lead to the design of this project.] 2.1. What was the process of problem identification and project design before the proposal for this project was written? CARE Project DME Capacity Assessment Tool ver

5 2.2. Who was involved? Were the major stakeholders (including community representatives, partners and donors, as well as CARE staff) involved in the diagnosis and problem identification process? 2.3. Was there a specific survey conducted as a part of an assessment/diagnosis process used prior to the designing of this project? 2.4. Looking back, would you say that the appropriate level of rigor was used in data collection and analysis during the assessment/diagnosis process? 2.5. Was the project diagnosis and design process holistic, open and creative? Did it explore the causes of the identified problem? Or were interventions selected just because staff had previous experience with them? 2.6. Was a problem tree developed? Was it helpful? 2.7. Were opportunities explored for working with available CBO, NGO and government institutions as potential partners to help address the identified problem? Use holistic Household Livelihood Security (HLS) as the analytic framework for every project s problem analysis. However, HLS should be adapted and interpreted for local conditions Was the HLS framework / perspective used during the diagnosis of the needs to be addressed and interventions to be used by this project? Did the diagnosis leading to the design of this project include assessments of needs, resources, opportunities, risks, and leverage points? 2.9. What methodology was used (e.g. HLS Assessment, appreciative inquiry, benefit-harm risk analysis, gender analysis, etc.)? Was there a participatory process that involved community members? Have donor priorities influenced the design of this project? If so, in what ways? Were government priorities, interests, and capabilities taken into consideration? If so, in what ways? [Summarize the above discussions by selecting the relevant categories on the following table.] Q2.0 What form of needs & opportunity diagnosis (assessment) was used to identify the problems to be addressed by this project? (circle all appropriate responses) Was any formal diagnosis conducted? Yes Somewhat No Project design based on staff s own prior knowledge of community Yes Somewhat No Informal key informant discussions Yes Somewhat No Sector-specific assessment Yes Somewhat No Was secondary data used to inform the diagnosis? Yes Somewhat No Full HLS Assessment Yes Somewhat No Other form of holistic, multi-sectoral assessment Yes Somewhat No Institutional mapping Yes Somewhat No Did members of the target community participate in the diagnosis? Yes Somewhat No Was the diagnosis conducted before the proposal was submitted? Yes No Was the diagnosis conducted after funding secured but before DIP (detailed plan) written? Yes No CARE Project DME Capacity Assessment Tool ver

6 Select the project target population based upon problem identification and analysis before project design Who is the target group for this project? What criteria were used to identify the target population(s)? Was it guided by the CO LRSP (Long-Range Strategic Plan), diagnosis or what? Were communities, donor(s), and partners involved in the identification and selection of the target population? Since project start-up has the identification of the target population been reviewed, reconfirming the relevance of the problem and the project goals for the target population(s)? For this project are beneficiaries different from participants? If so, how and why? Are there clear definitions of: Primary (direct) participants? Secondary adopters (multiplier effect)? Direct beneficiaries? Indirect beneficiaries? Q2.1 Describe the intended direct beneficiaries of this project. [The API will ask for this definition.] Q2.2 Describe the intended indirect beneficiaries of this project. 3. Design I: Logic Sound project design involves a clear and logical cause-effect framework. There should be a logical linkage between proposed outputs and outcomes. More specifically, there should be a clear cause-effect relationship between inputs, activities, outputs, effects and impact. Q3.0 Has the project design been summarized in a logical framework (logframe)? Yes / kind-of / No 3.1. Does the logframe include clear goals, objectively verifiable indicators, means of verification, and relevant assumptions? 3.2. Looking at it now would you say that there is a clear, logical linkage between the different levels? CARE Project DME Capacity Assessment Tool ver

7 3.3. Does the logframe or text clearly state the hypothesis which underlies the causal links in the project design itself, especially the relationship between outputs and outcomes? 3.4. Does the project design (proposal) cite relevant research and evaluation reports that support the hypothesis linking outputs and outcomes? Have the proposed interventions have been tried elsewhere, under similar conditions? If so, was it proven that they led to desirable effects and impact? 3.5. If this project was a follow-on from a pilot or a previous phase, how were lessons learned incorporated into the design? Q3.1 How confident are you that there is a clear, logical linkage between outputs and outcomes (inputs activities outputs effects impact)? Design II: Goals Link to Higher Goals A project final goal must be clearly and explicitly linked to higher level program and strategic goals. Projects should not be isolated, but clearly embedded in long term program and strategic frameworks In what ways were the goals of this project influenced by the CO LRSP? 4.2. Does the CO have strategic plans for Sectors and/or Areas (geographically focused regions)? 4.3. If so, is there coordination of the logframes and M&E systems (including plans for impact evaluation) for multiple projects working with the same population? Program impact should be ultimately measurable at the household level Is there a program goal to which this project contributes? What is that program (ultimate impact) goal? Does it address household-level impact? Is it clear how the project will contribute to that higher goal? If so, where and how is this hypothesis stated? Q4.0 Does the project design state how this project contributes to (fit within) a higher Program goal or strategy? Yes / kind-of / No A project Final Goal should be significant, yet achievable and measurable within the life of the project. Q4.1 What is the stated Final Goal of this project? 3 For questions asking for a Likert ranking, 1=lowest, 5=highest; i.e. 1=very poor, 2=poor, 3=fair, 4=fairly good, 5=extreemly good. CARE Project DME Capacity Assessment Tool ver

8 Q4.2 Does the project Final Goal address household level impact? Yes / kind-of / No 4.1. In the light of the issues raised in the Impact Evaluation Checklist would you say that the project Final Goal is significant? What criteria would you use to judge whether or not it is significant? Does the project Final Goal address the appropriate the level of change? Would an objective DME expert define it as output, effect or impact? 4.2. Is it reasonable for the project to achieve its Final Goal before its time runs out (final evaluation)? Is it technically feasible and cost effective to measure the indicators related to the achievement of the Final Goal during the life of the project? Place primacy upon the definition of high quality goals before determining appropriate indicators We ve already asked whether or not there was a problem tree developed during the diagnosis phase of this project. Were the Final and Intermediary Goals and Output Objectives based on such a problem tree? 4.4. Do the Intermediate Goals clearly and directly relate to the achievement of the Final Goal? Q4.3 Do the project Intermediate Goals address effect-level changes in behaviors or systems? Yes / kind-of / No 4.5. Are the planned outputs stated in terms of Output Objectives (i.e. goals addressed at achieving outputs)? Is there a clear link between them and the Intermediate (effect) Goals? 4.6. Examine all the goals and objectives for this project. Are they well stated? Consider whether or not they seem to meet the SMART criteria (Specific, Measurable, Appropriate, Reliable, Time-bound)? Q4.4 Do the goals meet the SMART test? Yes / kind-of / No 4.7. Who was involved in the definition of the goals? Specialists (consultants, CARE international or within CO)? Partners? Donors? 4.8. Have there been changes in goal magnitudes (targets) based upon baseline and/or evaluation data? Have these changes been documented? All projects should address the development and measurement of capacitystrengthening of institutions and individuals (social capital and human capital) Does this project address institutional capacity strengthening? How? CARE Project DME Capacity Assessment Tool ver

9 Was analysis of social and human capital included in diagnosis and design? Were they addressed by any goals? Will they be (have they been) included in evaluations? Q4.5 Summary comments/recommendations on how future DME strategy should address issues related to improving project logic and program and project goals. 5. Design III: Indicators Indicators should be relevant to the goals they represent, qualitatively or quantitatively measurable, objectively verifiable, reliable, meet international professional standards and yet be understandable and appreciated by project participants and stakeholders. Select indicators that are appropriate to each goal level (impact indicators for Final Goal, effect indicators for Intermediate Goals) Were the indicators selected from existing lists of tested impact and effect indicators? What sources were used? In what ways were they adapted to local conditions and project capabilities? Does the set of indicators for the project Final Goal permit measurement of impact at the HH level for the target population? Use standard criteria for evaluating indicators: valid, reliable, relevant, technically feasible, specific, sensitive, cost-efficient Do the project s indicators meet these criteria? Use indicators that can be understood by and are meaningful to the primary project stakeholders including participants and project staff, as well as donors Within the set of indicators are there those that are understood and meaningful to: Project donors? Partners? Project management? Project field staff? Project participants? Develop and use a combination of relevant indicators 5.4. Among the indicators being measured by this project (either during monitoring or evaluation), are there those which will: CARE Project DME Capacity Assessment Tool ver

10 Measure impact? Measure effect? Measure outputs? Monitor assumptions? Measure social capital? Measure societal factors? Measure beneficiary perceptions, with their participation? Measure sustainability? Measure partner institutional capacity? Measure intended and unintended effects on different segments of the population (e.g. marginalized social classes, women & girls, etc.)? Measure gender dynamics? Measure intended and unintended effects on the environment? Measure the capacity of CARE staff? Ensure that indicators and the means for measuring them are culturally sensitive (with minimal social trespass on local values) Do the indicators meet this criteria? I.e. are there ethical or cultural problems involved in measuring them? Q5.0 Are you satisfied with the indicators proposed to measure impact, effect and outputs of this project? Indicators of impact Yes Somewhat No Indicators of effects Yes Somewhat No Indicators of outputs Yes Somewhat No 5.6. If you do not feel the indicators are as appropriate as they should be, why not? How do you feel they could be improved? Indicator Why inappropriate? How to improve? 6. Design IV: Proposal Q6.0 Was there a formal proposal written and submitted to donor(s)? Yes / No Q6.1 How long was the process from initial concept to submission of proposal? (I.e., was there a short deadline to meet?) weeks / months CARE Project DME Capacity Assessment Tool ver

11 Q6.2 Who was (were) the primary author(s) of the proposal? Name Position Q6.3 Who else was directly involved? Name Position 6.1. Was the project design reviewed by a review panel of experts for logical consistency and adherence to sectoral best practices? Was this a panel made up of internal CARE CO staff, or did it include external experts as well? Is there a mechanism for proposal review in your CO? Was it followed during the preparation of the proposal for this project? Did it influence the design? In what ways? Q6.4 Who (internal or external) was involved in the proposal review process? Name Position Q6.5 Were their comments and suggestions helpful? Yes / somewhat / No Q6.6 Is it clear what guidelines they use when reviewing project proposals? Yes / No Q6.7 When was the proposal submitted to the donor(s)? (month/year) Q6.8 When did the project officially get started? (month/year) Q6.9 What suggestions do you have for improving the project design review process? Q6.10 Project DIP (Detailed Implementation Plan, Inception Review Plan or similar project document): Was there sufficient detail in the original proposal? Yes No Or, was there a separate DIP developed for this project? Yes No Q6.11 In general, how would you assess the quality of the original project proposal submitted for your project (e.g. clear, comprehensive, realistic, etc.)? CARE Project DME Capacity Assessment Tool ver

12 Q6.12 In general, how would you assess the quality of the current working document (DIP) for your project (e.g. clear, comprehensive, realistic, etc.)? Maintain problem analysis and project design as an on-going process through the life of a project. Based upon new information this can lead to project re-design or adjustments (if needed) Is there flexibility (on the part of the donor and/or management) to review the project s problem analysis and design based on findings from the baseline, monitoring or midterm evaluation? Has there been any such re-design? What were the major changes? Were they documented? Is there a process for ensuring the timely use of information that may have redesign implications? Q6.13 Has the project design (including logframe) been modified in response to information obtained from baseline, monitoring or mid-term evaluation? (reality check) Yes (in major ways) / Somewhat (in a minor way) / No (not at all) Q6.14 Have these changes in project design been systematically documented? Yes / somewhat / No Q6.15 Summary comments/recommendations regarding how DME strategy should address Design issues: 7. M&E Methodologies Project M&E plans should provide sufficient detail to clearly identify evaluation design, sources of data, means of measurement, timelines, data processing and analysis, dissemination of information to and utilization by key stakeholders, and responsibilities for each of these processes. Base the M&E plan on an evaluation (or research) design that includes key questions, the plan for the baseline study, whether or not a comparison group is called for, preliminary plans for the final evaluation, and whether or not there should (can) be a post-project evaluation Is there a clearly planned evaluation design that guides the M&E plan of this project? Describe the evaluation design Where is this plan documented? In what format is the evaluation design and/or the M&E plan presented? CARE Project DME Capacity Assessment Tool ver

13 What are the key evaluation questions? 7.2. Do the M&E plans call for a baseline? With or without a comparison group? 7.3. Will there be (has there been) a mid-term evaluation? Will (did) it measure progress towards effect and/or impact goals? 7.4. Will there be a final evaluation of this project? If so, will the final evaluation include a HH survey? Will the survey measure progress towards indicators of the final goal? Will it be comparable with the baseline (same methodology and level of rigor)? Will there be a comparison group? 7.5. Are there plans (and funds) for conducting a post-project evaluation? Or, are there plans for conducting post-intervention evaluations? (Discuss the difference.) [There may be multiple phases of interventions with different individuals/groups within the life of a project.] If so, after how many years is the follow-up evaluation to be conducted? Q7.0 Does a project document or M&E plan include some form of evaluation plan or design, including key questions to be addressed by, and preliminary plans for, the baseline, midterm and final evaluations? Yes / somewhat / No Employ a balance of methodologies during assessments, monitoring and evaluation What methodologies are included in this project s M&E plan? Do the M&E plans call for the use of a reasoned mix of both qualitative and quantitative methods? Describe For example, do they include any of the following, and, if so, in what ways? Participatory methods? Quantitative surveys? Multi-disciplinary focus? Secondary and primary data sources? Ensure that findings from different methodologies used in monitoring and evaluation result in triangulation, combinations of methods, and participatory perspectives Do the methods used provide adequate triangulation and cross-checks (getting closer to the truth by looking from different angles, through different lenses)? 7.8. Are there examples of M&E tools which have been modified as a result of the learnings from other methodologies, e.g., PRA exercises that lead to change in survey items? 7.9. In what ways are participants involved in monitoring and/or evaluation? Use appropriate levels of rigor, collecting useful information in cost-effective and timely ways. This includes choice of methods involving sample design and sample size, precision, indicator validity, etc. CARE Project DME Capacity Assessment Tool ver

14 7.10. Has there been a conscious choice of levels of rigor for planning for and conducting various M&E events; e.g. needs assessment, baseline, monitoring, midterm and final evaluations? Are these specified in the M&E plan? Has there been a review of degrees of methodological rigor including indicator validity (e.g., direct observation, proxy, recall)? M&E Plans Where are the detailed M&E plans for this project documented? (E.g. in proposal, in Detailed Implementation Plan, separate M&E plan). Q7.1 Does the project have specific M&E plans which contain sufficient detail to guide those who are responsible for implementing them? Yes (the M&E plan is a useful tool for all responsible) / Somewhat (M&E plan OK, but lacks sufficient details) No (is too vague or non-existent) Q7.2 Who designed the detailed M&E plans for the project? [Name then give category.] Name Position The author of the project proposal? Yes No Member(s) of project implementation staff? Yes No Somebody from the Country Office? Yes No Someone from a CI Member HQ? Yes No An outside consultant/technical advisor? Yes No Donor(s) Yes No Partners Yes No COUNTING BENEFICIARIES Earlier we asked for the definition of direct beneficiaries. Was there a target number of beneficiaries proposed in the proposal? Q7.3 If so, how many target direct beneficiaries? Does the M&E system of this project count actual direct beneficiaries? If so, how? Q7.4 Method for counting direct beneficiaries (check the one method most closely representing that used by this project s M&E system): A. A central register of beneficiaries keeps track of individual persons (and/or households) reached by all interventions. Can eliminate double counting (and analyze synergy) of those persons reached more than one time or in more than one way. B. Records are kept of beneficiaries for each intervention, but there is no way to avoid double counting (persons reached in more than one way). C. Occasional counts made of sample of beneficiaries. Net totals have to be estimated. D. Numbers reached are just estimates. CARE Project DME Capacity Assessment Tool ver

15 Q7.5 How many net direct beneficiaries did this project reach during the past year? [Note: this is the number that should appear in the summary table in the API (Annual Project Information) report.] Earlier we also asked for the definition of indirect beneficiaries. Was there a target set in the project proposal? If so, what was the target number of indirect beneficiaries to be reached by this project? Does the M&E system of this project have a way to actually count indirect beneficiaries (i.e. during final evaluation survey)? Or do you have a formula for calculating indirect beneficiaries? If so, what is it? Q7.6 Do you have a number (measured or estimated) of indirect beneficiaries reached by this project last year? If so, what was it? Does your project s M&E system disaggregate by different segments of the population? If so, what are those segments? Why are you interested in analyzing these separately? Q7.7 Does your project s M&E system disaggregate by gender? Yes / No If so, how? (Check method closest to that used by this project s M&E system.) A. Actual register of participants records gender. B. Gender ratios based on a sample. C. Figures are estimated. Q7.8 Overall, how would you rate the quality and details of the M&E plans (contained in the project proposal, DIP, or separate M&E plan)? Implementing M&E: Baseline 8.1. Are you clear what the difference is between a diagnosis (assessment of needs and opportunities) and a baseline? (Note that the following questions relate specifically to the definition of baseline study, i.e. focused on the indicators of effect and impact, done as the initial part of the evaluation plan.) Q8.0 Was there a baseline study done for this project? Yes / No Q8.1 If so: Was it a quantitative survey? Was it different than the diagnosis/needs assessment? If quantitative survey, what was sample design? If quantitative survey, what was sample size? Yes No Yes No 8.2. In this project was there a distinction between project start-up and the startup of implementation? (This can make a difference in answering the following question.) 8.3. When did the M&E plans call for the initial baseline study to be conducted? CARE Project DME Capacity Assessment Tool ver

16 Q8.2 If there was a baseline, when, and how long before or after start-up date of project implementation was it conducted? Was baseline conducted before or after start-up of Before After implementation? Date baseline study was completed: Date project implementation begun: Months between baseline and start-up of implementation: Q8.3 Did the baseline include quantitative data on indicators that will be used to evaluate achievement of the goals? Indicators of Final Goal (impact): Yes / No Indicators of Intermediary Goals (effect): Yes / No 8.4. Did the baseline include qualitative indicators of impact and effect goals? Were these done in a way that they will be comparable with the final evaluation? Q8.4 Did the rigor of sample design, sample size and data collection used for the baseline survey match that which will be required of the final evaluation? Yes / No Q8.5 What is your assessment of the quality and utility of the baseline survey? Q8.6 Comments on baseline: Identify and monitor key assumptions in project designs Does the logframe or text of the proposal clearly state key assumptions (elements identified in problem analysis as vital to the achievement of the project goal, yet external to what the project has direct control over)? Does the M&E system include plans for monitoring these assumptions? Q8.7 Are project staff involved in the process of collecting and using information related to indicators of outcomes (effect/impact) and assumptions (and not just outputs/activities)? Yes / somewhat / No 9. INFORMATION PROCESSING AND USE 9.1. What happens to data after it is collected? Does the project have efficient methods for entering, cleaning, managing, analyzing and reporting information? 9.2. Is a software program used to manage the data? If so, what program? Q9.0 Management Information System (MIS): What software program is used to manage and analyze the data? Q9.1 Have you found this software program to be satisfactory? Yes / somewhat / No Q9.2 Does the data management system contribute to accurate reports? Yes / somewhat / No CARE Project DME Capacity Assessment Tool ver

17 Q9.3 Does the data management system contribute to timely reports? Yes / somewhat / No Q9.4 Are the reports generated by the M&E system meeting the needs of key stakeholders? Field Supervisors Yes Somewhat No Project management Yes Somewhat No Sector Coordinator Yes Somewhat No CO senior management Yes Somewhat No CI Member HQ Yes Somewhat No Donor(s) Yes Somewhat No Partners Yes Somewhat No Participants Yes Somewhat No Other key stakeholders Yes Somewhat No Q9.5 How would you rate the over-all functional effectiveness of the M&E system? Q9.6 If you do not think the M&E plans/system are adequate, why not? How do you feel they could be improved? 10. Training in D+M&E Do project staff generally feel adequately knowledgeable and skilled in D and/or M&E? Consider what forms of training can help project planners and implementers be more effective in various aspects of project design, monitoring and/or evaluation. (These could include internal or external 1-3-day workshops/ 1-2-week workshops/on-the-job training/longer-term academic study). Q10.0 Has there been formal training of project staff in Design and/or M&E during the past 2 years? Yes and adequate / Yes but not sufficient / No specific training (If none, skip next question) Q10.1 If yes, on what topics has there been training for project staff, for how many days, for whom, led by whom? [add columns if more than one] Recent topics of training (related to D/M&E) Duration of workshop(s) (days) Participants Leadership of workshop Q10.2 (Whether or not there has been any training in the past) are there currently plans for future trainings and/or on-the-job training? Yes / No Q10.3 If there are plans for future training, how many, for whom, by whom? Planned topics of training (related to D/M&E) Duration of workshops (days) CARE Project DME Capacity Assessment Tool ver

18 Participants Leadership of workshop Q10.4 How would you rate the over-all adequacy of the training available in D+M&E currently provided by CARE? Q10.5 Comments on D+M&E training: (Note: A subsequent analysis of the responses of individual project staff on the Assessment of Individual Staff Skills / Training Needs in DME will provide further details on training needs. See attached form.) 11. QUALITY OF EVALUATION REPORTS Q11.0 Have you read evaluation reports on a previous phase of this project, or of other similar projects in your country? Yes / No Q11.1 If so, how would you rate the quality, credibility and usefulness of project evaluation reports? Q11.2 Which evaluation reports have you found to be particularly useful? Please list them (so that we can be sure they are included in the Evaluation Library). 12. SUMMARY COMMENTS Q12.0 What do you see to be the main issues facing D+M&E in your project / the major weaknesses which need strengthening? Q12.1 Recommendations to the CO DME strategy development: Please note any specific recommendations you may have to contribute to the development of a strategy by the Country Office to promote the strengthening of DME capacity. CARE Project DME Capacity Assessment Tool ver

19 This DME Capacity Assessment process Q13.0 Who was involved in this discussion and in responding to this questionnaire? Name Position Was this person involved in the Design or developing the M&E plan? If so, designate by D or M&E Q13.1 How much time was needed to complete this DME Capacity Assessment Tool? Q13.2 Did all (or at least some) of the participants in this assessment read the IEI Checklist in the CARE Impact Guidelines first? all / most / a few / none Q13.3 Did participants view the PowerPoint overview of the CARE Impact Guidelines (DME Principles) before responding to the questionnaire? Yes/No Q13.4 How many project staff (CARE or partners) completed the Assessment of Individual Staff Skills / Training Needs in DME form? CARE Project DME Capacity Assessment Tool ver

20 ASSESSMENT OF INDIVIDUAL STAFF SKILLS / TRAINING NEEDS IN DME (Based on a list of some of the key elements of D+M&E) Separate copies of this form should be filled out by individuals who have responsibilities for one or more aspects of project Design, Monitoring and/or Evaluation. This may include project senior staff, D/M&E specialists, field staff, and staff of partner organisations related to this project. Consider each of the topics listed below. For each, consider how important this subject is to your job, and then how adequately you know this subject. (This instrument can be used both to self-assess current skill levels and to indicate specific training needs.) Column A: IMPORTANCE TO MY JOB: How important is this skill to the work your are responsible for? (rate: 1=not at all; 2=very little; 3=somewhat; 4=fairly important; 5=very important) Column B: CURRENT LEVEL OF MY SKILL: How well do you know this subject? (rate: 1=not at all; 2=somewhat; 3=moderately, but not sufficient; 4=adequate for this job; 5=skilled professional) Project Individual s Name 5 Position Date Organization I work for Circle one: Staff category 4 (check one): D/M&E specialist Senior project staff Field supervisor Field staff CARE / Partner organization A B Topic 1 The big picture including a broad perspective on D+M&E cycles at CO, program & project levels; Principles and standards related to D+M&E in CARE 2 Project design (including diagnosis/ assessments of needs and opportunities, problem analysis) 3 LogFrames (including the cause & effect hierarchy, criteria for good goals of impact and effect, broad indicators) 4 Evaluation design (including key questions for mid-term and final evaluations; need for baseline, control groups) 5 Components of a detailed project M&E plan (including objectively verifiable indicators, means of measurement, Importance to my job Current level of my skill Comments, including what specific aspect of this topic you feel you need to learn more about. 4 List of categories should be customized for particular CO or project's needs. 5 Since the main purpose of this survey is to determine what training needs to be organized for different categories of staff (rather than focus on assessing individuals' current knowledge and skills) it may be decided that individual's names not be asked for on this form. CARE Project DME Capacity Assessment Tool ver

21 schedules, responsibilities, etc.) 6 Planning for data collection (including knowledge of when and how to plan and use the appropriate mix of methods) 7 Qualitative techniques for data collection (e.g. key informant interviews, focus group discussions, other PRA tools) 8 Quantitative techniques for data collection (e.g. survey plans, observations, questionnaire design & testing, interview techniques, training & coordination of enumerators, data quality control, triangulation, etc.) 9 Sampling (including sample design, sample frame, sample size) 10 Project management information systems (MIS) (e.g. computer software programs like MER, data cleaning, data management) 11 Data analysis (including quantitative & qualitative techniques, GIS [Geographical Information Systems, i.e. mapping], interpretation, tables, graphs) 12 Techniques for presenting information (e.g. narrative reporting writing, customizing reports for different stakeholders, use of visual aids, oral presentations, etc.) Personal over-all comments/recommendations on what is needed to strengthen my capacity to do better D / M&E. Now go back and circle the four (4) DME skill areas that you consider to be the top priority training needs for you. CARE Project DME Capacity Assessment Tool ver

22 4. COUNTRY OFFICE DME CAPACITY RATING TOOL What is the CO s policy on proposal review? Include who should be involved and what criteria they are to follow. (Attach document if available.) Who is the main point-person for D+ME in the CO? Name Position Who else plays a leadership role in promoting DME at the CO level (e.g. sector coordinators, program managers, etc.)? Names Positions Are these people formally organized as a CO DME Task Force? Yes / No If there is such a CO DME Task Force, how active is it? Give examples of recent initiatives and progress made. DME Capacity Assessment Toolkit 21 Country Office DME Capacity Rating Tool

23 Has the CO organized training in D and/or M&E for staff and/or partners during the past two years? Yes / No If yes, describe: Date Subject Facilitator Participants Has the CO enabled individual staff to participate in training or other professional development opportunities in other countries? Yes / No If yes, describe: Date Subject Facilitator Participants Is the CO promoting a standard computer-based management information system (MIS) for project M&E purposes? Yes / No If so, what system? If not, what systems are being used by projects? Are evaluation reports routinely being read by senior management, and lessons learned utilized in future strategy development and project design? Yes / No Are evaluation reports systematically saved in a designated place? Yes / No How would you rate the over-all quality, credibility and utility of project evaluations? DME Capacity Assessment Toolkit 22 Country Office DME Capacity Rating Tool

24 List project or program final evaluations conducted during the past three years, and note which of them have been abstracted and entered on to the Evaluation Electronic Library (EEL): PN and Title of project evaluation report On EEL? Further instructions to CO-level leadership team: In the right hand column of the following table, please fill in your ratings for the subjects. For the questions related to these subjects, see the Project DME Capacity Self-Assessment Tool. (The roll-up of the responses given by individual projects will appear in the when using the linked tables in the Excel version of this form.) DME Capacity Assessment Toolkit 23 Country Office DME Capacity Rating Tool

25 5. SUMMARY RATING OF CO S CURRENT DME CAPACITY Ref. Diagnosis & Design 2.6 Are there clear proposal review guidelines? 3 Holistic needs assessment: Full HLS Assessment Other form of holistic assessment 4 Do projects fit within Program goals? 4.1 Are Final Goals at impact level? 4.2 Are Intermediate Goals at effect level? 5 Do projects have logframes? 5.1 Do logframes show clear logical linkage? * 6 Rate quality of project design CO staff self-rating M&E Plans 7.1 Do projects have detailed M&E plans? 7.2 Do they have satisfactory indicators of: outputs effects impact 7.3 Do projects have central registries for measuring beneficiaries? 7.4 Do projects disaggregate data by gender? * 7.6 Rate over-all quality of M&E plans Implementing M&E 8 Do projects have baseline studies? * 8.4 Rate quality of baselines DME Capacity Assessment Toolkit 24 Country Office DME Capacity Rating Tool

26 Ref. Information Management Systems 9 Do projects have satisfactory M&E software? Ref. 9.1 Do they produce timely & accurate reports? 9.2 Do project reports meet the needs of: Project management CO senior management Donors Partners Participants * 9.4 Over-all quality of M&E system Training in DME 10 Have project staff received training in D/M&E during past 2 years? * 10.4 Adequacy of DME training Evaluations 11 How would you rate the quality, credibility and usefulness of project evaluation reports? CO DME COMPOSITE INDEX 6 Source: CO leadership 6 * CO DME Composite Index averages the 6 rating scores DME Capacity Assessment Toolkit 25 Country Office DME Capacity Rating Tool

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

Guidance Note on Developing Terms of Reference (ToR) for Evaluations Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim

More information

Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5)

Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5) Assessing an Organization s in Health Communication: A Six Cs Approach [11/12 Version] Name & location of organization: Date: Scoring: 0 = no capacity, 5 = full capacity Category Assessment Indicator As

More information

Glossary Monitoring and Evaluation Terms

Glossary Monitoring and Evaluation Terms Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.

More information

MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION

MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION I. Introduction to Results Based Monitoring HED will use its results-based monitoring and evaluation system to effectively manage partnership

More information

Logical Framework Analysis and Problem tree. 21 st March 2013

Logical Framework Analysis and Problem tree. 21 st March 2013 Logical Framework Analysis and Problem tree 21 st March 2013 Overview Explanation of purpose and benefits of Logical Frameworks Develop a problem tree analysis linking roots causes effects (exercise) Logic

More information

Policy for Monitoring and Evaluation of Compacts and Threshold Programs

Policy for Monitoring and Evaluation of Compacts and Threshold Programs Policy for Monitoring and Evaluation of Compacts and Threshold Programs May 1, 2012 Version: Final Submitted by: Department of Policy and Evaluation Millennium Challenge Corporation 875 15th Street N.W.

More information

Guide for the Development of Results-based Management and Accountability Frameworks

Guide for the Development of Results-based Management and Accountability Frameworks Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and

More information

USAID PROGRAM CYCLE OVERVIEW

USAID PROGRAM CYCLE OVERVIEW USAID PROGRAM CYCLE OVERVIEW USAID AUGUST 2012 CONTENTS Background... 3 Overview Summary... 3 Program Cycle Components:... 3 Agency Policies and Strategies:... 4 Country Development Cooperation Strategies:...

More information

Using the logical framework matrix

Using the logical framework matrix Using the logical framework matrix Defining the M&E Strategy for a programme/project A logical framework Clearly defined results and risks Performance indicators and means of verification An M&E plan for

More information

Monitoring and Evaluation of. Interventions

Monitoring and Evaluation of. Interventions Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science

More information

Il Project Cycle Management :A Technical Guide The Logical Framework Approach

Il Project Cycle Management :A Technical Guide The Logical Framework Approach European Institute of Public Administration - Institut européen d administration publique Il Project Cycle Management :A Technical Guide The Logical Framework Approach learning and development - consultancy

More information

Equal Rights and Treatment for Roma in Moldova and Ukraine. Manual

Equal Rights and Treatment for Roma in Moldova and Ukraine. Manual Equal Rights and Treatment for Roma in Moldova and Ukraine Project Management Methodology Manual WELCOME TO THE COUNCIL OF EUROPE PROJECT MANAGEMENT METHODOLOGY In July 2001, the Council of Europe launched

More information

Step 1: Analyze Data. 1.1 Organize

Step 1: Analyze Data. 1.1 Organize A private sector assessment combines quantitative and qualitative methods to increase knowledge about the private health sector. In the analytic phase, the team organizes and examines information amassed

More information

TIPS BASELINES AND TARGETS ABOUT TIPS

TIPS BASELINES AND TARGETS ABOUT TIPS NUMBER 8 2 ND EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS BASELINES AND TARGETS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance

More information

OUTLINE OF PRINCIPLES OF IMPACT EVALUATION

OUTLINE OF PRINCIPLES OF IMPACT EVALUATION OUTLINE OF PRINCIPLES OF IMPACT EVALUATION PART I KEY CONCEPTS Definition Impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended

More information

Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819)

Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819) Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819) 953-6088 (For the hearing and speech impaired only (TDD/TTY):

More information

Monitoring, Evaluation and Learning Plan

Monitoring, Evaluation and Learning Plan Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net

More information

Problem Tree Analysis

Problem Tree Analysis Problem Tree Analysis What is it? The Problem Tree method is a planning method based on needs, however it is not a mechanical translation of problems into objectives. While going through the process, taking

More information

The Logical Framework Approach An Introduction 1

The Logical Framework Approach An Introduction 1 The Logical Framework Approach An Introduction 1 1. What is the Logical Framework Approach? 1.1. The background The Logical Framework Approach (LFA) was developed in the late 1960 s to assist the US Agency

More information

Introduction to Monitoring and Evaluation. Using the Logical Framework Approach

Introduction to Monitoring and Evaluation. Using the Logical Framework Approach Introduction to Monitoring and Evaluation Using the Logical Framework Approach Developed and Presented by: Umhlaba Development Services Umhlaba Development Services Noswal Hall, Braamfontein, Johannesburg,

More information

Darwin ECTF M&E Programme: Final Report Review

Darwin ECTF M&E Programme: Final Report Review Basic Project Details Darwin ECTF M&E Programme: Final Report Review Project Ref No. 14-031 Project Title A market led conservation response to the domestic bird trade in Indonesia UK Contractor Holder

More information

Existing Analytical Market Assessment Tools - Definitions

Existing Analytical Market Assessment Tools - Definitions Existing Analytical Market Assessment Tools - Definitions November, 2003 This list of market assessment tools was prepared by Development Alternatives Inc. (DAI) as an internal working document to support

More information

Overview. EU Energy Focus. Key aspects of Horizon 2020. EU Energy Focus. Funding Actions eligibility and funding rates

Overview. EU Energy Focus. Key aspects of Horizon 2020. EU Energy Focus. Funding Actions eligibility and funding rates Overview Addressing the evaluation criteria 12 th March 2015 Introduction Topics covered by May deadline Evaluation criteria Key messages Evaluator comments Common feedback services Funded by the Department

More information

DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES

DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?

More information

Module 4: Monitoring and Reporting 4-1

Module 4: Monitoring and Reporting 4-1 Module 4: Monitoring and Reporting 4-1 Module 4: Monitoring and Reporting 4-2 Module 4: Monitoring and Reporting TABLE OF CONTENTS 1. MONITORING... 3 1.1. WHY MONITOR?... 3 1.2. OPERATIONAL MONITORING...

More information

6. DOS AND DON TS IN MONITORING AND EVALUATION

6. DOS AND DON TS IN MONITORING AND EVALUATION 6. DOS AND DON TS IN MONITORING AND EVALUATION TIM UNWIN AND BOB DAY Executive Summary This concluding chapter provides a checklist of recommendations relating to the practical monitoring and evaluation

More information

Terms of Reference Baseline Assessment for the employment intensive project for youth in Lower Juba (Dhobley and Afmadow), Somalia

Terms of Reference Baseline Assessment for the employment intensive project for youth in Lower Juba (Dhobley and Afmadow), Somalia Terms of Reference Baseline Assessment for the employment intensive project for youth in Lower Juba (Dhobley and Afmadow), Somalia Organization African Development Solutions www.adesoafrica.org Project

More information

KEY PERFORMANCE INFORMATION CONCEPTS

KEY PERFORMANCE INFORMATION CONCEPTS Chapter 3 KEY PERFORMANCE INFORMATION CONCEPTS Performance information needs to be structured to demonstrate clearly how government uses available resources to deliver on its mandate. 3.1 Inputs, activities,

More information

Project management handbook A working tool for project managers. Novartis Foundation for Sustainable Development

Project management handbook A working tool for project managers. Novartis Foundation for Sustainable Development A working tool for project managers Novartis Foundation for Sustainable Development Contents The Novartis Foundation for Sustainable Development 5 The project management handbook 7 1. Project identification

More information

TIPS SELECTING PERFORMANCE INDICATORS. About TIPS

TIPS SELECTING PERFORMANCE INDICATORS. About TIPS 2009, NUMBER 6 2ND EDITION DRAFT PERFORMANCE MANAGEMENT & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS About TIPS TIPS provides practical advice and suggestions to USAID managers on issues related

More information

Evaluation of degree programs. Self-Evaluation Framework

Evaluation of degree programs. Self-Evaluation Framework Evaluation of degree programs Self-Evaluation Framework COVER, December 2009 FOREWORD UNIL's approach to quality emphasizes procedures based on reflection that encourage the faculties and units concerned

More information

Performance Monitoring and Evaluation System (PMES) Framework Document

Performance Monitoring and Evaluation System (PMES) Framework Document Performance Monitoring and Evaluation System (PMES) Framework Document Performance Management and Evaluation Unit Cabinet Support and Policy Division Cabinet Office 8 th November 2010 Table of Contents

More information

Program Competency & Learning Objectives Rubric (Student Version)

Program Competency & Learning Objectives Rubric (Student Version) Program Competency & Learning Objectives Rubric (Student Version) Program Competency #1 Prepare Community Data for Public Health Analyses and Assessments - Student 1A1. Identifies the health status of

More information

Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies

Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies Home > Centre of Excellence for Evaluation Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies 6.0 Performance Measurement Strategy Framework 6.1 Overview of the

More information

Menu of Indicators on Management and Leadership Capacity Development

Menu of Indicators on Management and Leadership Capacity Development Menu of Indicators on Management and Leadership Capacity Development Leadership, Management, and Sustainability Program Management Sciences for Health Monitoring, Evaluation, and Communication Team 2006

More information

Application Deadline: 5:00pm, September 16, 2014

Application Deadline: 5:00pm, September 16, 2014 QUALITY RATING AND IMPROVEMENT SYSTEM LOCAL EVALUATION REQUEST FOR QUALIFICATIONS Application Deadline: 5:00pm, September 16, 2014 APPLICATION INFORMATION: 1. All sections of the application must be completed.

More information

Guidelines for Preparing an Undergraduate Thesis Proposal Department of Agricultural Education and Communication University of Florida

Guidelines for Preparing an Undergraduate Thesis Proposal Department of Agricultural Education and Communication University of Florida Guidelines for Preparing an Undergraduate Thesis Proposal Department of Agricultural Education and Communication University of Florida What is a thesis? In our applied discipline of agricultural education

More information

University of Cambridge: Programme Specifications CERTIFICATE OF HIGHER EDUCATION IN INTERNATIONAL DEVELOPMENT

University of Cambridge: Programme Specifications CERTIFICATE OF HIGHER EDUCATION IN INTERNATIONAL DEVELOPMENT University of Cambridge: Programme Specifications Every effort has been made to ensure the accuracy of the information in this programme specification. Programme specifications are produced and then reviewed

More information

QUAๆASSURANCE IN FINANCIAL AUDITING

QUAๆASSURANCE IN FINANCIAL AUDITING Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1

More information

Demonstrating Understanding Rubrics and Scoring Guides

Demonstrating Understanding Rubrics and Scoring Guides Demonstrating Understanding Rubrics and Scoring Guides Project-based learning demands a more progressive means of assessment where students can view learning as a process and use problem-solving strategies

More information

Triple Bottom Line Impact Guide for Projects

Triple Bottom Line Impact Guide for Projects Triple Bottom Line Impact Guide for Projects 1. Background Information Clarifying the term Impact It is important to clarify the difference between project impacts and project outcomes: While the created

More information

LOURDES UNIVERSITY Graduate School Master of Science in Nursing NUR 698 NURSING CAPSTONE

LOURDES UNIVERSITY Graduate School Master of Science in Nursing NUR 698 NURSING CAPSTONE LOURDES UNIVERSITY Graduate School Master of Science in Nursing NUR 698 NURSING CAPSTONE Credit Hours: 3 semester hours Prerequisites: All graduate nursing courses Capstone Advisors: PhD Prepared nursing

More information

Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents

Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents 1.0 Introduction 1.1 Quality Management Policy and Practices 2.0 Quality System Components 2.1 Quality Management Plans 2.2 Quality

More information

TABLE OF CONTENTS. The Concept of School Accreditation:... 4. Objectives of School Accreditation:... 4

TABLE OF CONTENTS. The Concept of School Accreditation:... 4. Objectives of School Accreditation:... 4 TABLE OF CONTENTS QNSA Handbook Foreword... 3 The Concept of School Accreditation:... 4 Objectives of School Accreditation:... 4 The Difference between the Accreditation and Licensing Process:... 6 Developing

More information

CONSULTANCY Education Program & Organizational Evaluation Funded by Dan Church Aid and Christian Aid (DCA/CA)

CONSULTANCY Education Program & Organizational Evaluation Funded by Dan Church Aid and Christian Aid (DCA/CA) CONSULTANCY Education Program & Organizational Evaluation Funded by Dan Church Aid and Christian Aid (DCA/CA) BACKGROUND Marginalized children in Phnom Penh who need to work to survive and/or supplement

More information

Communication Plan. for the. ATLANTIC AREA 2007-2013 Transnational Cooperation Programme

Communication Plan. for the. ATLANTIC AREA 2007-2013 Transnational Cooperation Programme Communication Plan for the ATLANTIC AREA 2007-2013 Transnational Cooperation Programme Prepared by the Managing Authority 18 January 2008 Index 0. Introduction... 2 1. Communication Strategy... 2 1.1

More information

A Human Resource Capacity Tool for First Nations // planning for treaty

A Human Resource Capacity Tool for First Nations // planning for treaty A Human Resource Capacity Tool for First Nations // planning for treaty table of contents Introduction //...3 Tools //... 9 HR Planning Timeline... 9 Stage 1 Where are we now?...11 Stage 2 Where do we

More information

PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS

PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS NUMBER 18 1 ST EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related

More information

BENEFITS MANAGEMENT AND REALISATION

BENEFITS MANAGEMENT AND REALISATION COSTS BENEFITS BENEFITS MANAGEMENT TOOLKIT BENEFITS MANAGEMENT AND REALISATION This toolkit has been designed to help you to identify, manage and realise your project benefits effectively. Remember In

More information

January. 12 Components Monitoring and Evaluation System Strengthening Tool

January. 12 Components Monitoring and Evaluation System Strengthening Tool 10 January 12 Components Monitoring and Evaluation System Strengthening Tool 12 Components Monitoring and Evaluation System Strengthening Tool THE WORLD BANK Contents Instruction for Using the Tool 03

More information

Cleveland College of Art & Design BA (Hons) Fashion Enterprise Programme Handbook 2013-2014 1

Cleveland College of Art & Design BA (Hons) Fashion Enterprise Programme Handbook 2013-2014 1 Cleveland College of Art & Design BA (Hons) Fashion Enterprise Programme Handbook 2013-2014 1 BA (Hons) Fashion Enterprise Programme Handbook 2013-2014 Your Programme Handbook provides you with a range

More information

Social Return on Investment. an introduction

Social Return on Investment. an introduction Social Return on Investment an introduction SROI an introduction in association with 1 Introduction Social Return on Investment (SROI) is an innovative way to measure and account for the value you create

More information

ASSESSMENT OF DEVELOPMENT RESULTS

ASSESSMENT OF DEVELOPMENT RESULTS ASSESSMENT OF DEVELOPMENT RESULTS UNDP Evaluation Office, January 2007 GUIDELINES FOR AN ASSESSMENT OF DEVELOPMENT RESULTS (ADR) CONTENTS A. WHAT IS THE ASSESSMENT OF DEVELOPMENT RESULTS (ADR)? An introduction

More information

4 Project Implementation and Monitoring

4 Project Implementation and Monitoring 4 Project Implementation and Monitoring Version 3, 29 July 2014 Contents 4. Implementation and Monitoring... 2 4.1 Project Implementation... 3 4.1.1 Setting up project implementation... 3 4.1.2 Development

More information

Doctor of Education Program Handbook

Doctor of Education Program Handbook Doctor of Education Program Handbook Student Copy Revised Summer 2013 1 Introduction The Johns Hopkins University (JHU) School of Education (SOE) attracts the most innovative and progressive scholars without

More information

ASSESSMENT 5: Masters Degree ECE Summative Assessment Project, Thesis or Paper

ASSESSMENT 5: Masters Degree ECE Summative Assessment Project, Thesis or Paper ASSESSMENT 5: Masters Degree ECE Summative Assessment Project, Thesis or Paper This Assignment Addresses These Specific Program Standards Affecting the Development of Both the Early Childhood Teacher Leader

More information

NHA. User Guide, Version 1.0. Production Tool

NHA. User Guide, Version 1.0. Production Tool NHA User Guide, Version 1.0 Production Tool Welcome to the National Health Accounts Production Tool National Health Accounts (NHA) is an internationally standardized methodology that tracks public and

More information

Module 1: Effective communication, feedback and reporting systems in a PM&E process

Module 1: Effective communication, feedback and reporting systems in a PM&E process Equal Access Participatory Monitoring and Evaluation Toolkit Module 1: Effective communication, feedback and reporting systems in a PM&E process You will understand: Outcomes from using this module how

More information

How to Develop a Research Protocol

How to Develop a Research Protocol How to Develop a Research Protocol Goals & Objectives: To explain the theory of science To explain the theory of research To list the steps involved in developing and conducting a research protocol Outline:

More information

Qualitative methods for effectiveness evaluation: When numbers are not enough

Qualitative methods for effectiveness evaluation: When numbers are not enough Chapter 7 Qualitative methods for effectiveness evaluation: When numbers are not enough 7.1 Introduction 7.2 Methods of collecting qualitative information 7.2.1 Interviews and focus groups 7.2.2 Questionnaires

More information

Key Considerations for MANAGING EVALUATIONS

Key Considerations for MANAGING EVALUATIONS Key Considerations for MANAGING EVALUATIONS Authors: (Brief Reference Guide) Dr. Rita Sonko, Addis Berhanu and Rodwell Shamu Pact South Africa June 2011 1 P age This publication is made possible by the

More information

Terms of Reference (TOR) For Impact Evaluation of ANN Project

Terms of Reference (TOR) For Impact Evaluation of ANN Project Terms of Reference (TOR) For Impact Evaluation of ANN Project Post Title: Rural Aquaculture Development (Impact Evaluation) Expert (International) Location: Oshakati Extension Office and Omahenene Inland

More information

May 2012 Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment

May 2012 Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment May 2012 Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment An Introductory Brief Table of Contents 1. Introduction to Qualitative and Quantitative Research... 3 2. Data

More information

Expert Panel Sample Booklet. Workforce Education Implementation Evaluation. SRI International

Expert Panel Sample Booklet. Workforce Education Implementation Evaluation. SRI International Expert Panel Sample Booklet Workforce Education Implementation Evaluation SRI International Introduction Welcome to the Expert Panel. During this session you will rate different qualities of classroom

More information

USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan

USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan January 2004 Contract Number: GDG-A-00-03-00006-00 Task Order Number: 165-00-03-00105-00 EQUIP1: Secondary Education Activity

More information

Evaluating health promotion programs

Evaluating health promotion programs Evaluating health promotion programs November 1, 2012 V4.12 Five minute exercise in evaluation design In pairs, carefully observe one another Turn your backs to each other Both of you change three things

More information

LONDON SCHOOL OF COMMERCE. Programme Specifications for the. Cardiff Metropolitan University. MSc in International Hospitality Management

LONDON SCHOOL OF COMMERCE. Programme Specifications for the. Cardiff Metropolitan University. MSc in International Hospitality Management LONDON SCHOOL OF COMMERCE Programme Specifications for the Cardiff Metropolitan University MSc in International Hospitality Management 1 Contents Programme Aims and Objectives 3 Programme Learning Outcomes

More information

Preparation for careers in substance abuse or mental health can begin as early as elementary school.

Preparation for careers in substance abuse or mental health can begin as early as elementary school. Preparation for careers in substance abuse or mental health can begin as early as elementary school. 128 Assessing Program Success 6 ASSESSING PROGRAM SUCCESS De nuestros errores aprendemos. [From our

More information

Bangladesh Water Supply and Sanitation Open data Monitoring Platform

Bangladesh Water Supply and Sanitation Open data Monitoring Platform Bangladesh Water Supply and Sanitation Open data Monitoring Platform Inception Report Based on the first mission discussion April 2013 C Sunil Kumar Project Manager Emil Saghatelyan TABLE OF CONTENTS 1.

More information

D R. R O B E R T S M A R T Q U I N N I P I A C U N I V E R S I T Y 3 J U N E 2 0 1 0 S O U T H E R N C O N N E C T I C U T S T A T E U N I V.

D R. R O B E R T S M A R T Q U I N N I P I A C U N I V E R S I T Y 3 J U N E 2 0 1 0 S O U T H E R N C O N N E C T I C U T S T A T E U N I V. The Double Helix: WAC and Critical Thinking, A History D R. R O B E R T S M A R T Q U I N N I P I A C U N I V E R S I T Y 3 J U N E 2 0 1 0 S O U T H E R N C O N N E C T I C U T S T A T E U N I V. Critical

More information

3.1.1 Improve ACT/SAT scores of high school students; 3.1.2 Increase the percentage of high school students going to college;

3.1.1 Improve ACT/SAT scores of high school students; 3.1.2 Increase the percentage of high school students going to college; SECTION 1. GENERAL TITLE 135 PROCEDURAL RULE West Virginia Council for Community and Technical College SERIES 24 PREPARATION OF STUDENTS FOR COLLEGE 1.1 Scope - This rule sets forth minimum levels of knowledge,

More information

Module 17: EMS Audits

Module 17: EMS Audits Module 17: EMS Audits Guidance...17-2 Figure 17-1: Linkages Among EMS Audits, Corrective Action and Management Reviews...17-5 Tools and Forms...17-7 Tool 17-1: EMS Auditing Worksheet...17-7 Tool 17-2:

More information

reporting forma plan framework indicators outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat

reporting forma plan framework indicators outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat plan reporting forma framework indicators rformancem&e Fram outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat Introduction Monitoring and evaluation enhance

More information

An Overview of Key Planning, Monitoring and Evaluation Concepts A Participatory Learning Approach

An Overview of Key Planning, Monitoring and Evaluation Concepts A Participatory Learning Approach An Overview of Key Planning, Monitoring and Evaluation Concepts A Participatory Learning Approach - Training Materials - Version - March 2001 Developed by Jim Woodhill IUCN Facilitator for East and Southern

More information

PROPOSAL ACRONYM - ETN / EID / EJD (delete as appropriate and include as header on each page) START PAGE MARIE SKŁODOWSKA-CURIE ACTIONS

PROPOSAL ACRONYM - ETN / EID / EJD (delete as appropriate and include as header on each page) START PAGE MARIE SKŁODOWSKA-CURIE ACTIONS START PAGE MARIE SKŁODOWSKA-CURIE ACTIONS Innovative Training Networks (ITN) Call: H2020-MSCA-ITN-2015 PART B PROPOSAL ACRONYM This proposal is to be evaluated as: [ETN] [EID] [EJD] [delete as appropriate]

More information

Monitoring and Evaluation

Monitoring and Evaluation OVERVIEW Brief description This toolkit deals with the nuts and bolts (the basics) of setting up and using a monitoring and evaluation system for a project or an organisation. It clarifies what monitoring

More information

VOLUME 12: FEED THE FUTURE EVALUATION DESIGN TEMPLATE NOVEMBER 2015

VOLUME 12: FEED THE FUTURE EVALUATION DESIGN TEMPLATE NOVEMBER 2015 VOLUME 12: FEED THE FUTURE EVALUATION DESIGN TEMPLATE NOVEMBER 2015 The attached Feed the Future Evaluation Design Template provides guidance to third party evaluators and USAID missions on recommended

More information

Monitoring and evaluation of walking and cycling (draft)

Monitoring and evaluation of walking and cycling (draft) Sustrans Design Manual Chapter 16 Monitoring and evaluation of walking and cycling (draft) November 2014 September 2014 1 About Sustrans Sustrans makes smarter travel choices possible, desirable and inevitable.

More information

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects State of Arkansas Office of Information Technology 124 W. Capitol Ave. Suite 990 Little Rock, AR 72201 501.682.4300 Voice 501.682.4020 Fax http://www.cio.arkansas.gov/techarch Best Practices Statement

More information

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness General Information Company: Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness Employee No: Detail Job Title: Coordinator of Data Reporting

More information

EUROPEAN COMMISSION JOINT RELEX SERVICE FOR THE MANAGEMENT OF COMMUNITY AID TO NON-MEMBER COUNTRIES (SCR)

EUROPEAN COMMISSION JOINT RELEX SERVICE FOR THE MANAGEMENT OF COMMUNITY AID TO NON-MEMBER COUNTRIES (SCR) EUROPEAN COMMISSION JOINT RELEX SERVICE FOR THE MANAGEMENT OF COMMUNITY AID TO NON-MEMBER COUNTRIES (SCR) Resources, relations with the other institutions, evaluation, and information Evaluation Project

More information

Formative Evaluation of the Midwifery Education Programme. Terms of Reference

Formative Evaluation of the Midwifery Education Programme. Terms of Reference Formative Evaluation of the Midwifery Education Programme Terms of Reference 1.0 BACKGROUND Investment in midwifery is crucial for national development and is of international interest. It has strong links

More information

Department of Social Work. MSW MCMP Student Learning Agenda and Assessment

Department of Social Work. MSW MCMP Student Learning Agenda and Assessment Department of Social Work MSW MCMP Student Learning Agenda and Assessment Agency Field Instructor Name Licensure Task Supervisor (if applicable) Student Field Faculty Field Placement Duration Typical Weekly

More information

ASPH Education Committee Master s Degree in Public Health Core Competency Development Project

ASPH Education Committee Master s Degree in Public Health Core Competency Development Project ASPH Education Committee Master s Degree in Public Health Core Competency Development Project Version 2.3 Word Format Domains and Competencies Only May 2007 Discipline-specific Competencies A. BIOSTATISTICS

More information

Criminal Justice Graduate Program (M.S.) Assessment Yearly Report. Submitted: December 1, 2011 Reporting Year: 2010-2011

Criminal Justice Graduate Program (M.S.) Assessment Yearly Report. Submitted: December 1, 2011 Reporting Year: 2010-2011 Criminal Justice Graduate Program (M.S.) Assessment Yearly Report Submitted: December 1, 2011 Reporting Year: 2010-2011 Part I: Criminal Justice Graduate Program Mission The Department of Criminal Justice

More information

Health Check Results - Retrak

Health Check Results - Retrak Health Check Results - Report, November 2014 1. Introduction Bond s Health Check tool is intended to provide organisations working in international development with insights into their strengths and weakness

More information

Using Evaluation to Support a Results- Based Management System

Using Evaluation to Support a Results- Based Management System Using Evaluation to Support a Results- Based Management System The E in M & E Evaluation Evaluation is the systematic and objective assessment of an ongoing or completed project, program or policy, including

More information

Practice guide. quality assurance and IMProVeMeNt PrograM

Practice guide. quality assurance and IMProVeMeNt PrograM Practice guide quality assurance and IMProVeMeNt PrograM MarCh 2012 Table of Contents Executive Summary... 1 Introduction... 2 What is Quality?... 2 Quality in Internal Audit... 2 Conformance or Compliance?...

More information

MSc Finance & Business Analytics Programme Design. Academic Year 2014-15

MSc Finance & Business Analytics Programme Design. Academic Year 2014-15 MSc Finance & Business Analytics Programme Design Academic Year 2014-15 MSc Finance & Business Analytics The MSc Financial Management programme is divided into three distinct sections: The first semester

More information

TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL

TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL DRAFT TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL An initial draft proposal to determine the scale, scope and requirements of a team productivity development improvement program for a potential client Team

More information

Impact of youth business loan scheme on enterprise development: A case study from Pakistan

Impact of youth business loan scheme on enterprise development: A case study from Pakistan Impact of youth business loan scheme on enterprise development: A case study from Pakistan Asif Chowdhury Ross Mcintosh Akhtar Nadeem Javed Younas Group-6 Background Youth in Pakistan are the largest population

More information

Social Return on Investment

Social Return on Investment Social Return on Investment for Microenterprise Development A Presentation at the AEO Conference, Portland, May 2005 Elaine Edgcomb The Aspen Institute/FIELD Julie Abrams Women s Initiative for Self Employment

More information

MONITORING AND EVALUATION TOOLKIT. HIV, Tuberculosis, Malaria and Health and Community Systems Strengthening

MONITORING AND EVALUATION TOOLKIT. HIV, Tuberculosis, Malaria and Health and Community Systems Strengthening MONITORING AND EVALUATION TOOLKIT HIV, Tuberculosis, Malaria and Health and Community Systems Strengthening Part 1: The Global Fund M&E requirements Fourth Edition November 2011 Disclaimers The geographical

More information

Corruption Risk Assessment Topic Guide

Corruption Risk Assessment Topic Guide Corruption Risk Assessment Topic Guide Contents What is corruption risk assessment? Purpose and context of the assessments Assessment approaches Data sources Key issues and challenges Examples of promising

More information

pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage

More information

Teacher Leader Masters in Education Masters Action Research Project Rubric: 300 Points 2015-2016

Teacher Leader Masters in Education Masters Action Research Project Rubric: 300 Points 2015-2016 Teacher Leader Masters in Education Masters Action Research Project Rubric: 300 Points 2015-2016 Name Date of MARP: SCHOOL/LOCATION of MARP Title of Masters Action Research Project: ORIGINALITY SCORE (

More information

Social and behavior change communication (SBCC) Quality Assessment Tool

Social and behavior change communication (SBCC) Quality Assessment Tool Social and behavior change communication (SB) Quality Tool Organizational Profile ate of : / / Y MONTH YER. Name of Organization:. ontact etails:. Technical Focus of the : ll reas Only one technical area:.

More information

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire

More information

Criminal Justice Internship Handbook CRJU 3398

Criminal Justice Internship Handbook CRJU 3398 Criminal Justice Internship Handbook CRJU 3398 Department of Sociology and Criminal Justice Contact: Prof. Peter Fenton Office Phone 470 578-2292 pfenton@kennesaw.edu PREPARING FOR YOUR INTERNSHIP Read

More information

Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change

Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change Sharing information from the PHDS-PLUS can help you launch or strengthen partnerships and efforts to improve services, policies, and programs

More information