1 UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White Questions Outstanding Yes t Applicable Section & Overall Rating Very Confident to Act Confident to Act Confident to Act t Confident to Act The key questions are highlighted as shown here, and are important questions in guiding the analysis of the section The Cornerstone questions are in column J and are questions that need to be answered for rating and justification of each of the six sections UNEG Standards for Evaluation in the UN System UNEG rms for Evaluation in the UN System UNICEF Adapted UNEG Evaluation Report Standards Response Title of the Evaluation Report Impact Assessment: HIV Education for Adolescent Response, Motivation and Empowerment (HEAR-ME) Project Report sequence number 2011/001 Date of Review 21 May 2012 Year of the Evaluation Report Region Eastern and Southern Africa Regional Office Country(ies) Lesotho Type of Report Evaluation TORs Present Name of reviewer IOD PARC Classification of Evaluation Report Geographical (Coverage of the programme being evaluated & generalizability of evaluation findings) Management (Managerial control and oversight of evaluation decisions) Purpose (Speaks to the overarching goal for conducting the evaluation; its raison d'être) 1.1 Sub-national: The programme and evaluation covers selected sub-national units (districts, provinces, states, etc.) within a country, where results cannot be generalized to the whole country 2.5 t clear from Report 3.2 At scale: The evaluation examines the efficacy of a programme that is being implemented at or near its maximum intended extent, with the intention of providing feedback on efficiency and the overall effectiveness of the programme at scale Comments Result (Level of changes sought, as defined in RBM: refer to substantial use of highest level reached) MTSP Correspondence (Alignment with MTSP focus area priorities: (1) Young child survival and development; (2) Basic education and gender equality; (3) HIV/AIDS and children; (4) Child protection from violence, exploitation and abuse; and (5) Policy advocacy and partnerships for children s rights) 4.1 Output: Causal effects deriving directly from programme activities, and assumed to be completely under programme control 5.1 Sectoral: addresses issues within only one of the five MTSP focus areas (3. HIV/AIDS & children) 5.3 Cross-cutting: Addresses issues that are named as cross-cutting strategies of the MTSP or otherwise known to operate within all MTSP areas. Includes but is not limited to the human rights-based approach to programming, gender equity, knowledge management, evaluation, & communication for development. The project has a strong focus on gender and empowerment of women. Level of Independence (Implementation and control of the evaluation activities) Timing / Stage 6.4 t clear from Report 7.2 Summative: An evaluation that examines the effects or outcomes of the object being evaluated and summarize it by describing what happened subsequent to delivery of the programme While the purpose of the evaluation is largely summative, it makes recommendations on future HIV and Gender programming for adolescents and young people in Lesotho.
2 SECTION A: OBJECT OF THE EVALUATION Question cc Remarks Object and context 1 Is the object of the evaluation well described? This needs to include a clear description of the interventions (project, programme, policies, otherwise) to be evaluated including how the designer thought that it would address the problem identified, implementing modalities, other parameters including costs, relative importance in the organization and (number of) people reached. 2 Is the context explained and related to the object that is to be evaluated? The context includes factors that have a direct bearing on the object of the evaluation: social, political, economic, demographic, institutional. These factors may include strategies, policies, goals, frameworks & priorities at the: international level; national Government level; individual agency level 3 Does this illuminate findings? The context should ideally be linked to the findings so that it is clear how the wider situation may have influenced the outcomes observed. The project is described mainly in terms of its intended results and activities. It is not clear what the intervention logic is, i.e. how the designers thought the project would address the problem identified. Explanation of the context deals mainly with a selection of epidemiological, social and demographic issues relevant to AIDS. While this is useful to get a general sense of the implementation context, it does not address issues inherent to the intervention logic of the project, e.g. explaining the relationship between Gender-Based Violence and AIDS and explaining the policy context and policy implementation, which is needed to make sense of the relationship between outputs and outcome. There is insufficient information about the dynamics of gender and Gender-Based Violence, specifically as it relates to HIV & AIDS, in the Lesotho context. More information on these issues is required to illuminate findings around their influence on the project's results. A/ Does the report present a clear & full description of the 'object' of the evaluation? The report should describe the object of the evaluation including the results chain, meaning the theory of change that underlies the programme being evaluated. This theory of change includes what the programme was meant to achieve and the pathway (chain of results) through which it was expected to achieve this. The context of key social, political, economic, demographic, and institutional factors that have a direct bearing on the object should be described. For example, the partner government s strategies and priorities, international, regional or country development goals, strategies and frameworks, the concerned agency s corporate goals & priorities, as appropriate. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Theory of Change 4 Is the results chain or logic well articulated? The report should identify how the designers of the evaluated object thought that it would address the problem that they had identified. This can include a results chain or other logic models such as theory of change. It can include inputs, outputs and outcomes, it may also include impacts. The models need to be clearly described and explained. The project goal, strategic objectives and outcome areas are identified. There is no explanation of the underlying logic in terms of how inputs and activities in the identified outcome areas will achieve the strategic objective, and how the strategic objective will contribute to the goal. Based on contextual information presented in the Executive Summary, there appears to be existing knowledge about drivers of the HIV epidemic in Lesotho that is not clearly reflected in the intervention logic of the project. Confident to Act The project's results, strategic objectives and The object of an evaluation should be goal are cited in the report. Project clearly and systematically described in interventions are clearly described. However, terms of its intervention logic, including key information pertaining to the inputs, activities, outputs, outcomes and implementation context is not dealt with in a impact. The role of stakeholders in the manner that explains and enhances project should be clearly identified. The understanding of the project design and implementation context should be intervention logic. Information about the described in a manner that adds implementation context therefore does not interpretative value to the findings. add interpretative value to the findings. Stakeholders and their contributions 5 Are key stakeholders clearly identified? These include o implementing agency(ies) o development partners o rights holders o primary duty bearers o secondary duty bearers 6 Are key stakeholders' contributions described? This can involve financial or other contributions and should be specific. If joint program also specify UNICEF contribution, but if basket funding question is not applicable 7 Are UNICEF contributions described? This can involve financial or other contributions and should be specific Implementation Status 8 Is the implementation status described? This includes the phase of implementation and significant changes that have happened to plans, strategies, performance frameworks, etc that have occurred - including the implications of these changes There is no rights-based analysis of key stakeholders. The two main implementing partners are identified, but their roles and contributions are not described. Although schools, teachers, churches, etc were closely involved in the implementation of the project, their roles and responsibilities are not explained. It is not clear at all what UNICEF's involved or role in the project or the evaluation entailed. It is clear that the evaluation is being conducted at the end of a two-year project. There is no discussion of any changes that have occurred within the project or in the implementation context that may have affected the project's implementation, and which therefore needed to be taken into account in the final evaluation.
3 Executive Feedback on Section A Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences All activities, elements and aspects of the project are not described in a clear and coherent manner. The implementation context is not described in a manner that adds insight to the project or meaning to the findings. Information about the gender dynamics of the HIV & AIDS epidemic in Lesotho would be crucial to deal with in an evaluation of the project, but relevant information is not presented in a manner that explains the intervention logic of the project and is therefore of limited value in illuminating the findings.
4 SECTION B: EVALUATION PURPOSE, OBJECTIVES AND SCOPE Question cc Remarks Purpose, objectives and scope 9 Is the purpose of the evaluation clear? This includes why the evaluation is needed at this time, who needs the information, what information is needed, how the information will be used. 10 Are the objectives and scope of the evaluation clear and realistic? This includes: Objectives should be clear and explain what the evaluation is seeking to achieve; Scope should clearly describe and justify what the evaluation will and will not cover; Evaluation questions may optionally be included to add additional details 11 Do the objective and scope relate to the purpose? The reasons for holding the evaluation at this time in the project cycle (purpose) should link logically with the specific objectives the evaluation seeks to achieve and the boundaries chosen for the evaluation (scope) The purpose of the evaluation is not clearly identified - it is assumed to be "to review project progress and impact against strategic objectives and intended results". This fails to indicate who needs the information, what information is needed and how it will be used. The scope of the evaluation is not clear. evaluation objectives are specified. The evaluation is guided by a number of questions. These questions do not incorporate important aspects of the project design and results, e.g. there is no reference to issues of gender and the empowerment of women/girls in the evaluation questions, even though the project appears to have a specific focus on this. The evaluation questions are appropriate, but there are no questions that relate to the project's impact in terms of its goal. Key concepts in the evaluation objectives and questions are not defined, so it is not clear what the appropriate indicators would be, e.g. what is meant by "an enabling and supportive environment" and what would be the indicators of such an environment in the context of the project and its evaluation? It would appear that three evaluation questions, which relate to the project results, together constitute the foundation on the basis of which the fourth evaluation question, which relates to the project's strategic objective, is to be addressed. However, this ranking/grading of objectives in terms of the project results is not evident in the evaluation questions. B/ Are the evaluation's purpose, objectives and scope sufficiently clear to guide the evaluation? The purpose of the evaluation should be clearly defined, including why the evaluation was needed at that point in time, who needed the information, what information is needed, and how the information will be used. The report should provide a clear explanation of the evaluation objectives and scope including main evaluation questions and describes and justifies what the evaluation did and did not cover. The report should describe and provide an explanation of the chosen evaluation criteria, performance standards, or other criteria used by the evaluators. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Evaluation framework 12 Does the evaluation provide a relevant list of evaluation criteria that are explicitly justified as appropriate for the Purpose? It is imperative to make the basis of the value judgements used in the evaluation transparent if it is to be understood and convincing. UNEG evaluation standards refer to the OECD/DAC criteria, but other criteria can be used such as Human rights and humanitarian criteria and standards (e.g. SPHERE Standards) but this needs justification.. t all OECD/DAC criteria are relevant to all evaluation objectives and scopes. The TOR may set the criteria to be used, but these should be (re)confirmed by the evaluator. Standard OECD DAC Criteria include: Relevance; Effectiveness; Efficiency; Sustainability; Impact Additional humanitarian criteria include; Coverage; Coordination; Coherence; Protection (This is an extremely important question to UNICEF) evaluation criteria are specified. An evaluation of the project's effectiveness is implicit in the first evaluation question, but the criterion is not identified as such. There is no explanation why the DAC criteria were not applied to this evaluation. t Confident to Act The evaluation is not guided by a coherent framework that clearly sets out its purpose, criteria, objectives and questions. One of the key shortcomings of the evaluation is the absence of clear evaluation criteria, with associated indicators, baselines and targets. The purpose of an evaluation should clearly state why an evaluation is being conducted, who needs the information, what information is needed and how the information will be used. An evaluation should be guided and structured according to a coherent framework that requires a clear indication of its purpose, criteria, objectives and questions. The scope of the evaluation should identify which elements and aspects of the object will be in- and excluded from the evaluation. 13 Does the evaluation explain why the evaluation criteria were chosen and/or any standard DAC evaluation criteria (above) rejected? The rationale for using each particular criterion and rejecting any standard OECD-DAC criteria (where they would be applicable) should be explained in the report.
5 Executive Feedback on Section B Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The evaluation is not guided by a coherent framework that clearly sets out its purpose, criteria, objectives and questions. One of the key shortcomings of the evaluation is the absence of clear evaluation criteria, with associated indicators, baselines and targets. This undermines the overall logic and coherence of the evaluation.
6 SECTION C: EVALUATION METHODOLOGY, GENDER, HUMAN RIGHTS AND EQUITY Question cc Remarks Data collection 14 Does the report specify data collection methods, analysis methods, sampling methods and benchmarks? This should include the rationale for selecting methods and their limitations based on commonly accepted best practice. 15 Does the report specify data sources, the rationale for their selection, and their limitations? This should include a discussion of how the mix of data sources was used to obtain a diversity of perspectives, ensure accuracy & overcome data limits Ethics 16 Are ethical issues and considerations described? The design of the evaluation should contemplate: How ethical the initial design of the programme was; The balance of costs and benefits to participants (including possible negative impact) in the programme and in the evaluation; The ethics of who is included and excluded in the evaluation and how this is done 17 Does the report refer to ethical safeguards appropriate for the issues described? When the topic of an evaluation is contentious, there is a heightened need to protect those participating. These should be guided by the UNICEF Evaluation Office Technical te and include: protection of confidentiality; protection of rights; protection of dignity and welfare of people (especially children); Informed consent; Feedback to participants; Mechanisms for shaping the behaviour of evaluators and data collectors The report specifies data collection and analysis methods. Sampling methods are not specified. Benchmarks specific to project interventions are identified for some of the findings (e.g. UNICEF's Minimum Package guide for standardizing HIV prevention intervention programs for adolescents). Data sources, the rationale for their selection and their limitations are not specified. The initial design of the programme in terms of ethical considerations is not assessed. There is no description of ethical issues or safeguards relevant to the evaluation. C/ Is the methodology appropriate and sound? The report should present a transparent description of the methodology applied to the evaluation that clearly explains how the evaluation was specifically designed to address the evaluation criteria, yield answers to the evaluation questions and achieve the evaluation purposes. The report should also present a sufficiently detailed Constructive feedback for future description of methodology in which methodological reports choices are made explicit and justified and in which Including how to address weaknesses and limitations of methodology applied are included. The report should give the elements to assess the maintaining good practice appropriateness of the methodology. Methods as such are not good or bad, they are only so in relation to what one tries to get to know as part of an evaluation. Thus this standard assesses the suitability of the methods selected for the specifics of the evaluation concerned, assessing if the methodology is suitable to the subject matter and the information collected are sufficient to meet the evaluation objectives. Results Based Management 18 Is the capability and robustness of the evaluated object's monitoring system adequately assessed? The evaluation should consider the details and overall functioning of the management system in relation to results: from the M&E system design, through individual tools, to the use of data in management decision making. 19 Does the evaluation make appropriate use of the M&E framework of the evaluated object? In addition to articulating the logic model (results chain) used by the programme, the evaluation should make use of the object's logframe or other results framework to guide the assessment. The results framework indicates how the programme design team expected to assess effectiveness, and it forms the guiding structure for the management of implementation. Yes Human Rights, Gender and Equity 20 Did the evaluation design and style consider incorporation of the UN and UNICEF's commitment to a human rights-based approach to programming, to gender equality, and to equity? This could be done in a variety of ways including: use of a rights-based framework, use of CRC, CCC, CEDAW and other rights related benchmarks, analysis of right holders and duty bearers and focus on aspects of equity, social exclusion and gender. Style includes: using human-rights language; gender-sensitive and child-sensitive writing; disaggregating data by gender, age and disability groups; disaggregating data by socially excluded groups The project monitoring system and its shortcomings are described under "Challenges and Lessons Learnt". Although data from the project M&E systems is cited in the evaluation, it does not add value to the findings. This should arguably have been identified as a limitation of the evaluation. The project has a strong focus on rights awareness and gender, including Gender-Based Violence. Because these issues are embedded in the project outcomes, the evaluation findings reflect on the contribution of the project in terms of raising awareness about Gender- Based Violence and child rights. However, it does not do so within a rights-based analytical framework. It does not explain how issues of gender and equity were incorporated in the evaluation design and methodology and gender-disaggregated data is not used to inform the f d h l d d Confident to Act t all aspects of the evaluation methodology are dealt with. Important methodological choices, e.g. sampling and data sources, and their limitations are not explained. While baseline data for the project is available, it is not clear why the possibility for using this as a historical counterfactual is not followed through. Shortcomings around the baseline appear to be a major limitation of the evaluation, but are not identified as such. Because no other counterfactual is used, e.g. schools or areas not targeted by the project, there is no conclusive evidence of the project's contribution to change envisaged in its results. Instead, judgements about the project's performance are largely anecdotal, based mainly on stakeholder opinions and perceptions. Impact evaluations should make use of methodologies and designs that would generate the strongest possible evidence for addressing the evaluation objectives/questions. Where possible, counterfactuals should be used to provide a basis for comparison that would provide more conclusive evidence of the difference a project or programme has made, rather than simply relying on anecdotal information based on experiences and opinions of stakeholders that have been directly involved in the project, either as implementers or beneficiaries.
7 21 Does the evaluation assess the extent to which the implementation of the evaluated object was monitored through human rights (inc. gender, equity & child rights) frameworks? UNICEF commits to go beyond monitoring the achievement of desirable outcomes, and to ensure that these are achieved through morally acceptable processes. The evaluation should consider whether the programme was managed and adjusted according to human rights and gender monitoring of processes. 22 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on HUMAN RIGHTS (inc. women & child rights)? The inclusion of human rights frameworks in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the level of: Identification of the human rights claims of rights-holders and the corresponding human rights obligations of duty-bearers, as well as the immediate underlying & structural causes of the non realisation of rights.; Capacity development of rights-holders to claim rights, and duty-bearers to fulfil obligations. findings. The evaluation does not present genderdisaggregated data as evidence of the project's impact and does not incorporate key contextual issues related to gender and social exclusion relevant to the project implementation environment in Lesotho. The evaluation does not reflect on the extent to which the project was managed and adjusted according to human rights and gender monitoring processes. Inputs/activities around training on HIV, gender and rights are described and the outcomes of these activities are dealt with in the findings. However, these findings are not followed through in conclusions and recommendations about strengthening gender equality and women's empowerment. While dealing with some issues of stigma and discrimination related to HIV & AIDS, as well as the need for equal support to all project sites due to the topography of Lesotho, equity is not a recurrent analytical theme throughout the evaluation. 23 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on GENDER EQUALITY AND WOMEN'S EMPOWERMENT? The inclusion of gender equality frameworks in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the immediate underlying & structural causes of social exclusion; and capacity development of women to claim rights, and duty-bearers to fulfil their equality obligations. 24 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on EQUITY? The inclusion of equity considerations in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the capacity development of rights-holders to claim rights, and duty-bearers to fulfil obligations & aspects of equity. Stakeholder participation 25 Are the levels and activities of stakeholder consultation described? This goes beyond just using stakeholders as sources of information and includes the degree of participation in the evaluation itself. The report should include the rationale for selecting this level of participation. Roles for participation might include: o Liaison o Technical advisory o Observer o Active decision making The reviewer should look for the soundness of the description and rationale for the degree of participation rather than the level of participation itself. It appears that stakeholders were mainly used as sources of information. There is no indication that the evaluation considered the rationale of using stakeholders in roles other than as sources of information. A higher degree of stakeholder participation in the verification of findings, as well as formulation of recommendations, would have been desirable. 26 Are the levels of participation appropriate for the task in hand? The breadth & degree of stakeholder participation feasible in evaluation activities will depend partly on the kind of participation achieved in the evaluated object. The reviewer should note here whether a higher degree of participation may have been feasible & preferable.
8 Methodological robustness 27 Is there an attempt to construct a counterfactual? The counterfactual can be constructed in several ways which can be more or less rigorous. It can be done by contacting eligible beneficiaries that were not reached by the programme, or a theoretical counterfactual based on historical trends, or it can also be a comparison group. 28 Can the methodology answer the evaluation questions in the context of the evaluation? The methodology should link back to the Purpose and be capable of providing answers to the evaluation questions. 29 Are methodological limitations acceptable for the task in hand? Limitations must be specifically recognised and appropriate efforts taken to control bias. This includes the use of triangulation, and the use of robust data collection tools (interview protocols, observation tools etc). Bias limitations can be addressed in three main areas: Bias inherent in the sources of data; Bias introduced through the methods of data collection; Bias that colours the interpretation of findings It is not clear why baseline data is discussed in the report, especially since the evaluation methodology does not attempt to replicate or to follow-up on the baseline. There is no comparison of evaluation data with the baseline. Shortcomings around the baseline appear to be a major limitation of the evaluation, but are not identified as such. Conclusive judgements about the project's contribution to changes as identified in the evaluation questions would have required comparisons with a baseline for the project target areas or with a counterfactual, e.g. comparable schools in geographical areas not targeted by the project. Therefore, although the methodology generates useful qualitative information, based largely on stakeholder opinions and perceptions, on changes that are claimed to be brought about by the project, it does not present conclusive evidence of this. Executive Feedback on Section C Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The report contains insufficient information about all aspects of the methodology and methodological choices and it is therefore not clear how the methodology was designed to overcome limitations of the evaluation. Being an impact evaluation, the use of a counterfactual would have added rigour to the evaluation and more credibility to its findings and recommendations.
9 SECTION D: FINDINGS AND CONCLUSIONS Question cc Remarks Completeness and logic of findings 30 Are findings clearly presented and based on the objective use of the reported evidence? Findings regarding the inputs for the completion of activities or process achievements should be distinguished clearly from results. Findings on results should clearly distinguish outputs, outcomes and impacts (where appropriate). Findings must demonstrate full marshalling and objective use of the evidence generated by the evaluation data collection. Findings should also tell the 'whole story' of the evidence and avoid bias. 31 Do the findings address all of the evaluation's stated criteria and questions? The findings should seek to systematically address all of the evaluation questions according to the evaluation framework articulated in the report. 32 Do findings demonstrate the progression to results based on the evidence reported? There should be a logical chain developed by the findings, which shows the progression (or lack of) from implementation to results. 33 Are gaps and limitations discussed? The data may be inadequate to answer all the evaluation questions as satisfactorily as intended, in this case the limitations should be clearly presented and discussed. Caveats should be included to guide the reader on how to interpret the findings. Any gaps in the programme or unintended effects should also be addressed. 34 Are unexpected findings discussed? If the data reveals (or suggests) unusual or unexpected issues, these should be highlighted and discussed in terms of their implications. Cost Analysis 35 Is a cost analysis presented that is well grounded in the findings reported? Cost analysis is not always feasible or appropriate. If this is the case then the reasons should be explained. Otherwise the evaluation should use an appropriate scope and methodology of cost analysis to answer the following questions: o How programme costs compare to other similar programmes or standards o Most efficient way to get expected results o Cost implications of scaling up or down o Cost implications for replicating in a different context o Is the programme worth doing from a cost perspective o Costs and the sustainability of the programme. Yes Yes The findings combine a description of project activities with stakeholders' experience and perceptions of how these activities contributed to results in each of the project outcome areas. Therefore, there is an attempt to demonstrate progression from implementation to results. Evidence of project achievement in the four outcome areas is mainly presented in the form of qualitative information. Without clear indicators of success, including baselines and targets for each of the outcome areas, and without comparison to a counterfactual, the findings of the project's contribution to outcomes (performance) are largely anecdotal. This is a key limitation of the evaluation that is not discussed. With the exception of a summative reflection on the extent to which the project has created an enabling and supportive environment for adolescents in project areas to protect themselves from HIV infection, the evaluation questions are largely addressed in the context of project outcome areas. Unexpected findings are not discussed. There are no evaluation questions related to project cost and cost-effectiveness. The methodology prevents a full cost analysis from being conducted and this is acknowledged as a limitation of the evaluation. D/ Are the findings and conclusions, clearly presented, relevant and based on evidence & sound analysis? Findings should respond directly to the evaluation criteria and questions detailed in the scope and objectives section of the report. They should be based on evidence derived from data collection and analysis methods described in the methodology section of the report. Conclusions should present reasonable judgments based on findings and substantiated by evidence, providing insights pertinent to the object and purpose of the evaluation. Confident to Act Constructive feedback for future reports Including how to address weaknesses and maintaining good practice The findings are based on reported evidence Impact evaluations should include some and address all of the evaluation questions. reflection on the extent to which a However, the evaluation questions relate programme or project has contributed to its mainly to the project's contribution towards identified goal (impact level). Conclusions achieving results at the output and outcome should reflect on the levels. Being a final/impact evaluation, a accomplishments/strengths, as well as major shortcoming of the evaluation is that shortcomings/failures of a project and this the findings do not address the project's should be clearly linked to causes and contribution to its identified goal. Without contributions by stakeholders. reflecting on causal reasons for project accomplishments and failures, an incomplete foundation for targeted recommendations is established. Conclusions are a very concise summary of selected strengths of the project. There is no reflection on the weaknesses of the project. The insight and meaning the conclusions add to the findings and the usefulness of the evaluation for stakeholders are therefore limited.
10 Contribution and causality 36 Does the evaluation make a fair and reasonable attempt to assign contribution for results to identified stakeholders? For results attributed to the programme, the result should be mapped as accurately as possible to the inputs of different stakeholders. 37 Do conclusions take due account of the views of a diverse crosssection of stakeholders? As well as being logically derived from findings, conclusions should seek to represent the range of views encountered in the evaluation, and not simply reflect the bias of the individual evaluator. Carrying these diverse views through to the presentation of conclusions (considered here) is only possible if the methodology has gathered and analysed information from a broad range of stakeholders. Yes The evaluation does not attempt to assign contribution for identified results to specific stakeholders. The conclusions are a very concise summary of the findings regarding the project's achievements in terms of its stated outcome areas. As such, the conclusions are reflective of the stakeholder views incorporated in the findings. Causal reasons for accomplishments and failures are identified, but these are not followed through in the conclusions and recommendations. 38 Are causal reasons for accomplishments and failures identified as much as possible? These should be concise and usable. They should be based on the evidence and be theoretically robust. (This is an extremely important question to UNICEF) Strengths, weaknesses and implications 39 Are the future implications of continuing constraints discussed? The implications can be, for example, in terms of the cost of the programme, ability to deliver results, reputational risk, and breach of human rights obligations. 40 Do the conclusions present both the strengths and weaknesses of the evaluated object? Conclusions should give a balanced view of both the stronger aspects and weaker aspects of the evaluated object with reference to the evaluation criteria and human rights based approach. Completeness and insight of conclusions 41 Do the conclusions represent actual insights into important issues that add value to the findings? Conclusions should go beyond findings and identify important underlying problems and/or priority issues. Simple conclusions that are already well known do not add value and should be avoided. 42 Are the conclusions pitched at a level that is relevant to the end users of the evaluation? Conclusions should speak to the evaluation participants, stakeholders and users. These may cover a wide range of groups and conclusions should thus be stated clearly and accessibly: adding value and understanding to the report (for example, some stakeholders may not understand the methodology or findings, but the conclusions should clarify what these findings mean to them in the context of the programme). Future implications of continuing constraints are not discussed - some constraints are identified in the findings, but these are not followed through in the conclusions and recommendations. The conclusions reflect mainly on the project's strengths, but fail to acknowledge weaker aspects of the project. The conclusions are a very concise summary of the findings and focuses mainly on the strengths and successes of the project. As such, it does not add insight or value to the findings in a manner that would be meaningful and relevant to the end-users of the evaluation. Executive Feedback on Section D Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences Findings are based on reported evidence and address all the evaluation questions. However, because the evaluation questions do not include reflection on the project's contribution to its stated goal, the evaluation fails to address an important requirement of a final/impact evaluation. The conclusions do not add insight or value to the findings and would be of limited use to end-users of the evaluation.
11 SECTION E: RECOMMENDATIONS AND LESSONS LEARNED Question cc Remarks Relevance and clarity of recommendations 43 Are the recommendations well-grounded in the evidence and conclusions reported? Recommendations should be logically based in findings and conclusions of the report. 44 Are recommendations relevant to the object and the purpose of the evaluation? Recommendations should be relevant to the evaluated object 45 Are recommendations clearly stated and prioritised? If the recommendations are few in number (up to 5) then this can also be considered to be prioritised. Recommendations that are over-specific or represent a long list of items are not of as much value to managers. Where there is a long list of recommendations, the most important should be ordered in priority. The recommendations raise issues and present information that are not dealt with in the findings. Important issues around the implementation and performance of the project that are identified in the findings are not followed through in the recommendations. The evaluation is specifically required "to make recommendations on future HIV and Gender programming for adolescents and young people in Lesotho". While useful recommendations are made around delivery mechanisms and the design of future programmes, there are no clear recommendations that specifically address the strengthening of future programmes around gender and to address the genderrelated issues raised by the evaluation. The recommendations are clearly stated, but are not prioritised. E/ Are the recommendations and lessons learned relevant and actionable? Recommendations should be relevant and actionable to the object and purpose of the evaluation, be supported by evidence and conclusions, and be developed with involvement of relevant stakeholders. Recommendations should clearly identify the target group for each recommendation, be clearly stated with priorities for action, be actionable and reflect an understanding of the commissioning organization and potential constraints to follow up. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Usefulness of recommendations 46 Does each recommendation clearly identify the target group for action? Recommendations should provide clear and relevant suggestions for action linked to the stakeholders who might put that recommendation into action. This ensures that the evaluators have a good understanding of the programme dynamics and that recommendations are realistic. 47 Are the recommendations realistic in the context of the evaluation? This includes: o an understanding of the commissioning organisation o awareness of the implementation constraints o an understanding of the follow-up processes 48 Does the report describe the process followed in developing the recommendations? The preparation of recommendations needs to suit the evaluation process. Participation by stakeholders in the development of recommendations is strongly encouraged to increase ownership and utility. Appropriate lessons learned Responsibility for implementing recommendations is not clear. Most recommendations are underpinned by evidence, even though not all evidence is followed through into recommendations. Without discussion of implementation constraints and future plans for the ongoing implementation of the project, it is not clear whether the recommendations are realistic. The process followed in the development of the recommendations is not described, so it is not clear to what extent they reflect the views and requirements of stakeholders. Confident to Act The report contains some clearly stated Care should be taken that all recommendations. However, it is not clear recommendations relevant to the purpose how the recommendations relate to the of an evaluation are extracted from the purpose of the evaluation. Also, there are evaluation evidence. Care should be taken many findings that are not followed through that all the possible lessons to be learnt in the recommendations, so potentially there from a project are extracted from the are significant ways in which future projects evaluation evidence. could be improved or strengthened that are not reflected in the recommendations. The same applies to lessons learnt - only two lessons related to "challenges" are identified. However, the findings of the evaluation suggest that there are more lessons that could be learnt from this project that could meaningfully inform future projects of this nature. There is lack of coherence between the lessons learnt and recommendations. 49 Are lessons learned correctly identified? Lessons learned are contributions to general knowledge. They may refine or add to commonly accepted understanding, but should not be merely a repetition of common knowledge. Findings and conclusions specific to the evaluated object are not lessons learned. 50 Are lessons learned generalised to indicate what wider relevance they may have? Correctly identified lessons learned should include an analysis of how they can be applied to contexts and situations outside of the evaluated object. Challenges and lessons learned are used interchangeably in the evaluation. While two important lessons are identified, there appear to be various other lessons to be learnt from this project that were not identified. Lessons learnt, especially where they reflect on shortcomings and weaknesses of the project, are not followed through in recommendations. Lessons are not generalised to indicate their wider relevance and implications beyond the project.
12 Executive Feedback on Section E Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The report contains some clearly stated recommendations. However, the evidence presented in findings and lessons learnt suggest that more recommendations could be identified that would be relevant and meaningful to the purpose of the evaluation.
13 SECTION F: REPORT IS WELL STRUCTURED, LOGIC AND CLEAR Question cc Remarks Style and presentation 51. Do the opening pages contain all the basic elements? Basic elements include all of: Name of the evaluated object; Timeframe of the evaluation and date of the report; Locations of the evaluated object; Names and/or organisations of evaluators; Name of the organisation commissioning the evaluation; Table of contents including tables, graphs, figures and annex; List of acronyms 52 Is the report logically structured? Context, purpose, methodology and findings logically structured. Findings would normally come before conclusions, recommendations & lessons learnt 53 Do the annexes contain appropriate elements? Appropriate elements may include: ToRs; List of interviewees and site visits; List of documentary evidence; Details on methodology; Data collection instruments; Information about the evaluators; Copy of the evaluation matrix; Copy of the Results chain. Where they add value to the report 54 Do the annexes increase the usefulness and credibility of the report? The only omission in the opening pages of the report is the date of the evaluation. The coherence of the report is undermined by the absence of a clear framework for the evaluation, as well as the fact that important information related to the project and implementation context are dealt with in various places throughout the report. Also, key findings are not followed through in the identification of lessons learnt, conclusions and recommendations, which is indicative of gaps in the logic of the evaluation. Structurally, conclusions are dealt with at the end of the report, after the recommendations. It would have been more meaningful to deal with conclusions directly following the findings to illustrate coherence between the findings and conclusions. The ToR for the evaluation are not attached as an annex, which makes it difficult to assess the extent to which the evaluation has met all its expectations and requirements. It would also be useful to include a copy of the evaluation matrix, as well as a list of site visits and documents that were reviewed as annexes. The annexes that are provided (data collection instruments) somewhat increase the credibility of the report. F/ Overall, do all these elements come together in a well structured, logical, clear and complete report? The report should be logically structured with clarity and coherence (e.g. background and objectives are presented before findings, and findings are presented before conclusions and recommendations). It should read well and be focused. Confident to Act The structure of the report could be improved by distinguishing more clearly between sections dealing with a description of the implementation context and the project. Also, conclusions should follow the findings, and should in turn be followed by the recommendations and lessons learnt. The logic of the report could be improved by ensuring greater coherence between the evidence generated on the one hand, and the findings, conclusions and recommendations on the other hand. All relevant findings should be followed through in the conclusions, recommendations and lessons learnt. The ToR are not attached as an annex, which makes it difficult to assess the extent to which the evaluation has addressed all its expectations and requirements. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice The structure of evaluation reports should follow the UNICEF-adapted UNEG standards and these should be made available to evaluators. Care should be taken that all relevant findings are followed through coherently into conclusions, recommendations and lessons learnt. ToR should always be attached as an annex to evaluation reports. Executive Summary 55. Is an executive summary included as part of the report? If the answer is, question 56 to 58 should be N/A 56 Does the executive summary contain all the necessary elements? Necessary elements include all of: Overview of the evaluated object; Evaluation objectives and intended audience; Evaluation methodology; Most important findings and conclusions; Main recommendations 57 Can the executive summary stand alone? It should not require reference to the rest of the report documents and should not introduce new information or arguments 58 Can the executive summary inform decision making? It should be short (ideally 2-3 pages), and increase the utility for decision makers by highlight key priorities. Executive Feedback on Section F Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences Yes The Executive Summary does not contain any information regarding the project design, implementation context or evaluation framework. Also, it does not contain a summary of the key findings and conclusions of the evaluation. It presents only an overarching finding regarding the project's contribution at the level of its strategic objective. The Executive Summary does not provide an overview of the recommendations. For these reasons, the Executive Summary cannot stand alone and cannot inform decision-making. The structure of the report does not follow the UNICEF-adapted UNEG standards. In terms of content, the logic of the report is undermined by the lack of coherence between findings, conclusions, recommendations and lessons learnt. There is a sense that the evaluation generated sufficient evidence for more/deeper conclusions, recommendations and lessons learnt than what is presented.
14 Additional Information Question i/ Does the evaluation successfully address the Terms of Reference? If the report does not include a TOR then a recommendation should be given to ensure that all evaluations include the TOR in the future. Some evaluations may be flawed because the TORs are inappropriate, too little time etc. Or, they may succeed despite inadequate TORs. This should be noted under vii in the next section Remarks NO TERMS OF REFERENCE WERE ATTACHED TO THE REPORT. ALL FUTURE EVALUATIONS SHOULD HAVE THE ORIGINAL TOR INCLUDED AS AN ANNEX. ii/ Identify aspects of good practice of the evaluation In terms of evaluation iii/ Identify aspects of good practice of the evaluation In terms of programmatic, sector specific, thematic expertise OVERALL RATING Question cc Remarks OVERALL RATING Informed by the answers above, apply the reasonable person test to answer the following question: Ω/ Is this a credible report that addresses the evaluation purpose and objectives based on evidence, and that can therefore be used with confidence? This question should be considered from the perspective of UNICEF strategic management. i/ To what extent does each of the six sections of the evaluation provide sufficient credibility to give the reasonable person confidence to act? Taken on their own, could a reasonable person have confidence in each of the five core evaluation elements separately? It is particularly important to consider: o Is the report methodologically appropriate? o Is the evidence sufficient, robust and authoritative? o Do the analysis, findings, conclusions and recommendations hold together? ii/ To what extent do the six sections hold together in a logically consistent way that provides common threads throughout the report? The report should hold together not just as individually appropriately elements, but as a consistent and logical whole. Each of the sections of the evaluation displays key shortcomings that undermine their individual credibility. The logic and coherence of the report is undermined by both its structure and its content. Issues and themes central to the evaluation and its purpose are not logically and comprehensively followed through from the findings to the conclusions, recommendations and lessons learnt. There is a sense that the evaluation fails to reach its full conclusion and potential. Confident to Act With some improvement and additions, the evaluation could be used with confidence. The evaluation framework should be strengthened by clarifying the purpose and identifying clear evaluation criteria, objectives and questions relevant to the purpose. As a final project evaluation, some reflection on the project's contribution towards its goal would be expected. Care should be taken that issues and themes that are central to the evaluation purpose and questions are coherently and comprehensively followed through from the findings to the conclusions, recommendations and lessons learnt. The structure of the report should be revised in accordance with the UNICEF-adapted UNEG standards. iii/ Are there any reasons of note that might explain the overall performance or particular aspects of this evaluation report? This is a chance to note mitigating factors and/or crucial issues apparent in the review of the report. Executive Feedback on Overall Rating Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences ToRs Other With some improvements and additions, the evaluation could be used with confidence. The main improvements relate to the clarification of the evaluation framework, including a clear purpose statement and coherent criteria, objectives and questions. Care should be taken that key themes and issues raised by the evaluation purpose and questions are coherently followed through from the findings to the conclusions, recommendations and lessons learnt. The structure of the report should adhere to the UNICEF-adapted UNEG standards.
EVALUATION ID 1430-2015/003 UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White s Outstanding Satisfactory No Not Applicable Section
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
UNODC Independent Evaluation Unit Evaluation Quality Assessment Template General Project Information Project/Programme Number and Name Thematic Area Geographic Area (Region, Country) Type of Evaluation
UNICEF Evaluation Report Standards The UNICEF Evaluation Report Standards have been created as a transparent tool for quality assessment of evaluation reports. This document outlines what the Standards
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
Guidance Document UNEG Quality Checklist for Evaluation Terms of Reference and Inception Reports Approved at the UNEG AGM 2010, this quality checklist for evaluation Terms of Reference and Inception Reports
United Nations Evaluation Group (UNEG) Norms for Evaluation in the UN System Towards a UN system better serving the peoples of the world; overcoming weaknesses and building on strengths from a strong evidence
ANNEX E: GLOSSARY OF KEY TERMS IN M&E Source: Development Assistance Committee (DAC). 2002. Glossary of Terms in Evaluation and Results-Based Management. Paris: OECD. This glossary is available in English,
Guidelines for UNODC Evaluation Reports 1 CONTENTS A standard evaluation report starts with a cover page, a table of contents, a list of abbreviations and acronyms, an executive summary and a matrix of
IEA Guidance Note G2 IEA Guidance Note G4 Guidance on Evaluation Inception Reports January 2015 1 Table of contents List of Abbreviations... ii Introduction... 1 General guidelines... 1 Content... 2 Opening
Guidelines for Evaluation Terms of Reference 1. The present Guidelines for Evaluation Terms of Reference (TOR) form part of a common set of ITC Evaluation Guidelines that operationalize the ITC Evaluation
1 Educational Leadership & Policy Studies Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) The comprehensive exam is intended as a final assessment of a student s ability to integrate important
Annex 1 Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation European policy experimentations in the fields of Education, Training and
National Occupational Standards National Occupational Standards for Youth Work Contents Introduction 5 Section 1 S1.1.1 Enable young people to use their learning to enhance their future development 6 S1.1.2
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1
Writing the MBA Dissertation 1. General Comments This should be divided into chapters as detailed in the following section: Note: The dissertation may not divide up easily into the 6 main headings, but
OUTLINE OF PRINCIPLES OF IMPACT EVALUATION PART I KEY CONCEPTS Definition Impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended
Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science
DFAT MONITORING AND EVALUATION STANDARDS December 2016 CONTENTS INTRODUCTION TO THE MONITORING AND EVALUATION (M&E) STANDARDS 2 STANDARD 1 INVESTMENT DESIGN (REQUIRED ELEMENTS RELATING TO M&E) 6 STANDARD
National Standards for Disability Services DSS 1504.02.15 Version 0.1. December 2013 National Standards for Disability Services Copyright statement All material is provided under a Creative Commons Attribution-NonCommercial-
7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP) Organization Project Position type Adeso African Development Solutions and ACTED - Agency for
Overview This standard identifies the requirements associated with leading and managing change within care services. It includes the implementation of a shared vision for the service provision and using
ETHICAL GUIDELINES FOR EVALUATION 21 July 2007 The Ethical Guidelines for Evaluation were formally approved by UNEG members at the UNEG Annual General Meeting 2008. UNEG2008 United Nations Evaluation Group
Foundation Document UNEG Ethical Guidelines for Evaluation UNEG, March 2008 The Ethical Guidelines for Evaluation were formally approved by UNEG members at the UNEG Annual General Meeting 2008. These Guidelines
BIRMINGHAM CITY UNIVERSITY'S RESEARCH ETHICAL FRAMEWORK 1. Introduction 1.1 This document sets out a framework through which staff and students of the University give consideration to the ethical implications
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
NATIONAL STANDARDS FOR DISABILITY SERVICES Full version web accessible FOR MORE INFORMATION CONTACT - QualityAssurance@dss.gov.au NATIONAL STANDARDS FOR DISABILITY SERVICES Copyright statement All material
Human Services Quality Framework User Guide Purpose The purpose of the user guide is to assist in interpreting and applying the Human Services Quality Standards and associated indicators across all service
Chapter 3 KEY PERFORMANCE INFORMATION CONCEPTS Performance information needs to be structured to demonstrate clearly how government uses available resources to deliver on its mandate. 3.1 Inputs, activities,
HAP 007 Standard in Humanitarian Accountability and Quality Management Adopted by HAP on th 0 January 007 Humanitarian Accountability Partnership-International Prepared by: HAP Editorial Steering Committee
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE DEVELOPMENT ASSISTANCE COMMITTEE PARIS, 1991 DAC Principles for Evaluation of Development Assistance Development Assistance Committee Abstract: The following
PERFORMANCE ASSESSMENT RESOURCE CENTRE PARC Evaluation Series No.1 The Logical Framework By w PARC Evaluation Series No. 1: The Logical Framework Looking back over the last year and the conferences where
01 NRC PROTECTION POLICY Photo: NRC / Christian Jepsen 2014 OUR VISION: RIGHTS RESPECTED PEOPLE PROTECTED Our Mission Statement: NRC works to protect the rights of displaced and vulnerable persons during
The Logical Framework Approach An Introduction 1 1. What is the Logical Framework Approach? 1.1. The background The Logical Framework Approach (LFA) was developed in the late 1960 s to assist the US Agency
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS 1 Table of Contents Background A. Criteria B. Definition C. Rationale D. Initiator E. Timing F. Planning G. Involvement of the Independent
Planning, Monitoring, Evaluation and Learning Program Program Description Engineers Without Borders USA EWB-USA Planning, Monitoring, Evaluation 02/2014 and Learning Program Description This document describes
Indicators What is it? A quick search on the world-wide-web on indicators provides us with an overwhelming amount of hits on business indicators, economic indicators, social indicators, environmental indicators,
QUALITY ASSURANCE IN INITIAL TEACHER EDUCATION BENCHMARK INFORMATION ON THE STANDARD for INITIAL TEACHER EDUCATION IN SCOTLAND CONSULTATION DOCUMENT April 2000 This document has been produced under the
Monitoring and Evaluation Framework and Strategy GAVI Alliance 2011-2015 NOTE TO READERS The 2011-2015 Monitoring and Evaluation Framework and Strategy will continue to be used through the end of 2016.
the role of the head of internal audit in public service organisations 2010 CIPFA Statement on the role of the Head of Internal Audit in public service organisations The Head of Internal Audit in a public
Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?
Published: 17 March 2015 Background M&E/Learning Guidelines for IPs (To be used for preparation of Concept Notes and Proposals to LIFT) LIFT's new strategy (2015-2018) envisions LIFT as a knowledge platform.
RISK APPETITE IN THE WORLD FOOD PROGRAMME Informal Consultation 7 December 2015 World Food Programme Rome, Italy INTRODUCTION 1. WFP s risk appetite reflects its overall approach to risk management. Since
For Official Use DCD/DAC/EV(2000)6 DCD/DAC/EV(2000)6 English text only For Official Use Organisation de Coopération et de Développement Economiques OLIS : 25-Oct-2000 Organisation for Economic Co-operation
Guidelines for Preparation of Review Protocols Type of document: Policy _x_ Guideline Procedure Version: 1.0, January 1 2001 Decision: Steering Group, Date? A Campbell Systematic Review is meant to review
Part 1: Mainstreaming equity and inclusion in WASH programmes and influencing Introduction Why do we need a toolkit? Equity and inclusion is a core principle of WaterAid s work. To ensure we achieve our
Group Manager Children s Social Care Services Role Profile: Grade: Accountable to: Accountable for: Senior Manager Hay B Service Leader 3-6 direct line reports, plus circa 48 staff that these direct reports
Operational Planning for HIV/AIDS: A Guidance Note Produced by the AIDS Strategy and Action Planning Service (ASAP) DRAFT COMMENTS WELCOME May 2009 i Contents Acronyms... ii Purpose... 1 Process of Developing
RESEARCH GRANTS PROGRAM GUIDE FOR WRITING A RESEARCH PROTOCOL Research Grants Program (RGP) Research Coordination (HDP/HDR) Division of Health and Human Development (HDP) Pan American Health Organization
AusAID NGO Cooperation Program Monitoring, Evaluation and Learning Framework May 2012 Disclaimer: This document outlines a proposed new monitoring, evaluation and learning framework (MELF) for the AusAID
The University of Adelaide Business School MBA Projects Introduction There are TWO types of project which may be undertaken by an individual student OR a team of up to 5 students. This outline presents
GUIDELINES FOR REVIEWING QUANTITATIVE DESCRIPTIVE STUDIES These guidelines are intended to promote quality and consistency in CLEAR reviews of selected studies that use statistical techniques and other
pm4dev, 2016 management for development series Project Scope Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage development
The Fund to End Violence Against Children Call for Proposals 2016 Introduction Every year, at least a billion children are exposed to violence, while a child dies a violent death every five minutes. This
Research Design Service London Writing a Qualitative Research Proposal When to use qualitative methods Qualitative methods should be used when the aim is to: Investigate complex phenomena that are hard
TOOL SUMMARY: PROJECT PROGRESS REPORT The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the project
WIPO 2010 2015 EVALUATION STRATEGY IAOD The damage does not come from findings from evaluations but from people trying to hide it. (Michael Quinn Patton, organizational development and evaluation consultant
EUROPEAN COMMISSION DIRECTORATE-GENERAL REGIONAL POLICY Thematic development, impact, evaluation and innovative actions Evaluation and additionality The New Programming Period 2007-2013 INDICATIVE GUIDELINES
Lessons Learned from MDG Monitoring From A Statistical Perspective Report of the Task Team on Lessons Learned from MDG Monitoring of the IAEG-MDG United Nations March 2013 The views expressed in this paper
POST DISTRIBUTION MONITORING: - Guidelines to Monitor processes, outputs and outcomes A guide for the Afghanistan Cash & Voucher Working Group George Bete: - August 2013 Funded by European Community Humanitarian
Practice Manager Children s Social Care Role Profile: Practice Manager Grade: Grade 12 Accountable to: Accountable for: Role Context & Purpose Group Manager Line management of a local team of 5-7 fte staff
Page 1 of 6 JOB TITLE: WASH Manager, Zimbabwe LOCATION: Harare - Zimbabwe SALARY: JOB FAMILY: Programme Technical LEVEL: C1 National OXFAM PURPOSE: To work with others to overcome poverty and suffering
Guide for Documenting and Sharing Best Practices in Health Programmes Guide for Documenting and Sharing Best Practices in Health Programmes WORLD HEALTH ORGANIZATION Regional Office for Africa Brazzaville
LONDON SCHOOL OF COMMERCE Programme Specifications for the Cardiff Metropolitan University MSc in International Hospitality Management 1 Contents Programme Aims and Objectives 3 Programme Learning Outcomes
Lead the performance management of care service provision Overview This standard identifies the requirements when leading and managing the performance of the care service provision. It includes identifying
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
II. Figure 5: 6 The project cycle can be explained in terms of five phases: identification, preparation and formulation, review and approval, implementation, and evaluation. Distinctions among these phases,
Australian ssociation Practice Standards for Social Workers: Achieving Outcomes of Social Workers Australian Association of Social Workers September 2003 Contents Page Introduction... 3 Format of the Standards...
MONITORING & EVALUATION: Some Tools, Methods & Approaches The World Bank Washington, D.C. www.worldbank.org/html/oed N Copyright 2002 The International Bank for Reconstruction and Development/THE WORLD
WORKING IN PARTNERSHIP FOR QUALITY HEALTHCARE IN HAWKE S BAY A QUALITY IMPROVEMENT & SAFETY FRAMEWORK DECEMBER 2013 2 1 IMAGE NEEDED CONTENTS INTRODUCTION 2 WHAT DO WE WANT TO ACCOMPLISH 4 QUALITY HEALTH
Monitoring, Evaluation, Accountability and Learning (MEAL) Keywords: Monitoring, evaluation, accountability, learning, plan, budget Introduction In this session on MEAL Planning and Budgeting, you will
The University of Adelaide Business School MBA Projects Introduction There are TWO types of project which may be undertaken by an individual student OR a team of up to 5 students. This outline presents
Purpose TOOL SUMMARY: LOGFRAME CHEAT SHEET The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the
DAC Guidelines and Reference Series Quality Standards for Development Evaluation DAC Guidelines and Reference Series Quality Standards for Development Evaluation DEVELOPMENT ASSISTANCE COMMITTEE ORGANISATION