1 EVALUATION ID /003 UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White s Outstanding Satisfactory No Not Applicable Section & Overall Rating Satisfactory Unsatisfactory Outstanding, best practice Highly Satisfactory The Cornerstone questions are in column J and are questions that need to be answered for rating and justification of each of the six sections UNEG Standards for Evaluation in the UN System Response UNEG Norms for Evaluation in the UN System UNICEF Adapted UNEG Evaluation Report Standards Title of the Evaluation Report External Evaluation of the Pacific Enable Project Report sequence number 2015/003 Date of Review 2015/12/14 Year of the Evaluation Report 2015 Region Type of Report Name of reviewer Geographic Scope ( Coverage of the programme being evaluated & generalizability of evaluation findings) Management of Evaluation (Managerial control and oversight of evaluation decisions) Purpose (Speaks to the overarching goal for conducting the evaluation; its raison d'être) East Asia and the Pacific Regional Office Evaluation Universalia Management Group Classification of Evaluation Report Country TORs Present 1.4 Regional: Where one programme is implemented in several countries, or different programmes of a similar theme are implemented in several countries, the evaluation covers multiple countries within the region and the sampling is adequate to make the results generalizable to the region. 2.1 UNICEF managed: Working with national partners of different categories UNICEF is responsible for all aspects of the evaluation. 3.7 Programme: An evaluation of a sectorial programme to determine its overall effectiveness and efficiency in relation to the stated goals and objectives Fiji (Pacific Islands) Comments Result (Level of changes sought, as defined in RBM: refer to substantial use of highest level reached) SPOA Correspondence (Alignment with SPOA focus area priorities: (1) Health; (2) HIV-AIDS; (3) WASH; (4) Nutrition; (5) Education; (6) Child Protection; (7) Social Inclusion; (8) Cross-Cutting - Gender Equality; and (9) Cross-cutting - Humanitarian Action) 4.3 Impact: Final results of a programme or policy on the intended beneficiaries and, where possible, on comparison groups. Reflects the cumulative effect of donor supported programmes of cooperation and national policy initiatives Touches more than one outcome. Please specify in the comments. The object of evaluation promotes the rights of people with disabilities and touches upon the SPOA areas of social inclusion, education, and child protection. Level of Independence (Implementation and control of the evaluation activities) Approach 6.3 Independent external: The evaluation is implemented by external consultants and/or UNICEF Evaluation Office professionals. The overall responsibility for the evaluation lies outside the division whose work is being evaluated. 7.3 Summative and formative: An evaluation that combines the elements of a formative and a summative evaluation.
2 SECTION A: OBJECT OF THE EVALUATION cc Object and context 1 Is the object of the evaluation well described? This needs to include a clear description of the interventions (project, programme, policies, otherwise) to be evaluated including how the designer thought that it would address the problem identified, implementing modalities, other parameters including costs, relative importance in the organization and (number of) people reached. 2 Is the context explained and related to the object that is to be evaluated? The context includes factors that have a direct bearing on the object of the evaluation: social, political, economic, demographic, institutional. These factors may include strategies, policies, goals, frameworks & priorities at the: international level; national Government level; individual agency level 3 Does this illuminate findings? The context should ideally be linked to the findings so that it is clear how the wider situation may have influenced the outcomes observed. Outstanding Outstanding The object of evaluation is thoroughly described on pp. 8 and 9 and from pp and includes information on the project's rationale, development and implementation process, objectives, geographic scope, and coverage. Information on project expenses is provided on p. 42. Additionally, expected project outputs and outcomes are presented in Appendix 4. Five program outcome areas are presented by lead implementing agency/agencies on p. 14. The evaluation demonstrates best practices by clearly describing the activities and outputs of each implementing agency and partner organization per outcome area from pp This provides the reader with a thorough understanding of the project before learning about the evaluation findings. The project context is discussed in detail from pp and on p. 15 and includes not only information on the geographic context of the Pacific, information on the Human Development Index of each country, and definitions for key disability-related concepts but also provides information on the particular challenges often faced by organizations working on issues of disabilities within the region. The evaluation also includes a very thorough discussion on how disability is included in various regional policies and frameworks. Both the detailed information about the project and its context help to illuminate the findings. A/ Does the report present a clear & full description of the 'object' of the evaluation? The report should describe the object of the evaluation including the results chain, meaning the theory of change that underlies the programme being evaluated. This theory of change includes what the programme was meant to achieve and the pathway (chain of results) through which it was expected to achieve this. The context of key social, political, economic, demographic, and institutional factors that have a direct bearing on the object should be described. For example, the partner government s strategies and priorities, international, regional or country development goals, strategies and frameworks, the concerned agency s corporate goals & priorities, as appropriate. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Theory of Change 4 Is the results chain or logic well articulated? The report should identify how the designers of the evaluated object thought that it would address the problem that they had identified. This can include a results chain or other logic models such as theory of change. It can include inputs, outputs and outcomes, it may also include impacts. The models need to be clearly described and explained. The program logic is explained on pp. 8 and 9 and includes a general discussion on how the program was designed to meet its overall goal of "improving the lives and opportunities for persons with disabilities" (p. 8) by addressing 4 major themes embedded in the CRPD. Although the report does not provide a visual Theory of Change model, Appendix 4 provides a results chain that identifies project outcome areas and outputs per area that clearly indicates how the project was designed to reach its larger goal. The quality of the results chain is assessed on p. 29. The report is very strong at providing a The report reflect best practices in its detailed and thorough description of the thorough description of the object of object of evaluation and its context. evaluation and stakeholder contribution by Information is provided on the project's including a breakdown of financial rationale, development and implementation distributions and contributions per process, objectives, geographic scope, and implementing partner and a detailed coverage. The report demonstrates best description of the project activities practices by providing a detailed description executed by each implementing partner. of the project activities executed by each Although no visual Theory of Change is implementing partner (including UNICEF). presented, good practices are seen in the The programme logic is explained, a results evaluation's presentation and analysis of chain with outputs and outcomes is the project's programme logic and results presented, and the evaluator provides an chain. Even though the evaluation report assessment of the quality of the results indicates that the evaluation is taking place chain. Even though the evaluation report after Phase I of the project in order to indicates that the evaluation is taking place inform decision-making relevant to Phase after Phase I of the project in order to inform II, it is necessary to provide the dates of decision-making relevant to Phase II, no the two phases in order to establish the dates are provided to establish the timing of timing of each phase. each phase. Changes made to the five
3 5 Are key stakeholders clearly identified? These include o implementing agency(ies) o development partners o rights holders o primary duty bearers o secondary duty bearers 6 Are key stakeholders' contributions described? This can involve financial or other contributions and should be specific. If joint program also specify UNICEF contribution, but if basket funding question is not applicable 7 Are UNICEF contributions described? This can involve financial or other contributions and should be specific Stakeholders and their contributions Outstanding Outstanding The funding agency (UNPRPD), implementing agencies, and regional organisations implicated in the project are clearly identified on p. 8 and their roles and responsibilities are clearly outlined on p. 9. The lead organisations per project outcome area are presented on p. 14. The distribution of UNPRPD funds by implementing agency along with information on contributions from each agency and organisation is outlined in detail on p. 42. A detailed description of the activities and outputs of each implementing agency and partner organization per outcome area is presented from pp Highly Satisfactory assessment of the quality of the results indicates that the evaluation is taking place chain. Even though the evaluation report after Phase I of the project in order to indicates that the evaluation is taking place inform decision-making relevant to Phase after Phase I of the project in order to inform II, it is necessary to provide the dates of decision-making relevant to Phase II, no the two phases in order to establish the dates are provided to establish the timing of timing of each phase. each phase. Changes made to the five outcome areas during the project implementation are discussed but no analysis of how these changes may have affected the project outcomes is provided. Implementation Status 8 Is the implementation status described? This includes the phase of implementation and significant changes that have happened to plans, strategies, performance frameworks, etc that have occurred - including the implications of these changes The evaluation report clearly indicates on p. 16 that that the evaluation is taking place after Phase I of the project in order to inform decision-making relevant to Phase II. However, no dates are provided to establish the timing of each phase. Changes made to the five outcome areas during the project implementation are discussed on p. 14. However, an analysis of how these changes may have affected the project outcomes was not provided. Executive Feedback on Section A Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The evaluation reflects best practices in its clear and detailed description of the object of evaluation, its context, and stakeholder contributions. The programme logic and results chain is well presented and assessed but information on the implementation status could be strengthened.
4 SECTION B: EVALUATION PURPOSE, OBJECTIVES AND SCOPE Purpose, objectives and scope cc 9 Is the purpose of the evaluation clear? This includes why the evaluation is needed at this time, who needs the information, what information is needed, how the information will be used. 10 Are the objectives and scope of the evaluation clear and realistic? This includes: Objectives should be clear and explain what the evaluation is seeking to achieve; Scope should clearly describe and justify what the evaluation will and will not cover; Evaluation questions may optionally be included to add additional details 11 Do the objective and scope relate to the purpose? The reasons for holding the evaluation at this time in the project cycle (purpose) should link logically with the specific objectives the evaluation seeks to achieve and the boundaries chosen for the evaluation (scope) The purpose of the evaluation is clearly described on p. 16 as to provide accountability and feedback on the impact of the implementation of Phase 1 of Pacific Enable in meeting the targets set out in the proposal documentation. The report also specifies that "the primary audience is the donor, with secondary audience as the participating agencies" (p. 16). The evaluation is meant to inform future programming, especially considering the development and implementation of Phase II of the program. The objectives of the evaluation are presented in a generalized fashion in the Executive Summary on p. 5. However, they are not clearly discussed again within the body of the report. The scope of the evaluation is not clearly identified in terms of time period or geographic coverage nor is there any discussion around sampling choices. The report would be stronger if it included evaluation questions. It is difficult to link the objectives and scope of the evaluation back to the purpose when the objectives and scope have not been clearly identified. B/ Are the evaluation's purpose, objectives and scope sufficiently clear to guide the evaluation? The purpose of the evaluation should be clearly defined, including why the evaluation was needed at that point in time, who needed the information, what information is needed, and how the information will be used. The report should provide a clear explanation of the evaluation objectives and scope including main evaluation questions and describes and justifies what the evaluation did and did not cover. The report should describe and provide an explanation of the chosen evaluation criteria, performance standards, or other criteria used by the evaluators. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Evaluation framework 12 Does the evaluation provide a relevant list of evaluation criteria that are explicitly justified as appropriate for the Purpose? It is imperative to make the basis of the value judgements used in the evaluation transparent if it is to be understood and convincing. UNEG evaluation standards refer to the OECD/DAC criteria, but other criteria can be used such as Human rights and humanitarian criteria and standards (e.g. SPHERE Standards) but this needs justification.. Not all OECD/DAC criteria are relevant to all evaluation objectives and scopes. The TOR may set the criteria to be used, but these should be (re)confirmed by the evaluator. Standard OECD DAC Criteria include: Relevance; Effectiveness; Efficiency; Sustainability; Impact Additional humanitarian criteria include; Coverage; Coordination; Coherence; Protection; timeliness; connectedness; appropriateness. (This is an extremely important question to UNICEF) 13 Does the evaluation explain why the evaluation criteria were chosen and/or any standard DAC evaluation criteria (above) rejected? The rationale for using each particular non-oecd-dac criterion (if applicable) and/or rejecting any standard OECD-DAC criteria (where they would be applicable) should be explained in the report. The evaluation specifies on p. 16 that the evaluation criteria included the areas of efficiency, timeliness, and sustainability. The report explicitly states that these criteria were identified in the ToRs. The ToRs also mention that the project is to be evaluated against the specified impact indicators and means of verification set out in the project proposal. In the findings section on p. 30, the report goes on to include additional evaluation criteria of "relevance, timeliness, effectiveness, impact, efficiency, and sustainability" of the project components, and assess the extent to which the crosscutting issue of gender was addressed in the different outcome areas. The report would be stronger if this additional information were was also presented under the section on Methodology. Satisfactory The report clearly describes the evaluation's purpose and its intended audience. The scope of the evaluation is not clearly identified in terms of time period or geographic coverage nor is there any discussion around sampling choices. The evaluation criteria are clearly stated and justified as called for in the Terms of Reference. However, the section on Methodology identifies the evaluation criteria of "efficiency, timeliness, and sustainability" while at the beginning of the findings section, the criteria also includes "relevance, effectiveness, and impact", along with the cross-cutting issue of gender. The report would be stronger and easier to understand if all of the evaluation criteria were presented at once under the section on Methodology. Evaluation objectives should be clearly identified within both the Executive Summary and the body of the report. The scope should also be clearly identified by providing the time period that the evaluation is reviewing (i.e. the dates of Phase I), the geographic coverage included in the evaluation, and the chosen sampling methods. Although standard OECD/DAC criteria are used for the evaluation, all of the criteria are not presented within the Methodology Section.
5 Executive Feedback on Section B Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The report clearly describes the evaluation's purpose and its intended audience but information is missing on the evaluation objectives, and evaluation scope. Although OECD/DAC criteria are used in the evaluation, all of the criteria are not clearly presented in the Methodology Section.
6 SECTION C: EVALUATION METHODOLOGY, GENDER, HUMAN RIGHTS AND EQUITY Data collection 14 Does the report specify data collection methods, analysis methods, sampling methods and benchmarks? This should include the rationale for selecting methods and their limitations based on commonly accepted best practice. 15 Does the report specify data sources, the rationale for their selection, and their limitations? This should include a discussion of how the mix of data sources was used to obtain a diversity of perspectives, ensure accuracy & overcome data limits Ethics 16 Are ethical issues and considerations described? The design of the evaluation should contemplate: How ethical the initial design of the programme was; The balance of costs and benefits to participants (including possible negative impact) in the programme and in the evaluation; The ethics of who is included and excluded in the evaluation and how this is done 17 Does the report refer to ethical safeguards appropriate for the issues described? When the topic of an evaluation is contentious, there is a heightened need to protect those participating. These should be guided by the UNICEF Evaluation Office Technical Note and include: protection of confidentiality; protection of rights; protection of dignity and welfare of people (especially children); Informed consent; Feedback to participants; Mechanisms for shaping the behaviour of evaluators and data collectors cc No The report identifies data collection methods (desk review, survey, and key informant interviews) on pp. 16 and 17. However, the report does not mention how open-ended survey questions or information from key informant interviews was analyzed. Sampling methods are not discussed. Appendix 7 provides a list of documents consulted and Appendix 1 provides a list of stakeholders consulted. Although the report indicates that a mix of data sources was used, it does not explain why this approach was used (i.e. to obtain a diversity of perspectives, ensure accuracy, or overcome data limits) or what limitations each source may contain. Ethical issues are addressed on pp. 18 and 19 and include information on ethical safeguards such as ensuring the voluntary participation of interviewees, only interviewing participants who are 18 years of age and older, assuring confidentiality, and identifying any conflict of interest. The evaluation also specifies that the evaluator found no evidence of unethical conduct during program implementation. There is little discussion around the balance of costs and benefits to participants, and the ethics around who is included and who is excluded in the evaluation. C/ Is the methodology appropriate and sound? The report should present a transparent description of the methodology applied to the evaluation that clearly explains how the evaluation was specifically designed to address the evaluation criteria, yield answers to the evaluation questions and achieve the evaluation purposes. The report should also present a sufficiently detailed description of methodology in which methodological choices are made explicit and justified and in which limitations of methodology applied are included. The report should give the elements to assess the appropriateness of the methodology. Methods as such are not good or bad, they are only so in relation to what one tries to get to know as part of an evaluation. Thus this standard assesses the suitability of the methods selected for the specifics of the evaluation concerned, assessing if the methodology is suitable to the subject matter and the information collected are sufficient to meet the evaluation objectives. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Results Based Management 18 Is the capability and robustness of the evaluated object's monitoring system adequately assessed? The evaluation should consider the details and overall functioning of the management system in relation to results: from the M&E system design, through individual tools, to the use of data in management decision making. The evaluation report assesses the project's M&E system (or lack thereof) on pp. 28 and 29. The report explains that although the proposal indicates that a monitoring framework would be developed, the logframe was used instead. The evaluator could not identify any staff responsible for the project's M&E system (p. 28) and although some midterm reports were submitted, some appear to be missing. The evaluator provides an assessment of the quality of the project's logframe on p. 29 and explains why the project's indicators, baseline, and means of verification are inadequate for a monitoring framework. Even though the project appears not to have used an adequate M&E Framework to measure results, the evaluation report uses the project's outcome areas to guide the assessment. Evaluation findings are structured around each outcome area. Overall, the evaluation methodology facilitates answers to the evaluation questions. However, some key information on the evaluation methodology is missing. The report identifies data collection methods but does not discuss how data was analyzed. Sampling methods are also not discussed. Although the report indicates that a mix of data sources was used, it does not explain why this approach was used or what limitations each source may contain. Ethical issues are addressed and include some information on ethical safeguards. The evaluation report assesses the project's M&E system and provides an assessment of the quality of the programme's logframe. Even though the project appears to not have used an adequate M&E Framework to measure results, the evaluation report uses the project's outcome areas to guide the assessment. The evaluation is very strong at incorporating a human-rights and equity Although the methodology appears to facilitate answers to the evaluation questions, detailed information should be provided on sampling methods, how information was analyzed, and why certain data sources were chosen and their potential limitations. Even though stakeholders are identified, their contributions and level of participation throughout the evaluation process should be discussed. Justification for why people with disabilities were not consulted in the evaluation should also be provided.
7 19 Does the evaluation make appropriate use of the M&E framework of the evaluated object? In addition to articulating the logic model (results chain) used by the programme, the evaluation should make use of the object's logframe or other results framework to guide the assessment. The results framework indicates how the programme design team expected to assess effectiveness, and it forms the guiding structure for the management of implementation. 29. The report explains that although the proposal indicates that a monitoring framework would be developed, the logframe was used instead. The evaluator could not identify any staff responsible for the project's M&E system (p. 28) and although some midterm reports were submitted, some appear to be missing. The evaluator provides an assessment of the quality of the project's logframe on p. 29 and explains why the project's indicators, baseline, and means of verification are inadequate for a monitoring framework. Even though the project appears not to have used an adequate M&E Framework to measure results, the evaluation report uses the project's outcome areas to guide the assessment. Evaluation findings are structured around each outcome area. on the evaluation methodology is missing. The report identifies data collection methods but does not discuss how data was analyzed. Sampling methods are also not discussed. Although the report indicates that a mix of data sources was used, it does not explain why this approach was used or what limitations each source may contain. Ethical issues are addressed and include some information on ethical safeguards. The evaluation report assesses the project's M&E system and provides an assessment of the quality of the programme's logframe. Even though the project appears to not have used an adequate M&E Framework to measure results, the evaluation report uses the project's outcome areas to guide the assessment. The evaluation is very strong at incorporating a human-rights and equity approach, as it is the core of the evaluated programme's objectives. Gender issues are discussed, since gender as a cross-cutting issue was identified in the ToRs, but even though significant areas for improvements were identified in the findings, the report does not provide any gender-specific conclusions or recommendations. Although stakeholders are identified, their contributions during the evaluation process are not entirely discussed. provided on sampling methods, how information was analyzed, and why certain data sources were chosen and their potential limitations. Even though stakeholders are identified, their contributions and level of participation throughout the evaluation process should be discussed. Justification for why people with disabilities were not consulted in the evaluation should also be provided. Satisfactory
8 Human Rights, Gender and Equity 20 Did the evaluation design and style consider incorporation of the UN and UNICEF's commitment to a human rights-based approach to programming, to gender equality, and to equity? This could be done in a variety of ways including: use of a rights-based framework, use of CRC, CCC, CEDAW and other rights related benchmarks, analysis of right holders and duty bearers and focus on aspects of equity, social exclusion and gender. Style includes: using human-rights language; gender-sensitive and child-sensitive writing; disaggregating data by gender, age and disability groups; disaggregating data by socially excluded groups. Promote gender-sensitive interventions as a core programmatic priority, To the extent possible, all relevant policies, programmes and activities will mainstream gender equality. 21 Does the evaluation assess the extent to which the implementation of the evaluated object was monitored through human rights (inc. gender, equity & child rights) frameworks? UNICEF commits to go beyond monitoring the achievement of desirable outcomes, and to ensure that these are achieved through morally acceptable processes. The evaluation should consider whether the programme was managed and adjusted according to human rights and gender monitoring of processes. 22 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on HUMAN RIGHTS (inc. women & child rights)? The inclusion of human rights frameworks in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the level of: Identification of the human rights claims of rights-holders and the corresponding human rights obligations of duty-bearers, as well as the immediate underlying & structural causes of the non realisation of rights.; Capacity development of rights-holders to claim rights, and duty-bearers to fulfil obligations. Support for humanitarian action achieving faster scaling up of response, early identification of priorities and strategies, rapid deployment of qualified staff and clear accountabilities and responses consistent with humanitarian principles in situations of unrest or armed conflict. 23 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on GENDER EQUALITY AND WOMEN'S EMPOWERMENT? The inclusion of gender equality frameworks in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the immediate underlying & structural causes of social exclusion; and capacity development of women to claim rights, and duty-bearers to fulfil their equality obligations. Outstanding The core programming area of the evaluated object is one based on respecting and promoting the human rights of persons with disabilities and ensuring that they are treated equitably. The evaluation reflects UNICEF's commitment to human rights and equity throughout the report through the use of a rights-based framework, sensitive analysis, writing style, and language. The evaluator demonstrates attention to human-rights by highlighting a potential problem with the ILO's approach of advocating selfemployment over the right to be hired by an employer. Gender as a cross-cutting issue is analyzed on pp. 44 and 45, as per the ToRs. The evaluation findings specify that significant room for improvement could be made by program partners in identifying and addressing gender-specific issues. The report also calls for increased disaggregated data on peoples with disabilities. However, the conclusions do not mention anything about gender issues and no recommendations are provided for improving the project's performance in terms of gender considerations. Satisfactory project's outcome areas to guide the assessment. The evaluation is very strong at incorporating a human-rights and equity approach, as it is the core of the evaluated programme's objectives. Gender issues are discussed, since gender as a cross-cutting issue was identified in the ToRs, but even though significant areas for improvements were identified in the findings, the report does not provide any gender-specific conclusions or recommendations. Although stakeholders are identified, their contributions during the evaluation process are not entirely discussed. 24 Do the methodology, analytical framework, findings, conclusions, recommendations & lessons provide appropriate information on EQUITY? The inclusion of equity considerations in the evaluation methodology should continue to cascade down the evaluation report and be obvious in the data analysis, findings, conclusions, any recommendations and any lessons learned. If identified in the scope the methodology should be capable of assessing the capacity development of rights-holders to claim rights, and duty-bearers to fulfil obligations & aspects of equity.
9 Stakeholder participation 25 Are the levels and activities of stakeholder consultation described? This goes beyond just using stakeholders as sources of information and includes the degree of participation in the evaluation itself. The report should include the rationale for selecting this level of participation. Roles for participation might include: o Liaison o Technical advisory o Observer o Active decision making The reviewer should look for the soundness of the description and rationale for the degree of participation rather than the level of participation itself. 26 Are the levels of participation appropriate for the task in hand? The breadth & degree of stakeholder participation feasible in evaluation activities will depend partly on the kind of participation achieved in the evaluated object. The reviewer should note here whether a higher degree of participation may have been feasible & preferable. The body of the evaluation report indicates that stakeholders were consulted and information gathered through a survey and key informant interviews. A list of the consulted stakeholders is presented in Appendix 1. The ToRs also mention that the consultant will work with an evaluation reference group made up of program partners. Overall, the level of participation seems appropriate even though the evaluation specifies that no persons with disabilities were included in the evaluation process (p. 47) and recommends that they be better included in the future. Methodological robustness 27 Is there an attempt to construct a counterfactual or address issues of contribution/attribution? The counterfactual can be constructed in several ways which can be more or less rigorous. It can be done by contacting eligible beneficiaries that were not reached by the programme, or a theoretical counterfactual based on historical trends, or it can also be a comparison group. 28 Does the methodology facilitate answers to the evaluation questions in the context of the evaluation? The methodology should link back to the Purpose and be capable of providing answers to the evaluation questions. 29 Are methodological limitations acceptable for the task in hand? Limitations must be specifically recognised and appropriate efforts taken to control bias. This includes the use of triangulation, and the use of robust data collection tools (interview protocols, observation tools etc). Bias limitations can be addressed in three main areas: Bias inherent in the sources of data; Bias introduced through the methods of data collection; Bias that colours the interpretation of findings N/A The methodology uses key informant interviews, a survey, and a desk review of key documents. The methodology facilitates answers to the evaluation and links back to the evaluation purpose. Since the evaluated object is part of a regional programme to promote the rights of peoples with disabilities, it would have been very difficult to conduct a counterfactual. Evaluation limitations are presented on p. 17 and 18 and are acceptable for the task at hand. The evaluator mentions that only open-ended survey questions were included in the assessment in order to reduce any potential bias that may emerge due to the limited number of people included in the survey. Executive Feedback on Section C Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The report is strong at including a human rights and equity approach and at assessing the programme's M&E framework and monitoring and data collection system. Some key information on the methodology, ethics, and stakeholder participation is missing, along with gender-specific conclusions and recommendations.
10 SECTION D: FINDINGS AND CONCLUSIONS cc Completeness and logic of findings 30 Are findings clearly presented and based on the objective use of the reported evidence? Findings regarding the inputs for the completion of activities or process achievements should be distinguished clearly from results. Findings on results should clearly distinguish outputs, outcomes and impacts (where appropriate). Findings must demonstrate full marshalling and objective use of the evidence generated by the evaluation data collection. Findings should also tell the 'whole story' of the evidence and avoid bias. 31 Do the findings address all of the evaluation's stated criteria and questions? The findings should seek to systematically address all of the evaluation questions according to the evaluation framework articulated in the report. 32 Do findings demonstrate the progression to results based on the evidence reported? There should be a logical chain developed by the findings, which shows the progression (or lack of) from implementation to results. 33 Are gaps and limitations discussed? The data may be inadequate to answer all the evaluation questions as satisfactorily as intended, in this case the limitations should be clearly presented and discussed. Caveats should be included to guide the reader on how to interpret the findings. Any gaps in the programme or unintended effects should also be addressed. 34 Are unexpected findings discussed? If the data reveals (or suggests) unusual or unexpected issues, these should be highlighted and discussed in terms of their implications. Cost Analysis 35 Is a cost analysis presented that is well grounded in the findings reported? Cost analysis is not always feasible or appropriate. If this is the case then the reasons should be explained. Otherwise the evaluation should use an appropriate scope and methodology of cost analysis to answer the following questions: o How programme costs compare to other similar programmes or standards o Most efficient way to get expected results o Cost implications of scaling up or down o Cost implications for replicating in a different context o Is the programme worth doing from a cost perspective o Costs and the sustainability of the programme. N/A Findings are clearly presented around evaluation criteria and project outcome area. Information concerning inputs is clearly distinguished from that of results. The findings appear to be objective and to tell the "whole story" of the evidence. The findings systematically address all of the evaluation's stated criteria and logically demonstrate a progression to results by first describing the activities of implementing partners and then assessing their results. Gaps and limitations are regularly discussed throughout the findings section and help guide the reader on how to interpret the findings. Examples include the lack of data on people with disabilities, the project's weak M&E system, and the difficulties in comparing stakeholder contributions due to different systems of measuring project contributions. Several unexpected findings are highlighted and discussed in terms of their implications. Examples include a missing project monitoring structure and disparities between the values of organizations according to their mission statements (including the ILO and UNICEF) and the values advocated for during program implementation. A cost analysis is not required by the ToRs. The evaluation was unable to assess the efficiency of the initiative (even though this was called for in the ToRs), stating that "since each organization has its own methods for calculating its own contribution amounts, it is not possible to compare efficiency by using the contribution amounts cited by each organization" (p. 41). Issues around project sustainability per implementing partner are discussed form pp D/ Are the findings and conclusions, clearly presented, relevant and based on evidence & sound analysis? Findings should respond directly to the evaluation criteria and questions detailed in the scope and objectives section of the report. They should be based on evidence derived from data collection and analysis methods described in the methodology section of the report. Conclusions should present reasonable judgments based on findings and substantiated by evidence, providing insights pertinent to the object and purpose of the evaluation. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Findings are clearly presented around The conclusions could represent a more evaluation criteria and project outcome area. diverse cross-section of stakeholders by The findings systematically address all of the including persons with disabilities and evaluation's stated criteria and logically people under the age of 18. Additionally, demonstrate a progression to results by first conclusions should highlight all of the describing the activities of implementing important areas identified in the findings, partners and then assessing their results. including those related to gender-equality. The evaluation demonstrates best practices in its ability to assign contribution for results to identified stakeholders by clearly describing the inputs and results of each implementing partner and how they contributed to the larger project. The conclusions are very well written, highlight the underlying problems and priority issues identified within the findings, and provide useful insights into important issues. The conclusions appear to reflect the range of views encountered in the evaluation. However, they do not necessarily represent the most diverse cross-section of stakeholders possible since persons with disabilities and people under the age of 18 were not consulted as part of the evaluation. Highly Satisfactory
11 Contribution and causality 36 Does the evaluation make a fair and reasonable attempt to assign contribution for results to identified stakeholders? For results attributed to the programme, the result should be mapped as accurately as possible to the inputs of different stakeholders. 37 Are causal reasons for accomplishments and failures identified as much as possible? These should be concise and usable. They should be based on the evidence and be theoretically robust. (This is an extremely important question to UNICEF) Outstanding Outstanding The evaluation demonstrates best practices in its ability to assign contribution for results to identified stakeholders by clearly describing the inputs and results of each implementing partner and how they contributed to the larger project. Findings include a detailed account explaining the factors leading to different results and why they occurred. Highly Satisfactory Strengths, weaknesses and implications 38 Are the future implications of continuing constraints discussed? The implications can be, for example, in terms of the cost of the programme, ability to deliver results, reputational risk, and breach of human rights obligations. 39 Do the conclusions present both the strengths and weaknesses of the evaluated object? Conclusions should give a balanced view of both the stronger aspects and weaker aspects of the evaluated object with reference to the evaluation criteria and human rights based approach. 40 Do the conclusions represent actual insights into important issues that add value to the findings? Conclusions should go beyond findings and identify important underlying problems and/or priority issues. Simple conclusions that are already well known do not add value and should be avoided. 41 Do conclusions take due account of the views of a diverse crosssection of stakeholders? As well as being logically derived from findings, conclusions should seek to represent the range of views encountered in the evaluation, and not simply reflect the bias of the individual evaluator. Carrying these diverse views through to the presentation of conclusions (considered here) is only possible if the methodology has gathered and analysed information from a broad range of stakeholders. Completeness and insight of conclusions 42 Are the conclusions pitched at a level that is relevant to the end users of the evaluation? Conclusions should speak to the evaluation participants, stakeholders and users. These may cover a wide range of groups and conclusions should thus be stated clearly and accessibly: adding value and understanding to the report (for example, some stakeholders may not understand the methodology or findings, but the conclusions should clarify what these findings mean to them in the context of the programme). The future implications of continuing constraints are discussed in the section on sustainability from pp Conclusions are presented on pp. 45 and 46, are structured around project outcome area, and present both the strengths and weaknesses of the evaluation object. The conclusions are very well written, highlight the underlying problems and priority issues identified within the findings, and provide useful insights into important issues. The conclusions are succinct, organised around outcome area, and are pitched at a level that is relevant to the end users of the evaluation. The conclusions appear to reflect the range of views encountered in the evaluation and not only the bias of the evaluator. However, they do not necessarily represent the most diverse cross-section of stakeholders possible since persons with disabilities and people under the age of 18 were not consulted as part of the evaluation. Important findings related to gender equality in terms of the design and implementation of the program are not addressed in the conclusions. Executive Feedback on Section D Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The findings are strong and reflect best practices in assigning contribution for results to identified stakeholders and describing causal reasons for accomplishments and failures. Although conclusions are insightful and are pitched at a level that is relevant to the end users of the evaluation, they could have represented a more diverse cross-section of stakeholders.
12 SECTION E: RECOMMENDATIONS AND LESSONS LEARNED cc Relevance and clarity of recommendations 43 Are the recommendations well-grounded in the evidence and conclusions reported? Recommendations should be logically based in findings and conclusions of the report. 44 Are recommendations relevant to the object and the purpose of the evaluation? Recommendations should be relevant to the evaluated object 45 Are recommendations clearly stated and prioritised? If the recommendations are few in number (up to 5) then this can also be considered to be prioritised. Recommendations that are over-specific or represent a long list of items are not of as much value to managers. Where there is a long list of recommendations, the most important should be ordered in priority. The recommendations are logically based in the findings and conclusions of the report and are relevant to the evaluated object. They are clearly stated in two groups: general recommendations for the programme and recommendations specific to the second phase of the initiative. The recommendations are numbered and prioritized. E/ Are the recommendations and lessons learned relevant and actionable? Recommendations should be relevant and actionable to the object and purpose of the evaluation, be supported by evidence and conclusions, and be developed with involvement of relevant stakeholders. Recommendations should clearly identify the target group for each recommendation, be clearly stated with priorities for action, be actionable and reflect an understanding of the commissioning organization and potential constraints to follow up. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice Usefulness of recommendations 46 Does each recommendation clearly identify the target group for action? Recommendations should provide clear and relevant suggestions for action linked to the stakeholders who might put that recommendation into action. This ensures that the evaluators have a good understanding of the programme dynamics and that recommendations are realistic. 47 Are the recommendations realistic in the context of the evaluation? This includes: o an understanding of the commissioning organisation o awareness of the implementation constraints o an understanding of the follow-up processes 48 Does the report describe the process followed in developing the recommendations? The preparation of recommendations needs to suit the evaluation process. Participation by stakeholders in the development of recommendations is strongly encouraged to increase ownership and utility. Appropriate lessons learned 49 Are lessons learned correctly identified? Lessons learned are contributions to general knowledge. They may refine or add to commonly accepted understanding, but should not be merely a repetition of common knowledge. Findings and conclusions specific to the evaluated object are not lessons learned. 50 Are lessons learned generalised to indicate what wider relevance they may have? Correctly identified lessons learned should include an analysis of how they can be applied to contexts and situations outside of the evaluated object. No No N/A N/A While the recommendations are realistic and demonstrate an awareness of the implementing partners and their constraints, they do not clearly identify the target group for action. There is also no mention of how the recommendations were developed and whether the process included the participation of any stakeholders. Lessons learned are not included as part of the evaluation report. While they are not requested as part of the ToRs, they are a useful component of any evaluation, especially in designing the second phase of this specific evaluated programme. Both the conclusions and recommendations include information that could be easily re-formatted and used as part of a section on lessons learned. Satisfactory The recommendations are logically based in The recommendations should identify the the findings and conclusions of the report, target group for action and an explanation are relevant to the evaluated object, and are should be provided as to how the numerically prioritized. While the recommendations were developed and recommendations are realistic and whether the process included the demonstrate an awareness of the participation of any stakeholders. While implementing partners and their constraints, lessons learned are not requested as part of they do not clearly identify the target group the ToRs, they are a useful component of for action. There is also no mention of how any evaluation, especially in designing the the recommendations were developed and second phase of this specific evaluated whether the process included the programme. Both the conclusions and participation of any stakeholders. Lessons recommendations include information learned are not included as part of the that could be easily re-formatted and used evaluation report. While they are not as part of a section on lessons learned. requested as part of the ToRs, they are a useful component of any evaluation. Executive Feedback on Section E Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences While the recommendations are logically based on the findings and conclusions, are relevant, and are numerically prioritized, they do not identify the target group for action and the process used to develop them is not explained. No lessons learned are included in the report as they were not requested by the TORs.
13 SECTION F: REPORT IS WELL STRUCTURED, LOGIC AND CLEAR 51. Do the opening pages contain all the basic elements? Basic elements include all of: Name of the evaluated object; Timeframe of the evaluation and date of the report; Locations of the evaluated object; Names and/or organisations of evaluators; Name of the organisation commissioning the evaluation; Table of contents including tables, graphs, figures and annex; List of acronyms 54 Do the annexes increase the usefulness and credibility of the report? 55. Is an executive summary included as part of the report? If the answer is No, question 56 to 58 should be N/A 56 Does the executive summary contain all the necessary elements? Necessary elements include all of: Overview of the evaluated object; Evaluation objectives and intended audience; Evaluation methodology; Most important findings and conclusions; Main recommendations Executive Feedback on Section F Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences Additional Information Style and presentation 52 Is the report logically structured? Context, purpose, methodology and findings logically structured. Findings would normally come before conclusions, recommendations & lessons learnt 53 Do the annexes contain appropriate elements? Appropriate elements may include: ToRs; List of interviewees and site visits; List of documentary evidence; Details on methodology; Data collection instruments; Information about the evaluators; Copy of the evaluation matrix; Copy of the Results chain. Where they add value to the report Executive Summary 57 Can the executive summary stand alone? It should not require reference to the rest of the report documents and should not introduce new information or arguments 58 Can the executive summary inform decision making? It should be short (ideally 2-3 pages), and increase the utility for decision makers by highlight key priorities. cc The opening pages contain all of the necessary elements. The report is logically structured and presents the report contents in the standard order. The annexes contain useful information that adds to the usefulness and credibility of the report, including a list of consulted stakeholders, a list of Round 1 UNPRPD Projects, the ToRs, the results chain of outputs and outcomes, interview guides and survey tools, and a list of consulted documents. The Executive Summary contains all of the necessary elements and reflects the structure of the evaluation report. It can stand alone, as it does not reference the body of the report or introduce any new information. It is short, well summarized with clear sub-headings, and highlights key priorities, thus making it useful for decision makers. F/ Overall, do all these elements come together in a well structured, logical, clear and complete report? The report should be logically structured with clarity and coherence (e.g. background and objectives are presented before findings, and findings are presented before conclusions and recommendations). It should read well and be focused. Highly Satisfactory The report is well written, clear, logically structured, and presents the report contents in the standard order. The annexes contain useful information that adds to the usefulness and credibility of the report. The Executive Summary contains all of the necessary elements and reflects the structure of the evaluation report. It can stand alone, as it does not reference the body of the report or introduce any new information. It is short, well summarized with clear subheadings, and highlights key priorities, thus making it useful for decision makers. Constructive feedback for future reports Including how to address weaknesses and maintaining good practice The Executive Summary reflects good practices by including all of the necessary elements and by reflecting the structure of the evaluation report. It can stand alone, as it does not reference the body of the report or introduce any new information. It is short, well summarized with clear subheadings, and highlights key priorities, thus making it useful for decision makers. The report is well written, clear, logically structured, and includes appropriate information in the annexes. The Executive Summary contains all of the necessary elements, can stand alone, and is useful to decision makers. i/ Does the evaluation successfully address the Terms of Reference? If the report does not include a TOR then a recommendation should be given to ensure that all evaluations include the TOR in the future. Some evaluations may be flawed because the TORs are inappropriate, too little time etc. Or, they may succeed despite inadequate TORs. This should be noted under vii in the next section. The evaluation successfully addresses the Terms of Reference. It appears, however, that some changes were made to the evaluation approach after the ToRs were created and these changes were not thoroughly discussed within the report. For instance, the ToRs request that "consideration should be given to gathering views from people with different types of disabilities representing different constituencies" (p. 57) yet people with disabilities were not explicitly included as participants in the evaluation. The reasoning behind this change should be provided. ii/ Identify aspects of good practice in the evaluation In terms of evaluation The evaluation reflects best practices in its excellent description of the object of evaluation and its context; assigning contribution for results to different stakeholders, assessing the reasons for successes and failures, and integrating a human-rights based analysis and writing style.
14 iii/ Identify aspects of good practice of the evaluation In terms of programmatic, sector specific, thematic expertise The evaluator demonstrates excellent technical knowledge regarding advocacy and promotion of the rights of persons with disabilities through his careful assessment of the programme logic and approach. An example of this is when the evaluator identified areas where the activities of the implementing partners did not correctly align with the organization's values and a rights-based approach to the promotion of the rights of persons with disabilities. OVERALL RATING cc OVERALL RATING Informed by the answers above, apply the reasonable person test to answer the following question: Ω/ Is this a credible report that addresses the evaluation purpose and objectives based on evidence, and that can therefore be used with confidence? This question should be considered from the perspective of UNICEF strategic management. i/ To what extent does each of the six sections of the evaluation provide sufficient credibility to give the reasonable person confidence to act? Taken on their own, could a reasonable person have confidence in each of the five core evaluation elements separately? It is particularly important to consider: o Is the report methodologically appropriate? o Is the evidence sufficient, robust and authoritative? o Do the analysis, findings, conclusions and recommendations hold together? ii/ To what extent do the six sections hold together in a logically consistent way that provides common threads throughout the report? The report should hold together not just as individually appropriately elements, but as a consistent and logical whole. Overall, the report is methodologically appropriate using a mixed methods approach, although more detailed information on sampling and analysis methods is required. The decision to not include children or persons with disabilities in the evaluation should also have been justified. Additionally, the timeframe of the project and the scope of the evaluation should have been more clearly identified. The findings, conclusions, and recommendations are well described, insightful, and hold together well. However, the target audience for each recommendation should be better identified and the inclusion of lessons learned would have strengthened the report. The report holds together in a logically consistent way that provides common threads throughout. Highly Satisfactory This is a credible report that addresses the evaluation purpose and objectives based on evidence and can be used with confidence. Overall, the report is methodologically appropriate using a mixed methods approach. More detailed information regarding the evaluation and methodology, including evaluation scope and sampling and analysis methods, would strengthen the report. As well, the decision to not include children or persons with disabilities in the evaluation should have been justified. The findings, conclusions, and recommendations are well described, insightful, and hold together well. The report is strong at attributing results to different stakeholders and describing causality for successes and weaknesses. Although the recommendations are realistic, the target audience for each recommendation should be better identified. The inclusion of lessons learned would also have strengthened the report. iii/ Are there any reasons of note that might explain the overall performance or particular aspects of this evaluation report? This is a chance to note mitigating factors and/or crucial issues apparent in the review of the report. ToRs Other The evaluator cited a limited timeframe to prepare the evaluation report among a list of evaluation limitations, which may explain why some descriptive detail on the methodology is missing. Executive Feedback on Overall Rating Issues for this section relevant for feedback to senior management (positives & negatives), & justify rating. Up to two sentences The report is strong at describing the object of evaluation, its context, stakeholder contributions, attributing results to different stakeholders, describing causality for successes and failures, assessing the programme's M&E system, and integrating a human-rights and equity approach throughout. The report is weaker at describing the evaluation scope and objectives, identifying the level of participation of stakeholders in the evaluation process, providing actionable recommendations, and including lessons learned.
UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White Questions Outstanding Yes t Applicable Section & Overall Rating Very Confident
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
UNODC Independent Evaluation Unit Evaluation Quality Assessment Template General Project Information Project/Programme Number and Name Thematic Area Geographic Area (Region, Country) Type of Evaluation
UNICEF Evaluation Report Standards The UNICEF Evaluation Report Standards have been created as a transparent tool for quality assessment of evaluation reports. This document outlines what the Standards
Guidance Document UNEG Quality Checklist for Evaluation Terms of Reference and Inception Reports Approved at the UNEG AGM 2010, this quality checklist for evaluation Terms of Reference and Inception Reports
United Nations Evaluation Group (UNEG) Norms for Evaluation in the UN System Towards a UN system better serving the peoples of the world; overcoming weaknesses and building on strengths from a strong evidence
ANNEX E: GLOSSARY OF KEY TERMS IN M&E Source: Development Assistance Committee (DAC). 2002. Glossary of Terms in Evaluation and Results-Based Management. Paris: OECD. This glossary is available in English,
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
Purpose TOOL SUMMARY: LOGFRAME CHEAT SHEET The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the
TOOL SUMMARY: PROJECT PROGRESS REPORT The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the project
1 Educational Leadership & Policy Studies Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) The comprehensive exam is intended as a final assessment of a student s ability to integrate important
Writing the MBA Dissertation 1. General Comments This should be divided into chapters as detailed in the following section: Note: The dissertation may not divide up easily into the 6 main headings, but
Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1
Chapter 3 KEY PERFORMANCE INFORMATION CONCEPTS Performance information needs to be structured to demonstrate clearly how government uses available resources to deliver on its mandate. 3.1 Inputs, activities,
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
Guidelines for UNODC Evaluation Reports 1 CONTENTS A standard evaluation report starts with a cover page, a table of contents, a list of abbreviations and acronyms, an executive summary and a matrix of
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
BIRMINGHAM CITY UNIVERSITY'S RESEARCH ETHICAL FRAMEWORK 1. Introduction 1.1 This document sets out a framework through which staff and students of the University give consideration to the ethical implications
AusAID NGO Cooperation Program Monitoring, Evaluation and Learning Framework May 2012 Disclaimer: This document outlines a proposed new monitoring, evaluation and learning framework (MELF) for the AusAID
Human Services Quality Framework User Guide Purpose The purpose of the user guide is to assist in interpreting and applying the Human Services Quality Standards and associated indicators across all service
OUTLINE OF PRINCIPLES OF IMPACT EVALUATION PART I KEY CONCEPTS Definition Impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended
IEA Guidance Note G2 IEA Guidance Note G4 Guidance on Evaluation Inception Reports January 2015 1 Table of contents List of Abbreviations... ii Introduction... 1 General guidelines... 1 Content... 2 Opening
7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION
DFAT MONITORING AND EVALUATION STANDARDS December 2016 CONTENTS INTRODUCTION TO THE MONITORING AND EVALUATION (M&E) STANDARDS 2 STANDARD 1 INVESTMENT DESIGN (REQUIRED ELEMENTS RELATING TO M&E) 6 STANDARD
Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science
Published: 17 March 2015 Background M&E/Learning Guidelines for IPs (To be used for preparation of Concept Notes and Proposals to LIFT) LIFT's new strategy (2015-2018) envisions LIFT as a knowledge platform.
Guidelines for Evaluation Terms of Reference 1. The present Guidelines for Evaluation Terms of Reference (TOR) form part of a common set of ITC Evaluation Guidelines that operationalize the ITC Evaluation
National Occupational Standards National Occupational Standards for Youth Work Contents Introduction 5 Section 1 S1.1.1 Enable young people to use their learning to enhance their future development 6 S1.1.2
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS 1 Table of Contents Background A. Criteria B. Definition C. Rationale D. Initiator E. Timing F. Planning G. Involvement of the Independent
01 NRC PROTECTION POLICY Photo: NRC / Christian Jepsen 2014 OUR VISION: RIGHTS RESPECTED PEOPLE PROTECTED Our Mission Statement: NRC works to protect the rights of displaced and vulnerable persons during
Group Manager Children s Social Care Services Role Profile: Grade: Accountable to: Accountable for: Senior Manager Hay B Service Leader 3-6 direct line reports, plus circa 48 staff that these direct reports
Appendix 1: Performance Management Guidance The approach to Performance Management as outlined in the Strategy is to be rolled out principally by Heads of Service as part of mainstream service management.
WIPO 2010 2015 EVALUATION STRATEGY IAOD The damage does not come from findings from evaluations but from people trying to hide it. (Michael Quinn Patton, organizational development and evaluation consultant
Annex 1 Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation European policy experimentations in the fields of Education, Training and
Part 1. Concepts, Tools and Principles 3 Overview Part 1. MfDR Concepts, Tools and Principles M anaging for Development Results (MfDR) is multidimensional, relating back to concepts about how to make international
: Evaluator and Planning Team Job Descriptions I. Overview II. Sample Evaluator Job Description III. Evaluator Competencies IV. Recruiting members of your strategic evaluation planning team V. Recruiting
LSI YW00 Youth Work National Occupational Standards Introduction Youth Work National Occupational Standards Introduction Contents: Suite Overview...2 Glossary......8 Functional Map.11 List of Standards..15
ETHICAL GUIDELINES FOR EVALUATION 21 July 2007 The Ethical Guidelines for Evaluation were formally approved by UNEG members at the UNEG Annual General Meeting 2008. UNEG2008 United Nations Evaluation Group
Monitoring and Evaluation Framework and Strategy GAVI Alliance 2011-2015 NOTE TO READERS The 2011-2015 Monitoring and Evaluation Framework and Strategy will continue to be used through the end of 2016.
The criteria and guidelines Best Belgian Sustainability Report 2013 - Edition 2014 In order to progressively move to G4 guidelines as adopted in May 2013 by the Global Reporting Initiative (GRI) and to
Performance audit report Ministry of Education: Monitoring and supporting school boards of trustees Office of the Auditor-General Private Box 3928, Wellington 6140 Telephone: (04) 917 1500 Facsimile: (04)
Policy for Monitoring and Evaluation of Compacts and Threshold Programs May 1, 2012 Version: Final Submitted by: Department of Policy and Evaluation Millennium Challenge Corporation 875 15th Street N.W.
Part 1: Mainstreaming equity and inclusion in WASH programmes and influencing Introduction Why do we need a toolkit? Equity and inclusion is a core principle of WaterAid s work. To ensure we achieve our
PROJECT AUDIT METHODOLOGY 1 "Your career as a project manager begins here!" Content Introduction... 3 1. Definition of the project audit... 3 2. Objectives of the project audit... 3 3. Benefit of the audit
For Official Use DCD/DAC/EV(2000)6 DCD/DAC/EV(2000)6 English text only For Official Use Organisation de Coopération et de Développement Economiques OLIS : 25-Oct-2000 Organisation for Economic Co-operation
Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?
Australian ssociation Practice Standards for Social Workers: Achieving Outcomes of Social Workers Australian Association of Social Workers September 2003 Contents Page Introduction... 3 Format of the Standards...
Commonwealth of Australia 2014 This work is copyright. In addition to any use permitted under the Copyright Act 1968, all material contained within this work is provided under a Creative Commons Attribution
The University of Adelaide Business School MBA Projects Introduction There are TWO types of project which may be undertaken by an individual student OR a team of up to 5 students. This outline presents
HPF Tool Template for the Performance and Accountability Statement Second round reviews Amended December 2012 The High Performance Framework was developed by the Public Sector Performance Commission. This
THE DCED STANDARD FOR MEASURING RESULTS IN PRIVATE SECTOR DEVELOPMENT CONTROL POINTS AND COMPLIANCE CRITERIA Version VII, April 2015 The DCED Standard for Measuring Results in Private Sector Development
Research Design Service London Writing a Qualitative Research Proposal When to use qualitative methods Qualitative methods should be used when the aim is to: Investigate complex phenomena that are hard
Gender-Responsive Results-Based Management Results-based management (RBM) is defined as a management strategy by which all development actors on the ground ensure that their processes, products and services
National Standards for Disability Services DSS 1504.02.15 Version 0.1. December 2013 National Standards for Disability Services Copyright statement All material is provided under a Creative Commons Attribution-NonCommercial-
Preparing for Inspection - an aide memoire for governors This aide memoire is to support governors in preparing for an Ofsted inspection. It includes a tool at the end to summarise your preparation ready
pm4dev, 2016 management for development series Project Scope Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage development
Quick Guide For Administrators Based on TIP 52 Clinical Supervision and Professional Development of the Substance Abuse Counselor Contents Why a Quick Guide?...2 What Is a TIP?...3 Benefits and Rationale...4
Foundation Document UNEG Ethical Guidelines for Evaluation UNEG, March 2008 The Ethical Guidelines for Evaluation were formally approved by UNEG members at the UNEG Annual General Meeting 2008. These Guidelines
Crosswalk of the New Colorado Principal Standards (proposed by State Council on Educator Effectiveness) with the Equivalent in the Performance Based Principal Licensure Standards (current principal standards)
HAP 007 Standard in Humanitarian Accountability and Quality Management Adopted by HAP on th 0 January 007 Humanitarian Accountability Partnership-International Prepared by: HAP Editorial Steering Committee
EVALUATION REPORT EVALUATION OF UNICEF S EMERGENCY PREPAREDNESS SYSTEMS Executive Summary EVALUATION OFFICE DECEMBER 2013 EVALUATION REPORT EVALUATION OF UNICEF S EMERGENCY PREPAREDNESS SYSTEMS Executive
ASAE s Job Task Analysis Strategic Level Competencies During 2013, ASAE funded an extensive, psychometrically valid study to document the competencies essential to the practice of association management
the role of the head of internal audit in public service organisations 2010 CIPFA Statement on the role of the Head of Internal Audit in public service organisations The Head of Internal Audit in a public
Planning, Monitoring, Evaluation and Learning Program Program Description Engineers Without Borders USA EWB-USA Planning, Monitoring, Evaluation 02/2014 and Learning Program Description This document describes
IFRC Framework for Evaluation Planning and Evaluation Department (PED) IFRC Secretariat February 2011 1 IFRC Framework for Evaluation 1. INTRODUCTION The purpose of this IFRC Framework for Evaluation (hereafter
PACIFIC ISLANDS FORUM SECRETARIAT PIFS(12)FEMK.05 FORUM ECONOMIC MINISTERS MEETING Tarawa, Kiribati 2-4 July 2012 SESSION 2 FEMM BIENNIAL STOCKTAKE 2012 The attached paper, prepared by the Forum Secretariat,
AQTF Audit Handbook The following publication was endorsed by the former Standing Council for Tertiary Education Skills and Employment (SCOTESE). On 13 December 2013, COAG agreed that its council system
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP) Organization Project Position type Adeso African Development Solutions and ACTED - Agency for
DAC Guidelines and Reference Series Quality Standards for Development Evaluation DAC Guidelines and Reference Series Quality Standards for Development Evaluation DEVELOPMENT ASSISTANCE COMMITTEE ORGANISATION
Practice Manager Children s Social Care Role Profile: Practice Manager Grade: Grade 12 Accountable to: Accountable for: Role Context & Purpose Group Manager Line management of a local team of 5-7 fte staff
Sponsorship in Programming tool Child Selection Objective: A tool for selection of children in World Vision child sponsorship We ve learned some things about selecting children. It is not a separate sponsorship
0 INTRODUCTION The development of the Merlin Standard has been progressed as a joint exercise between the Department for Work and Pensions (DWP) and its providers operating in the Welfare to Work (W2W)
DRIVING FORWARD PROFESSIONAL STANDARDS FOR TEACHERS The Standards for Registration: mandatory requirements for Registration with the General Teaching Council for Scotland December 2012 Contents Page The
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
NATIONAL STANDARDS FOR DISABILITY SERVICES Full version web accessible FOR MORE INFORMATION CONTACT - QualityAssurance@dss.gov.au NATIONAL STANDARDS FOR DISABILITY SERVICES Copyright statement All material
United Nations General Assembly Distr.: General 12 April 2011 A/HRC/RES/16/21 Original: English Human Rights Council Sixteenth session Agenda item 1 Organizational and procedural matters Resolution adopted
SDF 8/2-NM-4 CARIBBEAN DEVELOPMENT BANK SPECIAL DEVELOPMENT FUND (UNIFIED) STATUS REPORT ON THE IMPLEMENTATION OF THE GENDER POLICY AND OPERATIONAL STRATEGY Revised APRIL 2012 ABBREVIATIONS AMT - Advisory
LEVEL 5 Advanced Diploma in Purchasing and Supply Risk Management and Supply Chain Vulnerability L5-02 Senior Assessor s Report July 2012 L5-02 Senior Assessor Report July 2012 FV 1/8 SECTION A Candidates
PERFORMANCE EXPECTATION 1: Vision, Mission, and Goals PERFORMANCE EXPECTATION 1: Vision, Mission, and Goals Education leaders ensure the achievement of all students by guiding the development and implementation