Independent Evaluation: Principles, Guidelines And Good Practice

Similar documents
Guidance Note on Developing Terms of Reference (ToR) for Evaluations

PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE

Monitoring and Evaluation Framework and Strategy. GAVI Alliance

Guide for the Development of Results-based Management and Accountability Frameworks

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

Strategic Communications Audits

Overview of GFSI and Accredited Certification

Contact address: Global Food Safety Initiative Foundation c/o The Consumer Goods Forum 22/24 rue du Gouverneur Général Eboué Issy-les-Moulineaux

Part 1. MfDR Concepts, Tools and Principles

FAO Competency Framework

Evaluation. Evaluation Document 2006, No. 1. Office GLOBAL ENVIRONMENT FACILITY. The GEF Monitoring and. Evaluation. Policy

New Approaches to Economic Challenges - A Framework Paper

Monitoring and Evaluation

REPORT 2016/066 INTERNAL AUDIT DIVISION. Audit of management of technical cooperation projects in the Economic Commission for Africa

REPORT OF THE CONFERENCE OF THE PARTIES ON ITS FIRST SESSION, HELD AT BERLIN FROM 28 MARCH TO 7 APRIL Addendum

The Ohio Resident Educator Program Standards Planning Tool Final

Terms of Reference for LEAP II Final Evaluation Consultant

4.1 Identify what is working well and what needs adjustment Outline broad strategies that will help to effect these adjustments.

the role of the head of internal audit in public service organisations 2010

South Carolina Perspectives on a Health Insurance Exchange: A Focus Group Research Study

PROPOSED MANDATE FOR THE GLOBAL PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO-OPERATION

MONITORING AND EVALUATION: WORK PROGRAM AND BUDGET

STRATEGIC REVIEW OF HUMAN RESOURCE MANAGEMENT IN UNICEF. Terms of Reference July

PROGRESS NOTE. Social Return on Investment and Its Relevance to Microfinance SROI FRAMEWORK. The Small Enterprise Education and Promotion Network

GLOSSARY OF EVALUATION TERMS

Foreword Introduction - The Global Food Safety Initiative (GFSI) Scope Section Overview Normative References...

International Advocacy Capacity Tool for organizational assessment

Forest Carbon Partnership Facility (FCPF)

Data quality and metadata

Northern Ireland Assembly. Applicant Information Booklet INDEPENDENT CHAIR AND MEMBER OF THE NORTHERN IRELAND ASSEMBLY AUDIT AND RISK COMMITTEE

Planning, Monitoring, Evaluation and Learning Program

Summary of ExCo 15 Discussion and Recommendations to the CGIAR on CGIAR Change Management Process 1

Chapter 3 Data Interpretation and Reporting Evaluation

Operational Guidelines for Private Sector Engagement

TADAT TAX ADMINISTRATION DIAGNOSTIC ASSESSMENT TOOL PROGRAM DOCUMENT

Road map to an integrated budget: cost classification and results-based budgeting. Joint report of UNDP, UNFPA and UNICEF

Job Profile. Head of Programme (N1) Governance Facility. Nepal

Fund for Gender Equality MONITORING AND EVALUATION FRAMEWORK ( )

Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) Toll free: Fax: (819)

Chile: Management Control Systems and Results-Based Budgeting

Ensuring Accountability in Disaster Risk Management and Reconstruction

Measuring Progress Towards Sustainable Development Goals

World Health Organization

USING THE EVALUABILITY ASSESSMENT TOOL

The Gavi grant management change process

RIPE NCC Activity Plan 2005

Competencies for Canadian Evaluation Practice

Engage Government & Community 2 PLAN REFLECT ACT

Leadership and Management Competencies

Communication Plan. for the. ATLANTIC AREA Transnational Cooperation Programme

Programme Management Competence Framework

AchieveMpls Strategic Plan FY 2014 FY 2016

STANDARD. Risk Assessment. Supply Chain Risk Management: A Compilation of Best Practices

FY 2016 ANNUAL PERFORMANCE PLAN

Investment manager research

Evaluation: Designs and Approaches

Report of the International Civil Service Commission for 2004

Terms of Reference: External Evaluation for Integrated Project focused on Gender Based Violence, Child Protection and Nutrition.

Peace operations 2010 reform strategy (excerpts from the report of the Secretary-General)

New JICA Guidelines for Project Evaluation First Edition. Japan International Cooperation Agency (JICA) Evaluation Department

POST DISTRIBUTION MONITORING: - Guidelines to Monitor processes, outputs and outcomes

Job Profile. Component Manager, Voice and Accountability Democratic Governance Facility (DGF) (Senior Adviser (N1)) Uganda

Developing Research & Communication Skills

Strategic Plan

TERMS OF REFERENCE FOR STRATEGIC PLANNING & MANAGEMENT EXPERT

A SHORT GUIDE TO PRO-POOR VALUE CHAIN PROGRAM DESIGN

Islamic Republic of Afghanistan, Post Disaster Need Assessment (PDNA) Training Manual

COMMUNICATION STRATEGY of the Interreg IPA Cross-border Cooperation programme Croatia Serbia

HEAD OF POLICY AND ADVOCACY

Organization Effectiveness Based on Performance Measurement and Organizational Competencies Getting to Perfect!

National Commission for Academic Accreditation & Assessment. Standards for Quality Assurance and Accreditation of Higher Education Institutions

CHEA. Accreditation and Accountability: A CHEA Special Report. CHEA Institute for Research and Study of Acceditation and Quality Assurance

Guidelines for Gender Sensitive Programming Prepared by Brigitte Leduc and Farid Ahmad

2.0 Information Management and the Humanitarian Context

Section 7. Terms of Reference

Communication - Terms of Reference Service contract

INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD

DATA COLLECTION CHALLENGES

Measuring the Impact of Volunteering

GROWTH/REPLICATION STRATEGIES

Gender Strategy GENDER PAYS OFF!

How To Help The World Coffee Sector

MOFAS Community Grants Program. Grantee Interview Report #1 (Phase 1)

BASIC ELEMENTS OF GRANT WRITING

BOD/2015/10 DOC 06-Decision Meeting of the Board of Directors October 16, 2015

Procurement Performance Measurement System

STRATEGIC PLAN 2013/2014 TO 2015/2016

UNDP Programming Manual December Chapter 7: MONITORING, REPORTING AND EVALUATION Page 1

Blue Cross Blue Shield of Massachusetts Chief Physician Executive Boston, MA

12. Governance and Management

Leadership for Change Flying Squad Program Review and New Approach to Organizational Development

building and sustaining productive working relationships p u b l i c r e l a t i o n s a n d p r o c u r e m e n t

ADVERTISEMENT. Markets for Change / M4C Communications and Monitoring & Evaluation Officer (SC SB-4)

ASAE s Job Task Analysis Strategic Level Competencies

Annex II: Terms of Reference for Management and Implementation Support Consultant (Firm)

Campus Network Planning and Technical Assistance Overview

For a detailed background of the Project and a job description, please refer to page 2.

Self Assessment Tool for Principals and Vice-Principals

DAC Guidelines and Reference Series. Quality Standards for Development Evaluation

pm4dev, 2016 management for development series Project Scope Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

Transcription:

Independent Evaluation: Principles, Guidelines And Good Practice The World Bank Development Grant Facility (DGF) Technical Note November, 2003 Introduction Global and regional partnership programs (GPPs) are comprised of multiple partners that operate across borders to tackle issues that cannot be addressed at the country level alone. They are increasingly important in achieving the Millennium Development Goals (MDGs) and in complementing and reinforcing the World Bank s country activities. As GPPs become mainstreamed into the Bank s business, 1 and as the Bank puts new emphasis on learning and on demonstrating accountability for results, independent evaluation is taking on new importance. Independent evaluation is also used to inform the Bank s funding decisions for GPPs through the DGF. Evaluation provides critical evidence of the outcomes of global programs and their roles in enhancing the development impacts of partner and country efforts on the ground. Box 1: Principles of Independent Evaluation Usefulness: For evaluation to affect decision making, findings must be perceived as useful and as geared to current operational concerns. Independence: For evaluation to be impartial, it must be free from bias in findings, analysis and conclusions. In turn, this means independence from line management. Credibility: Evaluation must be perceived as objective, rigorous and impartial. Credibility rests on the professional quality of evaluators and the rigor of methods. Transparency: Credibility and usefulness will also depend on the transparency of the evaluation the ready availability of findings to all stakeholders. Drawn from Lessons & Practices, Number 10, OED, World Bank, July 1997. This note is intended for GPP managers and development partners. It summarizes the principles of good evaluation (Box 1), reviews guidelines (Box 2), 2 and discusses good practices in the conduct and design of independent evaluations. The DGF is the World Bank s grant mechanism for global and regional programs. It provides $175 million annually to some 50 GPPs. This note was prepared by Randall Purcell of the DGF Secretariat. The author is grateful to OED for drawing on its work on global programs. rpurcell@worldbank.org, tel: 202-473-5571 1 See The World Bank s Approach to Global Programs: An Independent Evaluation Phase 1,OED, World Bank, May 8, 2002. 2 The guidelines in this note (and in Box 2) are for DGFsupported programs but are broadly applicable to GPPs. The DGF approach to monitoring and evaluation is described in The Development Grant Facility: FY99 DGF Annual Review and Proposed FY00 Budget, July 26, 1999. In addition to independent evaluation, it requires programs to undertake the following: Progress/Grant Completion Reporting: The annual DGF application includes a progress report on activities and accomplishments. Bank task team leaders are also required to report on outputs and impacts of grant programs within six months of the closing date of the grant. Financial Reporting/Audit: The DGF requires from an external grant recipient the provision of an audited financial statement together with a statement detailing the specific use of DGF funds. DGF-supported programs may be audited by the Bank s Internal Audit Department. The results are reported to the Bank s Senior Management and the Audit Committee of the Board.

Box 2: Evaluation Guidelines 3 Evaluations should: review measurable indicators of performance including outputs, outcomes and impacts and assist programs to develop improved indicators (see Box 4). be conducted every three to five years (for programs that receive total DGF funding of $300,000 or more). be owned and utilized by all partners and not be driven solely by the needs of the Bank. Therefore, timing and coverage should take into account partner preferences. be overseen by the program s governing body and conducted with the assistance of program management. Evaluators should report to the governing body, not to management. be conducted by individuals or companies that have been at arm slength from the initiative. report on outputs, outcomes and impacts. address the performance of the governing body as well as of management. make recommendations for financial sustainability as well as program delivery. be discussed by the relevant Bank Sector Board and used by them as the basis for considering DGF funding requests. have terms of reference and selection of consultants cleared with the DGF Secretariat. The DGF grant can cover a part of evaluation costs which should also be borne by other partners. The Purpose of Independent Evaluation An independent evaluation should serve both as a learning tool and as a means to demonstrate accountability. Learning. An independent evaluation is an effective means for learning how GPPs can be strengthened and how resources can be used more effectively. It provides an understanding of the following: 1) how well a program has articulated its vision and is achieving its mission; 2) the quality of activities and how useful they are in meeting clients needs; 3) how capacities such as financial and executive management, governance and country-based resources affect quality. Evaluation that is driven by the need to make better program decisions should clarify with some specificity who needs to learn what. This frames the exercise and gives it greater legitimacy and focus. Thus, a primary audience for the evaluation should be agents for change within programs, including task team leaders and managers. Funders will also need assurance that they are supporting program learning. It is important that the evaluation go beyond asking, is this being done well to ask, how can this be done better. Accountability. Because development resources are limited and the time frame to achieve the MDGs is short, donors are increasingly focusing on results. GPPs are under particular scrutiny because grant resources are especially scarce and because these programs are largely independently governed. Independent evaluation is crucial for demonstrating how resources have been spent and what results have been achieved. At the same time, it is important that the evaluation go beyond results and their attribution to ask, What would have happened if the program did not exist? What would happen if the program ceased to exist? 3 Requirements for DGF-supported programs. Examples of good practice for evaluation terms of reference and evaluation reports can be found on the DGF website. 2

Conduct and Form of the Evaluation Independent evaluations should be carried out in ways that support the principles of good evaluation shown in Box 1. Important questions that need to be addressed are the following: Who governs the evaluation process? The governing body, in collaboration with management, should formulate key questions in consultation with partners. Final decisions on terms of reference and selection of consultants should be made by the governing body; the evaluators should report directly to that body. Who pays for the evaluation? Costs should be borne equitably by the various donors. No one donor should pay for or drive the process. How much should the evaluation cost? Costs will vary depending mostly on the size of the program and the need for field work. Evaluations of newer programs that do not require extensive field investigation may typically cost between $75,000 and $150,000. Older programs may have more development impacts to demonstrate and would be more costly to evaluate. To contain costs (and to focus attention), evaluations should generally be limited to three to six months depending on the size and complexity of the program. How should evaluators be selected? Evaluators should be selected on the basis of their expertise in the field, the diverse points of view they bring to a team and their objectivity. Evaluators should have knowledge of evaluation methodology and be familiar with development issues and international organizations. To avoid conflict of interest, evaluators should have an arm s-length relationship with the program: They should have had no substantive prior involvement in its establishment or operation. Competitive bidding for consultants is encouraged but not required. How should the evaluation be carried out? The evaluation will typically involve a desk review of documents, interviews with stakeholders and, where applicable, collection of data and evidence to determine development impacts. Because of the importance of assessing results at the country level (see below), the involvement of country counterparts and country clients is often crucial (see Box 3). 4 For DGF-supported programs, TOR and consultants should be cleared with the DGF Secretariat. 3 Box 3: Approaches for Integrating Country Partners in Evaluations Invite recipient-country representatives to participate on the evaluation team. Form an advisory panel comprised of recipient-country nationals to advise the evaluation team and/or review its findings. Involve recipient-country nationals as key informants, focus-group members, researchers, interviewers, country/regional experts, etc. Instead of a full-fledged evaluation, programs in the early stages of implementation might first undertake a process review in order to take stock of operating procedures and the adequacy of controls. Such an exercise is often limited to a study of documents and to interviews. As implementation progresses, subsequent evaluations would focus more on outcomes and impacts through assessments in the field. What should be the scope of the evaluation? An independent evaluation should normally be done every three to five years. It should examine past performance focusing on the period after the last evaluation. Based on past performance, it should develop lessons that can improve performance going forward. The evaluation should not be conducted simply to determine the extent to which a program worked but should also develop lessons that can improve the way it and future programs will function. What should be evaluated? The evaluation should examine how activities have been carried out and how they could be done differently. The focus should be on the following: program performance, including an assessment of its mission, objectives, governance, policies, management, operational modalities and impacts; the performance of other partners, and coordination needed among them; 4 4 The success of most GPPs will ultimately be measured by progress towards achievement of the MDGs. It is vital, therefore, that the evaluation examines the coordination needed among partners to achieve results at the country level.

the perspectives of other stakeholders, including beneficiaries; the key exogenous or contextual factors that affect implementation of the program. Programs that principally involve networking or advocacy should pay particular attention to how successfully the program has facilitated the participation and commitment of partners. Evaluations should examine progress against performance indicators established by the program (see Box 4). The World Bank s Operations Evaluation Department has developed a set of core evaluation questions designed to assess performance against such indicators (Box 5). Evaluations should focus on responding to these questions (or some version of them) and should provide evidence to substantiate findings and conclusions. How are evaluation findings reported and made accessible? Evaluation reports typically include the following sections: o summary and conclusions; o description of program objectives, components and activities evaluated; o methodology used to develop findings; 5 o implementation experience; o responses to core questions; o summary of key findings, including strategy for financial sustainability or disengagement; o conclusions and lessons learned; o recommendations. A completed evaluation should be sent first to the program s governing body. Key partners should have an opportunity to provide feedback before the evaluation is released to the public. With the approval of the governing body, the evaluation should be disseminated widely, including on the Internet. World Bank task team leaders should ensure that evaluations are discussed at Sector Board meetings and that a record of that discussion is made available to the DGF. Box 4: Developing and Using Performance Indicators 6 What are performance indicators? Performance indicators are measures of program impacts, outcomes, outputs, and inputs. They are monitored throughout implementation to assess progress towards program objectives. How should indicators be developed? Performance indicators should be based on an underlying logical framework that links program objectives with program components and their respective inputs, activities and outputs at different stages of implementation. What do performance indicators look like? Examples of performance indicators are given below for a hypothetical research program. 7 Inputs: funding received, number of staff hired, equipment purchased Outputs: studies conducted, training courses organized Outcomes: government policies improved, donor investments committed, technologies developed Impacts: improved access to goods and services (e.g. vaccines, water, energy), health and nutrition status improved It is useful to think of the results chain of indicators as a continuum mirroring the logical means-end relationship of the program: inputs yield outputs that contribute to outcomes and impacts. How should indicators be utilized? The development of performance indicators should be an integral part of a program s strategic plan. Indicators should be utilized periodically in progress reports 8 and should be used to measure success in an independent evaluation. 5 The evaluation should include information on who commissioned it, what process was followed, how evaluators were selected and to whom they reported, who was consulted and what documents were reviewed. 6 Drawn from Performance Monitoring Indicators: A Handbook for Task Managers, OED, World Bank, 1996. 7 These are direct indicators. Indirect measures of performance might include indicators for risk, efficiency, effectiveness and sustainability (see Box 5) 8 The annual progress report to DGF should describe progress against previously developed indicators. 4

Box 5: Key Evaluation Issues and Questions 9 Global Relevance 1. International consensus: To what extent is there an international consensus, either formal or informal, concerning global challenges and concerns in the sector and that global collective action is required to address these challenges and concerns? 2. Relevance: To what extent is the program addressing global challenges and concerns in the sector, consistent with client countries current development priorities, consistent with the missions and strategies of partners? 3. Eligibility criteria: To what extent is the program providing global and regional public goods, supporting international advocacy to improve policies at the national level, producing and delivering cross-country, relevant lessons to client countries, mobilizing substantial incremental resources? 4. Compatibility: To what extent do the activities of the program complement, duplicate or compete with other development instruments? Outcomes, Impacts and their Sustainability 5. Efficacy: To what extent has the program achieved or is it expected to achieve its stated objectives? 6. Value added: To what extent is the program adding value to what partners are doing in the sector, what developing countries are doing in the sector in accordance with their own priorities? 7. Sustainability of outcomes and impacts: How resilient are the outcomes and impacts of the program? 8. Monitoring and evaluation: To what extent does the program have clear program and component objectives with verifiable indicators, a structured set of quantitative or qualitative indicators, systematic and regular processes for data collection and management, independence of program-level evaluations, effective feed feedback from monitoring and evaluation? (continued on p. 6) 9 Adapted from OED, Global Public Policies and Programs Study Background Paper, Annex A. October 8, 2003. 5

Box 5 (continued) Organization, Management and Financing 9. Efficiency: To what extent has the program achieved or is it expected to achieve efficient allocation of resources, benefits that are more cost effective than those that could be achieved by providing the same service on a country-by-country basis, Benefits that are more cost effective than those that could be achieved if individual contributors to the program acted alone? Governance and management 10. Legitimacy: To what extent is the authority exercised by the global program effectively derived from those with a legitimate interest in the program (including donors, developing and transition countries, clients and other stakeholders) taking into account their relative importance? 11. Governance and management: To what extent is the governance and management of the program transparent in providing information about the program, clear with respect to roles and responsibilities, fair to clients, accountable to donors, clients, scientists/professionals and other stakeholders? Partnerships and participation Financing 12. Partnerships and participation: To what extent do developing and transition country partners, clients and beneficiaries participate and exercise effective voice in the various aspects of program design, governance, implementation, monitoring and evaluation? 13. Financing: To what extent is funding positively or negatively affecting the strategic focus of the program, the governance and management of the program, the sustainability of the program? 14. Bank s presence: To what extent has the Bank s presence as a partner catalyzed non-bank resources for the program? Risks and risk management 15. Risks and risk management: To what extent have the risks associated with the program been identified and effectively managed? Partner Performance (examples given are for the World Bank) 16. Comparative advantage: To what extent are partners maximizing their comparative advantages in support of the program at the global level (global mandate and reach, convening power, mobilizing resources), and at the country level (multi-sector capacity, analytical expertise, country-level knowledge)? 17. Linkages to country operations: To what extent are there effective and complementary linkages, where needed, between global program activities and partner country activities to the mutual benefit of each? 6