Guidelines for Evaluation Terms of Reference



Similar documents
Guidelines for Follow-up to Evaluation Report Recommendations

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

ASSESSMENT OF DEVELOPMENT RESULTS

PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS

Glossary Monitoring and Evaluation Terms

GLOSSARY OF EVALUATION TERMS

Stiftung Zewo Schweizerische Zertifizierungsstelle für gemeinnützige, Spenden sammelnde Organisationen

To be used in conjunction with the Invitation to Tender for Consultancy template.

STRATEGIC REVIEW OF HUMAN RESOURCE MANAGEMENT IN UNICEF. Terms of Reference July

Evaluation Policy. Evaluation Office. United Nations Environment Programme. September Evaluation Office. United Nations Environment Programme

Monitoring and Evaluation Plan Primer for DRL Grantees

Identification. Preparation and formulation. Evaluation. Review and approval. Implementation. A. Phase 1: Project identification

Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) Toll free: Fax: (819)

Terms of Reference for LEAP II Final Evaluation Consultant

REQUEST FOR EXPRESSION OF INTEREST FOR HIRING THREE(3) EXPERT COACHES NO. 010/NCBS-BTC/S/

UNDP Programming Manual December Chapter 7: MONITORING, REPORTING AND EVALUATION Page 1

FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS

QUAๆASSURANCE IN FINANCIAL AUDITING

USING THE EVALUABILITY ASSESSMENT TOOL

NCBS DATA ANALYST EXPERT COACH TERMS OF REFERENCE. National Capacity Building Secretariat

Organization. Project Name. Project Overview Plan Version # Date

Essentials of Project Management

REQUEST FOR PROPOSALS SERVICES FOR

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template

4 Project Implementation and Monitoring

Using Evaluation to Support a Results- Based Management System

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template

Pro-GOLE (Right to do Business) ToRs Commercial Contract Translation

Scope of Work Consultancy Services of Communication Specialist for the Development of Project Communication Strategy and Plan 1. PURPOSE The purpose

CIDA EVALUATION GUIDE

Develop a strategic plan. A tool for NGOs and CBOs M

Procurement Performance Measurement System

Promoting hygiene. 9.1 Assessing hygiene practices CHAPTER 9

Options for creating working groups, task forces and editorial boards to facilitate the implementation of the work plan. Note by the secretariat

ADVERTISEMENT. Markets for Change / M4C Communications and Monitoring & Evaluation Officer (SC SB-4)

Guidelines for Project and Programme Evaluations

Evaluation. Evaluation Document 2006, No. 1. Office GLOBAL ENVIRONMENT FACILITY. The GEF Monitoring and. Evaluation. Policy

Terms of Reference (revised version 24 Augustus 2012 correction on description of deliverables)

How To Monitor A Project

Project Management Framework

URBACT III OPERATIONAL PROGRAMME ( ) CALL FOR PROPOSALS FOR THE CREATION OF 20 ACTION-PLANNING NETWORKS

We released this document in response to a Freedom of Information request. Over time it may become out of date. Department for Work and Pensions

1. Trustees annual report

Technical Specifications SCOPE OF WORK

ANNEX I TO REQUEST FOR PROPOSALS N PTD/09/032 TERMS OF REFERENCE FOR A

Contract for Individual Consultant (other Specialists) Request for written proposal. Net-Med youth Project in Jordan

Forest Carbon Partnership Facility (FCPF)

Formative Evaluation of the Midwifery Education Programme. Terms of Reference

Competencies for Canadian Evaluation Practice

Current Employment Opportunities Full Time Staff

WH+ST. Action Plan UNESCO World Heritage and Sustainable Tourism Programme

Fund for Gender Equality MONITORING AND EVALUATION FRAMEWORK ( )

Monitoring and Evaluation Framework and Strategy. GAVI Alliance

1. Background and Overview Why the addendum? How should this addendum be used? Organization of the addendum 2

TOOL. Project Progress Report

DAC Guidelines and Reference Series. Quality Standards for Development Evaluation

Internal Audit Manual

NCBS HUMAN RESOURCES EXPERT COACH TERMS OF REFERENCE. National Capacity Building Secretariat. 2. BTC Change co-manager. Signature.

VICTIMS SUPPORT PROGRAMME. Guidance Note on the Recruitment of Staff G3/VSP

reporting forma plan framework indicators outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat

UNICEF Evaluation Report Standards

INDIVIDUAL CONSULTANT PROCUREMENT NOTICE

REPORT 2016/066 INTERNAL AUDIT DIVISION. Audit of management of technical cooperation projects in the Economic Commission for Africa

TERMINAL EVALUATION TERMS OF REFERENCE

PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE

GUIDE FOR EVALUATION PROCEDURES FOR EMPLOYMENT OF CONSULTANTS UNDER JAPANESE ODA LOANS

Role Description Metro Operations, Data Analyst

Internal Audit Quality Assessment. Presented To: World Intellectual Property Organization

The Role of the Science Subject Leader. The role of the subject leader is crucial if schools are to raise standards and

Administrative forms (Part A) Project proposal (Part B)

UN-SWAP Evaluation Performance Indicator Technical Note (reviewed in August 2014)

INDIVIDUAL CONSULTANT PROCUREMENT NOTICE. Description of the assignment: National Expert on Public Finance (Open to Thai National only)

TOOL. Project Progress Report

PRAG INSTITUTIONAL FUNDING TOOLKIT

TERMS OF REFERENCE (TOR) VISUAL IDENTITY AND GRAPHIC DESIGN WORK

Guidance for ISO liaison organizations Engaging stakeholders and building consensus

Internal Audit Standards

GENERIC STANDARDS CUSTOMER RELATIONSHIPS FURTHER EXCELLENCE CUSTOMISED SOLUTIONS INDUSTRY STANDARDS TRAINING SERVICES THE ROUTE TO

Key Considerations for MANAGING EVALUATIONS

Equal Rights and Treatment for Roma in Moldova and Ukraine. Manual

Request for Proposal HQC Evaluation of Saskatchewan s Lean Health Care Transformation: Phase 1


Outline of a Typical NSF Grant Proposal

Call for Expression of Interest Consultant

Terms of Reference (TOR) For Impact Evaluation of ANN Project

Digital Industries Apprenticeship: Assessment Plan. Cyber Security Technologist. April 2016

An introduction to impact measurement

Preparations for 2012 Quadrennial comprehensive policy review of the General Assembly

Internal Audit Division

Tips and Guidelines for an NIH Proposal

Performance Monitoring and Evaluation System (PMES) Framework Document

Template for IT Project Plan. Template for IT Project Plan. [Project Acronym and Name]

INDIVIDUAL CONSULTANT PROCUREMENT NOTICE

Terms of Reference Concurrent Monitoring of Mid Day Meal (MDM) in Odisha

Transcription:

Guidelines for Evaluation Terms of Reference 1. The present Guidelines for Evaluation Terms of Reference (TOR) form part of a common set of ITC Evaluation Guidelines that operationalize the ITC Evaluation Policy. Evaluation Guidelines are separate documents containing more detailed explanations of the process and methodologies to be used for conducting evaluations. They set evaluation standards for planning, conducting and using evaluations, developing and disseminating methodology and establishing the institutional mechanism for their implementation. 2. The Guidelines for Evaluation TOR are intended to assist evaluation managers (program/project manager, evaluation officer or the person who is in charge of the evaluation and monitoring function for the program/project) in writing clear and detailed TOR for the evaluation of projects/programmes and of other objects that are not supported by a Logframe 1. The TOR should respect and follow the Key Principles as they are defined in the ITC Evaluation Policy. 3. TOR are the starting point of the evaluation exercise. They reflect the understanding of the parties to the object being evaluated on the framework for evaluation. They constitute the basic agreement on the purpose of the evaluation. They sum up available knowledge and specify the scope of the evaluation. They give an overview of management s requirements and expectations for the evaluation. TOR should include clear and detailed information on why and for whom the evaluation is being done and what it intends to accomplish. 4. TOR define the work that must be carried out by the Evaluation Team. They suggest an evaluation method and delineate the procedures, which will be followed. They form an integral part of the contract with the evaluators. TOR guide the evaluation process until the submission of the Evaluation Report. In particular, they should define how the evaluation will be accomplished, who will be involved in the evaluation and when the main requirements will be met. 5. Evaluation managers should use the TOR format given. Major headings should be retained but sub-headings may be added reflecting particular requirements and expectations for the evaluation. 6. Attached is the standard format for Evaluation TOR, starting with a sample cover page. The TOR document should not exceed 10 pages. Supporting and relevant information should be placed in annexes. The layout and order of contents should follow those in the guidelines. It should be typed in 1 1/2 spacing, using Arial font 11 and in the A-4 format. The TOR should be submitted in electronic format. Pages should be numbered consecutively with Arabic numerals. The numbers should appear, on the right side, at the top of the page. Paragraphs should be numbered. 1 As defined in the ITC Evaluation Policy, evaluation objects include projects and programmes and also other objects that are not supported by a Logframe such as ITC tools, methodologies, policies, strategies work in specific countries and regions and critical internal processes. 1

Date: EVALUATION TERMS OF REFERENCE Title of the Object of Evaluation (Project or programme, Thematic area, Work in specific countries or regions, Tools and methodologies, Policies and strategies, Critical internal processes, with the corresponding code/number) Country(ies) / Region / Area of work INTERNATIONAL TRADE CENTRE THE JOINT AGENCY OF THE WORLD TRADE ORGANIZATION AND THE UNITED NATIONS Geneva Switzerland 2

CONTENTS Paragraphes Page CONTENTS LIST OF ACRONYMS 1. BACKGROUND INFORMATION 2. PURPOSE OF THE EVALUATION 3. SCOPE OF THE EVALUATION 4. EVALUATION METHODS 5.EVALUATION TEAM COMPOSITION 6. PLANNING AND IMPLEMENTATION ARRANGEMENTS Annexes 1. Evaluation questions grid 2. Evaluation Process Timetable 3. Relevant Materials 3

1. BACKGROUND INFORMATION This section provides information on why, when and how the object being evaluated was established, including information on the concept, design and budget. The background should reflect any amendments or revisions of the original document(s) establishing the object being evaluated. The purpose and main objectives and expected results must be clearly stated. (The same applies to the corresponding performance indicators in the case of programmes / projects. The findings of the evaluation should be clearly compared to the expected results of the object being evaluated.) Finally, this section provides information on the authority and mandate that established the object being evaluated (decisions of the SMC, Executive Director, or others). The background should also explicitly state that the UNEG Ethical Guidelines for Evaluation 2 apply to the conduct of the evaluation in the ITC in general and to the present evaluation in particular. 2. PURPOSE OF THE EVALUATION Why is the evaluation being undertaken? The purpose of evaluation is to measure achievements, outcomes and impacts both positive and negative. In general, the overall purpose for the evaluation is to learn from the implementation of the object being evaluated so that lessons can be drawn that can be the basis for instituting improvements to the planning, design and management of the object in question, or of similar objects. The following questions should be considered when preparing this section: - Who initiated the evaluation? 3 - Why is the evaluation being undertaken, and why is it being undertaken now? - What will the evaluation seek to accomplish? - Who are the main stakeholders of the evaluation? TOR need to be specific about the purpose of the evaluation to ensure that evaluation serves the needs of the intended users and to enable its actual use through the systematic follow-up on the implementation of the recommendations that will have been accepted. 3. SCOPE OF THE EVALUATION This section is the core of the TOR. It describes what the evaluation will focus on and clearly states what should be evaluated. It contextualizes the evaluation exercise by explicitly identifying what are considered crucial and strategic issues at the time. The scope should never loose the ITC overall corporate development perspective. It should systematically request the assessment if, and how much, the object being evaluated contributes to a priority area or comparative advantage for the ITC. It also includes the timeframe to be covered by the evaluation, its geographical coverage and the thematic coverage. 2 The UNEG Ethical Guidelines for Evaluation are available at: http://www.uneval.org/papersandpubs/documentdetail.jsp?doc_id=102 3 The evaluation might have been foreseen in the project document, or may have been requested by a donor. It may also have been inserted into the Evaluation Annual Plan by the Executive Director, or identified for strategic reasons by EMU, etc. 4

The scope details the issues to be covered and the corresponding evaluation criteria to assess performance. It includes a list of specific and targeted evaluation questions that substantiate the focus of the evaluation. The evaluation questions guide the Evaluation Team in its task to produce the relevant findings, lessons learned and recommendations. To enable due quality, it is fundamental that the number of evaluation questions is limited and proportionate to the resources available for conducting the evaluation. The Evaluation Report will clearly address the evaluation questions. Typically, the scope will build on a combination of the following evaluation questions: Relevance Whether the object being evaluated addresses the identified needs / problems? Is the evaluation object concept and design the appropriate solution to these needs / problems? Were the objectives 4 of the evaluation object attainable? Are they still relevant? What is the value of the object in relation to other priority needs and efforts? Are problems addressed still major problems? Effectiveness Is the evaluation object achieving satisfactory progress toward its stated objectives? Have the results 5 have been achieved, and if not, whether there has been some progress made towards their achievement? Have the anticipated activities 6 and outputs 7 being delivered on time and according to specifications? What were the problems and constraints encountered during implementation? Efficiency Are the effects 8 being achieved at an acceptable cost, compared with alternative approaches to accomplishing the same objectives? This will usually include an analysis of how efficiently planning and implementation are carried out and an assessment to which extent organizational structure, managerial support and coordination mechanism used by ITC supports the object being evaluated. Impact How useful were the results and outcomes 9? What is the assessment of ITC s contribution to human and institutional capacity building in terms of knowledge and competencies? What difference has the evaluation object made to beneficiaries / clients / stakeholders? What are the trade, economic, social, technical, environmental, and other effects on individuals, communities, and institutions either short-, medium-, or long-term; intended or unintended; positive and negative; on a micro- or macro-level? Sustainability Will the results continue after funding? Has ITC s capacity building led to a change in the behaviour of beneficiaries? Are they willing / able to continue? Are the beneficiary institutions developing the capacity and motivation to use / deliver new competences? Can this become self-sustaining financially? 4 Objectives: Intended impact contributing to physical, financial, institutional, social, environmental, or other benefits to a society, community or group of people via one or more development interventions. Glossary of Key Terms in Evaluation and Results Based management, OECD-DAC, 2002. 5 Results: The output, outcome or impact (intended or unintended, positive or negative) of a development intervention. Idem 6 Activities: Actions taken or work performed through which inputs, are mobilized to produce specific outputs. idem 7 Outputs: The products and services which result from the completion of activities within a development intervention. UNDG Results-Based Management Terminology, http://www.undg.org/index.cfm?p=224. 8 Effects: Intended or unintended change due directly or indirectly to an intervention.. Glossary of Key Terms in Evaluation and Results Based management, OECD-DAC, 2002. 9 Outcomes: The intended or achieved short-term and medium-term effects of an intervention s outputs. Outcomes represent changes in development conditions which occur between the completion of outputs and the achievement of impact. Idem. 5

4. EVALUATION METHODS This part suggests the key elements of the methodology to be used by the Evaluation Team. Methodology is the approach used to identify information sources and collect information during an evaluation, and to analyze those data. The quality of the evaluation very much depends on the methods used. They have to ensure that evaluation findings and the corresponding recommendations are based on robust and reliable data and information. Evaluation methods also have to be consistent with answering the evaluation questions, the required skills and competencies for the Evaluation Team and time and budget constraints. Evaluation methods may include: - Document review, this will include all major documents such as the project document and its revisions, progress and monitoring reports, terminal reports, self-evaluations etc (desk study); - Interviews with all key informants and key players; - Field visits; - Questionnaires; - Participation of partners and stakeholders; - Benchmarking. In general, based on the TOR and in consultation with EMU, the Evaluation Team takes the final decision about the evaluation methods that are the most appropriate to the purpose of the evaluation. The TOR must clearly state that the Evaluation Team should present a detailed statement of evaluation methods including the description of data collection instruments and procedures, information sources and procedures for analyzing the data. 5. EVALUATION TEAM COMPOSITION Based on the character of the object being evaluated and the purpose of the evaluation this section details the number of evaluators in the Evaluation Team and their areas of expertise. The required qualifications should be consistent with the suggested evaluation methods and the requested deliverables. The Section also suggests a division of labour between them. An evaluation expert normally nominated by EMU should lead the team, while other team members could be specialists in their respective area. The team leader suggests specialist team members according to the overall purpose of the evaluation. While in some cases, evaluators may be nominated by other stakeholders it should be made clear in the TOR that evaluators shall have no past connection with the object of the evaluation, that they will not act as representatives of any party and will remain independent and impartial so that conflicts of interest are avoided and the credibility of the evaluation process and product is not undermined. 6

6. PLANNING AND IMPLEMENTATION ARRANGEMENTS This Section elaborates on a detailed programme of the planning for evaluation. It explains how the evaluation is going to be carried out and details the main phases of the evaluation exercise. Managements Arrangements They describe who does what in the evaluation process, the roles and responsibilities of all the parties involved in conducting the evaluation (the Evaluation Team, EMU and other concerned parties (the donor(s), for instance)) should be clearly stated. Timeframe for the evaluation process The programme of the evaluation should be detailed with a precise timeframe. A table summarizing the timeframe for the evaluation process should be used. Resources required and logistical support This section provides information on budget requirements to undertake the evaluation. Evaluations have to be financially feasible. It means that expectations have to be achievable given the amount of available resources. This section should clearly show the balance between the required level of effort and the assigned costs. Levels of effort and costs are planned on a per diem, weekly, monthly basis or with lump sum, depending on the agreed basis for payment. The following tables might be used as templates to describe planned costs and level of effort. Projected Level of Effort Activity Number of days/weeks/months Headquarters Field Total Projected Cost Type 10 Cost ($) Headquarters Field Total Total Source: How to Perform Evaluations Model TOR, CIDA. This section also identifies the main types of administrative and logistical support to the Evaluation Team and points out the roles and responsibilities in providing such kinds of support. Expected deliverables of the evaluation Finally, the TOR list the requested deliverables with the corresponding target dates. One or more deliverables may be requested to evaluator(s) at the end of each phase. Examples of deliverables are the evaluation workplan and the Evaluation Report. 10 September 2008 10 Common types of cost are Personnel, Travel, and out-of-pocket expenses (supplies, equipments and materials, communication, etc.). 7