1 Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science and Technology (MUST) Telephone: ,
2 M & E in Sport in Development Focus of the Presentation What is Monitoring & Evaluation (M & E)? Why do M & E? M & E in Sport & Development Interventions Conventional Versus Participatory M & E Performance Monitoring & Evaluation Creative M & E Dos & Don'ts of M & E Some Lessons Learnt
3 M & E in Sport in Development Sport contributes to the personal development and well being of people and brings wider benefits to the community. Comic Relief Research Initiative It is important to enhance the skills and knowledge among agencies using sport as a tool to improve the lives of especially vulnerable populations, and to develop sport strategies that are tailored to meet the needs of these individuals and their communities. By equipping them with the knowledge & insights to Monitor & Evaluate their interventions, this contributes greatly to the success of their projects and programs. Comic Relief Research Initiative
4 Definition of Monitoring & Evaluation Monitoring is about systematically collecting information that will help you answer questions about your project. You can use this information to report on your project and to help you evaluate. Evaluation is about using monitoring and other information you collect to make judgments about your project. It is also about using the information to make changes and improvements.
5 What is Monitoring & Evaluation (M&E)? Monitoring... is the systematic ti and routine collection of information from projects and programs for four main purposes: To learn from experiences to improve practices and activities in the future; To have internal and external accountability of the resources used and the results obtained; To take informed decisions on the future of the initiative; To promote empowerment of beneficiaries of the initiative. is a periodically recurring task already beginning in the planning stage of a project or program. Is allowing results, processes and experiences to be documented and used as a basis to steer decision-making and learning processes. is checking progress against plans. The data acquired through monitoring is used for evaluation.
6 What is Monitoring & Evaluation (M&E)? Evaluation...is assessing, as systematically and objectively as possible, a completed project or program (or a phase of an ongoing project or program that has been completed). Evaluations appraise data and information that inform strategic decisions, thus improving the project or program in the future. Evaluations should help to draw conclusions about five main aspects of the intervention: relevance effectiveness efficiency impact sustainability Information gathered in relation to these aspects during the monitoring process provides the basis for the evaluative analysis.
7 Monitoring & Evaluation (M&E) M&E is an embedded concept and constitutive part of every project or program design ( a must be ). M&E is not an imposed control instrument by the donor or an optional accessory ( nice to have ) of any project or program. M&E is ideally a dialogue on development and its progress between all stakeholders. Monitoring is integral to evaluation. During an evaluation, information from previous monitoring processes is used to understand the ways in which the project or program developed and stimulated change.
8 Monitoring & Evaluation (M&E) Monitoring focuses on the measurement of the following aspects of an intervention: On Quantity and Quality of the implemented activities (i.e: outputs: What do we do? How do we manage our activities?) On Processes inherent to a project or program (outcomes: What were the effects /changes that occurred as a result of your intervention?) On Processes external to an intervention (impact: Which broader, long-term effects were triggered by the implemented activities in combination with other environmental factors?) The evaluation process is an analysis or interpretation of the collected data which delves deeper into the relationships between the results of the project/program, the effects produced d by the project/program and the overall impact of the project/program.
9 M&E for Credibility and Transparency Adopt and use appropriate M&E tools to measure the SiD programs focusing on three issues: Efficiency (e.g. cost per activity, cost per participant, i t ratio of staff to participant) i t) Effectiveness (outputs and outcomes e.g. impact of participation on individuals in areas of physical health, life/leadership skills, AIDS awareness and behavior, improved self- esteem/confidence) Strategic outcomes (e.g. broader impact of programs on community, improved perception on quality of life, improved attitude toward another culture or group of the society)
10 Why M&E is important M & E provides the only consolidated source of information showcasing project progress It allows actors to learn from each other s experiences, building on expertise and knowledge It contributes to transparency and accountability, and allows for lessons to be shared more easily It reveals mistakes and offers paths for learning & improvements It provides a basis for questioning and testing assumptions It provides a means for agencies seeking to learn from their experiences and to incorporate them into policy and practice It provides a way to assess the crucial link between implementers and beneficiaries on the ground and decision-makers It adds to the retention and development of institutional memory It provides a more robust basis for raising i funds and influencing i policy.
11 Managers of Programs Why M & E (cont d)? Need to know whether organisational objectives and planned program results are being accomplished in order to make adjustments, re-plan, reassign resources Need to know the long-term or broader impacts of their work (on clients etc) t) in order to make judgements about the importance of continuing i the work, changing strategies and developing new products/programs Stakeholders & other donors Need to know whether organisational objectives & planned program results are being accomplished in order to make funding decisions, licensing decisions Need to know about the long term or broader effects of their work (on clients etc) in order to make decisions about continuing funding, expansion or replication of the program/project p Clients or Customers Need to know about the long-term or broader impacts of a program or product or service in order to make a decision about whether to use the product / service
12 M&E - Sport in Development M&E of Sport-For-Development interventions is of high priority. The relatively recent recognition of the use of sport as a tool in development requires thorough assessment of the value of sport in development and humanitarian disaster contexts. Sport can add value for the development of individuals, of organizations and of whole communities irrespective of the level of development. Despite this broadly shared conviction, there is still a lack of substantiated ti t evidence to support the purported potential ti of sport. Effective, transparent and (if possible) comparable M&E must therefore take place to further determine the inherent benefits, risks and limitations of sport and physical activity to development
13 M&E In Sport for Development Sports-based projects in the community address direct needs for diversification of free time, development of physical capacity & have other long term social impacts like: Social Inclusion Solidarity and team spirit Leadership skills etc According to Cunningham & Beneforti (2005), the Social Impact of Sport in Development comprises of eight indicators: Health and quality of life Physical environment Economy and employment Crime and security Education and training Physical activity Leisure and sport Social relations and networking
14 Conventional Versus Participatory Tools of M & E Several Practitioners distinguish between Conventional and Participatory M & E. Narayan (1993) offers a useful summary of differences between the two: Why Conventional Evaluation Accountability, Usually Summary Judgements about the project to determine if funding continues Participatory Evaluation To empower local people to initiate, control and take corrective action Who External Experts Community Members, project staff, facilitator What How Predetermined indicators of success, principally p cost and production output Focus on scientific objectivity, distancing of evaluators from other participants; uniform complex procedures, delayed limited access to results People identify their own indicators of success Self evaluation, simple methods adapted to local culture, open immediate sharing of results through local involvement in evaluation processes When Mid-term and completion Any assessment for program improvement; merging of monitoring and evaluation; hence frequent small evaluations
15 Performance Monitoring & Evaluation (Tips) The self-assessment approach is important because it is a self guided approach. If people discover themselves what they did wrong and what they did well, then they can do better and change. Their attitude is impacted in the process Jose de Souza Silva Performance Monitoring & Evaluation provides for active involvement in the M & E process of those with a stake in the program i.e. Providers Partners Customers/clients/beneficiaries Any other interested parties Participation takes place throughout all stages of the M & E process i.e. Planning & design Gathering & analysing the data, Identifying findings, Conclusions & recommendations
16 Performance Monitoring & Evaluation (Tips) Making M & E Participatory Design Gather views of intended beneficiaries on how the program should operate and what the success criteria (indicators and targets) should be Implementation Involve beneficiaries in the planning performance monitoring and evaluation i.e. What questions to ask, what trends to track Ask some beneficiary representatives to be on the project M & E team or at least ask them to help as data collectors, perhaps in a mid-term evaluation Involve beneficiaries in the analysis of monitoring & evaluation findings and in the formulation of evaluation recommendations. Transfer post-project M & E responsibility to beneficiaries Beneficiaries should continue with M & E
17 Performance Monitoring & Evaluation (Tips) Characteristics of Participatory Evaluation Participant focus & ownership oriented to needs of stakeholders and not those of the donor. Donor agency only helps participants conduct their own evaluation thus building ownership and commitment to results & facilitating the follow-up action Scope of Participation Range of participants & roles varies. Some studies include s few beneficiaries while others include all stakeholders Participant Negotiations Participating groups meet to communicate and negotiate and reach a consensus of evaluation findings, solve problems and make plans to improve performance Diversityit of Views Views of all participants are sought and recognised. Most powerful stakeholders allow participation of the less powerful Learning Process The process is a learning experience for participants. Emphasis is on identifying lessons learned that will help participants improve program implementation and assess whether targets have been achieved.
18 Performance Monitoring & Evaluation (Tips) Cont d Flexible Design while some preliminary planning for the M & E may be necessary, design issues are decided (as much as possible) in the participatory process. Generally evaluation questions, data collection and analysis methods are determined by the participants not by outside evaluators Empirical Orientation Good participatory evaluations are based on empirical data. Old saying What gets measured gets done. Use of Facilitators participants actually conduct the evaluation, not outside evaluators as is traditional. However, one or more outside experts usually serve as facilitator i.e. provide supporting roles as mentor, trainer, group processors s negotiator, methodologist og Information Information Decisions Actions Results Information (Individuals in a strong learning organisationswill aggressively seek out information from all relevant sources)
19 Another Tool: Creative Monitoring & Evaluation Creative M&E is understood as a participatory approach which combines traditional, standardized M&E tools with alternative, innovative M&E tools. Such innovative instruments are not at all intended to substitute traditional M&E tools, but should complement them. The creative M&E approach allows for monitoring gaps to be filled, and to appraise projects/programs from different perspectives. Creative monitoring can provide a more complete image and understanding of what the project achieved.
20 Creative Monitoring & Evaluation cont It is widely recognized that the effects of Sport & Development programs may not be easily measured and evaluated due to their focus on social and psychosocial change triggered by sports. Methods used in creative monitoring can help to monitor and consequently evaluate the achieve change in knowledge or behavior. Challenges in measuring social and psychosocial impact; Outcomes and impact in the social and psychosocial field are notoriously hard to measure and assess. The so-called soft outcomes such as changes in attitudes, self- perception or certain skills areas are typically defined as crucial, but are often intangible to measure.
21 Creative Monitoring & Evaluation cont It is becoming increasingly acknowledged that a combination of qualitative and quantitative approaches to M&E has the potential to capture a wider set of outcomes (especially unexpected outcomes) and is more likely to provide a more complete picture of the effects of the intervention. Combining approaches accordingly, the range of commonly acceptable M&E tools needs to be broadened and adapted to innovative methods. Therefore, besides traditional, commonly standardized M&E tools (such as questionnaires, focus group discussions, interviews, i etc.), alternative ti M&E tools (e.g. story telling, performing arts, fine arts, photo monitoring, video documenting, etc.) should be considered.
22 Creative Monitoring & Evaluation cont Traditional approaches to M&E, involving often highly technical and specialized methods, may: not be fully conducive to being implemented on the ground, due to lack of knowledge or understanding of M&E among non-specialist implementers not be seen as integral to the intervention not be entirely capable of reflecting both intended and unintended outcomes of the intervention not participatory or inclusive enough
23 Creative Monitoring & Evaluation cont In light of the peculiarities of sport and play focusing on the recreational approaches towards learning, building relationships, developing understanding and generating greater autonomy, creative techniques to M&E can offer a useful additional means of engaging with quality control and programmed improvement that is especially in line with the playful nature of sport and play activities. Furthermore innovative M&E tools could be Furthermore, innovative M&E tools could be motivating for the staff as well as cost-effective.
24 Examples of Creative M&E Examples of a creative M&E approach currently being used in Sport & Development programs or projects include: Photo monitoring Storytelling Participatory video Problem tree Poetry club (International Platform on Sport and Development)
25 Dos & Don'ts Dos & Don'ts of M & E Identify who the stakeholders are persons involved in or affected by the evaluation should be identified so that their needs can be addressed. Do not use only external experts The persons conducting the evaluation should be both trustworthy and competent to perform the evaluation so that the findings achieve maximum credibility & acceptance Information collected should be broadly selected to address pertinent questions about the program and be responsive to the needs and interests of clients and other stakeholders Findings should be disseminated to intended users in a timely fashion. Do not delay with the results Evaluations should be planned, conducted and reported in ways that encourage follow up by stakeholders
26 Dos & Don'ts of M & E (cont d) Dos & Don'ts Ensure that the data used in M & E is: Accurate what evidence exists that the data are valid Close to Beneficiaries- from whom are these data collected? Credible to what degree are the data believable and convincing? Dependable what evidence exists that the data are reliable? Relevant are the data well aligned with important questions? Representative does the scale of the study match the program scale? Understandable how comprehensible are the data to intended users? Well-collected were data collected without t bias or collection errors Evaluation procedures should be practical and keep disruption to a minimal
27 Some Lessons Learnt Use of Evaluation results and processes does not come automatically- must be planned and cultivated throughout the evaluation process I have seen many M & E reports not utilised Use of participatory self-assessment methods creates an environment where participants p are encouraged to express themselves freely and to be forthcoming with opinions and ideas this enables follow-up on M& E recommendations Using a learning by doing approach helps individuals in an organisation strengthen their own evaluation capacities and pass this knowledge on to others. An initial training in Participatory i t M & E however hastens the process Training i in Performance monitoring & evaluation is essential as well as in Results Based Management (RBM) for all practitioners In the process of working jointly with different stakeholders in the process of M & E, partners often strengthen their relationships eato psand dbuild more oenetworks Need to build an Evaluation Culture in your organisations (RBM)
28 Some Links & References org/toolkit/index Dutch Platform Sport & Development p p p menttool/tabid/14593/default.aspx Aaker J (1994): Looking Back and looking forward: A participatory approach in evaluation, New York, PACT Aubel j (1993) Participatory Program Evaluation: A manual for involving stakeholders in the Evaluation Process. New York, PACT Feuerstien M (1996) Partners in Evaluation: Evaluating Development & Community Programmes Gosling L, (ed) 1995: Toolkits: A practical guide to assessment, monitoring, review and evaluation. London. Save the Children.
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
Project Detail: Strengthening Knowledge Sharing in Research 1. Project Information Hosting Center: International Water Management Institute (IWMI) Project Coordinator: Sanjini De Silva Project Duration:
AusAID NGO Cooperation Program Monitoring, Evaluation and Learning Framework May 2012 Disclaimer: This document outlines a proposed new monitoring, evaluation and learning framework (MELF) for the AusAID
WHITE PAPER DRAFT Project Management Excellence Enabling Quality In Project Execution Grzegorz Szalajko, Grzegorz Dzwonnik, Louis Klein, Steve Raue 20/02/2014 Version 1 Systemic Excellence Group Independent
Part 1. Concepts, Tools and Principles 3 Overview Part 1. MfDR Concepts, Tools and Principles M anaging for Development Results (MfDR) is multidimensional, relating back to concepts about how to make international
S t a n Monitoring & Evaluation d a Support Tool Using the Standards to Improve Monitoring and Evaluation r d s Version 1.0 July 2009 1 C o n t e n t s Introduction... 3 Project Design... 5 Project Monitoring...
A Report of the Discussion and Summary of the Graduate Leadership Summit (October 17, 2015) School of Graduate Studies November, 2015 How can we best prepare graduate students for leadership and success
Participatory Monitoring Presentation by Paulette Bynoe, PhD CBD/IADB Civil Society Organizational Dialogue BUDDYS INTERNATIONAL HOTEL PROVIDENCE, EAST BANK DEMARARA GUYANA NOVEMBER 18-20, 2008 1 Outline
9 PARTICIPATORY EVALUATION (PE) PARTICIPATORY EVALUATION (PE) Structure 9.1 Introduction 9.2 Purpose and Function of Evaluation 9.3 Historical Context 9.4 Participatory Monitoring and Evaluation Process
National Principles for Disaster Recovery Introduction Recovery is a significant component within Australia s comprehensive approach to emergency management (Prevention, Preparedness, Response, and Recovery).
ANNEX E: GLOSSARY OF KEY TERMS IN M&E Source: Development Assistance Committee (DAC). 2002. Glossary of Terms in Evaluation and Results-Based Management. Paris: OECD. This glossary is available in English,
Evaluating Participation A guide and toolkit for health and social care practitioners September 2013 This publication was developed by Gary McGrow, Social Researcher, the Scottish Health Council, in partnership
Australian Government Department of Education and Training More Support for Students with Disabilities 2012-2014 Evaluation Case Study Leadership development in special schools Output 7: Supporting school
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction
Fundamental rights & anti-discrimination THE COSTS AND BENEFITS OF DIVERSITY European Commission Emplo 2 THE COSTS AND BENEFITS OF DIVERSITY A Study on Methods and Indicators to Measure the Cost-Effectiveness
Executive Board Hundred and ninety-second session 192 EX/6 PARIS, 31 July 2013 Original: English Item 6 of the provisional agenda PROPOSAL FOR A GLOBAL ACTION PROGRAMME ON EDUCATION FOR SUSTAINABLE DEVELOPMENT
Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) 997-5006 Toll free: 1-800-230-6349 Fax: (819) 953-6088 (For the hearing and speech impaired only (TDD/TTY):
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE DEVELOPMENT ASSISTANCE COMMITTEE PARIS, 1991 DAC Principles for Evaluation of Development Assistance Development Assistance Committee Abstract: The following
DRIVING FORWARD PROFESSIONAL STANDARDS FOR TEACHERS The Standards for Leadership and Management: supporting leadership and management development December 2012 Contents Page The Standards for Leadership
Logical Framework Planning Matrix: Turkish Red Crescent Organisational Development Programme Indicators Sources of verification Assumption/risks Overall Goal The Turkish Red Crescent as a key part of civil
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
Learning & Development Framework for the Civil Service 2011-2014 Table of Contents 1. Introduction & Background... 1 2. Key Objectives and Actions arising... 3 3. Framework Objectives... 4 3.1 Prioritise
37th Session, Paris, 2013 37 C 37 C/57 4 November 2013 Original: English Item 5.19 of the provisional agenda PROPOSAL FOR A GLOBAL ACTION PROGRAMME ON EDUCATION FOR SUSTAINABLE DEVELOPMENT AS FOLLOW-UP
pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage
Risk Management Strategy EEA & Norway Grants 2009-2014 Adopted by the Financial Mechanism Committee on 27 February 2013. Contents 1 Purpose of the strategy... 3 2 Risk management as part of managing for
Promoting the Development of Sustainable Energy Project, Azerbaijan Report of Component 2: Increasing the awareness about renewable energy in Azerbaijan Executive summary The report presents overview of
An introduction to impact measurement Contents 1 Introduction 2 Some definitions 3 Impact measurement at BIG 4 Setting impact measures for programmes APPENDICES A External Resources (separate document)
Evidence Review: Developing a Money Advice Performance Management Framework for Local Authorities in Scotland February 2015 Contents Contents 2 Purpose 3 Background 4 Key Points from Local Authority Submissions
EUROPEAN COMMISSION DIRECTORATE-GENERAL REGIONAL POLICY Thematic development, impact, evaluation and innovative actions Evaluation and additionality The New Programming Period 2007-2013 INDICATIVE GUIDELINES
LSI YW00 Youth Work National Occupational Standards Introduction Youth Work National Occupational Standards Introduction Contents: Suite Overview...2 Glossary......8 Functional Map.11 List of Standards..15
Leadership Program Outcomes Over the past year the Leadership Learning Community has been actively trying to learn more about the leadership outcomes that programs are seeking for individuals, organizations,
A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 This paper reviews current practice in and the potential
Activity: A specific action or process undertaken over a specific period of time by an organization to convert resources to products or services to achieve results. Related term: Project. Appraisal: An
Competencies Part One Strengthening Organizational Core Values and Managerial Capabilities Enquiries regarding the UNIDO Competency model can be addressed to S.Gardelliano@unido.org Contents Message from
Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?
Joining Forces Across Europe for Prevention and Promotion in Mental Health - Conference Report Allyson McCollam & Andrej Marušič 1 379 what were we doing here? contributing...... to the european process......
ETI PERSPECTIVE 2020: A FIVE YEAR STRATEGY Introduction This document is the final and Board approved version of ETI s strategic directions based on the ETI Board meeting discussion of 12 th March 2015.
Embedding public engagement at UCL: successes, barriers and lessons for the future Dr Hilary Jackson UCL Public Engagement Unit ucl.ac.uk/public-engagement email@example.com Beacons for Public
Quality Assurance of Medical Appraisers Recruitment, training, support and review of medical appraisers in England www.revalidationsupport.nhs.uk Contents 1. Introduction 3 2. Purpose and overview 4 3.
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
University Reviews: Exploring a New and Coordinated Approach Abstract Sue Leahy University of Technology Sydney The University of Technology, Sydney (UTS) has endorsed the implementation of a new model
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
Stakeholder Relationship Management A Maturity Model for Organisational Implementation Lynda Bourne 7 Effective Implementation This chapter describes elements necessary for the successful implementation
CAREER AND TRANSITION SERVICES FRAMEWORK: an effective national approach to youth transitions PREAMBLE The Career and Transition Services (CTS) Framework will help young people to make successful transitions
POST DISTRIBUTION MONITORING: - Guidelines to Monitor processes, outputs and outcomes A guide for the Afghanistan Cash & Voucher Working Group George Bete: - August 2013 Funded by European Community Humanitarian
TOOL D14 Monitoring and evaluation: a framework 159 TOOL D14 Monitoring and evaluation: a framework TOOL D14 For: About: Purpose: Use: Resource: Commissioners in primary care trusts (PCTs) and local authorities
helping you do better what you do best 4 Coldbath Square London EC1R 5HL t +44 (0) 20 7713 5722 f +44 (0) 20 7713 5692 e firstname.lastname@example.org w www.ces-vol.org.uk Company limited by guarantee Registered
18 Fresh Ideas for Lawyers frahanblonde` 18 Fresh Ideas for Lawyers Training Ideas for Law Firms, Legal Departments and Law Professionals Overview At FrahanBlondé, we help our clients in the legal sector
Supporting information for appraisal and revalidation During their annual appraisals, doctors will use supporting information to demonstrate that they are continuing to meet the principles and values set
REPORT Complexity-oriented oriented Planning, Monitoring and Evaluation (PME) From alternative to mainstream? Public seminar, 10 November 2010, 1.30-5.00 p.m. Concordia Theatre, The Hague 1. What was the
Volunteer Managers National Occupational Standards Contents 00 Forward 00 Section 1 Introduction 00 Who are these standards for? 00 Why should you use them? 00 How can you use them? 00 What s in a Standard?
Introduction to Monitoring and Evaluation Using the Logical Framework Approach Developed and Presented by: Umhlaba Development Services Umhlaba Development Services Noswal Hall, Braamfontein, Johannesburg,
A private sector assessment combines quantitative and qualitative methods to increase knowledge about the private health sector. In the analytic phase, the team organizes and examines information amassed
World Heritage 36 COM WHC-12/36.COM/5E Paris, 11 May 2012 Original: English / French UNITED NATIONS EDUCATIONAL, SCIENTIFIC AND CULTURAL ORGANIZATION CONVENTION CONCERNING THE PROTECTION OF THE WORLD CULTURAL
ECHO s main mission is to fund the co-ordinated delivery of Community humanitarian assistance and protection through partner humanitarian organisations in order to save and preserve life, reduce or prevent
Organizational development of trade unions An instrument for self diagnosis Elaborated on the basis of an experience in Latin America FNV Mondiaal 2007 INDICE INTRODUCTION I. Concepts 1. What is organizational
March 2001 THE PROGRAMME MANAGERS MONITORING AND EVALUATION TOOLKIT TOOL NUMBER 4: STAKEHOLDER PARTICIPATION IN MONITORING AND EVALUATION 1. Introduction The toolkit is a supplement to the UNFPA programming
MONITORING GOVERNANCE SAFEGUARDS IN REDD+ CHATHAM HOUSE & UN-REDD PROGRAMME WORKSHOP 1 Meeting Report 24 25 May 2010 The need for monitoring REDD+ governance How does one ensure that REDD+ mitigation actions
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
POSITION PAPER 1 EVALUATING BROKERS A 360 degree approach to evaluate a broker s impact on partnerships By Surinder Hundal 1 Contents Pages Overview 3 Frequently Asked Questions 3-6 Defining the Broker
ANNEX I Conceptual Design Revised Performance Appraisal Tool for Resident Coordinators and UN Country Teams: Assessment of Results and Competencies (ARC) I. From One80 to the ARC: Background and Key Changes
An Exploration of Best Practices in Health Promotion: a short history of the Best Practices Work Group, Centre for Health Promotion, University of Toronto by Barbara Kahan and Michael Goodstadt May 14,
New JICA Guidelines for Project Evaluation First Edition Japan International Cooperation Agency (JICA) Evaluation Department June 2010 Contents 1 OVERVIEW OF AID EVALUATION AND JICA'S PROJECT EVALUATION...
DELTAS Africa Initiative Outline Developing Excellence in Leadership, Training and Science August 2014 Improving School Governance 1 The Wellcome Trust and its partners have launched an initiative with
National Society leadership and management development (supporting National Society development) Executive summary This is one of four sub-plans of the programme Supporting National Society development.
Environmental Quality Management, 17 (2), 81-88, 2007 A Framework for Business Sustainability Robert B. Pojasek, Ph.D Business sustainability seeks to create long-term shareholder value by embracing the
Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net
Operational Risk Management - The Next Frontier The Risk Management Association (RMA) Operational risk is not new. In fact, it is the first risk that banks must manage, even before they make their first
ROLE PROFILE Job Title Reports to Service area No. of staff responsible for Budget responsibility ( ) Performance Consultant (Fixed Term) Assistant Director for Human Resources HR 0 None Purpose of Job
SDF 8/2-NM-4 CARIBBEAN DEVELOPMENT BANK SPECIAL DEVELOPMENT FUND (UNIFIED) STATUS REPORT ON THE IMPLEMENTATION OF THE GENDER POLICY AND OPERATIONAL STRATEGY Revised APRIL 2012 ABBREVIATIONS AMT - Advisory
Using Qualitative Information For Impact Assessment: Introducing The Qualitative Impact Assessment Protocol (QUIP) By Katie Wright-Revolledo, INTRAC Introduction Qualitative methods can provide insight
02 03 FIFTH QUADRANT Established in 1998, Fifth Quadrant is a Management Consultancy and Analyst Organisation specialising in: METHODOLOGIES USED CUSTOMER EXPERIENCE STRATEGY CUSTOMER EXPERIENCE RESEARCH
Blackburn College Teaching, Learning and Assessment Strategy 25 August 2015 1 Introduction This document provides a Teaching, Learning and Assessment Strategy covering all of the College s Further Education
Programme Curriculum for Master Programme in Managing People, Knowledge and Change 1. Identification Name of programme Scope of programme Level Programme code Master Programme in Managing People, Knowledge
Organisational and Leadership Development at UWS Context The University of Western Sydney s (UWS) leadership development framework is underpinned by the recognition that its managers and leaders have a
THE UNITED REPUBLIC OF TANZANIA PRESIDENT S OFFICE PUBLIC SERVICE MANAGEMENT ICT Project Management A Step-by-step Guidebook for Managing ICT Projects and Risks Version 1.0 Date Release 04 Jan 2010 Contact
Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1