Results-Based Management (RBM) Guiding Principles
|
|
|
- Jemima Malone
- 10 years ago
- Views:
Transcription
1 Results-Based Management (RBM) Guiding Principles
2 INDEX 1. PREFACE BRIEF HISTORICAL BACKGROUND WHAT IS RBM? WHAT IS A RESULT? HOW TO FORMULATE AN EXPECTED RESULT? WHAT IS THE RELATIONSHIP BETWEEN INTERVENTIONS, OUTPUTS AND RESULTS? MONITORING OF IMPLEMENTATION ANNEXES This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 3
3 1. PREFACE It is said that if you do not know where you are going, any road will take you there. This lack of direction is what results-based management (RBM) is supposed to avoid. It is about choosing a direction and destination first, deciding on the route and intermediary stops required to get there, checking progress against a map and making course adjustments as required in order to realise the desired objectives. For many years, the international organizations community has been working to deliver services and activities and to achieve results in the most effective way. Traditionally, the emphasis was on managing inputs and activities and it has not always been possible to demonstrate these results in a credible way and to the full satisfaction of taxpayers, donors and other stakeholders. Their concerns are straightforward and legitimate: they want to know what use their resources are being put to and what difference these resources are making to the lives of people. In this line, RBM was especially highlighted in the 2005 Paris Declaration on Aid Effectiveness as part of the efforts to work together in a participatory approach to strengthen country capacities and to promote accountability of all major stakeholders in the pursuit of results. It is usually argued that complex processes such as development are about social transformation, processes which are inherently uncertain, difficult, not totally controllable and - therefore - which one cannot be held responsible for. Nonetheless, these difficult questions require appropriate responses from the professional community and, in particular, from multilateral organizations to be able to report properly to stakeholders, and to learn from experience, identify good practices and understand what the areas for improvements are. The RBM system aims at responding to these concerns by setting out clear expected results expected for programme activities, by establishing performance indicators to monitor and assess progress towards achieving the expected results and by enhancing accountability of the organization as a whole and of persons in charge. It helps to answer the so what question, recognizing that we cannot assume that successful implementation of programmes is necessarily equivalent to actual improvements in the development situation. This paper is intended to assist in understanding and using the basic concepts and principles of results-based management. General information on RBM concept in this document is based on materials of some UN agencies. Bearing in mind that different RBM terminology is used by different actors due to each and everyone's specific context, it is important to ensure that the definitions are clear to all involved in the result based management within an organization. Inspite of different terminology used by different actors, the chain itself proceeds from activities to results/impacts. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 4
4 2. BRIEF HISTORICAL BACKGROUND As such, the concept of RBM is not really new. Its origins date back to the 1950 s. In his book The practice of Management, Peter Drucker introduced for the first time the concept of Management by Objectives (MBO) and its principles: - Cascading of organizational goals and objectives, - Specific objectives for each member of the Organization - Participative decision-making - Explicit time period - Performance evaluation and feedback As we will see further on, these principles are very much in line with the RBM approach. MBO was first adopted by the private sector and then evolved into the Logical Framework (Logframe) for the public sector. Originally developed by the United States Department of Defense, and adopted by the United States Agency for International Development (USAID) in the late 1960s, the logframe is an analytical tool used to plan, monitor, and evaluate projects. It derives its name from the logical linkages set out by the planners to connect a project s means with its ends. During the 1990s, the public sector was undergoing extensive reforms in response to economic, social and political pressures. Public deficits, structural problems, growing competitiveness and globalization, a shrinking public confidence in government and growing demands for better and more responsive services as well as for more accountability were all contributing factors. In the process, the logical framework approach was gradually introduced in the public sector in many countries (mainly member States of the Organization for Economic Co-operation and Development (OECD). This morphed during the same decade in RBM as an aspect of the New Public Management, a label used to describe a management culture that emphasizes the centrality of the citizen or customer as well as the need for accountability for results. This was followed by the establishment of RBM in international organizations. Most of the United Nations system organizations were facing similar challenges and pressures from Member States to reform their management systems and to become more effective, transparent, accountable and results-oriented. A changeover to a results-based culture is however a lengthy and difficult process that calls for the introduction of new attitudes and practices as well as for sustainable capacitybuilding of staff. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 5
5 3. WHAT IS RBM? Results -based management (RBM) can mean different things to different people/organizations. A simple explanation is that RBM is a broad management strategy aimed at changing the way institutions operate, by improving performance, programmatic focus and delivery. It reflects the way an organization applies processes and resources to achieve interventions targeted at commonly agreed results. Results-based management is a participatory and team-based approach to programme planning and focuses on achieving defined and measurable results and impact. It is designed to improve programme delivery and strengthen management effectiveness, efficiency and accountability. RBM helps moving the focus of programming, managing and decision-making from inputs and processes to the objectives to be met. At the planning stage it ensures that there is a necessary and sufficient sum of the interventions to achieve an expected result. During the implementation stage RBM helps to ensure and monitor that all available financial and human resources continue to support the intended results. To maximize relevance, the RBM approach must be applied, without exceptions, to all organizational units and programmes. Each is expected to define anticipated results for its own work, which in an aggregative manner contributes to the achievement of the overall or high-level expected outcomes for the organization as a whole, irrespective of the scale, volume or complexity involved. RBM seeks to overcome what is commonly called the activity trap, i.e. getting so involved in the nitty-gritty of day-to-day activities that the ultimate purpose or objectives are being forgotten. This problem is pervasive in many organizations: project/programme managers frequently describe the expected results of their project/programme as We provide policy advice to partners, We train journalists for the promotion of freedom of expression, We do research in the field of fresh water management, etc., focusing more on the type of activities undertaken rather than on the ultimate changes that these activities are supposed to induce, e.g. in relation to a certain group of beneficiaries. An emphasis on results requires more than the adoption of new administrative and operational systems, it needs above all a performance-oriented management culture that supports and encourages the use of new management approaches. While from an institutional point of view, the primordial purpose of the RBM approach is to generate and use performance information for accountability reporting to external stakeholders and for decision-making, the first beneficiaries are the managers themselves. They will have much more control over the activities they are responsible for, be in a better position to take well-informed decisions, be able to learn from their successes or failures and to share this experience with their colleagues and all other stakeholders. The processes or phases of RBM The formulation of expected results is part of an iterative process along with the definition of a strategy for a particular challenge or task. The two concepts strategy and expected results - are This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 6
6 closely linked, and both have to be adjusted throughout a programming process so as to obtain the best possible solution. In general, organizational RBM practices can be cast in twelve processes or phases, of which the first seven relate to results-oriented planning. 1) Analyzing the problems to be addressed and determining their causes and effects; 2) Identifying key stakeholders and beneficiaries, involving them in identifying objectives and in designing interventions that meet their needs; 3) Formulating expected results, in clear, measurable terms; 4) Identifying performance indicators for each expected result, specifying exactly what is to be measured along a scale or dimension; 5) Setting targets and benchmarks for each indicator, specifying the expected or planned levels of result to be achieved by specific dates; 6) Developing a strategy by providing the conceptual framework for how expected results shall be realized, identifying main modalities of action reflective of constraints and opportunities and related implementation schedule; 7) Balancing expected results and the strategy foreseen with the resources available; 8) Managing and monitoring progress towards results with appropriate performance monitoring systems drawing on data of actual results achieved; 9) Reporting and self-evaluating, comparing actual results vis-à-vis the targets and reporting on results achieved, the resources involved and eventual discrepancies between the expected and the achieved results; 10) Integrating lessons learned and findings of self-evaluations, interpreting the information coming from the monitoring systems and finding possible explanations to eventual discrepancies between the expected and the achieved. 11) Disseminating and discussing results and lessons learned in a transparent and iterative way. 12) Using performance information coming from performance monitoring and evaluation sources for internal management learning and decision-making as well as for external reporting to stakeholders on results achieved. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 7
7 4. WHAT IS A RESULT? A result is the raison d être of an intervention. A result can be defined as a describable and measurable change in state due to a cause and effect relationship induced by that intervention. Expected results are answers to problems identified and focus on changes that an intervention is expected to bring about. A result is achieved when the outputs produced further the purpose of the intervention. It often relates to the use of outputs by intended beneficiaries and is therefore usually not under full control of an implementation team. 5. HOW TO FORMULATE AN EXPECTED RESULT? Formulate the expected results from the beneficiaries perspective Formulating expected results from the beneficiaries perspective will facilitate focusing on the changes expected rather than on what is planned to be done or the outputs to be produced. This is particularly important at the country level, where UNESCO seeks to respond to the national development priorities of a country. Participation is key for improving the quality, effectiveness and sustainability of interventions. When defining an intervention and related expected results one should therefore ask: Who participated in the definition of the expected results? Were key project stakeholders and beneficiaries involved in defining the scope of the project and key intervention strategies? Is there ownership and commitment from project stakeholders to work together to achieve identified expected results? This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 8
8 Use change language instead of action language The expected result statement should express a concrete, visible, measurable change in state or a situation. It should focus on what is to be different rather than what is to be done and should express it as concretely as possible. Completed activities are not results, results are the actual benefits or effects of completed activities. Action language expresses results from the provider s perspective: To promote literacy by providing schools and teaching material. can often be interpreted in many ways: To promote the use of computers. focuses on completion of activities: To train teachers in participatory teaching. Change language describes changes in the conditions of beneficiaries: Young children have access to school facilities and learn to read and write.... sets precise criteria for success: People in undersupplied areas have increased knowledge of how to benefit from the use of a computer and have access to a computer.... focuses on results, leaving options on how to achieve them (how this will be achieved will be clarified in the activity description): Teachers know how to teach in a participatory way and use these techniques in their daily work Make sure your expected results are SMART Although the nature, scope and form of expected results differ considerably, an expected result should meet the following criteria (be SMART ): Specific: It has to be exact, distinct and clearly stated. Vague language or generalities are not results. It should identify the nature of expected changes, the target, the region, etc. It should be as detailed as possible without being wordy. Measurable: It has to be measurable in some way, involving qualitative and/or quantitative characteristics. Achievable: It has to be achievable with the human and financial resources available ( realistic ). Relevant: It has to respond to specific and recognized needs or challenges and to be within mandate. Time-bound: It has to be achieved in a stated time-frame or planning period. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 9
9 Once a draft expected results statement has been formulated, it is useful to test its formulation going through the SMART criteria. This process enhances the understanding of what is pursued, and is of help in refining an expected result in terms of their achievability and meaningfulness. Example: if we consider a work plan to be undertaken in a specific country that includes the expected results statement Quality of primary education improved, the application of the SMART questioning could be as follows: 1. Is it Specific? What does quality actually mean in this context? What does an improvement of quality in primary education amount to concretely? Who are the relevant stakeholders involved? Are we working on a global level, or are we focusing on a particular region or country? In responding to the need of being specific, a possible expected result formulation could finally be: Competent authorities in Country X adopted the new education plan reviewed on the basis of international best practices and teachers and school personnel implement it. 2. Is it Measurable? Can I find manageable performance indicators that can tell about the level of achievement? Possible Performance Indicators could be: - % of teachers following the curriculum developed on the basis of the new education plan (baseline 0%, benchmark 60%) - % of schools using quality teaching material (baseline 10%, benchmark 90%) 3. Is it Achievable? Do I have enough resources available to attain the expected result? I need to consider both financial and human resources. If the answer is negative, I have to either reconsider and adjust the scope of the project or mobilize additional resources. 4. Is it Relevant? Is the expected result coherent with the upstream programming element based on. of its domains? If the answer is negative I should drop the activity. 5. Is it Time-bound? The expected result should be achievable within the given timeframe for programming processes this period. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 10
10 Improving the results formulation: the SMART process Quality of primary education improved Specific What does quality mean? Are we working at a global level or are we focusing on the region/ community? Measurable Can I find manageable performance indicators that can tell about the level of achievement? Achievable Do I have enough resources to attain it? Relevant Is it contributing to the overall goal, is it answering to perceived needs? Time- BOUND Is it feasible within the given timeframe? Competent authorities in Country X adopted the new education plan reviewed on the basis of international best practices and teachers and school personnel implement it Performance indicators: % of teachers following the curriculum developed on the basis of the new education plan (baseline 0%, benchmark 60%) % of schools using quality teaching material (baseline 10%, benchmark 90%) Find a proper balance among the three Rs Once an intervention is formulated, it can be useful to check and improve its design against yet another concept namely, establishing a balance between three variables Results (describable and measurable change in state that is derived from a cause and effect relationship), Reach (the breadth and depth of influence over which the intervention aims at spreading its resources) and Resources (human, organisational, intellectual and physical/material inputs that are directly or indirectly invested in the intervention). Unrealistic project plans often suffer from a mismatch among these three key variables. It is generally useful to check the design of a project by verifying the three Rs by moving back and forth along the project structure and by ensuring that the logical links between the resources, results and the reach are respected. It is rather difficult to construct a results-based design in one sitting. Designs usually come together progressively and assumptions and risks have to be checked carefully and constantly along the way. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 11
11 6. WHAT IS THE RELATIONSHIP BETWEEN INTERVENTIONS, OUTPUTS AND RESULTS? Interventions, outputs and results are often confused. Interventions describe what we do in order to produce the changes expected. The completion of interventions leads to the production of outputs. Results are finally the effects of outputs on a group of beneficiaries. For example, the implementation of training workshops (activity) will lead to trainees with new skills or abilities (outputs). The expected result identifies the behavioral change among the people that were trained leading to an improvement in the performance of, say, an institution the trainees are working in, which is the ultimate purpose of the activity. If we move our focus from what we do to what we want the beneficiaries to do after they have been reached by our intervention, we may realize that additional types of activities could be necessary to make sure we will be able to achieve the expected results. It is important that a project is driven by results and not activities. Defining expected results: is not an exact science; includes a solid understanding of the socio-economic, political and cultural context; is influenced by available resources, the degree of beneficiary reached and potential risk factors; requires participation of key stakeholders. The following examples may help to understand the relationship between interventions, outputs and results, but should not be seen as a generally applicable master copy as every intervention is different from another. TITLE OF THE ACTIVITY Capacity building for sustainable development in the field of water management INTERVENTIONS OUTPUTS RESULTS Compilation of best practices in the field of water information management. Develop Internetbased information systems/web pages and databases and other tools (software, guidelines, databases) to transfer and share water information. Organization of Water Information Summits. Best practices for management of water information collected and disseminated. Software, guidelines, database, etc. available to the institutions concerned. Water Information Summits attended by policy makers and representatives of institutions concerned. Relevant institutions assisted for the Relevant institutions have adapted and implemented best practices for water information management. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 12
12 Provision of technical assistance. implantation of best practices. Contribution to a reconstruction programme for the education system in country X Assessing the situation with regard to the national education system Organization of workshops for decision-makers and experts to discuss and select a new education plan. Definition of an education plan ensuring an adequate learning environment in country X. Provision of technical assistance Organization of training workshops for teachers and school personnel Development of teaching and learning material. Situation analysis reportcompleted. Workshops attended by relevant decisionmakers and experts. Education plan developed and available to local counterparts. Teachers and school personnel trained. Implementation capacities of local counterparts improved. Teaching and learning materials delivered. Relevant authorities have adopted the new education plan and teachers and school personnel implement it. UNESCO Prize for Tolerance Selection and information of jury. Preparation of brochures, information material. Development and organization of an Information campaign. Advertising the prize. Development of partnerships for the identification of a list of candidates. Organization of the award ceremony. Organization of press conferences. Follow-up and assessment of media Jury nominated and agreeable to main stakeholders. Brochures, leaflets, videos produced and disseminated. Information campaign implemented List of candidates completed and agreeable to main stakeholders. Prize-winner nominated. Press Conference organized and attended by identified journalists. Concept of tolerance spread among the general public in a country/region/globally. coverage. Media coverage of This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 13
13 the event. Promotion of the UNESCO Universal Declaration of Cultural Diversity Develop guidelines on the application of the Declaration in different fields. Inform about the Declaration. Organization of workshops on the Declaration for policymakers and key decision-makers. Guidelines on the application of the Declaration in different fields produced. Workshops on the Declaration organized and guidelines on the Declaration distributed to persons attending the workshops. Decision-makers put to practice the guidelines on the application of the Declaration in different fields. Increasing access to quality basic education for children through community learning centres Discussions with local authorities. Assessing the feasibility of community-based learning centre in community X. Preliminary discussions with local stakeholders. Sensitization seminars for local leaders and community members. Curriculum development of community-based learning centres. Selection of training personnel among the local community. Adapting training material Training of personnel. Production of information material for local authorities. Meetings with local authorities for the replication of such centres. Principle agreement by local authorities. Feasibility study completed and gender analysis produced and disseminated. Principle agreement by community leaders. Community Centre proposal completed and submitted to local authorities and community leaders. Personnel selected. Curriculum and training material for community-based learning centres developed. Managers and teachers have the necessary skills to implement their functions. Brochures & videos developed and disseminated. Local leaders and community members informed, sensitized and convinced. The centre is operational and is an integral part of the community life. Steps are taken by local authorities for replicating this initiative in other communities. Provision of This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 14
14 technical assistance to local authorities for replicating such centres in other communities. We can anticipate a few challenges in this process: - The nature of expected results: it is obvious that the nature, magnitude, meaning of expected results cannot be the same among the different levels. Nevertheless, it is crucial that all these results build a chain of meaningful achievements, bridging the gap between the mandate and the strategic objectives of the organisation actually achieves in its daily operations. - Reconciling global and local dimensions: RBM stresses results and greater focus; this should be done without sacrificing the organisation s global mandate and its commitment to decentralisation and responsiveness to country needs and priorities: a good balance has to be found between global and field-oriented approaches. 7. MONITORING OF IMPLEMENTATION Monitoring can be described as a continuing function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing ( ) intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds (Source: OECD RBM Glossary). The function of a monitoring system is to compare the planned with the actual. A complete monitoring system needs to provide information about the use of resources, the activities implemented, the outputs produced and the results achieved. What we are focusing on here is a results-based monitoring system: at the planning stage, through its monitoring system, the officer in charge has to translate the objectives of the intervention in expected results and related performance indicators and to set the baseline and targets for each of them. During implementation, he needs to routinely collect data on these indicators, to compare actual levels of performance indicators with targets, to report progress and take corrective measures whenever required. As a general rule, no extra resources (neither human, nor financial) will be allowed for monitoring tasks, hence the responsible person has to ensure that these tasks can be undertaken with the budget foreseen (as a general rule of thumb, about 5% of the resources should be set aside for this purpose). Selection and formulation of performance indicators Monitoring is done through the use of appropriate performance indicators. When conceiving an intervention, the responsible officer is also required to determine appropriate performance indicators This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 15
15 that will allow to track progress and assess the effectiveness of our intervention, i.e. if it was capable of producing the intended results. Indicators support effectiveness throughout the processes of planning, implementation, monitoring, reporting and evaluation. Indicators may be used at any point along the chain inputs, interventions, outputs, results, but a results-based monitoring system does not address compliance to the rate of expenditure or to the implementation plan (answering the question: have we done it? ), but on the actual benefits that our interventions were actually able to bring to the targeted populations (answering the question: we have done it, so what? ). Performance indicators are aimed at giving indications of change caused or induced by the intervention. This core purpose does not require sophisticated statistical tools, but reliable signals that tell, directly or indirectly, about the real facts on which one undertakes to have leverage. A fair balance is to be sought between the cost - both in terms of time and money - to collect the information required and its capacity to reflect the desired changes. Even a carefully selected, clearly defined indicator is of little use unless it is actually put to use. A critical test of an indicator is how practical it is to monitor. Thinking about an indicator is one thing, actually finding, recording and presenting the data is another. Indicators need to be approached as a practical tool, not merely as a conceptual exercise. Performance indicators are signposts of change. They enable us to verify the changes the interventions we are dealing with seek to achieve. The purpose of indicators is to support effective programme planning, management and reporting. Indicators not only make it possible to demonstrate results, but they can also help produce results by providing a reference point for monitoring, decision-making, stakeholder consultations and evaluation. We should bear in mind, however, that indicators are only intended to indicate, and not to provide scientific proof or detailed explanations about change. In addition, we should avoid the temptation to transform the measurement of change into a major exercise with a burdensome workload. Measuring change should not take precedence over programme activities that generate the changes to be measured. The critical issue in selecting good indicators is credibility, not the number of indicators, nor the volume of data or precision in measurement. The challenge is to meaningfully capture key changes by combining what is substantively relevant with what is practically feasible to monitor. At the end of the day, it is better to have indicators that provide approximate answers to some important questions than to have exact answers to many unimportant questions. Selecting substantively valid and practically possible performance indicators presupposes an indepth understanding of the situation and of the mechanisms subtending change. Therefore, the use of pre-designed or standardised performance indicators is not recommended, as they often do not address the specificities of the situation in which the intervention is carried out. Performance indicators have to be designed on the basis of the ambition of an intervention, its scope and the environment they are implemented in. Failing to design good indicators often means that the results are not clearly defined or that they are too wide-ranging. The process of selecting indicators can help identify the core issues of the intervention and translate often intangible concepts into more concrete and observable elements. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 16
16 A result and its indicator should not be mixed up. The result is the achievement. Indicators should tell about the achievement. Indicators for soft assistance activities The experience of a number of development cooperation agencies with the shift to a results-based approach has shown that, unless guarded against, there could be a tendency for country operations to focus more explicitly on quantifiable initiatives. It is therefore critical that the organization guard against the development of any disincentives that would prevent it from focusing on capacitybuilding and advocacy work, both of which are complex and long-term and against which it may be much more difficult to assess results than it is in other sectors. The term capacity in this framework refers to the abilities, skills, understandings, attitudes, values, relationships, knowledge, conditions and behaviours that enable organizations, groups and individuals in a society to generate benefits and achieve their objectives over time. Capacity also reflects the abilities of these actors to meet the needs and demands of the stakeholders for whom they were established or to whom they are accountable. These attributes cover formal, technical and organizational abilities and structures and also the more human, personal characteristics that allow people to make progress. Quantitative vs. qualitative indicators Indicators can comprise a variety of types of signals such as numbers, ranking systems or changes in the level of user approval. A signal also features a scale of observation. For example, the indicator 65 per cent of enrolled students graduate primary school features a percentage signal with a scale of 65 per cent. Signals and scales lend themselves to indicators that express qualitative and/or quantitative information. Quantitative indicators are numerical. Qualitative indicators use categories of classification, based on individual perceptions. The concept of quantitative versus qualitative indicators has been a subject of frequent discussion over the past few years. The common belief is that quantitative indicators are measurements that stick to cold and hard facts and rigid numbers and there is no question about their validity, truth and objectivity while qualitative indicators are seen as subjective, unreliable and difficult to verify. No one type of indicator or observation is inherently better than another; its suitability depends on how it relates to the result it intends to describe. There should be a shift away from the approach that indicators should be quantitative rather than qualitative. It is expected to select the type of indicator that is most appropriate for the result being measured. If a qualitative indicator is determined to be most appropriate, one should clearly define each term used in the measure, make sure to document all definitions and find possible ways (such as using rating scales) to minimize subjectivity. For example, if the result under consideration is in the area of improving the functioning of the government, in particular concerning its preparation to respond to local needs, we could measure the degree of results achievement through indicators measuring the change in levels of end-user approval (or client satisfaction). This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 17
17 Possible indicators could therefore be: Average rate of perceived responsiveness of the government to the needs of population, on a scale from 1 to 10. Proportion of people who perceive local government management as very participatory. If this proportion increases from 40 per cent to 65 per cent over a certain period of time, this increase provides some measure of the degree of qualitative change. This kind of numerical expression of qualitative considerations may also be obtained through indicators that use rating systems that rank, order or score given categories of attributes. Proportion of people who rate central government adequateness to their own needs 6 or more. Qualitative indicators are particularly helpful - for example - when the actions involve capacity development for service delivery. The perceptions of end-users regarding service delivery gets straight to the issue of whether the services are wanted, useful and effectively delivered. The satisfaction of end-users (or clients) has the advantage of some comparability. Results may be compared and data disaggregated by kind of service, location, time, etc. This approach is not without its problems, however. The only way of getting this information may be through a survey that may reveal itself too costly, clients may not always be easy to identify, and their perceptions of satisfaction with services is subject to influences other than the service itself. Types of Performance Indicators Several types of performance indicators can be used to assess progress towards the achievement of results: a) Direct Statistical Indicators Direct statistical indicators show progress when results are cast in terms of readily quantifiable short-term changes. For example, if the result is Nominations of cultural and natural properties from regions or categories of heritage, currently under-represented or non-represented on the World Heritage List increased, it should not be difficult to secure direct quantifiable data about the number of new nominations over the time period of a biennium (or less). Care must be taken to ensure that the time span of the result lends itself to the collection of such data for use in review. b) Proxy Indicators Proxy indicators are normally quantitative, but do not directly relate to the result. A proxy is used to show progress. It should be used when getting the full data is too time-consuming, or when the timeliness of complete data would fall outside the need for review. However, there must be a prima facie connection between the proxy and the result. For example if the result is Public recognition improved of the importance of the mathematical, physical, and chemical sciences for life and societal development, a good Proxy Indicator might be the improvement of the media coverage concerning these issues. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 18
18 c) Narrative Indicators When the results are not easily quantifiable (changing attitudes, building capacities, etc.) over the time period of the biennium, and the number of recipients is not too big, a non-statistical approach can be envisaged to develop an indication of progress. Narrative indicators largely focus on the process of change. This technique works especially well in instances where capacity building, training, conferences, network development and workshops are the planned interventions. However, when dealing with stakeholders, care needs to be taken to avoid a focus simply on satisfaction. Rather, the focus should be on what happened (or at least on what the recipients have planned to do) as a result of the intervention/participation. For example if the expected result is National capacities in educational planning and management strengthened, then a valid narrative indicator might be a follow-up questionnaire to be circulated among those individuals who participated in training, or conferences or other activities to ask them what they did (or what they have planned to do) in their countries. Narrative indicators enable an organization to begin to explore complex interrelationships among factors without recourse to extremely expensive statistical research. Risks when identifying performance indicators: There are a number of risks when defining and using performance indicators. The most frequent are: Oversimplification and misunderstanding of how development results occur and confusion over accountability for results; Overemphasis on results that are easy to quantify at the expense of less tangible, but no less important results; Mechanical use of indicators for reporting purposes in ways that fail to feed into strategic thinking and organizational practices. In the following table, possible performance indicators are added to the previously introduced examples completing the RBM planning and monitoring cycle. TITLE OF THE ACTIVITY Capacity building for sustainable development in the field of water management OUTPUTS RESULTS PERFORMANCE INDICATORS Relevant institutions have adapted and implement best practices for water information management. Best practices for management of water information collected and disseminated. Software, guidelines, database, etc. available to the institutions concerned. Water Information Summits attended by Number and importance of institutions represented at the Water Information Summits. Profile of representatives participating at the Water Information Summits. policy makers and Number of This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 19
19 representatives of institutions concerned. Relevant institutions assisted for the implantation of best practices. institutions where there is evidence that best practices are being implemented. Access (disaggregated per country) to Internetbased information systems/web pages and databases, etc.. Number of institutions officially requesting technical assistance for the implementation of best practices Contribution to a reconstruction programme for the education system in country X Situation analysis report completed. Workshops attended by relevant decisionmakers and experts. Education plan developed and available to local counterparts. Teachers and school personnel trained. Implementation capacities of local counterparts improved. Teaching and learning materials delivered. Relevant authorities have adopted the new education plan and teachers and school personnel implement it. New education plan adopted Percentage of schools in country X where the new education plan is being implemented Attendance rates of schools implementing the new education plan UNESCO Prize for Tolerance Jury nominated and agreeable to main stakeholders. Brochures, leaflets, videos produced and disseminated. Information campaign implemented List of candidates completed and agreeable to main Concept of tolerance spread among the general public in a country/region/globally. Media coverage of the prize stakeholders. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 20
20 Prize-winner nominated. Press Conference organized and attended by identified journalists. Media coverage of the event. Promotion of the UNESCO Universal Declaration of Cultural Diversity Best practices for management of water information collected and disseminated. Software, guidelines, database, etc. available to the institutions concerned. Water Information Summits attended by policy makers and representatives of institutions concerned. Relevant institutions assisted for the implantation of best practices. Relevant institutions have adapted and implement best practices for water information management. Number and importance of institutions represented at the Water Information Summits. Profile of representatives participating at the Water Information Summits. Number of institutions where there is evidence that best practices are being implemented. Access (disaggregated per country) to Internetbased information systems/web pages and databases, etc.. Number of institutions officially requesting technical assistance for the implementation of best practices Increasing access to quality basic education for children through community learning centres Situation analysis report completed. Workshops attended by relevant decisionmakers and experts. Education plan developed and available to local counterparts. Teachers and school Relevant authorities have adopted the new education plan and teachers and school personnel implement it. New education plan adopted Percentage of schools in country X where the new education plan is being implemented Attendance rates of schools implementing the new education plan personnel trained. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 21
21 Implementation capacities of local counterparts improved. Teaching and learning materials delivered. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 22
22 8. ANNEXES RBM GLOSSARY DEFINITIONS ОПРЕДЕЛЕНИЯ ACHIEVEMENT: Result(s), or part of the result(s) accomplished at a given point in time BACKGROUND: Context and history (local, national, sectoral etc...) of the intervention and its specified area and beneficiaries (local, institutional,...). BASELINE DATA: Data describing the situation before the implementation of the intervention, related to each result, at each level. It is the starting point from which progress towards expected results will be measured. ДОСТИЖЕНИЕ - результат(ы) или часть результат(ов), которые получены в определенный момент времени. ИСХОДНЫЕ ДАННЫЕ контекст и исторические сведения (местные, национальные, секторальные т.д.) об интервенции, ее конкретной сфере и бенефициарах (местных, институциональных,). БАЗОВЫЕ ДАННЫЕ - данные, описывающие ситуацию до осуществления интервенции, связанные с каждым результатом и на каждом уровне. Они являются отправной точкой для измерения движения к ожидаемому результату. BENCHMARK(S): Reference points or Standards set in the recent past by an institution or by other comparable organizations against which performance or achievements can be assessed. КОНТРОЛЬНЫЕ ПОКАЗАТЕЛИ - ориентиры или стандарты, созданные в недавнем прошлом учреждением или другими организациями, с помощью которых можно оценивать успешность достижений. В ЮНЕСКО мы используем термин «контрольные показатели» как цели, достижимые за двухлетний период и измеряемые с помощью соответствующего индикатора эффективности. BENEFICIARIES: The individuals, groups, or organizations, whether targeted or not, that benefit, directly or indirectly, from the БЕНЕФИЦИАРИЙ человек, группа людей, организация, которые являются прямыми или косвенными получателями выполнения 23 This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty
23 intervention. программы\ работы. CONSTRAINTS: Factors external to the intervention that could jeopardize its success and that need to be taken into account when defining the implementation strategy as well as when assessing the results achieved. EVALUATION: The systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation and results. The aim is to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability. EX-ANTE EVALUATION: An evaluation that is performed before implementation of a development intervention. EX-POST EVALUATION: Evaluation of a development intervention after it has been completed. Note: It may be undertaken directly after or long after completion. The intention is to identify the factors of success or failure, to assess the sustainability of results and impacts and to draw conclusions that may inform other interventions. EXTERNAL EVALUATION: The evaluation of a development intervention conducted by entities and/or individuals outside the donor and implementing organizations. ОГРАНИЧЕНИЯ внешние по отношению к интервенции факторы которые могут помешать успеху и которые необходимо учитывать при выработке стратегии, а также при оценке полученных результатов. ОЦЕНКА - систематическая и объективная оценка осуществляемого или завершенного проекта, программы или политики, того, как он построен, как выполняется и с какими результатами. Цель заключается в том, чтобы определить, насколько верно поставлены задачи и как они выполнены, а также действенность их развития, эффективность, их воздействие и устойчивость. ПРЕДВАРИТЕЛЬНАЯ ОЦЕНКА - оценка, которая осуществляется до осуществления интервенции или разработки интервенции. ОКОНЧАТЕЛЬНАЯ ОЦЕНКА - оценка разработки интервенции после ее завершения. Примечание: ее можно провести сразу после завершения или через определенное время после завершения. Цель заключается в том, чтобы выявить факторы успешности или неуспешности, оценить устойчивость результатов и воздействий и сделать выводы, полезные для других интервенций. ВНЕШНЯЯ ОЦЕНКA (External evaluation) оценка разработанной интервенции и осуществленной предприятиями или отдельными лицами, не являющимися донорами или организациями-подрядчиками. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 24
24 SELF-EVALUATION: An evaluation by those who are entrusted with the design and/ or planning and/ or delivery of a programme, project, activity. FUNCTIONS: The range of functions that the organization performs. САМООЦЕНКА оценка теми, кому была доверена разработка, или планирование, или осуществление программы, проекта, мероприятия. ФУНКЦИИ - ряд функций, которые выполняет организация. IMPACT(S): The positive and/or negative, long-term effects produced by an intervention, directly or indirectly, intended or unintended. ВОЗДЕЙСТВИE(Я) позитивный или негативный долговременный эффект, произведенный интервенцией прямой или опосредованный, намеренный или непреднамеренный. IMPLEMENTATION STRATEGY: The plan, method and the series of interventions designed to achieve a specific expected result. СТРАТЕГИЯ ОСУЩЕСТВЛЕНИЯ план, метод или серия интервенций, нацеленных на то, чтобы достигнуть каких-либо определенных результатов. INDICATOR: see Performance Indicator. ИНДИКАТОР - см. показатель эффективности выполнения MILESTONE: A significant stage or event in the progress or development of a society or a project. ЭТАП - важная часть или событие в развитии общества или выполнении проекта. OPPORTUNITIES: The factors external to the intervention that can be utilized by the intervention in order to improve the effectiveness of the activities implemented. Need to be taken into account when defining the implementation strategy. ВОЗМОЖНОСТИ внешние по отношению к интервенции факторы, которые можно использовать при интервенции для повышения эффективности осуществляемых мероприятий. Необходимо учитывать при выработке стратегии This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 25
25 осуществления. OUTCOMES: The medium-term effects/ results РЕЗУЛЬТАТЫ среднесрочные эффекты программ OUTPUTS: The products, capital goods and services produced by the intervention, which are necessary for the achievement of results. ПРОДУКТЫ продукты, капиталы, товары и услуги, произведенные в результате интервенции, которые необходимы для достижения результатов. PERFORMANCE INDICATORS: The objectively verifiable units of measurement that are used for assessing and measuring progress or lack thereof - towards a Result. Using indicators is essential for managing, monitoring and reporting as well as for evaluating the implementation of programming elements. ИНДИКАТОРЫ ЭФФЕКТИВНОСТИ ВЫПОЛНЕНИЯ поддающиеся объективной проверке единицы измерения, которые используются для оценки и измерения прогресса или отсутствия такового необходимого для получения результатов. Использовать индикаторы важно с точки зрения управления, мониторинга и отчетности, а также для оценки осуществления элементов программы. RATIONALE: The justification for the intervention itself as well as for its specific characteristics (e.g. timing, resources allocated, delivery mechanisms chosen, defined expected results). ОБОСНОВАНИЕ обоснование самого вмешательства, а также его специфических характеристик (напр. график осуществления, ресурсы, выбранные механизмы поставок и определенные конечные результаты). This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 26
26 REACH: The people, groups or organizations who will benefit directly or indirectly from, or who will be affected by the results of the intervention. ОХВАТ люди, группы или организации, которые получат прямую или непрямую выгоду или те, на кого будут воздействовать результаты интервенции. RESOURCES: Organisational, intellectual, human and financial inputs necessary for the implementation of an intervention. RESULT: The describable and measurable change in state that is derived from a cause and effect relationship. РЕСУРСЫ организационный, интеллектуальный и финансовый вклады, необходимые для выполнения интервенций. РЕЗУЛЬТАТ изменения состояния, которые можно измерить и описать и которые обусловлены причинно-следственной связью. EXPECTED RESULTS: The expected results statement should refer to a result expected to be achieved by the organization interventions, within the working period and with the available resources. ОЖИДАЕМЫЕ РЕЗУЛЬТАТЫ - определение ожидаемых результатов должно относиться к результатам, которые ожидаются вследствие интервенций организации в течение рабочего периода и при наличии ресурсов. RESULTS CHAIN: The design of an expected result is required for each programming element, at any programming level. These results should form part of a chain of results: those of downstream elements combine to produce the result of the element to which they relate, a mechanism that swirls bottom up throughout the programming tree. STAKEHOLDERS: Agencies, organisations, groups or individuals who have a direct or indirect interest in the development intervention or its evaluation. ЦЕПЬ РЕЗУЛЬТАТОВ - модель «ожидаемые результаты» необходима для каждого элемента программы и на любом уровне программы. Эти результаты должны составлять часть цепочки результатов: нижестоящие элементы скомбинированы и дают результат элемента, к которому они относятся механизм, который поворачивает все вверх дном на протяжении всего дерева программы. ЗАИНТЕРЕСОВАННЫЕ СТОРОНЫ агентства, организации, группы или отдельные лица, которые заинтересованы прямо или косвенно в разработке интервенции или в ее оценке. This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 27
27 TARGETS: Quantitative and qualitative levels of performance indicators that an intervention is meant to achieve at a given point in time (e.g. at the end of the biennium). ЦЕЛИ количественные и качественные уровни индикаторов качества выполнения того, что должна достичь интервенция в определенный момент времени (например, в конце двухлетнего периода). This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 28
28 RBM MODULES This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the UNESCO Cluster Office in Almaty 29
29 What is RBM? Background and RBM Fundamentals Module 1 a) A Management Tool b) A frame of mind c) RBM = Really Boring Monologue d) A waste of time Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials RBM is Management tool to improve management effectiveness and accountability in achieving results What does RBM involve? a) Identifying results (inputs, outputs, outcomes, impacts) and their causal relationships b) Developing indicators to measure success c) Identifying assumptions or risks that may influence success or failure d) Measuring performance to inform decisionmaking and for reporting e) All of the above Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials
30 RBM involves Involves identifying strategic elements: Results (inputs, outputs, outcomes, impacts) and their causal relationships, indicators to measure success assumptions or risks that may influence success or failure Measuring performance Decision-making and reporting Why RBM? (1) There is an increasing demand to better demonstrate results Donors want efficiency and effectiveness of aid The public wants better services Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials Why RBM? (2) Problems frequently encountered in projects and programmes Do not achieve objectives Lack of clear statement of objectives & performance indicators Lack of clear roles and responsibilities Lack of participation in project design from the operating levels Projects not related to programme objectives / programmes not related to national objectives Presentation is based on UNIFEM materials Advantages of RBM (1) Provides Integrated management strategy - project program organisational - sector country Shows Partners/donors commitment by presenting the programme and the rationale in a concise manner Presentation is based on UNIFEM materials
31 Advantages (2) Clarifies for all stakeholders what the project aims to achieve Provides a common framework to discuss project performance By necessity it intensifies communication between all project partners (Should) Lead to a reduction in the length of reports and an improvement of content Presentation is based on UNIFEM materials True or False? RBM can be brought into an organization quickly, in a short-time time frame There is only one way to apply RBM Presentation is based on UNIFEM materials Challenges It requires fundamental change organizational culture and technical skills/systems It takes years to complete It has financial implications It s a tool - not a panacea Should not be applied rigidly Presentation is based on UNIFEM materials Some Lessons Learned Common vision Ownership and Partnership Capacity Building Reform of budget processes and financial management Realistic outcomes and simplicity Devolution of authority Learning and decision-making Presentation is based on UNIFEM materials
32 Management Cycle RBM Fundamentals Analysis & Needs Assessment Strategic Planning Operational Planning Reporting and Review Continuous Learning Implementation Monitoring Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials MfDR in the Management Cycle Analysis & Needs Assessment Analysis & Needs Assessment Strategic Planning Operational Planning Context & Client Needs Analysis Strategic Planning Operational Planning Reporting and Review Continuous Learning Reporting and Review Continuous Learning Implementation Implementation Monitoring Monitoring Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials
33 Strategic Planning Operational Planning Analysis & Needs Assessment Reporting and Review Results Chain Risks Indicators Continuous Learning Monitoring Operational Planning Implementation Analysis & Needs Assessment Reporting and Review Strategic Planning Continuous Learning Monitoring Refined Results & Indicators Detailed Outputs & Activities Performance Measurement System Set-up Implementation Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials Implementation Monitoring Analysis & Needs Assessment Reporting and Review Strategic Planning Monitoring Operational Planning Continuous Learning Activities Undertaken Outputs Delivered Presentation is based on UNIFEM materials Analysis & Needs Assessment Reporting and Review Strategic Planning Operational Planning Continuous Learning Implementation Indicator Data Collected & Analyzed Presentation is based on UNIFEM materials
34 Reporting and Review MfDR in the Management Cycle Analysis & Needs Assessment Strategic Planning Decision-making Accountability HR Management Continuous Learning Monitoring Operational Planning Presentation is based on UNIFEM materials Implementation Context & Client Needs Analysis Results Chain Risks Indicators Continuous Decision-making Learning Accountability HR Management Indicator Data Collected & Analyzed Refined Results & Indicators Detailed Outputs & Activities Performance Activities Undertaken Outputs Delivered Presentation is based on UNIFEM materials Measurement System Set-up Analysis & Needs Assessment Learning in MfDR Reporting and Review Strategic Planning Continuous Learning Monitoring Operational Planning Implementation Using Performance Information Often the last consideration, but should be the first The need for this information should drive the MfDR process Management Decision-making Resource Allocation Learning Reporting and Accountability Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials
35 Developing Results Results should be: Situated in a framework that shows their causal relationship between one and another Developed in a participatory process Linked to project / program context and overall goals Clearly worded specifying what is expected to be achieved Presentation is based on UNIFEM materials Measuring Results Results are measured by indicators Indicators are not useful unless they are linked to results Indicators provide data that can be analyzed to gauge performance against expected results Measurement includes unexpected or unplanned results Presentation is based on UNIFEM materials Performance Measurement A system is need to collect data Processes need to be developed Human and financial resources need to be assigned Measurement activities needed to be planned and time allocated for them Module Summary RBM has particular advantages and disadvantages and must be applied appropriately Each stage of the management cycle requires particular RBM interventions Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials
36 Module Objectives Context Analysis Module 2 Understand the role of the context analysis in formulation of results and identifying assumptions and risks Context Analysis Qualitative and quantitative description of a given issue Helps identify: strategic areas of intervention and capacity gaps for 1) rights holders and 2) duty bearers specific issues to be addressed hence results to be achieved major challenges or risks indicators and targets assumptions and risks Can be used as baseline (but not always) Module 2 Summary Brief description of a situation Should include gaps in capacity RH and DB Should refer to conventions, etc. Helps identify, results, indicators, assumptions and risks Sometimes serves as a baseline
37 Module Objectives Results Definition The Results Chain Module 3 To understand the different levels of results and linkages between them To understand and practice how to create results chains To learn how to represent complex results chains in a logic model or results tree Articulating Results If you don t know where you are going, any road will take you there (Unknown) A word on results Outputs - short-term change, beyond completed activities, usually specified annually (ex. skills, knowledge acquired) Outcomes - medium term change, directly influenced by Outputs, drawn from/linked to MYFF, around end of time period (ex. skills knowledge applied) Impacts - long term change, contributed to by Outcomes (in addition to many other factors), taken from MYFF, post time period, country/sector level (ex. increased right for women in.)
38 The Results Chain A series of expected achievements, linked by causality Continuum from inputs/resources to final impact divided up into segments / links Expressed horizontally or vertically Each link in the chain is characterized by: Increased importance of achievement with respect to the program goal Decreased control, accountability, and attribution Inputs Activities Outputs Outcomes Impacts Training (by itself) Outputs Outcomes Impacts Policy Advocacy Outputs Outcomes Impacts Knowledge Skills Abilities Application in organization Organizational Performance Knowledge Awareness Buy-in Policy Change Results of Policy Implementation Example: Training in RBM Use of RBM by trainees More successful programming
39 Capacity Building with Single Organization Outputs Outcomes Impacts Knowledge Organizational Capacity Results of Skills improved Systems Performance Performance Processes Policies Example: Increased / improved in program management Enhanced ability to manage program Improved management of program Improved program results for women Wording should show change: Improved Increased technical capacity of Enhanced Greater Higher Diminished Presence / Absence Should not use through, for (in order to), by (how), ie no causality in the statement Wording should include: What - ensure equality provisions within inheritance laws Duty bearer or rights holders - Ministry of Revenue Location - in country X(where not obvious) Normative term - appropriate equality provisions Timeframe - by 2008 May not be appropriate for all results Outcome timeframe is end of project
40 Risks and Assumptions Module 4 What are Assumptions? Positive conditions that are necessary to ensure program success and that we assume will be in place for: The planned activities to produce expected outputs The outputs to directly influence the outcomes The outcomes to contribute to the impact Our program design is based on the assumption that these things will remain true What is Risk? Risk refers to the uncertainty that surrounds future events Risks are negative factors that would limit / reduce / or interfere with project success If the risk comes to pass then the success of the results is threatened Risks can be internal or external to the program Example: Assumptions and Risks An Assumption could be: Rural women (rights holders) will participate in advocacy activities A Risk could be: After next elections, local government (duty bearer) may not support the program
41 Identifying Assumptions When to include? Include assumptions that are: Key positive success factors you are depending on Factors that have a high likelihood of remaining true throughout the program Don't include assumptions that are: Relatively minor in nature Negative factors with a high likelihood of coming to pass (they should be incorporated in program design) Source: Results-Based Management in CIDA: An Introductory Guide to the Concepts and Principles When to include - Risks Include Risks that are: Low or medium likelihood and medium or high effect Do not include risks that are: Low likelihood and low effect High likelihood, high effect risks should be incorporated in program design Effects Significant (3) Moderate (2) Minor (1) Include Include Low (1) Risk Analysis Include Include Medium (2) Likelihood Design Include High (3) If Low Likelihood and Low Effect, no need to state If High Likelihood and Significant Effect, needs to be taken into account in the program design
42 Analysis and Monitoring Assess the likelihood and impact of each risk Need to develop risk mitigation strategies Need to periodically review assumptions and risks
43 Indicators Developing and Selecting Indicators Module 5 Not everything that counts can be measured, and not everything that can be measured counts (A. Einstein) WHAT IS AN INDICATOR? Instrument to measure evidence of progress towards a result or that a result has been achieved Establishes the level of performance necessary to achieve results Specifies the elements necessary to establish whether expected results were achieved Indicators Performance against results is measured through the use of indicators Indicators play an important role by establishing the status of expected results Indicators tell us how we will know when we have been successful in progress towards results
44 Performance and Results Levels Stated Outcomes Goal/ Impact Indicators Usefulness of Indicators Tell us how we will recognize success Force us to clarify what we mean by our objectives/expected result Outcomes Outputs Indicators Indicators Provide a measurable basis for monitoring and evaluation Considerations Indicators do not exist in a vacuum they should always be related to results Indicators should measure: Improvements in the capacities of rights holders and duty bearers to realize rights Improvements in the enjoyments of rights Need a balance of quantitative and qualitative Some results are more suitable for indicators than others Quantitative vs. Qualitative Indicators Can be directly counted and expressed as a number % of # of Frequency of Ratio of Amount of Timeliness of Involve perception, can be analysed quantitatively (Level of ) Congruence with Satisfaction with Knowledge of Ability to Appropriateness of... Importance of
45 Quantitative vs. Qualitative Indicators Quantitative % of participants who are employed # of women in decision-making positions % of women parliamentarians Qualitative Congruence of policy changes with advocacy messages Level of commitment to CEDAW Quality of GEL formulation Qualitative Indicators Require further measurement criteria Congruence of policy changes with advocacy messages Reflection of key words in policy document Inclusion of key messages with different wording Key messages taken into account but not fully adopted All these are different levels of congruence Example of Proxy/ Indirect Indicator Use a proxy indicator when you cannot directly measure a result Result: Decreased trafficking of women and girls from country X. Direct Indicator: # of women and girls moved across the border against their will or under false pretences per year Proxy Indicator:? Deconstructing Indicators Indicator 80 % of NGO Recommendations Adopted Target Measure
46 Measures, Indicators, Targets Measures (The What ) Quantitative or qualitative result attributes that must be measured in order to determine the performance of a program or initiative. Indicators ( The How ) 80 % of NGO Recommendations Adopted Target Indicator The quantification or qualification of a performance measure. A statistic or parameter that, tracked over time, provides information on trends in the condition of a phenomenon. Targets or Standards (The How Much ) Specific quantitative or qualitative goals against which actual outputs or outcomes will be compared. Targets imply a desired goal that may be more ambitious than a standard. Targets are not appropriate for all types of indicators Measure Examples Measure Satisfaction of citizen with the complaints mechanism Indicator % of citizen who are satisfied complaint mechanism Target Percentage of citizen who are very satisfied with complaint mechanism is 50% Examples Measure Female ownership of land Indicator % of land owned by females Target Proportion of land owned by women is 40% Examples Measure Knowledge of NGO workers on advocacy campaign tactics Indicator Level of knowledge of NGO staff of 3 key advocacy tactics Target 90% of NGO trainees can fully describe the 3 key advocacy tactics
47 Guidelines for Developing Indicators Brainstorm all possible indicators and then select best Limit the number of indicators (2 to 3 per result) Measure realization and enjoyment of rights Always sex-disaggregate Develop in participatory fashion Indicators must be relevant to needs of the user Criteria for Selecting Best Indicators Does the indicator measure the result (accuracy, certainty, & exactness)? Is it a consistent measure over time? When the result changes, will the indicator be sensitive to those changes? Will it be easy to collect and analyze the information? Can you use the data the indicator provides for decision-making and learning? Can you afford to collect the information? Does the indicator measure the result? Questions to ask: 1. How would you know if the result was successfully achieved? What would you measure? (Is this the selected indicator?) 2. Is it possible that the result could change but there would be no change in the indicator? 3. Is it possible that the indicator could change due to something other than the result? Baseline A description (qualitative or quantitative) of the situation prior to the intervention against which progress can be assessed or comparison made Used as a benchmark for assessing programme induced outcomes or impacts Often the first data collected for an indicator is the baseline Baseline data are gathered before or shortly after project implementation begins
48 BASELINE, TARGETS, PERFORMANCE Commitment The concept of performance indicator includes the notions of: Baseline Measurement at the end of the period Target - baseline (to establish the situation at the outset) - target (to set commitment for result) - achievement (actual result)
49 The LFA The Logical Framework Module 6 LFA = Logical Framework Analysis Widely adopted by donor community Many different flavours but common features are: Causal logic Objectives, indicators, assumptions Results Based - LFA The three levels of the results chain are aligned with the old goal, purpose, and input levels The RB-LFA should be iterative, modified regularly to reflect changes in the programme as it evolves (Output, Outcome, not Impact) Adoption of RB-LFA format was to shift the orientation of projects from managing by Inputs/Outputs to managing for Results Expected results statement Goal taken from MYFF An Example of LFA Indicators Taken from global MYFF Outcome drawn from/linked to MYFF Drawn from or linked to global/regional MYFF or subregional strategy Output Drawn from regional MYFF, sub-regional strategy or programme document Activity No indicators needed Means of measurement/verification Assumptions/risks Outcome to Impacts Outputs to Outcomes Activities to Outputs None Required
50 Strengths (Almost) Everything you need to know about project in one place Indicators easily line up with results Assumptions at each stage of causal logic are clear Risks are identified Weaknesses Causal logic hard to follow in LFA The first column is a bit redundant Hard to represent outputs that are inputs to other outputs Some but not all of the data collection information
51 Performance Measurement Performance Measurement Module 7 A plan for data collection is needed A Performance Monitoring Framework (PMF) contains the details of the plan The Performance Monitoring Framework A tool to plan the collection of performance information The PMF will help to ensure the regular and timely collection of comparable performance information The PMF must be designed and completed by the project with input from stakeholders Performance Measurement Build in adequate time and resources for collection and analysis of performance measurement data so that progress is reviewed and assessed Tie performance monitoring into reporting processes 1
52 PMF Data Gathering Questions From what source will the data be collected? What methodology will be used? (see handout in binder) How often will the data be collected? Who will be responsible for collecting the data? Expected Results From LFA An Example of PMF : Planning for Monitoring Indicators From LFA Means of measurement/ verification Where is the data Collection methods (with indicative timeframe & frequency) How are you going to get and when Baseline (with indicative timeframe) What is the starting point Responsibilities Who is going to get it 2
53 Performance Information Reporting and Learning Module 8 Performance measurement should yield good information that can be used during project implementation to adjust strategies If performance information is not being used actively, then RBM is not being applied! Key Questions Who will be collating and analysing the data? Who will performance information be reported to? How will the data be reported and how often? (formats + frequencies) How much detail needs to be reported? Reasons for Reporting Results To determine if results achieved Accountability To learn and apply lessons for the future
54 Results Based Reporting Reporting for Evaluation and Learning Stakeholder Consultations and Reviews Donor Reporting Accountability Contributes to / detracts from Ability Ability to raise funds in future Reputation Annual MYFF Review Donor Reporting Why Reporting? Links evaluation To learning Feeds back into programming
55 Risks and Assumptions Advancing Implementation of Gender Equality Legislation in Employment and the Public Workplace Expected results statement Impact: Increased equality for women in employment and the public workplace Assumptions/Risks Assumptions: While there may be gaps in the provisions of existing GEL-WRI, such gaps do not preclude the implementation of GEL-WRI. GEL-WRI implementation improves workplace equality. Risks: Political instability in Southeast Europe and Ukraine might prevent attention to gender equality in general, and to GEL-WRI implementation in particular. Immediate/ short-term political priorities do not include gender equality and women s right issues in employment and workplace Outcome 1: Improved CSO advocacy and outreach services for GEL-WRI implementation There is lack of reliable statistics to measure changes/ progress/ regress that could better inform decision-making for improved implementation of GEL-WRI. Risks: Political climate in Southeast Europe might not be able to support a space for sustained NGO advocacy fully CSOs do not have sufficient resources to undertake sustained advocacy campaigns. Assumption: Women have sufficient confidence to access GEL-WRI enforcement mechanisms based on track record established for cases aided by trained CSOs and lawyers Output 1.1 Increased advocacy skills of 100 CSOs and 25 lawyers on GEL-WRI implementation Risk: Access to relevant mechanisms is limited for some groups of women (ethnic groups, rural women, low skilled/ low educated women, poor women)
56 Risks and Assumptions Advancing Implementation of Gender Equality Legislation in Employment and the Public Workplace Expected results statement Assumptions/Risks Activities Design of training, based on needs assessment, for 100 CSOs and 25 lawyers on advocacy skills for GRL-WRI implementation Conduct of training for 100 CSOs and 25 lawyers on advocacy skills for GRL-WRI implementation Output 1.2 Increased knowledge and skills of 10 CSOs and 25 lawyers to provide outreach services to women whose rights on WRI are not protected or are violated Activities Design of training, based on needs assessment, for 10 CSOs and 25 lawyers on provision of outreach services to women on GEL-WRI issues Conduct of training for 10 CSOs and 25 lawyers on provision of outreach services to women whose rights on WRI are not protected or are violated Outcome 2: More effective GEL-WRI Risks: implementation by ministries and judiciary in Croatia, Bosnia & Ukraine Governments have difficulty in providing and sustaining adequate resource allocation to enable relevant ministries and the judiciary to implement programmes in support of GEL-WRI implementation. Output 2.1 Increased capacity of relevant ministries and judiciary for GEL-WRI implementation Geographical distribution/ quality of these services at local level is uneven causing problems with access to services for some groups of women
57 Risks and Assumptions Advancing Implementation of Gender Equality Legislation in Employment and the Public Workplace Expected results statement Assumptions/Risks Activities Review of existing provisions of GEL-WRI including responsibilities of ministries and judiciary Design of training for better understanding of GEL-WRI issues and importance of resource allocation for GEL-WRI implementation by relevant ministries and the judiciary Conduct of training for relevant ministries and 200 public prosecutors and judges Provide technical support to relevant ministries in developing new services for women in line with their responsibilities to implement GEL-WRI Develop, publish and update knowledge base on GEL and CEDAW implementation relating to WRI Output 2.2 Increased capacity of relevant ministries and the judiciary to ensure their accountability for GEL-WRI implementation Activities Develop monitoring & reporting tools to assess GEL-WRI implementation by relevant ministries and the judiciary Analyze reports Disseminate reports
58 Sample LFA HBW_ENG.doc Annex I Logical Framework Analysis Phase II: Strengthening Organizations of Home Based Workers in South and Southeast Asia GOAL: Ensuring the full realization of human rights of women home-based workers (HBWs) in Asia. OUTCOME 1: Existence of sustainable organizations of HBWs and their networks at national and sub-regional levels in South and Southeast Asia. INDICATORS: HomeNets South and Southeast Asia exist. Increased visibility of HBWs and their networks. HBW networks are implementing programmes. HBW networks are meeting membership needs and demands. Growing membership within HBW networks. Financial resources for the sub-regional and national HBW networks are forthcoming. Institutional procedures, systems and mechanisms exist within the sub-region and national networks. Recognition and inclusion of HomeNets in national, regional and international forums. OUTPUTS INDICATORS MOVs ACTIVITIES 1. Strong, representative, financially sustainable networks are legally established which are able to successfully achieve their mandates at the sub-regional and national levels in South and Southeast Asia. HomeNet Southeast Asia is legally registered. Autonomous and consolidate HomeNet South Asia exists. HBW national networks participate in skills enhancement workshops. National governments recognize the national HBW networks by soliciting their feedback, inputs, & participation on issues affecting HBWs and/or the informal sector. HBWs and associated groups continue their membership with the sub-regional and national networks. Increased ownership of networks by HBWs. Increased representation of HBWs by HBWs themselves at national and international levels. Certification of legal registration for HomeNet Southeast Asia. Government reports NGO reports. Reports and documents generated by HBW networks. Minutes of government meetings, taskforces, committees. Feedback from government officials. MEANS OF VERIFICATION (MOVs): Constitutions/Annual reports of each subregional and national HBW network. Extent of participation of HBW networks in consultations/ policy forums etc. Consult HBW Networks work plans Feedback on HBW networks programmes. Comparison of current membership numbers to that of previous years. Donor commitments/ expression of interest/ agreements References to HBW Networks in media and government reports Legally establish HomeNet Southeast at its new base in Manila and HomeNet South Asia (in Delhi) with appropriate structures in place (such as a bank account and sound accounting capacities) to undertake its mandate Support the election of HBW Network governing body representatives in all 8 countries and 2 sub-regions Support the professionalisation and organizational development of the HBW networks in all 8 countries through training in organizational, personnel and project management, leadership, networking, lobbying, resource mobilization, etc. in an effort to develop the skills required to manage their own organizations Conduct entrepreneurship and enterprise development training for organized home workers groups in both sub-regions and all 8 countries Support HomeNets South Asia and Southeast Asia in establishing appropriate policies, procedures, and mechanisms to guide operations of the network, such as membership procedures, and mechanism for decision-making and representation Forge linkages at local, national and regional levels with organizations such as trade unions, Chambers of Commerce, corporate houses, academics, women s groups and government agencies.
59 Sample LFA HBW_ENG.doc 2. HBWs in both subregions expand their regional and national membership bases towards institutional sustainability. 3. Sub-regional and national HBW networks have skills in resource mobilization (RM) in order to build towards their own financial sustainability. 4. Knowledge sharing, networking and crossregional and national learning contribute to enhanced capacities of sub-regional and national networks. Increased number of members in subregional and national HBW networks. New countries explored for potential membership. New HomeNet is established in Lao. RM strategy exists and is implemented for each of the HBW networks. Increase in amount of resources that are mobilized from sources other than UNIFEM. Trainings in RM take place. Staff/members of HBW networks participate in trainings offered. Tangible links between HomeNets South Asia and Southeast Asia exist. Inter and intra regional study visits, trainings and research take place. Regional databases on HBWs and associated groups exist. Number of members that benefit from inter and intra-regional knowledge sharing and networking events. Number of hits on HomeNet website Increased organizational effectiveness of HBW networks. Consult membership database in all countries. Reports from exploratory missions. Physical observation that RM strategy exists. HBW Networks budgets (and sources of funds). Feedback from training & Study tour participants. HBW Network reports HomeNet website counter Feedback from HBW Network members, partners, donors on effectiveness of HBW networks. Project progress and evaluation reports. 2.1 Strengthen expansion efforts among home workers groups in Laos. 2.2 Conduct exploratory discussions with HBWs and associated groups in Vietnam and Cambodia (funding permitted) and in parts of South Asia. 2.3 Continue mapping the HBW sector and undertake research initiatives at both sub regional and national levels in South and Southeast Asia, in an effort that the results contribute to greater visibility, legislative reform and empowerment of HBWs. 3.1 Support HBW sub-regional and national networks in developing linkages with NGO, government, bilateral and multilateral partners, liaison and networking. 3.2 Support the HBW sub-regional and national networks in developing a resource mobilization strategy in all 8 countries. 4.1 Support sub-regional and national networks in all 8 countries in information and cross-regional learning through regular updating of the HomeNet Southeast Asia website, HomeNet South Asia s and Southeast Asia s newsletter, study visits, documentation and dissemination of best practices, tools, manuals, and resources. 4.2 Conduct sub regional and national workshops on social protection, lobbying, advocacy, networking, marketing, information technology (e.g., e-commerce), and other felt needs of home worker groups in both sub-regions. 4.3 Support the development and dissemination of a directory of HBWs and associated organizations in South Asia. OUTCOME 2: Existence of enabling policy environment for women HBWs / Informal Sector workers in South and Southeast Asia. INDICATORS: HBWs and their concerns addressed in policies, laws and government programmes. Increased willingness to discuss issues relevant to women HBWs by government and policy makers. Local, provincial and national governments solicit feedback, inputs and participation on issues related to women HBWs. Increase in number and types of laws being discussed, proposed and/or passed affecting women HBWs. OUTPUTS INDICATORS MOVs ACTIVITIES 5. Increased consensus amongst government on the nature of policy and Increased willingness by government to discuss issues pertaining to women HBWs Government reports Feedback from government officials. legislative measures for Increased agreements by regional level promoting the rights of bodies such as SAARC, ASEAN, APEC. Media reports MEANS OF VERIFICATION (MOVs): Existing and proposed laws/policies. Feedback from government departments, officials. Feedback from HBW networks regarding their participation on gov t initiated committee, taskforces, reports, etc. 5.1 Promote informal-sector friendly microfinance policies and programs, child-care and other social services, as well as scholarships and other forms of assistance to benefit working children in the Philippines. 5.2 Advocate for more widespread implementation of the Country
60 Sample LFA HBW_ENG.doc women HBWs all 8 countries and two subregions.. 6. HBWs and their networks are better able to advocate for legislative reform with regards to HBWs economic and social rights in all 8 countries. 7. Increased public and government awareness of and commitment to HBWs rights to social protection and healthy working conditions. Increased numbers of regional dialogues on issues affecting women HBWs, bringing together national governments, regional bodies and multilateral partners. Increased number of HBWs participating in advocacy training. Increased interest in and/or commitments by government for legislative reform vis-à-vis HBWs social and economic rights. Increased number of successful HBWled advocacy and lobbying campaigns take place. Increased media attention on HBW rights to social protection and better working environments. Increased number of governmentinitiated dialogues on issues related to women HBWs' rights to social protection and better working environments. Regional agreements Documents from SAARC, ASEAN and APEC Reports from training workshops. Information from HBW networks. Feedback from government officials, departments. Media reports Media reports Surveys/polls on public awareness of and sentiments on HBWs rights to social protection and better working conditions. Government reports Program for the Informal Sector in the Philippines, particularly in local government units. 5.3 Strengthen the National Steering Committee on Home workers in Indonesia. 5.4 Liaise and network with the Ministry of Labor and Transmigration for enabling HBW-friendly policies in Indonesia. 5.5 Through signature campaigns and similar initiatives, support the passage of laws favorable to HBWs e.g., Magna Carta for the Informal Sector in the Philippines and the draft Labour Protection law in Thailand. 5.6 Establish linkages in South Asia with regional bodies such as SAARC, ASEAN and APEC for the ratification and implementation of the ILO Convention. 5.7 Undertake advocacy on the enforcement of minimum wages for home workers and then enforce minimum wages at the national level in South Asia. 6.1 Build capacity of HBW groups in the areas of effective lobbying, advocacy and networking in South and Southeast Asia. 6.2 Continue working with the Central Bureau of Statistics (CBS) to sustain the collection of home workers statistics in the CBS household survey and the CBS socio-economic census in Indonesia. 6.3 Network with national agencies, local government units, banks and other private business for financial and technical assistance, as well as access to other resources in South Asia. 6.4 Launch, maintain, and expand access of home workers in various areas to microfinance programs in South and Southeast Asia. 6.5 Continue to build alliances and networks at local, national and regional levels to support policy advocacy on laws and policies concerning fair trade and other macro-economic issues in Southeast Asia. 6.6 Establish linkages with organizations supporting fair labour practices in South Asia such as IFAT, ITF and ETI. 6.7 Establish issue-based linkages with various partners in areas such as occupational health, housing, identity cards, etc. 7.1 Develop strategic linkages and partnerships with census departments and national government statistics departments in an effort to produce more accurate information on the size and scope of HBWs and their contribution to the economy in Southeast Asia. 7.2 Document and disseminate the learnings from pilot projects in both sub-regions with NGOs, governments and other relevant groups to generate interest and awareness in HBW issues. 7.3 Undertake advocacy campaigns around the issue of decent work, particularly in the area of occupational safety and health and
61 Sample LFA HBW_ENG.doc OUTCOME 3: Improved response from government and private sector on social protection measures and schemes for HBWs. Greater public demand to address women HBWs rights to social protection and better working conditions. HBW issues appear as priorities in multilateral processes such as PRSPs, CCAs, UNDAFs. Promises made publicly by government on HBW issues. Country specific CCAs, UNDAFs, PRSPs. INDICATORS: Increased acknowledgment and involvement by government in addressing women HBWs need for social protection. Increased acknowledgement and involvement by private sector on issues related to women HBWs access to social protection schemes. Increased interaction between government and private sector on issues related to women HBWs and their access to social protection schemes. Increased number of HBWs have access to and are accessing formal and indigenous social protection schemes. OUTPUTS INDICATORS MOVs ACTIVITIES minimum wages. 7.4 Promote HBW issues in South Asia by connecting them with advocacy campaigns on related issues such as HIV/AIDs, and trafficking. 7.5 Engage in multilateral processes such as the PRSP, CCAs, UNDAFs and MDGs in an effort to raise awareness of HBW issues with multilateral partners in South Asia. 7.6 Support people s representatives to take up home workers issues and concerns in public hearings at local and provincial levels in Indonesia. 7.7 Organize exposure and study visits of government organizations to home workers sites in Southeast Asia. 7.8 Raise awareness on various aspects of decent work (including occupational safety and health) towards their observance in all production processes. 7.9 Within the principles of decent work, promote a more holistic approach to the One-Tambon-One-Product (OTOP) in Thailand, in an effort to include occupational safety and health and other basic rights. MEANS OF VERIFICATION (MOVs): Government reports Feedback from private sector organizations Government and private sector databases on numbers of HBWs having access to and actually accessing social protection schemes. HBW Network databases.
62 Sample LFA HBW_ENG.doc 8. Women HBWs and their associated groups are better informed regarding the different types of social protection schemes. 9. Improved capacity of home based workers to advocate for and participate in different social protection schemes in SE Asia. 10. Improved consensus amongst government and private sector on the need for social protection for HBWs in South Asia. Relevant stakeholders (i.e. HBWs, their networks, policy makers, etc) are (a) aware of social protection schemes currently available in their respective countries, (b) able to articulate the pros and cons of each type of scheme, and (c) able to articulate which schemes would benefit HBWs the most. Lobbying and advocacy campaigns highlight the above information to governments and private sector organizations. Advocacy campaigns highlight the above information to HBWs and the general public. Increased numbers of HBWs participating in social protection schemes. Number of HBWs trained in the area of social protection and the various schemes available. Increased number of policy dialogues between government and private sector take place on the issue Increased number of partnerships are forged between government and private sector on the issue. Government and/or private sector led initiatives exist for the provision of social protection to HBWs. Consult studies that have been undertaken by HBWs, their networks, government, private sector, etc. Information on advocacy and lobby campaigns undertaken by HBWs and their networks. HBW Network databases Feedback from HBWs Information from social protection scheme providers. Information solicited from private-sector social protection scheme providers. Information from governments. 8.1 Conduct and disseminate research on ways to improve access of HBWs to formal and indigenous social protection schemes in South Asia, particularly at the local level. 9.1 Promote the extended coverage of HBWs in national social security schemes, assert the participation of their organizations in decisionmaking bodies running these schemes, and wherever possible (e.g. Philippines), work for their accreditation as collecting agents for such schemes Organize a large, high profile regional round table with stakeholders a post Katmandu meeting on social protection; lobbying, advocacy, and networking; marketing, information technology and other felt needs of home based worker groups in the South Asia region Support the sub-regional and national HBW networks in South Asia in campaigning for new, user-friendly, and demand-responsive social security schemes for HBWs Promote linkages and strategic partnerships with different stakeholders in South Asia, including the private sector insurance companies and banks.
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
Guidance Note on Developing Terms of Reference (ToR) for Evaluations
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
Logical Framework: Making it Results-Oriented
1 of 11 2001-04-05 09:58 Logical Framework: Making it Results-Oriented Table of Contents 1. Introduction 2. Purpose of Guide 3. The LFA Process 4. The Logical Framework Structure 5. The Vertical Logic
A framework to plan monitoring and evaluation
6. A framework to plan monitoring and evaluation Key ideas Monitoring and evaluation should be an integral part of (and therefore affect) all stages of the project cycle. As mentioned earlier, you need
Glossary Monitoring and Evaluation Terms
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
Results-Based Programming, Management, Monitoring and Reporting (RBM) approach as applied at UNESCO Guiding Principles
BSP/RBM/2008/1.REV.6 Paris, September 2015 Original: English Results-Based Programming, Management, Monitoring and Reporting (RBM) approach as applied at UNESCO Guiding Principles Bureau of Strategic Planning
Monitoring, Evaluation and Learning Plan
Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net
USAID PROGRAM CYCLE OVERVIEW
USAID PROGRAM CYCLE OVERVIEW USAID AUGUST 2012 CONTENTS Background... 3 Overview Summary... 3 Program Cycle Components:... 3 Agency Policies and Strategies:... 4 Country Development Cooperation Strategies:...
Part 1. MfDR Concepts, Tools and Principles
Part 1. Concepts, Tools and Principles 3 Overview Part 1. MfDR Concepts, Tools and Principles M anaging for Development Results (MfDR) is multidimensional, relating back to concepts about how to make international
Meta-Evaluation of interventions undertaken by the Luxembourg Cooperation in the subsector of professional training in hospitality and tourism.
Meta-Evaluation of interventions undertaken by the Luxembourg Cooperation in the subsector of professional training in hospitality and tourism. In 2013, the Directorate for Development Cooperation and
New JICA Guidelines for Project Evaluation First Edition. Japan International Cooperation Agency (JICA) Evaluation Department
New JICA Guidelines for Project Evaluation First Edition Japan International Cooperation Agency (JICA) Evaluation Department June 2010 Contents 1 OVERVIEW OF AID EVALUATION AND JICA'S PROJECT EVALUATION...
PROJECT MANAGEMENT TRAINING MODULES
PROJECT MANAGEMENT TRAINING MODULES KENYA PROJECTS ORGANIZATION < Projects Solutions Provider > Macjo Arcade, 4 th Flr., Suite 15E P.O Box, 3029 00200, Nairobi - Kenya Tel: 254 (0)202319748 Mob: 0725 788
UNDP Programming Manual December 2000. Chapter 7: MONITORING, REPORTING AND EVALUATION Page 1
Chapter 7: MONITORING, REPORTING AND EVALUATION Page 1 Contents Page 7.0 MONITORING, REPORTING AND EVALUATION See the New M & E Framework. 7.1 Policy framework 7.1.1 General policy statements 7.1.2 Coverage
Revenue Administration: Performance Measurement in Tax Administration
T e c h n i c a l N o t e s a n d M a n u a l s Revenue Administration: Performance Measurement in Tax Administration William Crandall Fiscal Affairs Department I n t e r n a t i o n a l M o n e t a r
How To Write Project Management Guidelines
Project Management Guidelines July 2013 FOREWORD I have pleasure to present these BDT Project Management Guidelines 2013 that now supersedes the April 2008 version. These Guidelines are drawn up in accordance
TOOL D14 Monitoring and evaluation: a framework
TOOL D14 Monitoring and evaluation: a framework 159 TOOL D14 Monitoring and evaluation: a framework TOOL D14 For: About: Purpose: Use: Resource: Commissioners in primary care trusts (PCTs) and local authorities
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS 1 Table of Contents Background A. Criteria B. Definition C. Rationale D. Initiator E. Timing F. Planning G. Involvement of the Independent
U.S. Department of the Treasury. Treasury IT Performance Measures Guide
U.S. Department of the Treasury Treasury IT Performance Measures Guide Office of the Chief Information Officer (OCIO) Enterprise Architecture Program June 2007 Revision History June 13, 2007 (Version 1.1)
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE DEVELOPMENT ASSISTANCE COMMITTEE PARIS, 1991 DAC Principles for Evaluation of Development Assistance Development Assistance Committee Abstract: The following
TIPS BASELINES AND TARGETS ABOUT TIPS
NUMBER 8 2 ND EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS BASELINES AND TARGETS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance
How to Measure and Report Social Impact
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
How To Monitor A Project
Module 4: Monitoring and Reporting 4-1 Module 4: Monitoring and Reporting 4-2 Module 4: Monitoring and Reporting TABLE OF CONTENTS 1. MONITORING... 3 1.1. WHY MONITOR?... 3 1.2. OPERATIONAL MONITORING...
PARC. Evaluation Series No.1 The Logical Framework
PERFORMANCE ASSESSMENT RESOURCE CENTRE PARC Evaluation Series No.1 The Logical Framework By w PARC Evaluation Series No. 1: The Logical Framework Looking back over the last year and the conferences where
Discussion Paper on Follow-up and Review of the Post-2015 Development Agenda - 12 May 2015
Discussion Paper on Follow-up and Review of the Post-2015 Development Agenda - 12 May 2015 Introduction This discussion paper outlines some key elements on follow-up and review which have emerged from
USING THE EVALUABILITY ASSESSMENT TOOL
USING THE EVALUABILITY ASSESSMENT TOOL INTRODUCTION The ILO is committed to strengthening the Office-wide application of results-based management (RBM) in order to more clearly demonstrate its results
MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION
MONITORING AND EVALUATION WORKSHEETS SUPPLEMENTAL INFORMATION I. Introduction to Results Based Monitoring HED will use its results-based monitoring and evaluation system to effectively manage partnership
GLOSSARY OF EVALUATION TERMS
Planning and Performance Management Unit Office of the Director of U.S. Foreign Assistance Final Version: March 25, 2009 INTRODUCTION This Glossary of Evaluation and Related Terms was jointly prepared
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT In order to respond to the need for an overview of the rapid evolution of RBM, the DAC Working
Selecting Indicators for impact evaluation. Ruby Sandhu-Rojon, UNDP
Selecting Indicators for impact evaluation. Ruby Sandhu-Rojon, UNDP Contents What indicators tell us about the wind 1. Introduction 2. Indicators Only Indicate 3. Types of Indicators 4. Qualitative and
192 EX/6. Executive Board Hundred and ninety-second session
Executive Board Hundred and ninety-second session 192 EX/6 PARIS, 31 July 2013 Original: English Item 6 of the provisional agenda PROPOSAL FOR A GLOBAL ACTION PROGRAMME ON EDUCATION FOR SUSTAINABLE DEVELOPMENT
Rationale for UNESCO involvement
Complementary Additional Programme 2014-2015 / Concept note Title: Development of the UNESCO Inclusive Policy Lab a global space for inclusive policy analysis and policy innovation Geographical country(ies):
STRATEGIC PLANNING BACKGROUND AND PRACTICE
Module 2 STRATEGIC PLANNING BACKGROUND AND PRACTICE 1 John Glazebrook STRATEGIC PLANNING Introducing SWAp for Environment Belgrade, 24-25 July 2012 Today Look at Why do Strategic Planning? A mixture of
Introduction to Monitoring and Evaluation. Using the Logical Framework Approach
Introduction to Monitoring and Evaluation Using the Logical Framework Approach Developed and Presented by: Umhlaba Development Services Umhlaba Development Services Noswal Hall, Braamfontein, Johannesburg,
UNICEF Global Evaluation Report Oversight System (GEROS) Review Template
UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White Questions Outstanding Yes t Applicable Section & Overall Rating Very Confident
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
Lessons Learned from MDG Monitoring From A Statistical Perspective
Lessons Learned from MDG Monitoring From A Statistical Perspective Report of the Task Team on Lessons Learned from MDG Monitoring of the IAEG-MDG United Nations March 2013 The views expressed in this paper
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire
Logical Framework Analysis and Problem tree. 21 st March 2013
Logical Framework Analysis and Problem tree 21 st March 2013 Overview Explanation of purpose and benefits of Logical Frameworks Develop a problem tree analysis linking roots causes effects (exercise) Logic
The Framework for Strategic Plans and Annual Performance Plans is also available on www.treasury.gov.za
Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Strategic Plans and Annual Performance Plans is also available
PERFORMANCE-BASED BUDGETING METHODOLOGY AND TOOLS
LEARN HOW TO LINK YOUR BUDGET DECISIONS WITH STRATEGIC OUTCOMES In this time of economic instability, citizens want real-time updates on the financial decisions and budget allocations made by their local
Essentials of Project Management
Institut de recherche en sciences psychologiques Essentials of Project Management Stephan Van den Broucke Université catholique de Louvain EAHC Workshop on Joint Actions Luxembourg, 10 December 2012 What
7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION NEEDS: INFORMATION GAP ANALYSIS
7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION
The Logical Framework Approach An Introduction 1
The Logical Framework Approach An Introduction 1 1. What is the Logical Framework Approach? 1.1. The background The Logical Framework Approach (LFA) was developed in the late 1960 s to assist the US Agency
Monitoring and Evaluation of. Interventions
Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science
ASSESSMENT OF DEVELOPMENT RESULTS
ASSESSMENT OF DEVELOPMENT RESULTS UNDP Evaluation Office, January 2007 GUIDELINES FOR AN ASSESSMENT OF DEVELOPMENT RESULTS (ADR) CONTENTS A. WHAT IS THE ASSESSMENT OF DEVELOPMENT RESULTS (ADR)? An introduction
Results-based Management in the ILO. A Guidebook. Version 2
Results-based Management in the ILO A Guidebook Version 2 Applying Results-Based Management in the International Labour Organization A Guidebook Version 2 June 2011 Copyright International Labour Organization
Strategic Plan 2013-2018
Strategic Plan 2013-2018 TABLE OF CONTENTS 1. LIST OF ACRONYMS 2. MESSAGE FROM THE AUDITOR GENERAL 3. OVERVIEW OF SAIB STRATEGIC PLAN 2013 2018 4. MISSION 5. VISION 6. VALUES 7. SAIB STRATEGIC THRUSTS
1EVALUATION AND TYPES OF EVALUATION 1. REASONS FOR CONDUCTING EVALUATIONS
Section 1EVALUATION AND TYPES OF EVALUATION 1. REASONS FOR CONDUCTING EVALUATIONS The notion of evaluation has been around a long time in fact, the Chinese had a large functional evaluation system in place
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT In order to respond to the need for an overview of the rapid evolution of RBM, the DAC Working
An evaluation of the effectiveness of performance management systems on service delivery in the Zimbabwean civil service
An evaluation of the effectiveness of performance management systems on service delivery in the Zimbabwean civil service ABSTRACT P. Zvavahera National University of Science and Technology, Zimbabwe This
TIPS SELECTING PERFORMANCE INDICATORS. About TIPS
2009, NUMBER 6 2ND EDITION DRAFT PERFORMANCE MANAGEMENT & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS About TIPS TIPS provides practical advice and suggestions to USAID managers on issues related
Identification. Preparation and formulation. Evaluation. Review and approval. Implementation. A. Phase 1: Project identification
II. Figure 5: 6 The project cycle can be explained in terms of five phases: identification, preparation and formulation, review and approval, implementation, and evaluation. Distinctions among these phases,
Integrated Risk Management:
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
WH+ST. Action Plan 2013-2015. UNESCO World Heritage and Sustainable Tourism Programme
UNESCO World Heritage and Sustainable Tourism Programme WH+ST Action Plan 2013-2015 The new WH+ST Programme will bring together a broad set of World Heritage and tourism stakeholders in the implementation
Evaluation policy and guidelines for evaluations
Evaluation policy and guidelines for evaluations IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and O
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS
A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 This paper reviews current practice in and the potential
Innovation Metrics: Measurement to Insight
Innovation Metrics: Measurement to Insight White Paper Prepared for: National Innovation Initiative 21 st Century Innovation Working Group Chair, Nicholas M. Donofrio IBM Corporation Prepared by: Egils
Meta-analysis of Reviews and Evaluation of Ecoregional Programs
Meta-analysis of Reviews and Evaluation of Ecoregional Programs Julio A. Berdegué and Germán Escobar 1. Aim, method and organization of the analysis The aim of this report is to summarize the main lessons
Equal Rights and Treatment for Roma in Moldova and Ukraine. Manual
Equal Rights and Treatment for Roma in Moldova and Ukraine Project Management Methodology Manual WELCOME TO THE COUNCIL OF EUROPE PROJECT MANAGEMENT METHODOLOGY In July 2001, the Council of Europe launched
Guide on Developing a HRM Plan
Guide on Developing a HRM Plan Civil Service Branch June 1996 Table of Contents Introduction What is a HRM Plan? Critical Success Factors for Developing the HRM Plan A Shift in Mindset The HRM Plan in
Data quality and metadata
Chapter IX. Data quality and metadata This draft is based on the text adopted by the UN Statistical Commission for purposes of international recommendations for industrial and distributive trade statistics.
Performance Management. Date: November 2012
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
STAGE 1 COMPETENCY STANDARD FOR PROFESSIONAL ENGINEER
STAGE 1 STANDARD FOR PROFESSIONAL ENGINEER ROLE DESCRIPTION - THE MATURE, PROFESSIONAL ENGINEER The following characterises the senior practice role that the mature, Professional Engineer may be expected
Guidelines for the mid term evaluation of rural development programmes 2000-2006 supported from the European Agricultural Guidance and Guarantee Fund
EUROPEAN COMMISSION AGRICULTURE DIRECTORATE-GENERAL Guidelines for the mid term evaluation of rural development programmes 2000-2006 supported from the European Agricultural Guidance and Guarantee Fund
Communication Plan. for the. ATLANTIC AREA 2007-2013 Transnational Cooperation Programme
Communication Plan for the ATLANTIC AREA 2007-2013 Transnational Cooperation Programme Prepared by the Managing Authority 18 January 2008 Index 0. Introduction... 2 1. Communication Strategy... 2 1.1
Chile: Management Control Systems and Results-Based Budgeting
Chile: Management Control Systems and Results-Based Budgeting 17 Chile: Management Control Systems and Results-Based Budgeting Author: Marcela Guzmán, Sr. Economist, Chief of Management Controls Division,
Request for Proposals from non-for-profit organizations
Request for Proposals from non-for-profit organizations Request to submit a written and financial proposal for a work assignment with UNESCO on Support and development of journalism entry education in
Develop a strategic plan. A tool for NGOs and CBOs M
Develop a strategic plan A tool for NGOs and CBOs M Developing Strategic Plans: A Tool for Community- and Faith-Based Organizations Developed by the International HIV/AIDS Alliance and Davies & Lee: AIDS
GAO PERFORMANCE MANAGEMENT SYSTEMS. IRS s Systems for Frontline Employees and Managers Align with Strategic Goals but Improvements Can Be Made
GAO United States General Accounting Office Report to Congressional Requesters July 2002 PERFORMANCE MANAGEMENT SYSTEMS IRS s Systems for Frontline Employees and Managers Align with Strategic Goals but
Performance Measurement
Brief 21 August 2011 Public Procurement Performance Measurement C O N T E N T S What is the rationale for measuring performance in public procurement? What are the benefits of effective performance management?
Annex 1. Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation
Annex 1 Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation European policy experimentations in the fields of Education, Training and
An introduction to impact measurement
An introduction to impact measurement Contents 1 Introduction 2 Some definitions 3 Impact measurement at BIG 4 Setting impact measures for programmes APPENDICES A External Resources (separate document)
REPORT. Public seminar, 10 November 2010, 1.30-5.00 p.m. Concordia Theatre, The Hague
REPORT Complexity-oriented oriented Planning, Monitoring and Evaluation (PME) From alternative to mainstream? Public seminar, 10 November 2010, 1.30-5.00 p.m. Concordia Theatre, The Hague 1. What was the
TIPS SELECTING PERFORMANCE INDICATORS ABOUT TIPS
NUMBER 6 2 ND EDITION, 2010 PERFORMANCE MONITORI NG & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related
By Jack Phillips and Patti Phillips How to measure the return on your HR investment
By Jack Phillips and Patti Phillips How to measure the return on your HR investment Using ROI to demonstrate your business impact The demand for HR s accountability through measurement continues to increase.
UNITED NATIONS OFFICE FOR PROJECT SERVICES. ORGANIZATIONAL DIRECTIVE No. 33. UNOPS Strategic Risk Management Planning Framework
UNOPS UNITED NATIONS OFFICE FOR PROJECT SERVICES Headquarters, Copenhagen O.D. No. 33 16 April 2010 ORGANIZATIONAL DIRECTIVE No. 33 UNOPS Strategic Risk Management Planning Framework 1. Introduction 1.1.
SLDS Workshop Summary: Data Use
SLDS Workshop Summary: Data Use Developing a Data Use Strategy This publication aims to help states detail the current status of their State Longitudinal Data System (SLDS) data use strategy and identify
Section Three: Ohio Standards for Principals
Section Three: Ohio Standards for Principals 1 Principals help create a shared vision and clear goals for their schools and ensure continuous progress toward achieving the goals. Principals lead the process
Evaluation Policy. Evaluation Office. United Nations Environment Programme. September 2009. Evaluation Office. United Nations Environment Programme
United Nations Environment Programme Evaluation Policy Evaluation Office September 2009 Evaluation Office United Nations Environment Programme P.O. Box 30552-00100 Nairobi, Kenya Tel: +(254-20)-7623387
Terms of Reference (TOR) For Impact Evaluation of ANN Project
Terms of Reference (TOR) For Impact Evaluation of ANN Project Post Title: Rural Aquaculture Development (Impact Evaluation) Expert (International) Location: Oshakati Extension Office and Omahenene Inland
Data Quality Assurance: Quality Gates Framework for Statistical Risk Management
Data Quality Assurance: Quality Gates Framework for Statistical Risk Management Narrisa Gilbert Australian Bureau of Statistics, 45 Benjamin Way, Belconnen, ACT, Australia 2615 Abstract Statistical collections
GUIDELINES FOR ENGAGING FAITH BASED ORGANISATIONS (FBOS) AS AGENTS OF CHANGE
GUIDELINES FOR ENGAGING FAITH BASED ORGANISATIONS (FBOS) AS AGENTS OF CHANGE These Guidelines provide a critical framework for engagement with faith based organisations (FBOs). They are not a blue print.
reporting forma plan framework indicators outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat
plan reporting forma framework indicators rformancem&e Fram outputs roject management Monitoring and Evaluation Framework 2009 / UNISDR Asia-Pacific Secretariat Introduction Monitoring and evaluation enhance
pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS
pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage
GEF/LDCF.SCCF.17/05/Rev.01 October 15, 2014. 17 th LDCF/SCCF Council Meeting October 30, 2014 Washington, DC. Agenda Item 5
17 th LDCF/SCCF Council Meeting October 30, 2014 Washington, DC GEF/LDCF.SCCF.17/05/Rev.01 October 15, 2014 Agenda Item 5 UPDATED RESULTS-BASED MANAGEMENT FRAMEWORK FOR ADAPTATION TO CLIMATE CHANGE UNDER
OUTLINE. Source: 36 C/Resolution 16, 190 EX/Decision 9 and 192 EX/Decision 6.
37th Session, Paris, 2013 37 C 37 C/57 4 November 2013 Original: English Item 5.19 of the provisional agenda PROPOSAL FOR A GLOBAL ACTION PROGRAMME ON EDUCATION FOR SUSTAINABLE DEVELOPMENT AS FOLLOW-UP
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP)
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP) Organization Project Position type Adeso African Development Solutions and ACTED - Agency for
UNICEF Global Evaluation Report Oversight System (GEROS) Review Template
EVALUATION ID 1430-2015/003 UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White s Outstanding Satisfactory No Not Applicable Section
Module 2: Setting objectives and indicators
Equal Access Participatory Monitoring and Evaluation toolkit Module 2: Setting objectives and indicators Outcomes from using this module You will understand: the concept of indicators and what they can
Partnership Satisfaction & Impact Survey
Partnership Satisfaction & Impact Survey Page 1 of TABLE OF CONTENTS Contents I INTRODUCTION... 3 II SATISFACTION SURVEY... 4 II.1 What?... 4 II.1.1 Definition... 4 II.1.2 Satisfaction survey in Practice...
Framework for Managing Programme Performance Information
Framework for Managing Programme Performance Information Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Managing
Best Practice Asset Management
Case Study Best Practice Asset Management An example of an Activity Management Plan, published by the Working Group Case Study AMP 01-2014 The Working Group was established by the Road Efficiency Group
INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD
EUROPEAN COMMISSION DIRECTORATE-GENERAL REGIONAL POLICY Thematic development, impact, evaluation and innovative actions Evaluation and additionality The New Programming Period 2007-2013 INDICATIVE GUIDELINES
Work based learning. Executive summary. Background
Work based learning Executive summary Background The training contract stage of qualifying as a solicitor is a prime example of 'work based learning' (WBL), a phrase that generally describes the learning
Dimensions of Statistical Quality
Inter-agency Meeting on Coordination of Statistical Activities SA/2002/6/Add.1 New York, 17-19 September 2002 22 August 2002 Item 7 of the provisional agenda Dimensions of Statistical Quality A discussion
Workshop on Strategic Plan for Implementation of Quality Culture within Higher Learning Institutions in East Africa
Workshop on Strategic Plan for Implementation of Quality Culture within Higher Learning Institutions in East Africa January 21-22, 2010 General Strategic Planning Joseph A. Kuzilwa Module Objectives The
Resources for Implementing the WWF Project & Programme Standards. Step 4.3 Analyze Operational and Financial Functions/ Performance
Resources for Implementing the WWF Project & Programme Standards Step 4.3 Analyze Operational and Financial Functions/ Performance March 2007 Step 4.3 Analyze Operational and Financial Performance Contents
