RESULTS-BASED MANAGEMENT: TOWARDS A COMMON UNDERSTANDING AMONG DEVELOPMENT COOPERATION AGENCIES
|
|
|
- Anna Francis
- 10 years ago
- Views:
Transcription
1 RESULTS-BASED MANAGEMENT: TOWARDS A COMMON UNDERSTANDING AMONG DEVELOPMENT COOPERATION AGENCIES Discussion Paper (Ver. 5.0) Prepared for the Canadian International Development Agency, Performance Review Branch, for consideration by the DAC Working Party on Aid Effectiveness and Harmonisation. Werner Meier, Director Results-Based Management Group Ottawa, Canada, istar.ca Wednesday, October 15, 2003
2 TABLE OF CONTENTS 1.0 INTRODUCTION AN EVOLUTION IN MANAGEMENT PUBLIC SECTOR CONTEXT MODERN MANAGEMENT APPROACHES GOAL SETTING IN DEVELOPMENT CO-OPERATION RBM - WHAT IS IT? DEFINITION LANGUAGE AND CULTURE KEY PRINCIPLES FOR PUBLIC SECTOR ORGANISATIONS Partnership Accountability Organisational Learning Transparency Simplicity Flexible and Iterative Application RBM MANAGEMENT CYCLE RBM: WHAT IT S NOT? CONTINUOUS EVALUATION COMPLIANCE AND CONTROLS PERFORMANCE-BASED BUDGETING PERFORMANCE REPORTING...11 ANNEX A: SIX BASIC STEPS TO MANAGING FOR RESULTS...12 Page i
3 1.0 INTRODUCTION 1.1 Background The first meeting of the DAC Working Party on Aid Effectiveness and Donor Practices held in Paris on May 19-21, 2003, acknowledged the importance of managing for results in order to enhance development co-operation effectiveness. It was agreed that further work should build on the international collaboration that began at the June 2002 Washington Roundtable on Better Measuring, Monitoring, and Managing for Development Results and that an experts meeting should be organised to identify the scope of such work. As a first step, four bilateral agencies, i.e., CIDA, DANIDA, DFID and Foreign Affairs/DGIS (the Dutch co-operation), agreed to commission a discussion paper by a reputable consultant with broad international experience who would present some of the most current thinking on Results-Based Management (RBM). 1.1 Purpose In accordance with the Terms of Reference provided to the consultant, the main purpose of this discussion paper was to address two questions, What is RBM? and What is it not? A secondary purpose was to identify some core principles to guide the development and implementation of poverty reduction strategies. Representatives of the four bilateral agencies expected that this paper would generate extensive discussion and debate on the concepts, principles and perspectives put forward, thus beginning an iterative process of dialogue and revision until some acceptable degree of consensus could be reached. A subsequent iteration of the paper could elaborate more fully on the implications for donor agencies and developing countries alike when applying the RBM approach in the context of developing and implementing poverty reduction strategies. 1.2 Structure This discussion paper begins with a brief overview of the confluence of factors that have shaped the RBM approach and then presents its defining characteristics, as well as six basic steps to managing for results (Annex A). 2.0 AN EVOLUTION IN MANAGEMENT 2.1 Public Sector Context For more than fifteen years, there has been constant pressure on governments around the world for greater transparency and accountability by taxpayers for the use of public resources. Public concern in the face of escalating national account deficits, a declining confidence in political leadership and the need for a more transparent and accountable governance have all been important factors contributing to the emergence of RBM in the public sector. Several books have documented the emergence of this new public sector management approach now prevalent in many OECD countries 1. Historically, governments focused their attention on human, technical 1 For more details see Osborne and Gaebler (1993), Reinventing Government: How the entrepreneurial spirit is transforming the public sector and Peter Aucoin (1996), The New Public Sector Management: A Comparative Perspective. Draft RBM Discussion Paper Page 2
4 and financial resources provided as inputs for their programs. The modern management agenda calls for a major shift in focus where public service managers are expected to define expected results, focus attention on result achievement, measure performance regularly and objectively, learn from performance information, make adjustments and improve the efficiency and effectiveness of their programmes. However, implementation of the RBM approach in government has been incremental and not without its challenges and disappointments, which central agencies have aptly documented. 2.2 Modern Management Approaches A brief overview of the advances in management over the past forty years allows us to put the advent of RBM into perspective. Public sector management has evolved considerably since the Planning, Programming and Budgeting Systems (PPBS) approach of the late 1960 s with its emphasis on financial planning and cost accounting. The management of inputs, i.e., human resources, operating and capital costs was of paramount importance in demonstrating management control over the allocation and use of financial resources. Programme Management By Activity (PMBA) became prominent in the 1970s and 1980s when donor organisations were heavily involved in physical infrastructure and industrial development projects. It combined several tools and techniques to plan and schedule activities, e.g., Work Breakdown Structure (WBS), the Gantt Chart, Critical Path Method (CPM), Programme Evaluation and Review Technique (PERT). These conventional blue-print techniques emphasised the implementation of activities according to a planned schedule and were derived from the fields of construction engineering and systems management. Although it has a much earlier history, Management-By-Objectives (MBO) in the mid-seventies enjoyed a resurgence of enthusiasm in the public sector. It allowed managers to take responsibility for the design and implementation of a programme or project under controlled conditions by setting objectives and identifying performance indicators. It provided organisations with a modicum of control and predictability while still being able to delegate responsibility to individuals and teams. The most common application at the time was the Logical Framework Approach (LFA) used in the early seventies by the U.S. Agency for International Development (USAID). It has since been widely adopted and adapted by the international donor community and is used mostly as an analytic tool 2 for project design and approval, while its potential for performance monitoring and evaluation have never been fully realised. An alternative version referred to as Objectives-oriented Project Planning included standard procedures for participatory analysis, problem solving and objectives setting with partner organisations and target groups 3. The diverse use to which the LFA has been put over the years is a testimony to the enduring strength of the approach. 2 Here we refer specifically to the sixteen-cell matrix, which is the product of using the Logical Framework Approach, and often referred to as the Logical Framework Analysis, logical framework, logframe or LFA. 3 Hailey and Sorgenfrei (2003), Measuring Success? - Issues in Performance Management. A Keynote Paper presented at the 5 th International Evaluation Conference on Measurement, Management and Accountability? KDK Conference Centre, The Netherlands. Draft RBM Discussion Paper Page 3
5 The new public management in the 1980s led to widespread government efforts at becoming client and service-oriented, spawning the development of a multiplicity of quality service standards. A new wave of methods and techniques soon followed, including Quality Control/Quality Assurance, ISO Accreditation, Total Quality Management (TQM), etc. which, for the most part, focussed on service delivery processes, quality standards and the acceptance of goals for continuous improvement. Concurrently, a renewed interest in performance indicators arose to measure the efficiency and effectiveness of public service delivery, increase government control over quality, enhance accountability and improve client services. Over the decades we have seen a shift in the focus of public sector management approaches from budgets, to activities, process controls, to objectives and now results. Although the Australian government adopted it as early as the mid-1980s, managing for results became an increasingly important public sector management theme during the 1990's. In an effort to demonstrate value for money in public services many of the OECD member states have reformed the way government does business by shifting their focus from inputs, activities and outputs to outcome achievement. Recent developments in information and communications technology have made integrated management information systems possible, opening the door to capturing and processing large amounts of quantitative financial data, while analysing it in relationship to qualitative outcome data. RBM is clearly an evolution in management and not a revolution, with its origins firmed rooted in the management sciences and closely linked to previous efforts to implement the Management-By-Objectives approach. While applying RBM to development projects has met with relative success 4, it is certainly much more complex when decentralised country programme-based approaches are envisaged in partnership with developing country governments. 2.3 Goal Setting in Development Co-operation Global goal setting in development co-operation is a relatively recent phenomenon beginning in the early 1990 s with a series of UN Conferences with the active participation of developing countries on a wide array of development priorities, e.g., education, children, environment, human rights, social development and women. The targets originating from these conferences were subsequently consolidated in the publication Shaping the 21 st Century: The Contribution of Development Cooperation (OECD 1996), which gave rise to the first integrated set of development goals and development effectiveness principles of which managing for results was one. They have since been adopted globally, for the most part, as indicators of progress in the fields of economic well-being, social development and environmental sustainability and represent the legitimate antecedents of the currently popular Millennium Development Goals. In September 2000 at the United Nations Millennium Summit, 189 Heads of State ratified the Millennium Declaration that outlined their collective commitment to poverty reduction and sustainable development. At the request of the UN General Assembly, the Secretary General subsequently prepared a plan for its implementation and included eight Millennium Development Goals (MDGs), eighteen targets and 48 performance indicators. This MDG framework is the culmination of over a decade of global goal setting and represents the shared commitment of donor and developing countries to eradicating extreme poverty by investing in primary education, improving the health of women and children, combating pandemic diseases, 4 For more details see Measuring and Managing Results: Lessons for Development Cooperation (UNDP 1997). Draft RBM Discussion Paper Page 4
6 promoting gender equality, ensuring environmental sustainability and promoting global partnerships for development. The Millennium Declaration has marked the beginning of an unprecedented era of collaboration within the development community, i.e., OECD, UN Agencies, MDBs and bilateral agencies, on issues of policy coherence, technical co-operation and donor co-ordination. The adoption in March 2002 of the Monterrey Consensus at the United Nations International Conference on Financing for Development exemplifies the new partnership between donor and developing countries. The conference succeeded in articulating the terms and conditions under which commitments by developing countries to transparency, good governance, and respect for human rights and the rule of law were matched by donor commitments towards policy coherence, increased foreign aid and accelerated support for good performers. Furthermore, the donor community was encouraged to undertake the following actions in support of all developing countries: Use development frameworks that are owned and driven by developing countries and that embody poverty reduction strategies; Harmonise their operational procedures to reduce transaction costs for recipient countries; and Improve ODA targeting to the poor, co-ordination of aid and the measurement of results. 5 A powerful momentum has been building behind the MDGs for use by developing countries in the context of poverty reduction strategies. Similarly, the need for sustained development financing, donor harmonisation and co-operation in the measurement of results has also gained increasing recognition. The adoption of the MDGs by developing countries raises many policy priority issues and technical challenges, not the least of which is the use of RBM as a means for promoting good governance and results-oriented public sector management. However, we should be reminded at this point that it was not so long ago that RBM was adopted by western democratic governments at the insistence of their citizenry who demanded greater accountability for, and transparency in the use of taxpayer contributions. While significant progress has been made in these countries and among donor agencies, MDBs and UN agencies in applying RBM, there remains considerable divergence of opinion as to what it is and how it can be effectively implemented. 5 Partial list adapted from Final Outcome of the International Conference on Financing for Development, Conference Secretariat, March 2002, pp. 10. Draft RBM Discussion Paper Page 5
7 3.0 RBM - WHAT IS IT? 3.1 Definition 6 Results-Based Management (RBM) is a management strategy aimed at achieving important changes in the way organisations operate, with improving performance in terms of results 7 as the central orientation. RBM provides the management framework and tools for strategic planning, risk management, performance monitoring and evaluation. It s primary purpose is to improve efficiency and effectiveness through organisational learning, and secondly to fulfil accountability obligations through performance reporting. Key to its success is the involvement of stakeholders throughout the management lifecycle in defining realistic expected results, assessing risk, monitoring progress, reporting on performance and integrating lessons learned into management decisions. 3.2 Language and Culture The language of RBM is significantly different from its precursor, Management-By-Objectives which suffered from a confusion of terminology. Take for example the term objective which has the following synonyms: aim, goal, intent, purpose and target, not to mention the use of the phrases general and specific objectives. The roles and relationships among these terms within the MBO approach was never really clear, with the exception of the hierarchy of objectives that was popularised by the use of the LFA, i.e., inputs, outputs, purpose and goal. While one can argue the semantic nuances between a well written objective and a well written result statement, the significant differences lie in how RBM terms are defined in relationship to one another. RBM terminology borrows heavily from systems theory and reflects the central role of causality, while taking into account the temporal dimension. The following selection of key RBM terms as defined by the OECD 8 illustrates these concepts quite nicely: Input: Activity: Output: Outcome: Impact: The financial, human, and material resources used for the development intervention. Actions taken or work performed through which inputs, such as funds, technical assistance and other types of resources are mobilised to produce specific outputs (Related term: development intervention 9 ). The products, capital goods and services which result from a development intervention; may also include changes resulting from the intervention which are relevant to the achievement of outcomes. The likely or achieved short-term and medium-term effects of an intervention s outputs. Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. 6 Also defined as A management strategy focusing on performance and achievement of outputs, outcomes and impacts. OECD (2002), Glossary of Key Terms in Evaluation and RBM. 7 While the OECD has defined results as the outputs, outcomes or impacts of a development intervention, RBM is generally considered to be outcomes-oriented. 8 OECD (2002), Glossary of Key Terms in Evaluation and RBM. 9 The term development intervention or simply intervention is used synonymously with development activity, but also denotes different types of support, e.g., policy advice, country or sector programme, project. Draft RBM Discussion Paper Page 6
8 Based on these definitions we can conclude that inputs are needed to undertake activities in order to produce outputs which in turn generate short and medium-term outcomes leading to long-term impacts. A graphic representation of this results chain is presented in below. HOW should this be implemented? WHAT should be produced? Inputs Activities Results Chain WHAT results do we expect from this investment? Short-Term Medium-Term WHY should we do this? Long-Term Impacts The above RBM terms are defined in relationship to one another based on an accepted causal sequence and temporal dimension. These RBM terms cannot be used interchangeably, nor out of sequence, providing stability in terminology that was otherwise lacking in other management approaches. The publication of the OECD glossary presents an opportunity for the international development community to adopt the same RBM language among themselves and especially with their developing country partners. Successful implementation of RBM is dependent on the organisation's ability to create a management culture that is focussed on results. It requires more than the adoption of new administrative and operational systems. An emphasis on outcomes requires first and foremost a results-oriented management culture that will support and encourage the use of the new management approaches. The public sector traditionally has had an administrative culture that emphasises the management and measurement of inputs, activities and outputs whereas a resultsoriented culture is focussed on managing for the achievement of outcomes. This means that organisations have to establish a set of desired values and behaviours, and take actions to foster these while avoiding the undesirable ones, e.g., low-balling targets, inflating results, etc. The greater the difference between the existing culture and that of a results-oriented culture, the more effort it will require. For example, it would take a well planned and funded change management programme to transform the many public sector organisations that have a hierarchical, control and compliance management culture into a learning organisation that uses performance information for management decision-making. The former requires public managers to be familiar with and apply the appropriate laws, regulations and procedures, while the latter requires managers to diagnose problems, design solutions and develop adaptive implementation approaches. Organisational change of this magnitude is difficult and time consuming, but necessary in order to create the enabling environment in which RBM and be effectively implemented Key Principles for Public Sector Organisations Partnership In development co-operation, RBM has to be built on mutually beneficial partnership relationships based on trust among the stakeholders involved in a development intervention. It is expected that national authorities exercise leadership and ownership, while donor agencies play 10 For more details see Implementing Results-Based Management: Lessons from the literature. A research paper prepared by RBMG for the Office of the Auditor General of Canada (OAG 2000) Draft RBM Discussion Paper Page 7
9 an actively supportive role. The extent of beneficiary participation as partners in the development intervention is encouraged but will vary according to circumstances and the prevalence of good governance and democracy Accountability Determining accountability should take into consideration the nature of the partnerships in the development intervention. Where strong partnerships are present, a development intervention starts with shared performance expectations, continues with shared management decisionmaking and leads eventually to shared accountability. Under these ideal circumstances, national authorities and donor agencies could assume shared accountability for development results when reporting to their respective constituencies. However, there is considerable variability among donor countries regarding accountability requirements for ODA funding, nor are all development interventions alike. The characteristics of the funding arrangements, e.g., direct budget support, basket funding (e.g., SWAps and Funds), donor-directed project interventions, etc. can also determine the degree of control over financial resources that each party exercises and thus their level of accountability. Other key factors include the number of the partners involved, the extent of their participation in the design and planning of the development intervention and the degree of management decision-making authority they can exercise over implementation decisions. Establishing the appropriate level of accountability vis-à-vis the results chain is thus context dependent. As a general principle, the more control and ownership the national authorities, or the donor agencies, have over the development intervention, the greater the potential for demonstrating attribution and thus assuming accountability for development results, i.e., outcomes and impacts Organisational Learning Organisational learning is the motivation behind the adoption of the RBM approach by highly effective organisations. RBM facilitates organisational learning by channelling performance information to decision-makers through nested feedback loops from continuous performance monitoring, evaluation and audit activities. This creates opportunities for learning at the individual, group and system level to continuously transform the organisation or development intervention in a direction that is increasingly satisfying to its stakeholders. Management decision-making processes can then be informed by valid and reliable performance information resulting in greater efficiencies and effectiveness Transparency Transparency is necessary to ensure that the benefits of the above RBM principles are fully realised. Clarity is needed in defining the respective roles and responsibilities of partners for the development intervention, and specifically the implementation of the RBM approach. Appropriate disclosure of the methodologies used to collect valid and reliable performance indicator data is critical to fulfilling partners accountability obligations for reporting to their respective constituencies. Broad dissemination and active discussion of performance information, including progress made toward the achievement of outcomes, lessons-learned and proposed adjustments, will enhance national country ownership and organisational learning. The RBM approach is significantly weakened in an environment that lacks transparency. Draft RBM Discussion Paper Page 8
10 3.3.5 Simplicity Simplicity is one of the keys to successful RBM implementation. Rather than focussing on a core set of expected results, the tendency has been to design complex results chains with numerous and finely differentiated outputs, outcomes and impacts. Of course, this increases the number of performance indicators required to produce reliable performance information by an exponential factor. Before too long data collection, analysis and reporting have absorbed a disproportionate amount of resources and management attention. The key is to keep the results and indicators to the vital few for continuous monitoring across the entire results chain from inputs, outputs, outcomes and impacts, and to transform evaluation and internal audits into learning opportunities. So, try to keep it simple, but no simpler than it is Flexible and Iterative Application Like any good management strategy, RBM should be sufficiently flexible to be applied in an iterative manner in a broad range of circumstances. Governments, donor agencies and civil society organisations have applied RBM in order to satisfy their unique organisational needs. In international co-operation, the RBM experience has been mostly at the project level, but increasingly with program-based approaches, e.g., sector-wide adjustment programs, basketfunding, etc. Donor agencies have moved from blue-print to iterative approaches in designing development interventions. These types of interventions present new challenges that require flexible management approaches, while upholding the other RBM principles. 3.3 RBM Management Cycle There are six steps to managing for results that constitute the RBM management cycle (Figure 1). For those who are unfamiliar with the RBM tools and techniques, more details describing this step-by-step approach are presented in Annex A. Step one involves establishing the development intervention profile: Review mandate and objectives; Conduct stakeholder analysis; Determine governance structure, roles and responsibilities. Figure 1. Step-by-Step Approach Development Intervention Mandate/Profile Build Results- Based Logic Model Performance Appraisal and Adjustments Risk Management Strategy Performance Review Strategy Data Collection Analysis & Reporting Step two involves building a results-based logic model with stakeholders: Determine appropriate stakeholder participation; Understand the results chain and articulate expected results; Answer the key questions Why? What do we Want? For Whom? And How?; Use a logic model to illustrate causality. Draft RBM Discussion Paper Page 9
11 Step three involves developing a risk management plan: Identify the underlying assumptions in the logic model; Conduct a risk analysis of the assumptions; Elaborate risk mitigation strategies where needed. Step four involves preparing a performance review plan: Determine performance measurement, management audit and evaluation requirements; Select performance indicators and complete a performance measurement plan; Estimate performance review costs; Step five involves measuring performance and reporting: Develop data collection instruments and systems; Establish baseline data and then set performance targets; Collect and analyse performance and risk data; Fulfil internal and external reporting requirements. Step six involves stakeholders in the appraisal of performance information: Diagnose performance shortcomings; Design and develop solutions; Use Performance information for organisational learning. The RBM management cycle is complete when adjustments are made to annual implementation plans to improve effectiveness based on credible performance information. 4.0 RBM: WHAT IT S NOT? 4.1 Continuous Evaluation One of the popular misconceptions is that RBM and continuous evaluation, also known as performance measurement, are synonymous. It is true that the continuous process of collecting and analysing data to compare current performance with what was expected is an important component of the RBM approach. However, it is a much broader management strategy that incorporates aspects of strategic planning, risk management, monitoring, evaluation and even audit. This popular misconception has led some people to believe that RBM is designed to increase compliance and control in decentralised management environment. 4.1 Compliance and Controls Government services have become increasingly decentralised with privatisation, outsourcing and alternative service delivery. The same applies to international development interventions with greater decision-making responsibility being delegated to in-country offices, increased contracting with local implementing organisations and direct funding to government bodies and agencies. The often advanced argument is that with delegated decision-making authority there must be increased accountability, not just for the stewardship of funds, but also for the achievement of results. When this rationale is applied to the individual, we observe a number of distortions in the how RBM is implemented. In many cases, individual performance appraisals or Draft RBM Discussion Paper Page 10
12 incentives are linked to the achievement of short and medium-term outcomes through the performance measurement system. This becomes problematic because the performance measurement system becomes the instrument by which senior management ensures compliance and exercises control over front-line managers under the guise of accountability. However, as we know, external factors can play havoc with the best laid plans despite the best efforts of frontline managers. Therefore holding individuals accountable for the achievement of outcomes is unreasonable and unrealistic. A more appropriate RBM strategy would be to base individual performance appraisals on the demonstrated skills in diagnosing problems, designing solutions and developing adaptive implementation plans Performance-Based Budgeting Another popular misconception is that RBM and performance-based budgeting (PPB) are synonymous. Historically, PBB is described as directly linking performance levels to the budgeting process and allocating resources among competing programmes based on costeffectiveness measures. It was believed that integrating performance information into budgetary decision-making and management practices would create incentives for improvement. Some have gone as far as to say that performance measurement in and of itself is not a strong incentive for improvement unless it is connected to budgetary decision-making. Many refer to the PBB approach as managing by results. However, there is little evidence of success with PBB as it has been defined above. An explanation for this is simply that most organisations did not have a sufficiently well developed RBM system in place before they attempted to link performance levels with budgetary allocations. Consequently, performance-based budgeting may not be an appropriate adjunct to RBM in the field of international development. However, performance information about results achievement should be taken into consideration as one among many other factors when budgetary allocations are determined. 4.4 Performance Reporting Performance reporting is not the main purpose of RBM, just a secondary by-product of good results-oriented management. It is unfortunate that many organisations have attempted to introduce RBM through the back door, like some unwelcome visitor. In government, performance reporting requirements are sometimes the only point of leverage that central agencies have over ministries and departments. Similarly, donor agencies have exercised the same leverage over executing agencies and even developing country partners. Reporting requirements are changed to document results achievements at the output, outcome and impact levels, instead of the usual information about financial disbursements and activities accomplished. Little thought is given to the enormous capacity building effort needed to bring about the changes in management culture and practices that are required to generate credible and useful performance information. It is not surprising then that many people think RBM is just an exercise in performance reporting, rather than an essential component of good governance and democratic development programming. 11 For more details see Zapico-Goni and Mayne (1997) Performance Monitoring in the Public Sector: Future directions from international experience. Draft RBM Discussion Paper Page 11
13 ANNEX A: SIX BASIC STEPS TO MANAGING FOR RESULTS 1.0 Step One: Establish a Development Intervention Profile 1.1 Review Mandate and Objectives Every development intervention should have an approved mandate, usually a clear statement of the intervention s purpose as it relates to the country s national development plans. Where a national Poverty Reduction Strategy exists, there should also be a tight link between its stated objectives and those of the development intervention. In this way, a coherent national policy framework for poverty reduction can be established for all development interventions irrespective of the donor agencies involved. 1.2 Conduct Stakeholder Analysis Development interventions may involve many stakeholders with different roles, perspectives and management information needs. An important task in preparing a profile is to develop a precise understanding of who is involved and how they relate to one another. Stakeholder mapping is a popular technique that is often used to illustrate and analyse the relationships among and between the various stakeholders involved in the intervention. A common and relatively easy distinction can be made between co-delivery partners and beneficiaries. Co-Delivery Partners In the first instance, the national government may partner with one or more donor agencies, nongovernmental organisations (NGO), research institutions, or private sector firms, in order to implement a development intervention. Within the national government there may also be different Ministries, or provincial and local government bodies with different concerns and interests with regard to their role in the implementation of the intervention. Beneficiaries In the second instance, a distinction can also be made between direct and indirect beneficiaries. The direct beneficiaries are the users of the goods and services (outputs) produced by the codeliverers, while the indirect beneficiaries are generally one step removed from the activities of the development intervention. Capacity building interventions illustrate this principle quite well, in that an organisation is strengthened (direct beneficiary) so that it can provide better services to its clientele (indirect beneficiaries). In such cases, the role of co-delivery partner and direct beneficiary will overlap. 1.3 Determine Governance, Roles and Responsibilities A sound governance structure is one of the most important elements of the profile because many interventions involve multiple partners within and outside the national government structures, thus necessitating an oversight mechanism to ensure co-ordination. A graphic illustration of the management accountability structure is usually helpful when combined with a bulleted narrative text that describes the roles and responsibilities of key committees and partners. Since governance can be an area of considerable sensitivity and disagreement among the partners involved, it is advisable to hold individual and group consultations in order to build a consensus around critical decision-making parameters and accountabilities. Draft RBM Discussion Paper Page 12
14 2.0 Step Two: Build a Results-Based Logic Model The second step in managing for results involves the building of a results-based logic model that captures the performance vision of the partners in a development intervention. In this section we discuss both the process and the technical issues, including, who should be involved, understanding the main components, using a participatory consensus-building approach, validating the logical cause-effect relationships, and undertaking activity-based costing. The underlying principle to our approach to this chapter is that the process is as important as the technical quality of the logic model. 2.1 Determine Appropriate Stakeholder Participation Based on past experience and lessons-learned from the implementation of RBM, the approach works best when a representative group of stakeholders 1 are involved in building the logic model, as well as in the subsequent steps of the RBM management cycle. The benefits of stakeholder participation include: expands the knowledge and information base enhances the technical quality of logic models; ensures that the needs of beneficiaries are considered; strengthens accountability for the achievement of results; enhances commitment to ongoing continuous performance measurement. 2.2 Understand the Results Chain The basic logic model adopted by most organisations around the world presented as Figure 2. is a graphic illustration of a series of cause-effect relationships from the investment of resources through to outcomes achievement and impact. These cause-effect linkages can be expressed with "if-then" phrases, representing the internal logic of any intervention. For example: "if" the resources are invested and the inputs mobilised, then Area of Control Internal to the Organisation Inputs (Resources) Figure 2. Results Chain Activities Efficiency Reach Direct Beneficiaries Area of Influence External to the Organisation Short Term (Direct) Med-Term (Indirect) Effectiveness Long Term Impact External Factors the activities can be implemented; if the activities are implemented, then the outputs will be produced; if the outputs are produced and used by the direct beneficiaries as expected, "then" we should achieve the short-term outcomes; if the short-term outcomes are achieved, then we should achieve the medium-term outcomes, etc. The basic logic model thereby provides the form while the key stakeholders provide the content based on their past experience, research knowledge and understanding of the development intervention s mandate and objectives. 1 The term stakeholders includes both co-delivery partners, direct and indirect beneficiaries. Draft RBM Discussion Paper Page 13
15 2.3 Answer the Key Questions While there are several techniques used to develop logic models, we suggest a stakeholder working session facilitated by an evaluation specialist who will ask the questions illustrated in Figure 3. When reading the graphic from right to left, the first question to ask is Why should we do this intervention? The next questions are What results (effects) do we expect from this investment? and Who? are the beneficiaries of these results. These questions should be answered in terms of expected outcomes that respond to the identified needs of the intended beneficiaries. The remaining questions can then be posed as to How? the intervention should be undertaken and Who to work with? In this way, the content of the logic model becomes demand driven by asking the Why? and What? questions before deciding on the How? All too often the reverse is true which is typical of a "supply or activity driven" development process that doesn t meet the needs of the intended beneficiaries. However, once a draft of the logic model is completed, then several iterations will be needed moving back and forth to verify the cause-effect relationships, eliminating duplication and refining the result statements. 2.4 Use a Logic Model While there are several types of results-based logic models, we recommend the format illustrated in Figure 4. The building of a logic model typically proceeds more smoothly when stakeholders focus on the core elements of the development intervention. There is only one basic rule with respect to how outcome statements are articulated. They are generally stated in the past tense beginning with an active verb, e.g., increased, reduced, maintained, improved, enhanced, etc. so as to Figure 3. Answer the Key Questions Inputs/ Activities HOW? Project Delivery Partners WHO WE WORK WITH? WHAT DO WE WANT? Short-Term Direct Beneficiaries Med-Term Indirect Beneficiaries WHO BENEFITS? WHY? Long-Term Impacts Society Figure 4. Results-Based Logic Model Mandate: To eradicate extreme poverty and hunger. Objective(s): To address the prioity needs of the targeted direct and indirect beneficiaires. Program Activity Set Program Activity Set Program Activity Set Program Activity Set Program Activity Set Internal Results Reach: Direct Beneficiary Group Direct Beneficiary Group Direct Beneficiary Group Short- Term Short- Term Short- Term Med-Term Med-term Long- Term Impacts External Results denote future expectations about change. The final task is to validate its internal logic. At the appropriate time during the working session, stakeholders should be asked to identify with arrows the cause-effect linkages from activities through to the long term impacts. Draft RBM Discussion Paper Page 14
16 Building a logic model is no easy task for any group of stakeholders involved in a development intervention. defy simple standardisation and don't lend themselves easily to a cookiecutter approaches. Because of the differing circumstances from one intervention to the next, it is impossible to establish standard outcome statements. It is best to rely on the consensus of informed key stakeholder groups, working in partnership, to develop a realistic logic model. 3.0 Step Three: Develop a Risk Management Plan Since development interventions are not implemented in a controlled environment, external factors can often be the cause of their failure. When in the intervention design stage, the necessary conditions for success must be identified, risk analysed and mitigation strategies put in place as required. Most managers will find this third step useful in the context of dynamic and even volatile operating environments. 3.1 Identify Assumptions/Areas of Risk The development of a logic model provides a solid basis upon which to build the risk management strategy. Examining the assumptions underlying the design of the development intervention identifies areas of risks. Assumptions describe the conditions that must exist if the cause-and-effect relationships are to behave as expected i.e., from resources to activities, outputs, reach and outcomes. This conditional logic begins with some initial assumptions about the necessary preconditions for Figure 5. Identifying Assumptions Initial Assumptions Inputs/ Activities Internal Risk Areas Program Management Assumptions Assumptions Assumptions Short-Term Med-Term External Risk Areas Long-Term Impacts implementation start-up and continues across the logic model. For example, "if" the funding is available from external resources, "then" the inputs can be mobilised and the activities undertaken. "If" the activities are delivered, "and" provided that the assumptions about the factors affecting the activity-output relationships hold true, "then" the outputs should be achievable. "If" the outputs are produced, "and" provided that the assumptions about the external factors affecting the outputs-outcomes relationships hold true, "then" the short term outcomes should be achievable. In short, if the assumptions hold true, the necessary conditions for success exist. However, program managers should assess the risks if these assumptions were not to hold true. Draft RBM Discussion Paper Page 15
17 Internal Areas of Risk Governance structures Accountability and transparency Policies, procedures and processes Risk management culture Human resource capacity External Areas of Risk: Political: the influence of government bodies Economic: international and national markets Social: demographic and social trends Technological: new communications and scientific technologies 3.2 Conduct Risk Analysis Risk analysis involves an examination of internal and external areas of risk (RA) to determine the probability or likelihood that the assumptions would not hold true and/or that the occurrence of events could compromise performance. This is combined with an analysis of the potential effect that these occurrences could have on the initiative s credibility and/or achievement of outcomes. This risk analysis is often based on the experience, insight and intuition of stakeholders and takes into consideration all existing risk mitigation strategies. Effect Significant (3) Moderate (2) Minor (1) Figure 6. Conduct Risk Analysis Risk Management Actions RA1 x Manage and monitor risks RA3 x Accept, but monitor risks RA4 x Accept risks Low (1) Risk management required RA2 x Manage and monitor risks Accept, but monitor risks Medium (2) High (3) Likelihood Extensive risk management essential Risk management required Manage and monitor risks RA1 RA2 RA3 RA4 Risk Areas External Risks Description Description Internal Risks Description Description Acceptable Risk Boundary The final assessment for each risk area (see RA 1-4) is plotted on the graph shown in Figure 6., and a level of risk is assigned to each based on the combination of likelihood and effect, i.e., Low (0-3), Moderate (4-6), High (6-9). Judging whether the level of risk is acceptable, or not, is a function of risk tolerance. If the risk is unacceptable, then program managers are advised to elaborate and implement suitable risk mitigation strategies. 3.3 Elaborate Risk Mitigation Strategies Risk mitigation strategies should bring risk within acceptable levels for partner by either reducing the probability of an undesirable event, e.g., loss of a key national government partner, or limiting its effect on the development intervention. Risk management should be integrated with management decision-making. This can be done by defining risk tolerance levels, developing decision criteria and alternative plans when assumptions do not hold true. Where internal risk areas are concerned, program managers should implement the necessary monitoring activities designed to lower risk at the operational level, e.g., financial and management audits, security procedures, public communication protocols, etc. Where external risk areas are concerned, risk mitigation strategies should be performance-oriented, used to enhance the achievement of expected outcomes, or provide alternate courses of action when assumptions about the external environment do not hold true. The regular monitoring of assumptions is of Draft RBM Discussion Paper Page 16
18 particular importance if pre-emptive action is to be taken and the effects of failed assumptions are to be mitigated. Partners should integrate risk mitigation strategies into the design of their interventions for those assumptions with the highest risk rating. Since this will require the reallocation of resources from normal programming activities, it will be important to understand the acceptable level of risk tolerance. 3.4 Monitor Assumptions/Risk Areas High risk assumptions over which program managers have little control or influence should be carefully monitored during implementation. As time passes, the necessary conditions for success may change and immediate corrective action will have to be taken. In some cases, the use of risk indicators to monitor the status of these assumptions would be recommended, particularly for very large, complex, innovative, or risky interventions where the potential benefits could outweigh the additional cost of data collection Initial Assumptions Risk Indicators Inputs/ Activities Figure 7. Monitoring Risk Internal Risk Areas Assumptions Risk Indicators Short-Term Assumptions Risk Indicators Med-Term External Risk Areas Decreasing Program Management Control Assumptions Risk Indicators Long-Term Impacts and analysis. Very simply, such a technique would involve a regular scanning of the environment in which the intervention is operating to determine whether the assumptions are holding true and the necessary conditions for success remain present. Reporting on internal and external risk areas, along with performance, enhances continuous learning and pro-active decision-making. Implementing a continuous learning approach in risk management will create incentives for innovation while respecting organisational risk tolerances. While not all risks can be foreseen, or totally avoided, having a risk management plan is an important step toward managing for results. 4.0 Step Four: Develop a Performance Review Plan Step four in managing for results will assist in developing a performance review plan that will generate evidence-based information in order to answer the following question: "Are we achieving the outcomes expected for the targeted beneficiaries at a reasonable cost"? The foundation for this strategy begins with the set of expected outcomes, illustrated in the logic model developed in Step 2, and a plan for reviewing performance against stated expectations. The plan should be designed to support efforts to manage for results, as well as fulfil all requirements for accountability and performance reporting. Draft RBM Discussion Paper Page 17
19 4.1 Describe Overall Approach The overall approach to performance review should take advantage of the latest results-based management (RBM) techniques and tools in combination with traditional monitoring approaches. In the short-term, program managers should develop a performance measurement system that will generate the timely flow of performance information to support ongoing management decision-making. In the medium and longer term, the more traditional performance review techniques, such as audits, formative and summative evaluations should be planned so as to inform management of any problem areas, outcome achievement, and/or major adjustments that may be required. When used in combination, performance measurement, audit and evaluation can serve as effective means to monitor the performance of any intervention throughout its entire life-cycle. 4.2 Determine Performance Measurement Requirements Performance measurement consists in of ongoing process of collecting data on performance indicators. A well designed performance measurement system is customised to respond to the performance information needs of stakeholders. When performance measurement activities are undertaken on a continuous basis during implementation, it empowers partners with timely information about the use of resources, the extent of reach and the achievement of results. Performance information should inform program managers about output, short-term and medium-term outcome achievement, as well as help to identify programming strengths and weaknesses. Consequently, it enhances learning and improves management decision-making. Long term impacts are generally not included in performance measurement activities because they are typically difficult and costly to measure on an ongoing basis. Similarly, performance measurement will not address the questions of How? or Why? results were/were not achieved. For a complete performance story, these and other questions should be asked within the context of an evaluation. 4.3 Determine Audit Requirements Audits can also be included as part of a performance review and risk management plans. They can provide reasonable assurance as to whether recipients are submitting accurate and complete financial reports and whether payments are managed in accordance with stated policies. Aside from the traditional financial and management audits, consideration should be given to undertaking risk or performance audits as well. The former can assist managers in monitoring risks, while the latter is designed to verify the validity and reliability of the information contained in performance reports. 4.4 Determine Evaluation Requirements Evaluations are another important part of the performance review plan. Formative and summative evaluations are often required by donor agencies for grant contributions. In a multidonor intervention, partners should have greater flexibility in planning how, when and what is to be evaluated in accordance with their particular performance information needs. Formative evaluations have primarily a learning function and are, consequently, undertaken when a development intervention reaches the mid-point of its life-cycle to allow time to take corrective action when required. The focus is generally on management issues, e.g., how the Draft RBM Discussion Paper Page 18
20 intervention is being implemented, whether risk is being managed and, especially, if the performance measurement system is generating valid and reliable performance data. It is also an opportune time to determine if progress is being made toward the achievement of outputs and short term outcomes. Risk management considerations can also be taken into account when prioritising the most important evaluation issues to address as part of a formative evaluation. Summative evaluations have primarily an accountability function and are, consequently, undertaken when an intervention reaches the end of its life-cycle or sometime thereafter, so as to allow time for the achievement of medium-term outcomes and impacts. The standard DAC evaluation criteria should be used to bring focus and structure to summative evaluations. It is expected that the formative and summative evaluations will generate findings and answers to the questions raised for each of the evaluation issues identified. It is also expected that lessonslearned will be derived from the evaluation findings and that recommendations will be made to guide future design and implementation. Program managers should prepare an evaluation framework that will identify the specific evaluation questions that will be asked and what data should be collected as part of the performance measurement strategy, or at the time of the evaluation. 4.5 Select Performance Indicators Performance indicators are either qualitative or quantitative measures of resource use, extent of reach, output production and results achievement. Quantitative indicators are statistical measures such as number, frequency, percentile, ratios, variance, etc. Qualitative indicators are judgement and perception measures of congruence with established standards, the presence or absence of specific conditions, the extent and quality of participation, or the level of beneficiary satisfaction, etc. There are six criteria that should be considered when selecting performance indicators. Each one is presented below along with an illustrative question in guise of an explanation. 1. Validity - Does it measure the result? 2. Reliability - Is it a consistent measure over time? 3. Sensitivity - When the result changes will it be sensitive to those changes? 4. Simplicity - Will it be easy to collect and analyse the information? 5. Utility - Will the information be useful for decision-making and learning? 6. Affordability - Can the program/project afford to collect the information? Selecting performance indicators involves at least one facilitated session with stakeholders. It is important that they agree a priori on the indicators that will be used to measure performance, or to answer specific evaluation questions. Begin the process by preparing a comprehensive list of indicators for each result statement or evaluation question. Then, decide how many indicators are needed and apply the selection criteria to the list. Those that don't meet these criteria should be discarded. The best performance indicators from those remaining should be used and the rest kept in a reserve pool. Selecting good performance indicators is a trial and error experience that can only be improved with experience after several cycles of data collection, analysis and appraisal. Some indicators may, after some use, prove not to generate useful information and must then be replaced. Draft RBM Discussion Paper Page 19
21 4.6 Complete Performance Measurement Tables A realistic plan for the collection of data for either performance measurement or evaluation purposes is a necessary part of the performance review plan. Completing a set of Performance Measurement Tables serves this purpose. The basic principle underlying these tables is that performance must be monitored by collecting data on selected performance indicators. These summary tables lay out in a matrix format what is to be measured, the performance indicators to be used, how data should be collected, from whom, when and how frequently. Results Chain LT Impacts MT ST Activities Inputs Figure 8. Performance Measurement Tables Performance Indicators Indicator #1... Indicator #2... Indicator #3... Indicator #1... Indicator #2... Indicator #3... Indicator #1... Indicator #2... Indicator #3... Indicator #1... Indicator #2... Indicator #3... Indicator #1... Indicator #2... Indicator #3... Indicator #1... Indicator #2... Indicator #3... Data Sources Data Collection Methods Freq. Resp. It is important to note that some data required for formative or summative evaluation purposes can be collected on an ongoing basis as part of the performance measurement system. Consequently, these tables are designed to allow for the identification of indicators that will be used for both performance measurement and evaluation purposes. Finally, the overall responsibility for data collection and analysis generally rests with the program manager of the intervention, but can also be delegated to staff, delivery partners and other stakeholders as required. 4.7 Estimate Performance Review Costs The costs associated with implementing the Performance Review Plan should be calculated and allocated to the appropriate cost centres. Since performance measurement is the program manager s responsibility, the incremental costs will have to be budgeted for as part of the intervention, while the costs associated with the conduct of audits and evaluations may/may not have to be budgeted for as part of the intervention depending on the particular funding circumstances. In all cases the estimated costs and the appropriate cost centres should be identified and informed. 5.0 Step Five: Measure Performance and Report In the context of managing for results, responsibility for ongoing performance measurement and reporting is shared among the partners. To fulfil this responsibility, partners must undertake a series of tasks and make a number of decisions to develop an operational performance measurement and reporting system. Step five outlines these key tasks, raises a number of issues and guides partners through the decision-making process. Draft RBM Discussion Paper Page 20
22 5.1 Develop Data Collection Systems and Tools By virtue of having completed the Performance Measurement Tables, the appropriate data sources and methods to collect performance data will have been identified. Data collection systems and tools will now have to be developed. However, since there is normally no single data collection system that can satisfy all performance measurement needs, each one must be developed and maintained with great care. While the common saying, garbage in - garbage out is often associated with electronic databases, it can also apply to financial accounting systems, administrative files and records management, etc. Ensuring that these systems contain credible and accessible administrative data will be critical to developing a good performance measurement system. Try to integrate data collection into overall management practices and recording keeping activities of the co-delivery partners responsible. When collecting primary data, i.e., undertaking original evaluation research, then any one of a variety of data collection instruments and tools will have to be developed. 5.2 Establish Baseline Data Before performance can be measured, there must first be a reference point for each indicator; baseline data and/or benchmarks can serve that purpose. If reliable historical data on performance indicators exists, then it should be used, otherwise a set of baseline data can be collected at the first opportunity. To avoid commissioning a costly Baseline Study, consider the data collected during the first performance measurement cycle as a baseline. It is also an excellent opportunity to test any new data collection instruments. 5.3 Set Performance Targets Performance targets are the basis from which measurement takes place and improvement begins. Without them, it is not possible to know whether performance is improving or falling behind. Targets for each indicator are established in relation to baseline data and thereby set the expectations for performance over a fixed period of time. End-of-year performance targets are generally established as part of the annual work planning exercise. If an intervention is achieving its annual performance targets at both the Figure 9. Set Performance Targets and/or Standards Expected Result: Indicator #1 Baseline End-of-Year Targets End-of-Program Target Indicator #2 Baseline End-of-Year Targets End-of-Program Target output and short term outcome levels, then this is an indication that it is on track to achieve the medium-term outcomes by the end of the programming period. In short, using targets allows performance to be measured in relation to the starting and end point. Draft RBM Discussion Paper Page 21
23 5.4 Collect and Analyse Performance and Risk Data To the extent possible, performance and risk data should be collected and analysed at the same time. This is in order to assess actual performance in light of any risk information available about the status of assumptions or areas of risk. The performance of an intervention may very well have been compromised because one or more of the assumptions about the operating environment did not hold true, e.g., political support, implementation and absorptive capacity, beneficiary co-operation, economic growth, etc. By integrating the collection and analysis of performance and risk information into the performance measurement activities, partners will be able to better understand any performance shortcoming and take corrective action. 5.5 Design Performance Report Formats The overall approach to performance reporting should be viewed as a logical extension of the performance review strategy. All internal and external reporting should become increasingly results-based, presenting summary information based on the systematic analysis of data collected on performance indicators. Consequently, current contribution agreements with implementing partners, progress report guidelines and formats may have to be retooled, so as to align more closely with these new performance information requirements. As the flow of information becomes increasingly more factual, and less anecdotal, so should the content of the reports. Quality standards for performance reports: Management: Annual review of performance and risk information to assess progress made toward the achievement of expected outcomes Figure 10. Collect and Analyse Performance and Risk Data Collecting Performance Information Collecting External Risk Information Risk Indicators Risk Indicators Risk Indicators Performance Indicators Short-Term Performance Indicators 1. clear description of operating environment and strategies employed; 2. meaningful performance expectations identified; 3. performance accomplishments reported against expectations; 4. methods described for collecting valid and reliable performance information; 5. demonstrated capacity to learn and adapt. Med-Term Performance Indicators 5.6 Identify Internal and External Reporting Requirements There are many potential users of the performance information that a development intervention will generate. A plan should be developed that identifies what type of reports need to be produced, how frequently, who should produce them and who are the recipients. It is sometimes helpful to begin with the internal reporting requirements, since such reports represent a valuable source of performance information. Included would be progress and financial reports from co- Draft RBM Discussion Paper Page 22
24 delivery partners, contractors, as well as any other partners and even beneficiaries. Audit, evaluation and special study reports would also be included here. Figure 11. Internal and External Reporting Requirements Table Report Title Summary Description Frequency Originator Recipient(s) External reporting requirements, i.e., to recipients outside the scope of the intervention s partners, should be agreed upon during the planning phase. Depending on the public profile that the intervention enjoys, consideration should be given to publishing an Annual Performance Report to be tabled with the national government or Parliament of the country and made available to the public. All internal and external reports should be included in the Schedule of Reporting Requirements. 6.0 Step Six: Appraise Performance and Make Adjustments In an RBM context, program managers are expected to have the requisite skills to manage for results. The must have the aptitude and ability to diagnose performance shortcomings, as well as, design and implement the necessary solutions to improve performance. For some, this may represent a new skill set, moving away from compliance and control management to a more analytic and iterative management style. 6.1 Diagnose Performance Shortcomings The usefulness of viewing performance measurement and evaluation as complementary performance review strategies was previously stressed. Performance measurement is best suited to collecting routine financial, administrative and beneficiary data, all of which can provide insights into the achievement of outputs and short term outcomes. A well designed performance measurement system can also track intermediate outcome indicators through annual primary research activities and provide reasonably reliable indications of outcome achievement. When performance on one or more outputs and related outcomes does not meet expectations, this may signal that there is a problem in the cause-effect chain. As previously mentioned, the analysis and interpretation can be considerably strengthened when this performance data is combined with risk data collected from periodic environmental scans. Since performance measurement per se will not be able to say much about cause - effect relations, periodic evaluation studies can also be used to sort out the contributing factors and provide the explanatory material to enhance the diagnosis of the problem. By tracing and examining the cause-effect links backwards from medium-term, to short term outcomes, then outputs and activities, there is a good chance of revealing where the integrity of the logic model has been compromised. Formative evaluations are best suited to examining management issues around the quality of activity implementation and output production, which generally arise in the Draft RBM Discussion Paper Page 23
25 first few years of implementation. Summative evaluations on the other hand can produce extensive information on any issues of attribution around the achievement of medium-term outcomes and impacts. Furthermore, evaluations and management audits could also be used as a means of improving the diagnostic potential of your performance measurement system by having them generate relevant and useful performance indicators, or by verifying the quality and accuracy of performance data. Partners should be well positioned to diagnose performance problems if armed with good performance and risk data, as well as in-depth periodic evaluations. 6.2 Design and Develop Solutions Performance information can be used to design and develop solutions in three key ways: where outcomes are being achieved, actions can be taken to strengthen them; where progress is difficult, different approaches can be tried or activities added; and, where activities-outputs are considered to be obsolete, they should be abandoned. Performance information should be used to examine strategic trade-offs between resource use, beneficiary reach and the achievement of outcomes. Program managers should ask themselves the following questions: Figure 12. Design and Develop Solutions for Optimal Program Performance Managing for Results: Program managers are expected to diagnose performance shortcomings, design new manaement solcutions and develop those solutions to ensure optimal program Projct performance. Inputs Project Inputs Project Inputs Project Activities Project Activities Performance Indicators Short-Term Performance Indicators Management decisions to allocate or reallocate resources 1. Can outcomes be improved given the allocated resources available? 2. Should beneficiary reach be increased or decreased for better outcomes? 3. Can resources be decreased or re-allocated to improve cost-effectiveness? Med-Term Performance Indicators The answers to these questions will certainly depend on the unique circumstances of each development intervention, but in each case they require a close examination and decision about strategic trade-offs between resources, beneficiary reach and outcomes. This process of diagnosis, design and development will enhance organisational learning. 6.3 Use Performance Information for Organisational Learning Managing for results is an iterative management approach. There is constant feedback to the planning and management process as results are assessed. Based on constant feedback of performance information from: audits, management reviews, performance measurement activities and evaluations, the inputs and activities can be modified and other implementation adjustments made. This corresponds to the two management functions of continuous performance measurement and iterative implementation. These two management functions are Draft RBM Discussion Paper Page 24
26 represented above by the nested feedback/action loops representing the collection of performance information and the management decisions based on the analysis of this information. By managing for results, all of the development intervention partners can contribute to organisational learning and improved management. Figure 13. Use Performance Information for Organisational Learning & Management Intermal Audit & Management Reviews Formative & Summative Evaluation Continuous Performance Measurment Managing for results: -planning for results -implementation -performance review -learning & action Short-Term Med-Term Long-Term Impacts Iterative Programme Implementation Draft RBM Discussion Paper Page 25
Logical Framework: Making it Results-Oriented
1 of 11 2001-04-05 09:58 Logical Framework: Making it Results-Oriented Table of Contents 1. Introduction 2. Purpose of Guide 3. The LFA Process 4. The Logical Framework Structure 5. The Vertical Logic
Part 1. MfDR Concepts, Tools and Principles
Part 1. Concepts, Tools and Principles 3 Overview Part 1. MfDR Concepts, Tools and Principles M anaging for Development Results (MfDR) is multidimensional, relating back to concepts about how to make international
Session 6: Budget Support
Session 6: Budget Support Paper 6.1 - Introductory Paper on Budget Support [1] Introduction Budget support has become an increasingly important instrument of development assistance. It has not only received
Guide for the Development of Results-based Management and Accountability Frameworks
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
Country Ownership of Policy Reforms and Aid Effectiveness: The Challenge of Enhancing the Policy Space for Developing Countries in Aid Relationships
Country Ownership of Policy Reforms and Aid Effectiveness: The Challenge of Enhancing the Policy Space for Developing Countries in Aid Relationships Statement by Louis Kasekende, Chief Economist, African
Guidance Note on Developing Terms of Reference (ToR) for Evaluations
Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim
DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES
Ref. Ares(2014)571140-04/03/2014 DG ENLARGEMENT SECTOR BUDGET SUPPORT GUIDELINES EXECUTIVE SUMMARY January 2014 TABLE OF CONTENTS Introduction 1. RATIONALE FOR BUDGET SUPPORT 1.1 What is Budget Support?
New JICA Guidelines for Project Evaluation First Edition. Japan International Cooperation Agency (JICA) Evaluation Department
New JICA Guidelines for Project Evaluation First Edition Japan International Cooperation Agency (JICA) Evaluation Department June 2010 Contents 1 OVERVIEW OF AID EVALUATION AND JICA'S PROJECT EVALUATION...
Glossary Monitoring and Evaluation Terms
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
4: Proposals for Best Practice Principles for GHP activities at country level
DOC 2.05-8.1 From: WORKING GROUP ON GLOBAL HEALTH PARTNERSHIPS: REPORT --- BEST PRACTICE PRINCIPLES FOR GLOBAL HEALTH PARTNERSHIP ACTIVITIES AT COUNTRY LEVEL (Oct. 2005) 4: Proposals for Best Practice
Evaluation policy and guidelines for evaluations
Evaluation policy and guidelines for evaluations IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and O
Risk Management Strategy EEA & Norway Grants 2009-2014. Adopted by the Financial Mechanism Committee on 27 February 2013.
Risk Management Strategy EEA & Norway Grants 2009-2014 Adopted by the Financial Mechanism Committee on 27 February 2013. Contents 1 Purpose of the strategy... 3 2 Risk management as part of managing for
A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS
A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 A FRAMEWORK FOR NATIONAL HEALTH POLICIES, STRATEGIES AND PLANS June 2010 This paper reviews current practice in and the potential
GLOSSARY OF EVALUATION TERMS
Planning and Performance Management Unit Office of the Director of U.S. Foreign Assistance Final Version: March 25, 2009 INTRODUCTION This Glossary of Evaluation and Related Terms was jointly prepared
Equal Rights and Treatment for Roma in Moldova and Ukraine. Manual
Equal Rights and Treatment for Roma in Moldova and Ukraine Project Management Methodology Manual WELCOME TO THE COUNCIL OF EUROPE PROJECT MANAGEMENT METHODOLOGY In July 2001, the Council of Europe launched
The Logical Framework Approach An Introduction 1
The Logical Framework Approach An Introduction 1 1. What is the Logical Framework Approach? 1.1. The background The Logical Framework Approach (LFA) was developed in the late 1960 s to assist the US Agency
The Framework for Strategic Plans and Annual Performance Plans is also available on www.treasury.gov.za
Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Strategic Plans and Annual Performance Plans is also available
PROPOSED MANDATE FOR THE GLOBAL PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO-OPERATION
PROPOSED MANDATE FOR THE GLOBAL PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO-OPERATION This document sets out the proposed mandate of the Global Partnership for Effective Development Co-operation. It was discussed
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
Alberta Health. Primary Health Care Evaluation Framework. Primary Health Care Branch. November 2013
Primary Health Care Evaluation Framewo Alberta Health Primary Health Care Evaluation Framework Primary Health Care Branch November 2013 Primary Health Care Evaluation Framework, Primary Health Care Branch,
Monitoring and Evaluation Framework and Strategy. GAVI Alliance 2011-2015
Monitoring and Evaluation Framework and Strategy GAVI Alliance 2011-2015 NOTE TO READERS The 2011-2015 Monitoring and Evaluation Framework and Strategy will continue to be used through the end of 2016.
Private Certification to Inform Regulatory Risk-Based Oversight: Discussion Document
Private Certification to Inform Regulatory Risk-Based Oversight: Discussion Document 1 Table of Contents INTRODUCTION... 3 BACKGROUND... 3 PRIVATE CERTIFICATION SCHEMES VS. REGULATORY STANDARDS... 3 PRIVATE
A NEW DEAL for engagement in fragile states
A NEW DEAL for engagement in fragile states THE FACTS 1.5 billion people live in conflict-affected and fragile states. About 70% of fragile states have seen conflict since 1989. Basic governance transformations
http://impact.zewo.ch/en/impact Stiftung Zewo Schweizerische Zertifizierungsstelle für gemeinnützige, Spenden sammelnde Organisationen
These guidelines are designed to help project managers to assess the outcomes of their projects and programmes. They demonstrate how development agencies can implement an appropriate outcome and impact
Framework for Managing Programme Performance Information
Framework for Managing Programme Performance Information Published by the National Treasury Private Bag X115 Pretoria 0001 South Africa Tel: +27 12 315 5948 Fax: +27 12 315 5126 The Framework for Managing
Introduction to Monitoring and Evaluation. Using the Logical Framework Approach
Introduction to Monitoring and Evaluation Using the Logical Framework Approach Developed and Presented by: Umhlaba Development Services Umhlaba Development Services Noswal Hall, Braamfontein, Johannesburg,
National Health Research Policy
National Health Research Policy The establishment of a Department of Health Research (DHR) in the Ministry of Health is recognition by the GOI of the key role that health research should play in the nation.
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT In order to respond to the need for an overview of the rapid evolution of RBM, the DAC Working
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
A COMPARISON OF PRINCE2 AGAINST PMBOK
Introduction This comparison takes each part of the PMBOK and gives an opinion on what match there is with elements of the PRINCE2 method. It can be used in any discussion of the respective merits of the
STRATEGIC PLANNING BACKGROUND AND PRACTICE
Module 2 STRATEGIC PLANNING BACKGROUND AND PRACTICE 1 John Glazebrook STRATEGIC PLANNING Introducing SWAp for Environment Belgrade, 24-25 July 2012 Today Look at Why do Strategic Planning? A mixture of
Lessons Learned from MDG Monitoring From A Statistical Perspective
Lessons Learned from MDG Monitoring From A Statistical Perspective Report of the Task Team on Lessons Learned from MDG Monitoring of the IAEG-MDG United Nations March 2013 The views expressed in this paper
Performance Monitoring and Evaluation System (PMES) Framework Document
Performance Monitoring and Evaluation System (PMES) Framework Document Performance Management and Evaluation Unit Cabinet Support and Policy Division Cabinet Office 8 th November 2010 Table of Contents
PROJECT AUDIT METHODOLOGY
PROJECT AUDIT METHODOLOGY 1 "Your career as a project manager begins here!" Content Introduction... 3 1. Definition of the project audit... 3 2. Objectives of the project audit... 3 3. Benefit of the audit
THE PROCESS OF PLANNING AND INSTITUTIONAL FRAMEWORK FOR POVERTY REDUCTION STRATEGY: THE CASE OF UGANDA.
THE PROCESS OF PLANNING AND INSTITUTIONAL FRAMEWORK FOR POVERTY REDUCTION STRATEGY: THE CASE OF UGANDA. By Margaret Kakande Poverty Analyst Ministry of Finance, Planning and Economic Development, Government
FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS
Activity: A specific action or process undertaken over a specific period of time by an organization to convert resources to products or services to achieve results. Related term: Project. Appraisal: An
Learning & Development Framework for the Civil Service
Learning & Development Framework for the Civil Service 2011-2014 Table of Contents 1. Introduction & Background... 1 2. Key Objectives and Actions arising... 3 3. Framework Objectives... 4 3.1 Prioritise
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE
PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE DEVELOPMENT ASSISTANCE COMMITTEE PARIS, 1991 DAC Principles for Evaluation of Development Assistance Development Assistance Committee Abstract: The following
Fund for Gender Equality MONITORING AND EVALUATION FRAMEWORK (2010-2013)
Fund for Gender Equality MONITORING AND EVALUATION FRAMEWORK (2010-2013) Prepared by Ana de Mendoza July 2010, revised March 2011 1 2 INDEX Page 0. Executive Summary 4 1. Introduction: Fund For Gender
Confident in our Future, Risk Management Policy Statement and Strategy
Confident in our Future, Risk Management Policy Statement and Strategy Risk Management Policy Statement Introduction Risk management aims to maximise opportunities and minimise exposure to ensure the residents
KPI UN Women Response Key Actions I. UN-Women s organisational effectiveness KPI 1: Providing direction for results
KPI UN Women Response Key Actions I. UN-Women s organisational effectiveness KPI 1: Providing direction for results KPI 2: Corporate strategy and mandate KPI 3: Corporate focus on results KPI 4: Focus
Monitoring, Evaluation and Learning Plan
Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net
7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION NEEDS: INFORMATION GAP ANALYSIS
7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION
Commonwealth Secretariat Response. to the DFID Multilateral Aid Review Update
Commonwealth Secretariat Response to the DFID Multilateral Aid Review Update Summary The Commonwealth Secretariat recognises that the United Kingdom contribution to the Commonwealth Fund for Technical
Framework. Australia s Aid Program to Papua New Guinea
Framework Australia s Aid Program to Papua New Guinea 21 October 2002 Our Unique Development Partnership our close bilateral ties are reflected in our aid program Enduring ties bind Papua New Guinea with
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT
RESULTS BASED MANAGEMENT IN THE DEVELOPMENT CO-OPERATION AGENCIES: A REVIEW OF EXPERIENCE BACKGROUND REPORT In order to respond to the need for an overview of the rapid evolution of RBM, the DAC Working
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS
PARTICIPATORY SELF-EVALUATION REPORTS: GUIDELINES FOR PROJECT MANAGERS 1 Table of Contents Background A. Criteria B. Definition C. Rationale D. Initiator E. Timing F. Planning G. Involvement of the Independent
UNITED NATIONS OFFICE FOR PROJECT SERVICES. ORGANIZATIONAL DIRECTIVE No. 33. UNOPS Strategic Risk Management Planning Framework
UNOPS UNITED NATIONS OFFICE FOR PROJECT SERVICES Headquarters, Copenhagen O.D. No. 33 16 April 2010 ORGANIZATIONAL DIRECTIVE No. 33 UNOPS Strategic Risk Management Planning Framework 1. Introduction 1.1.
TERMS OF REFERENCE FOR THE EVALUATION OF SECTOR SUPPORT IN THE WATER SECTOR.
TERMS OF REFERENCE FOR THE EVALUATION OF SECTOR SUPPORT IN THE WATER SECTOR. 1. Background information 1.1. Sector-wide approach in Dutch bilateral aid The sector-wide approach has been developed as a
Essay on Development Policy. Implications of Providing Budget Support for Public Expenditure. A Case Study from Uganda
EssayonDevelopmentPolicy ImplicationsofProvidingBudgetSupportforPublicExpenditure ACaseStudyfromUganda ThomasBenninger NADELMAS Cycle2008 2010 May2010 TableofContents TABLEOFCONTENTS... 2 TABLEOFFIGURES...
Step 1: Analyze Data. 1.1 Organize
A private sector assessment combines quantitative and qualitative methods to increase knowledge about the private health sector. In the analytic phase, the team organizes and examines information amassed
Quality Assurance in Higher Education: A Literature Review
Quality Assurance in Higher Education: A Literature Review ANDREA WILGER National Center for Postsecondary Improvement 508 CERAS School of Education Stanford University, Stanford, CA 94305-3084 1997 National
EU Budget Support can strengthen partner country commitments to anti-corruption, good governance and sustainable development
Liaison Office to the EU Rue Breydel 40 B-1040 Brussels (Belgium) Phone: +32 (0)2 23 58 645 Fax: +32 (0)2 23 58 610 Email: [email protected] http://www.transparencyinternational.eu TRANSPARENCY
Job Profile. Head of Programme (N1) Governance Facility. Nepal
Job Profile Head of Programme (N1) Governance Facility Nepal Reference number: DK-01349-2016/NEP.03-W 1. Preliminary 1.1. Short background: The Embassy of Denmark/Danida, the Embassy of Switzerland/SDC
How to Measure and Report Social Impact
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
* * * Initial Provisions for. CHAPTER [ ] - Regulatory Cooperation
REMARKS: This is an initial textual proposal for a draft Chapter on Regulatory Cooperation that the Commission intends to submit to the US on Friday, 30 January, in preparation of the 8 th round of TTIP
TIPS SELECTING PERFORMANCE INDICATORS. About TIPS
2009, NUMBER 6 2ND EDITION DRAFT PERFORMANCE MANAGEMENT & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS About TIPS TIPS provides practical advice and suggestions to USAID managers on issues related
INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD
EUROPEAN COMMISSION DIRECTORATE-GENERAL REGIONAL POLICY Thematic development, impact, evaluation and innovative actions Evaluation and additionality The New Programming Period 2007-2013 INDICATIVE GUIDELINES
PROGRESS REPORT MANAGING FOR DEVELOPMENT RESULTS AT THE ASIAN DEVELOPMENT BANK
Annex C PROGRESS REPORT ON MANAGING FOR DEVELOPMENT RESULTS AT THE ASIAN DEVELOPMENT BANK PREPARED FOR THE SECOND INTERNATIONAL ROUNDTABLE ON MANAGING FOR DEVELOPMENT RESULTS FEBRUARY 2004 Annex C CONTENTS
NETWORK SUSTAINABILITY 1. Guillermo Rivero, Financial Services Manager, Pact HQ. USA. 2006.
NETWORK SUSTAINABILITY 1 1 This document has been written by Alfredo Ortiz, Country Director Pact Ecuador & LAC Regional Coordinator, and Guillermo Rivero, Financial Services Manager, Pact HQ. USA. 2006.
pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS
pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage
THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK
THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK ACCOUNTABLE SIGNATURE AUTHORISED for implementation SIGNATURE On behalf of Chief Executive Officer SAHRA Council Date Date
PARC. Evaluation Series No.1 The Logical Framework
PERFORMANCE ASSESSMENT RESOURCE CENTRE PARC Evaluation Series No.1 The Logical Framework By w PARC Evaluation Series No. 1: The Logical Framework Looking back over the last year and the conferences where
Improving Accountability: Developing an Integrated Performance System
OCCASIONAL PAPER No. 11 Improving Accountability: Developing an Integrated Performance System This paper was prepared as part of the State Services Commission's "Improving Accountability" project, and
Network Rail Infrastructure Projects Joint Relationship Management Plan
Network Rail Infrastructure Projects Joint Relationship Management Plan Project Title Project Number [ ] [ ] Revision: Date: Description: Author [ ] Approved on behalf of Network Rail Approved on behalf
FINAL. World Education Forum. The Dakar Framework for Action. Education For All: Meeting our Collective Commitments. Revised Final Draft
28/04/2000, 3 P.m. FINAL The Dakar Framework for Action Education For All: Meeting our Collective Commitments Revised Final Draft World Education Forum Dakar, Senegal, 26-28 April 2000 1 1 The Dakar Framework
Guidance Note: Corporate Governance - Board of Directors. March 2015. Ce document est aussi disponible en français.
Guidance Note: Corporate Governance - Board of Directors March 2015 Ce document est aussi disponible en français. Applicability The Guidance Note: Corporate Governance - Board of Directors (the Guidance
Governance as Stewardship: Decentralization and Sustainable Human Development
Governance as Stewardship: Decentralization and Sustainable Human Development by Jerry VanSant Research Triangle Institute USA EXECUTIVE SUMMARY Introduction The United Nations Development Programme (UNDP)
12. Governance and Management
12. Governance and Management Principles and Norms DEFINITIONS 12.1 Governance concerns the structures, functions, processes, and organizational traditions that have been put in place within the context
TIPS SELECTING PERFORMANCE INDICATORS ABOUT TIPS
NUMBER 6 2 ND EDITION, 2010 PERFORMANCE MONITORI NG & EVALUATION TIPS SELECTING PERFORMANCE INDICATORS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related
CIDA EVALUATION GUIDE
Canadian International Agence canadienne de Development Agency développement international CIDA EVALUATION GUIDE Overcoming challenges Delivering results Meeting expectations Making a contribution Evaluation
How To Write Project Management Guidelines
Project Management Guidelines July 2013 FOREWORD I have pleasure to present these BDT Project Management Guidelines 2013 that now supersedes the April 2008 version. These Guidelines are drawn up in accordance
Integrated Risk Management:
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
1. Background and Overview 1 1.1 Why the addendum? 1 1.2 How should this addendum be used? 1 1.3 Organization of the addendum 2
ADDENDUM june 2011 evaluation Updated guidance on EvalUAtion in the Handbook on Planning, Monitoring and EvalUAting for Development Results(2009) Table of Contents 1. Background and Overview 1 1.1 Why
International environmental governance. Bali Strategic Plan for Technology Support and Capacity-building
UNITED NATIONS UNEP/GC.23/6/Add.1 EP Governing Council of the United Nations Environment Programme Distr.: General 23 December 2004 Original: English Twenty-third session of the Governing Council/ Global
PROJECT FICHE 3.2.1.2
PROJECT FICHE 3.2.1.2 TO ACTION PROGRAMME 2005 FOR THE FORMER YUGOSLAV REPUBLIC OF MACEDONIA Title Combating money laundering (phase II) Total cost EC contribution: 1.5 million (approx) Aid Method Project
Proposed medium-term plan for the period 2002-2005
United Nations A/55/6 (Prog. 9) General Assembly Distr.: General 18 April 2000 Original: English Fifty-fifth session Item 120 of the preliminary list* Programme planning Contents Proposed medium-term plan
The future agenda for development cooperation: voices of Dutch society
The future agenda for development cooperation: voices of Dutch society Contribution prepared for the High Level Panel on the Post 2015 development agenda - March 2013 Prepared by NCDO, in cooperation with
Successfully identifying, assessing and managing risks for stakeholders
Introduction Names like Enron, Worldcom, Barings Bank and Menu Foods are household names but unfortunately as examples of what can go wrong. With these recent high profile business failures, people have
Draft Resolution on Science, technology and innovation for development
1 Draft Resolution on Science, technology and innovation for development The Economic and Social Council, Recognizing the role of the Commission on Science and Technology for Development as the United
Logical Framework Planning Matrix: Turkish Red Crescent Organisational Development Programme
Logical Framework Planning Matrix: Turkish Red Crescent Organisational Development Programme Indicators Sources of verification Assumption/risks Overall Goal The Turkish Red Crescent as a key part of civil
pm4dev, 2007 management for development series Project Management Organizational Structures PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS
pm4dev, 2007 management for development series Project Management Organizational Structures PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology
P3M3 Portfolio Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction
Project Proposal Writing Module
Project Proposal Writing Module Introduction This project proposal writing module is designed to be used by an individual or group to help them prepare funding proposals. This module will explain the basics
www.busanhlf4.org 1 December 2011
BUSAN PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO-OPERATION FOURTH HIGH LEVEL FORUM ON AID EFFECTIVENESS, BUSAN, REPUBLIC OF KOREA, 29 NOVEMBER-1 DECEMBER 2011 1. We, Heads of State, Ministers and representatives
Project Management Standards: A Review of Certifications/Certificates
Project Standards: A Review of Certifications/Certificates Standards for Project Supporting Certification and Certificates Certificate Certification The Project Body of Knowledge PMBOK Guide Projects in
Section 10: Monitoring and Evaluation
From: The OECD DAC Handbook on Security System Reform Supporting Security and Justice Access the complete publication at: http://dx.doi.org/10.1787/9789264027862-en Section 10: Monitoring and Evaluation
Annotated Agenda of the Sherpa meeting. Main features of Contractual Arrangements and Associated Solidarity Mechanisms
Annotated Agenda of the Sherpa meeting 21-11-2013 Main features of Contractual Arrangements and Associated Solidarity Mechanisms At their meeting on 26 November the Sherpas are invited to discuss: General
THE OECD/DAC HANDBOOK ON SSR: SUPPORTING SECURITY AND JUSTICE
THE OECD/DAC HANDBOOK ON SSR: SUPPORTING SECURITY AND JUSTICE Enhancing International Co-operation on Conflict, Peace and Security The OECD/DAC (Development Assistance Committee) Network on Conflict, Peace
GOVERNMENT RESPONSE TO THE CHILD INTERVENTION SYSTEM REVIEW
GOVERNMENT RESPONSE TO THE CHILD INTERVENTION SYSTEM REVIEW October 2010 Closing the Gap Between Vision and Reality: Strengthening Accountability, Adaptability and Continuous Improvement in Alberta s Child
Integrated data and information management in social protection
BRIEFING Integrated data and information management in social protection Key messages > Integrating data and information management of social protection programs through a Single Registry and associated
DRAFT GUIDELINES ON DECENTRALISATION AND THE STRENGTHENING OF LOCAL AUTHORITIES
DRAFT GUIDELINES ON DECENTRALISATION AND THE STRENGTHENING OF LOCAL AUTHORITIES 1 INTRODUCTION Sustainable human settlements development can be achieved through the effective decentralization of responsibilities,
Introduction. Note for the reader. Translation of the Partos Code of Conduct, Oct 2012
Translation of the Partos Code of Conduct, Oct 2012 Introduction Partos members are professional organisations that work in the field of International Collaboration. Their activities take place both here
Monitoring and Evaluation of. Interventions
Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science
Quality Management Systems for ETQAs
Quality Management Systems for ETQAs P0LICY DOCUMENT Please refer any queries in writing to: The Executive Officer SAQA Director: Quality Assurance and Development RE: Quality Management Systems for ETQAs
PROGRAM-FOR-RESULTS FINANCING INTERIM GUIDANCE NOTES TO STAFF ON ASSESSMENTS. Operations Policy and Country Services
PROGRAM-FOR-RESULTS FINANCING INTERIM GUIDANCE NOTES TO STAFF ON ASSESSMENTS These interim guidance notes are intended for internal use by Bank staff to provide a framework to conduct assessments required
INTRODUCTION TO PROJECT MANAGEMENT FRAMEWORKS
Economic Cooperation Organisation and United Nations in Iran INTRODUCTION TO PROJECT MANAGEMENT FRAMEWORKS Ali Farzin - United Nations Project Management Workshop February 24-25, 2010 Tehran WORKSHOP
