METHODOLOGICAL APPROACHES TO EVALUATE SUPPORT TO CAPACITY DEVELOPMENT. Synthesis Report. April 2014 Stein-Erik Kruse Kim Forss

Size: px
Start display at page:

Download "METHODOLOGICAL APPROACHES TO EVALUATE SUPPORT TO CAPACITY DEVELOPMENT. Synthesis Report. April 2014 Stein-Erik Kruse Kim Forss"

Transcription

1 METHODOLOGICAL APPROACHES TO EVALUATE SUPPORT TO CAPACITY DEVELOPMENT Synthesis Report April 2014 Stein-Erik Kruse Kim Forss

2 Table of content 1. INTRODUCTION Background Purpose and questions Methods Limitations... 3 CHAPTER 2: CAPACITY DEVELOPMENT AN EVOLVING CONCEPT Dimensions and levels of capacity development Analysis of concepts and definitions... 6 CHAPTER 3: THEORY AND PRACTICE OF EVALUATING CAPACITY DEVELOPMENT Approaches to evaluation design The choice of models Indicators Methods CHAPTER 4: CONCLUSIONS AND RECOMMENDATIONS Conclusions Recommendations Annex 1: Terms of reference Annex 2: List of evaluations and studies Annex 3: Entry points to capacity development analysis Annex 4: Use of methods and tools... 47

3 1 1. INTRODUCTION 1.1. Background This study is part of a Scandinavian initiative to evaluate capacity development as a strategy in development cooperation. In a Joint Evaluation of Capacity Development, Sida, Norad and Danida plan to conduct a number of coordinated studies and activities. The inception phase consists of three thematic studies of which this is one - followed by portfolio reviews of aid from each of the three donors. The main evaluation phase will consist of three stand-alone, but closely coordinated evaluations to be carried out in 2014 and 2015, followed by a synthesis report and joint dissemination of the results in the latter half of Purpose and questions The purpose of this study 1 is to assist Danida/Norad/Sida staff and evaluators to choose the most appropriate methodological approaches for evaluating aid to capacity development. More specifically, the study will identify and assess promising approaches applied in previous evaluations or other relevant studies of support to capacity development by: Identifying indicators for measuring institutional change used in results frameworks and monitoring systems integrated in capacity development strategies. Identifying indicators and other data sources used in evaluations. Identifying methods used to assess effectiveness of capacity development interventions in evaluations. Discussing the different approaches with regard to their applicability and usefulness in an evaluation of support to capacity development in Scandinavian development cooperation. The following questions should guide the study: Which data sources and indicators are used to measure changes in institutional capacity, whether in evaluations or other relevant studies, or in results framework and monitoring systems integrated in capacity development strategies? Which conceptual models (such as theories of change) have been used in evaluations or other studies to assess effectiveness and efficiency? Which methods have been applied in evaluations to assess causal mechanisms (contribution/attribution/counterfactual)? How do evaluations of support to organisational development in public and private institutions in Scandinavian countries compare to evaluations in development cooperation with regard to indicators and methodology? What are the pros and contras of data sources, indicators and methods identified above with regard to reflecting actual improvements of institutional capacity and addressing causal mechanisms based on a realistic perspective of the availability of baseline and monitoring data, as well as the time and resources made available for most evaluations? Which other lessons are found in the literature consulted that may be of relevance to the joint Scandinavian evaluation of capacity development, for instance for evaluation 1 See Annex 1: Terms of Reference

4 2 of other dimensions of capacity development than the effectiveness and efficiency, such as relevance, sustainability and impact? 1.3. Methods Evaluations of capacity development are available from multilateral and bilateral donors, civil society organisations and public and private sector in developed countries. The majority of studies and evaluations assess capacity development as a part of a broader programme while only a few focus more exclusively on capacity development. It was not feasible or desirable to select a random representative sample of such reports. Partly because we didn t know the universe of reports, but more importantly because we were looking for interesting and promising examples. Hence, our findings and conclusions are limited to our sample and any general conclusions can be questioned. When we explain that for instance four out of fifteen reports have a certain methodological characteristic, we refer only to our sample and not a broader universe of reports. The search for reports was carried out using databases in multi-, bilateral and civil society organisations, existing networks on capacity development and expert advise. The original plan was to select a sample of ten to fifteen evaluations. At the end, we analysed fifteen and summaries are available in Annex 5. The choice of such a sample, was based on the assumption that if the reports were methodologically sophisticated and well implemented, they would yield sufficient insights for the purpose of this study. The sample of fifteen reports should fulfil the following criteria: Reflect different aspects of institutional capacity building (individual, organisational and system wide). Reflect experiences from different sectors; such as water and health, research and education, infrastructure, trade and private sector development, agriculture and forestry, public administration, etc. Our aim was not to cover every sector, but to avoid that all the cases came from one and the same sector. Reflect different methodological approaches as far as possible. There were several types of reports to search for such as: (a) Results framework including monitoring systems, (b) evaluation studies, (c) handbooks and guidelines for evaluating capacity development and (d) general literature on capacity development. Evaluation reports were the main units of analysis in this study since we assumed they have most to contribute to lessons learned. The proof of general principles and methods emerge from actually doing evaluations of capacity development. The other types of documents provided a backdrop for the study and informed the analytical and concluding chapter. The reports were reviewed and summarised by the two consultants. The following questions guided the reading and provided the structure for the case studies: 1. What is understood and defined as capacity development? What are the levels and dimensions? This is an important question since the choice of definition has methodological implications whether it is focusing on individual, organisational or contextual dimensions. Each dimension requires separate methods and performance indicators. 2. What is the analytical framework used in the assessment of capacity development if any? This is important because it covers and demonstrates issues pertaining to explicit/implicit theory of change and contribution/attribution.

5 3 3. How is capacity development operationalized? What are the dimensions and indicators of capacity development at output and outcome levels? This is crucial because it explains what CD is in practice and what should be measured? 4. What are the methods for collecting data and information and what sources are used? The choice and combination of methods (document review, interviews, surveys and observation) explain how the evaluators have been working and where they have looked for data Limitations The most difficult challenge was to find evaluations with promising and interesting methodological approaches. We may not have found the optimal sample of reports. There are certainly other reports that we could and possibly should have included. Most of the informants we contacted became hesitant when we asked them to nominate methodologically interesting reports. It was more common to complain about the lack of innovation stating that most evaluations of capacity development used a standard mix of desk studies and interviews. Interesting reports seem to be interesting because of what they have to say about capacity development. Whether a report is interesting or not is, in retrospect, mostly because it has been useful in policy formulation, to support decisions, or whether people have learnt something from it. The methodology seems to play a limited role and apparently, it is possible to produce insightful evaluations even with a mediocre choice of methods. There is also a huge number of reports out there assessing capacity development directly or indirectly too many for drawing a representative sample. Most of them treat capacity development as part of a broader programme and not as a separate thematic impact category - making it sometimes difficult to decide whether a report is an evaluation of capacity development or not. The search was based on agreed selection criteria, but the actual choice of reports was also determined by what we found and what was available. However, based on our experience from other studies, we believe that the selected reports reflect both mainstream and more innovative methodological approaches. Another sample would most likely not have led to other major conclusions and recommendations, but broad general conclusion in the report can be looked at with suspicion.

6 4 CHAPTER 2: CAPACITY DEVELOPMENT AN EVOLVING CONCEPT 2.1. Dimensions and levels of capacity development Our first question in the analysis of the reports was: What is the understanding of capacity development and how is it defined? A paradox is the widespread agreement on the importance of capacity development on the one hand and the fact that the term is widely disputed and quite problematic in the literature and praxis. Many use the term as synonymous to organisational development, others focus on markets, legislation, or even norms, values and aspects of culture in general. In some reports, capacity development becomes even synonymous with development itself. Since the aim of the study is to look for promising methodological examples for assessing capacity development, we need to know how the authors define the term what the main dimensions and levels of capacity development are. There is a clear link between the object of study (CP) and selection of methods, indicators and sources of information form follow function. The understanding and definition of capacity development have significant methodological implications. Hence, we have established a framework for analysing capacity development in evaluations. This is obviously not easy to achieve since there are a large number of concepts and definitions around. A review of donor literature and current use of terms for capacity development present a relatively large number of alternatives, but the underlying processes - or rather the main concerns within capacity development share important similarities - despite considerable semantic pluralism. The aim has not been to invent a new set of terms, but to stay as close as possible to what key and influential actors use. Most concepts refer to and can be classified along two variables: The intended level of intervention (from individual and organisational to sectors and systemic levels). The type and composition of activities (from training and organisational development activities to establishing/enforcing rules and regulations for effective functioning of sectors and systems). This means that the concepts refer to particular set of activities at various levels of society, but there are seldom clear-cut borders between levels or activities. The terms express direction and intention - more than conceptual rigidity. Using these variables, one could thus distinguish between 2 : Human resource development (training and education) which are concerned with how people are educated and trained, how knowledge and skills are transferred to individuals, competence built up and people prepared for their current or future careers. Organisational development, which seeks to change and strengthen structures, processes and management systems in specific organisations in order to improve organisational performance. There are variations between theories and strategies, but 2 This classification was used in MFA Evaluation Report Institutional development in Norwegian Bilateral Assistance.

7 5 they have in "pure" form the following characteristics: (a) focus on individual formal organisations and particularly their internal functioning, (b) less attention paid to external contextual influences on performance, (c) concern with internal organisational changes, and (d) activities include education, training, advice. Systems development is a broader concept than organisational development. In addition to a concern with human resources and the development of particular organisations, it includes an emphasis on linkages between organisations and the context or environment within which organisations operate and interact. It is useful to make a distinction between three system levels: o o o The network and linkages among organisations, which include the network and contact between organisations that facilitate or constrain the achievements of particular tasks. The regulatory environment referring to the policy environment of the public, private and civil sector that constrains or facilitates organisational activities and affects their performance, including e.g. laws, regulations and policies. The value framework - including the economic, social, cultural and political milieu in which organisations operate, and the extent to which conditions in this broader environment facilitate or constrain the functional capacity of organisations. This description of capacity development underlines the interaction between micro (internal) and macro (external) factors determining how organisations translate their capacities into actual performance. Capacity development refers to change processes - located at any of these three levels - that improve the capacity of a social system to achieve its goals and objectives. The question is if the definition contains too many levels? There will be five columns to categorise aspects of capacity development. That is quite a lot, and it would seem desirable to reduce the number. There is no doubt that human resource development often is an aspect of organisational development. Perhaps these two categories could be merged? When analysing organisational development, one would always include an analysis of the human resource development of the organisation. However, human resource development could be a number of other things than organisational development. Many projects are geared towards general human resource development, for example through school systems, vocational training and the like. The impact will be analysed with other methods than organisational development would be analysed. On the other hand, an analysis of organisational development would often include an analysis of the organisational environment. So perhaps the two categories of organisational development and networks/linkages could be merged? Possibly not, as the criteria to evaluate a network are normally quite different from those applied when evaluating an organisation. The health and well being of a network is a distinctly different question from that of the health and well-being of any one organisation that takes part in it. To focus an understanding of networks as part of the assessment of one organisation could be very misleading. A market system of competing and cooperating enterprises would look different if analysed from the point of view of an existing firm, than it would if analysed on its own merits. Yet another possibility would be to merge the two categories legislation and norms/values. There is a conceptual connection between the two; the legislation in society can be expected

8 6 to reflect the norms and values of the citizens. On the other hand, there a number of norms and values that are not expressed in legislation and some of the tension and dynamism in institutional development arise when the two are not the same. It is worth noting that when an evaluator is to assess legislative development, he or she would use entirely different methods than when assessing norms and values. It is much a matter of preference and intellectual temperament how many categories one should work with in the definition of capacity development. The table below illustrates our preference. Table 1. A scheme for understanding capacity development Process dimension Level Focus Human resource development Individuals and groups Competence, attitudes, behaviour Organisational development Organisations Structures, systems and processes System development Networks and linkages Patterns of collaboration Regulatory environment Value framework Policies, rules, legislation Cultural values, norms, politics According to the Terms of Reference this study should also look at evaluations of capacity development in other fields than development cooperation. In all the OECD countries there are examples of interventions that address human resources, organisational development, as well as systems development, and these interventions are often evaluated. But there is no systematic use of the word capacity development such as there is in the field of development cooperation. While parts of the analysis could be similar, the concepts used in the OECD context are often quite different. The differences are interesting and at least in part explained by the fact that in the OECD setting the focus is on continuous processes of institutional change rather than on projects, programmes and interventions, and hence studies focus more directly on issues of personnel turnover, comparative salaries, career patterns, life-long learning and other substantive aspects of human resource development rather than on project administration Analysis of concepts and definitions The following table presents a summary of our analysis of the reports 3. It lists which level of capacity development they cover helping us to map the entry points used for capacity development. However, it was not sufficient to simply note whether a level or dimension was treated or not. Hence, three different modes of treating capacity development were distinguished. When an issue of capacity development had been treated fully, it means that several pages, perhaps a chapter, was used to describe a situation, to analyse shortcomings, achievements and options, When an issue had received some treatment, it has been mentioned, but there is not a full argued case either for how that aspect has affected an operation, or what the impact has been. When there is no reference to that aspect of capacity development, it means the issue is not referred to at all. 3 The full analysis of the reports can be found i Annex 5.

9 7 Table 2. Summary of entry points to capacity development Human resource development Organisational development Networks and linkages Laws and rules Values and norms Full treatment Some notes No analysis Total There was consensus in almost all reports that capacity development is a broad multidimensional process-taking place at various levels of society. Most refer to and use three levels: the individual, organisational and societal (systemic). The World Bank report on capacity development in Africa (2005) was a typical example - analysing three dimensions of public sector capacity: Human capacity, organisational capacity and institutional capacity, e.g. the formal rules of the game and informal norms. It is also a shared concern that enhanced capacity should be treated as a goal in its own right, not merely as a means for achieving other development objectives. The USAID evaluation on the Tatweer programme in Iraq is the only study that deliberately excluded systemic capacities and focused on individual and organisational capacities. However, the focus in the majority of the studies was at the organisational level. In principle, a broad definition of capacity development was applied opening up for an analysis of all the three levels, but in practice organisational issues received most attention. Capacity development was to some extent synonymous with organisational development. We will illustrate later that most analytical tools and models were developed for carrying out organisational analysis. There were fewer methods and tools for assessing individual capacity and even less for the three systemic dimensions. The five dimensions of capacity development overlap, but are also distinctly different individual attitudes and behaviour, organisational structures, network linkages, legal systems and cultural norms. Even if the reports refer to the multi-level characteristics of capacity development, there is hardly any discussion of the methodological implications that the methods for collecting data will have to be different at the individual, organisational and systemic levels. There is not one set of tools for assessing all aspects of capacity development. It was also interesting to note that human resource development was less covered in many of the reports. This is the classical form of capacity development including training and technical cooperation. Capacitated individuals are necessary building blocs and important outputs in most types of capacity development, but this level is maybe perceived as too basic and unsophisticated when assessing capacity development. Organisations were the units of analysis even if it is difficult to envisage strengthening of organisations without including human knowledge, motivation and behaviour. The DFID study on technical cooperation covered individual capacity, but organisational capacity for economic management was selected as the focus for analysis. The concern for higher-level organisational outcomes seems to have replaced or suppressed a proper assessment of changes at individual level. An exception is the Swedish evaluation of the National Institute of Statistics in Cambodia where much of the support has been various forms of training programmes; formal and informal, courses, on-the-job training and the possibility to call on technical advice from

10 8 Statistics Sweden. The evaluation also discusses how the project needs to progress towards organisational capacities; i.e. structures and processes around the production of statistical information, for example management of census, quality control, but even though these levels were all present in the report, the focus lies on individual capacity building and it is through increasing the capacities of individuals to plan and implement data collection, analysis and reporting that the institutional capacities are increased. There was no theory of change on how the individual capacities get translated into organisational or systemic capacities. Networks and linkages were neither covered systematically in the reports. This was to some extent surprising given the intuitive importance of networks as part of capacity development. Partnership has also become a buzzword for most donors (from CSOs to development banks). The strategy of achieving results through partnerships and alliances with like-minded organisations is widely adopted. Networks were included in several analytical models as a dimension, such as: Managing relations across government (Leading Change in the Department of Education, UK) and in most of the civil society assessment: The core capability to relate and to attract resources and support and Ability to retain effective partnerships (Capacity, Change and Performance and Organisational Review of Save the Children). However, internal organisational issues were prioritised. We could not find any explicit tools for analysing networks and linkages. There was an unanimous agreement that context matters the wider institutional, political and external context within which the organisation being supported is operating (DFID Evaluation of technical cooperation). The Dutch study specified also contextual influences as: Operating space, Politics and power, Formal institutional factors and cultural values, Supply and demand and Perceived legitimacy. However, in practice context received marginal attention. The contextual box was possibly too large. Nobody disputed the importance of context, but when it includes everything that potentially can influence an organisation, the analysis becomes easily too broad and unwieldy. Most of the reports have a relatively brief and rather general contextual analysis stating the importance of external factors, but often without a substantive elaboration of what and how. There were no for instance no political science frameworks or tools for preparing a targeted contextual analysis. The themes suggested in the Dutch study represent a move in the right direction, but the terms remain too wide and unspecified. All case reports in the Dutch study contained some descriptions about the contexts in which the Southern organizations operated. However, half the case studies were weak in terms of relating contextual factors to the ways in which the capacity of the Southern organisations had changed. Reports often provided only static descriptions of the main features of the legal and institutional environment. They did not provide more elaborate descriptions of how changes in these features influenced the capacity of the Southern organisations. Cultural values and norms as facilitating and constraining factors in capacity development were also marginally discussed with no analytical tools or models. Again, several reports referred to the importance of cultural values and norms, but don t follow up with a more indepth analysis of why and how they make a difference in organisations. The DFID study on technical cooperation did emphasise the need for a deep understanding of the context (including both the wider political and institutional environment and the particular features of an organisation supported). The report described also the likely failure of initiatives that are

11 9 based on uncritical attempts to transfer generic best practice approaches without an understanding of the structure of incentives for capacity development. However, this was not much developed in the report. The study of Resilience and Children in the Sahel was neither about individual training programmes or organisational capacity building, but it addressed a system wide capability namely the interaction between local and national governments, civil society organisations and the international community to protect the poorest of the poor from famine and malnutrition. It was thus about systemic capacity building what it takes to achieve resilience and how international organisations can work towards this objective. DFID took the lead in an evaluation of public sector governance reform (DFID Summary Report of the Public Sector Governance Reform Evaluation) a typical example in which context definitely matters. The point of departure for the study was that several donors had invested large amounts of money in supporting public sector governance reform, but with often disappointing progress and results. It was acknowledged in the report that reform of this kind is complex and long-term and rife with political-economic interests. Effective public sector governance depends on a complex system of interdependent institutions that all need reforms, and it is difficult to monitor and measure progress at the aggregate level in such a system. An evaluation framework was developed, but its approach was rather conventional and did not involve any advanced methods or tools. The three steps in the evaluation were: Describing the reforms and their contexts telling the story from the country perspective. Supporting and coordinating reforms telling the story from the donor perspective. Learning what worked, what didn t and why. The main units of analysis were: Reforms of the role of the state, the central functions of the government, accountability and oversight mechanisms, and civil service systems and the management of public service organisations. A broader search would possibly have identified more advanced analytical models and tools.

12 10 CHAPTER 3: THEORY AND PRACTICE OF EVALUATING CAPACITY DEVELOPMENT This chapter discusses the methodological choices in evaluation, from the first overarching choice of design or approach - and via models and methods to the instruments for data collection. We start at the top and conclude with the practical instruments of evaluation, but most evaluations do not follow such a linear logic. Often the terms of reference push the evaluation towards a specific choice, or the evaluation is constrained by time and money to choose one design rather than another. But a choice there is and it is better to know what the implications are. In particular, an experimental design requires that the whole CD intervention is designed for the evaluation with a control group, preferably randomly selected. It might be useful to consider the difference between methods and designs at first 4. In the social sciences, a design refers to the overarching logic of how research is conducted. The way the concept is used in research, a design would often entail the totality of research questions, theory, data and the use of data. In research, the author has command of the whole chain, but in evaluation the questions would usually be asked by a client and the evaluator responds by defining relevant theory and the data that needs to be collected. Method is the word to be used for data collection and analysis; interviews or surveys are examples of methods, so are observations, focus groups, document analyses. A discussion of methods involves analysis of the size of samples, sampling strategies, how to ensure the reliability and validity of findings and the appropriate combination of data collection instruments. Finally, a treatment of scientific methods arrives at the research instruments themselves; the survey questions, the interview protocols, the rating scales, the formulation of indicators and the benchmarks used in the judgement process. This study covers all these three aspects of the scientific methods, but the reader should recognize that they represent very different levels of abstraction. The quality of research evaluation depends on excellence at all levels. No matter how sophisticated the design, or how technically proficient the methods, if the interview questions are leading, or rating scales are mistaken, the conclusions will not be valid or reliable. This section reviews the choices of approach, models, methods and some of the instruments for data collection that appear to prevail in the current practice of CD evaluation. 3.1 Approaches to evaluation design The evaluation community is engaged in heated debates on the choice of approach to best conduct an evaluation. It is a rather strange debate and it started with a lobbying group arguing that there is one and only one method that can rigorously conclude on how results are caused. This superior method (described as a gold standard ) favours experiments generally and randomized-control trials in particular. Naturally, the idea that there is a single superior evaluation approach has been widely challenged (Ravallion 2009, Deaton 2010, Cartwright and Hardie 2012, Stern et al 2012, Pawson 2013). There is no design of an evaluation that is inherently more rigorous than any other design. An evaluation based on experimental methods can be misleading, just as a case study-based evaluation can be. 4 The word approach is often used as a synonym for design.

13 11 Designs and methods are fit for different purposes and when well-executed all have their strengths and weaknesses. The overall design of an evaluation needs to follow from the kinds of questions that are being asked. The examples of CD evaluations that we have analysed for this study all have a design obviously. The literature on CD evaluation naturally assumes that there will be a design, but it is not so commonly addressed what the design will be. The often misguided, but heated exchanges on rigour and gold standards has not affected the practice of evaluating CD much. Nor has the debate on experimental methods percolated down to the CD evaluation literature. None of the sources among our references propagate experimental designs, nor for that matter any other design. A few, such as the World Bank study on capacity building in Africa concludes briefly, that a RCT approach is in general not feasible for evaluating CD. It is not clear why the study reaches that conclusion. An experimental approach may be difficult, expensive or practically unfeasible, but the overall conduct of CD evaluation would certainly benefit from a more diverse application of approaches - including a more widespread use of experimental approaches and among them RCTs. The problem is rather that the choice of design in many evaluations appears to be made by default. We have not seen any explicit consideration of the strengths or weaknesses associated with a particular design, or even a well-informed discussion about what design to take. The methodology sections of the evaluations we looked at inform the readers about methods for data collection and problems with that, but they do not address broader methodological concerns. The extent of choice design options What are then the design options? The evaluation debate has mainly been about the pros and cons of experimental methods, but what are the alternatives? Stern et al (2012) put together a comparative framework for evaluation design in a study commissioned by DFID. Four different designs were identified each of which is distinguished by a different view of how effects are to be analysed and explained. They were: Case-based approaches. These designs can be characterised as ethnographic the evaluator approaches the phenomenon straight on and start exploring issues, often framed in the questions of relevance, efficiency, effectiveness, impact and sustainability. The assessment is usually a within-the-case analysis, but the evaluators can also make use of methods such as qualitative comparative analysis (QCA), configurations, simulations, and comparisons to find results and their causes. Theory-based approaches. These designs start with and elaborate a theory of change and look for evidence that support or disqualify the assumptions of change. The theory of change can be articulated as a mechanism and the so-called realist approach is a typical theory-based approach. This articulates a sequence of mechanism (the intervention and how it is supposed to work), a context and outcomes (results). Contribution theory and theories of reach are also examples of designs that fit in this category. Statistical approaches make use of statistical modelling, longitudinal studies, econometrics to respond to evaluation questions. In the analysis of CD, these approaches would build on a correlation between outcome and context variables and the mechanism, i.e. the process of CD. Experimental approaches. The essence of the experimental approach is that the evaluation should develop a counterfactual case, i.e. a situation where there is no intervention. The outcome of that situation should then be compared to the situation

14 12 with an intervention and only if there is a difference between the two can a change be attributed to the intervention. While the so called gold standards implies that the situation with the intervention as well as the case without the intervention are randomly selected from a population of several potential cases, there are also less strict applications that are called natural experiments or quasi-experiments. The study by Stern et al was expressly commissioned to identify and present the alternatives to the experimental methods. Do these approaches capture the spectrum or are there any other research designs in the social sciences and hence in evaluation? The report by Stern has been widely disseminated and discussed in seminars and conferences, but we have not heard anyone suggest that there are approaches that have been forgotten by the authors. There is a wide range of studies on scientific method, and most Ph.D. programmes contain basic introductions to research design (see for example Ackoff, Kaplan, for some of the most commonly used texts, or Patton, for an application in the field of evaluation), but evaluators as well as social scientists are stuck with these four broad ranges of designs. However, the boundaries between the designs are permeable and it is common that designs are combined in a study and hence also in evaluation. The four designs can all make use of qualitative and quantitative methods. They use models, instruments for data collection and analysis, indicators and benchmarks. Even though theory plays a more explicit role in the causal analysis in the theory-based approaches, the other approaches are of course also grounded in substantive theories. To take some examples; network theory, governance theory, or management theories; can be used within all four approaches to inform hypotheses, guide data collection, and be used in the analysis. The question of causal inference One of the key questions in CD evaluation is whether any increased capabilities are due to the intervention or other factors. In a study parallel to this, Boesen (2014) points out that change is essentially endogenous, and if an outside intervention takes place, it does so in connection to endogenous processes of change. This has implications for the understanding of causality. The classical definition of causality is that a causal factor must be necessary and sufficient for the effect to take place. In the language of causal analysis, an effect can be attributed to a factor if, and only if, that factor is necessary and sufficient for the effect to appear. Such is the strict meaning of attribution. However, this is relatively new criterion and it has also been questioned (Losee, 2011). In many fields, it is more relevant to talk of contributions to change. This corresponds well to Boesen s analysis of CD results; external projects and programmes can in the best of circumstances contribute to change, but it is highly unlikely that such projects or programmes will be the sole causal factors leading to CD. Attribution is ruled out as causal inference. The notion of a contributory cause, recognizes that effects are produced by several causes at the same time, none of which might be necessary and sufficient for impact. It is, for example, support for human resource development, combined with structural reforms, suitable governance and leadership that contribute to CD results some of which may be funded by development cooperation. Just as it is smoking along with other factors and conditions that result in lung cancer, not smoking on its own, so also it is development intervention along with other factors that produce an impact. The analysis of combinations of factors that lead to CD results could follow the approach to causal inference put forward by Mackie (1974) and which has strong echoes in contemporary causal analysis and evaluation. In the literature this is called an INUS cause: an Insufficient

15 13 but Necessary part of a condition that is itself Unnecessary, but Sufficient for the occurrence of the effect. While this may sound highly theoretical and complex, it is in fact the approach to causality which is the practice in the studies we have analysed. But when they discuss different changes in organisations and grapple with understanding why things happened, they implicitly keep many more or less important causal factors in mind at the same time. It is described as multiple-conjunctural by Ragin, (1987) and as configurational causation by Pawson (2007). Multiple refers to the number of paths, each sufficient but possibly unnecessary, that can bring about change; while conjunctural is related to the connection between the outcome and a package of multiple causes which, taken in isolation, are unlikely to be responsible for the outcome, or even for bits of it (often identified as the net effect). The important message to bear in mind is that institutional change has multiple causes, many of which including the aid financed projects are neither necessary nor sufficient, but still part of a causal package. The role of the counterfactual The notion of causal packages and multiple change factors that are neither necessary nor sufficient make it even more difficult to consider what would have happened without the CD intervention. The question of counterfactual merits some more attention, although none of the evaluations we have looked at discuss a counterfactual. Nor does the literature on CD evaluation discuss the counterfactual challenge. What is it then? Counterfactual logics (Lewis 1973) seek to answer the question: what would have happened without the intervention? by comparing an observable world with a theoretical one, where the latter is intended to be identical to the former except for the presence of the cause and effect. The latter is described as counterfactual because it cannot be observed empirically. This is where the experimental methods have their strongest point. By establishing a control group you have an empirically observable counterfactual. It is beyond the scope of this study to enter fully into the on-going philosophical debate about the extent to which counterfactuals are a sufficient basis for understanding causation. (Interested readers can refer to the sources cited above and in addition to Brady 2002, Hall 2004, Dowe and Noordhof 2004, Stern et al 2012). The word counterfactual can be used in a different way though, more in line with common-sense reasoning. Evaluators and policy makers can use counterfactual thinking as a mental exercise to think through different policy options. This use of counterfactuals is not a rigorous method for causal inference. To conclude the discussion on counterfactuals, both evaluators and their clients now recognise that causation without explanation is insufficient for policy learning. Policy-makers need to understand why as well as how change happens if they are to use findings from research or evaluation for future policy-making. Explanation depends on the use of theory alongside appropriate methods, because without theory there is no basis for interpretation. Comparing evaluation designs Different designs may share similar methods and techniques: both experiments and case studies may use interview data, draw on focus groups and analyse statistical data. What holds them together is the fundamental logic of a design in particular their assumptions around causal inference - not the methods. The evaluations we have looked at usually have a section that explains methods, but these sections present the methods for data collection. They do not discuss the more fundamental choices of overarching approach, which defines how the evaluation will conclude on outcomes and impact, whether findings can be generalised, and how causality will be addressed. This is a choice that would benefit from being made clearer in all evaluations. To conclude the discussion, we return to the basic

16 14 questions of capacity development (What works? For whom? In what context? Why? Is it replicable?). Table 3 summarises where the different designs have their strengths and weaknesses. Table 3. Comparative assessment of evaluation designs Design What works? For whom? In what context? Why? Can it be replicated? Case-based approached Theory-based approached Statistical approaches Experimental approaches XX XX XXX XX X XXX XXX XXX XXX XXX X XXX XXX X X XXX XX X X X Codes: X not so good at responding to the question, XX can at times respond to question, XXX good at responding to question. The Case-based approaches are characterized by attention to the particular circumstances of projects and programmes and hence one of their strengths is the contextual understanding they can provide. The Case-based approaches also produce knowledge on what works, for whom and why, but the question whether projects are replicable is seldom addressed. The Theory-based approaches allow and expect evaluators to address all questions in order to explain the interventions, their effects, and the processes of change and to do so within a framework of past evidence, tried and tested practice. This is what makes the explanatory power of the approach relatively stronger than alternative approaches. The Statistical approaches can probably be discarded when we discuss CD interventions. These approaches can be very relevant when evaluations look for impact in society, for example in poverty alleviation, HIV/AIDS prevention, climate change mitigation, or some other macro-level outcome. But it is hard to see that they would be relevant when analysing a specific CD intervention. The Experimental approaches would seem to have their strengths in circumstances where it is credible to propose a classical causal connection between an intervention and an effect (necessary and sufficient), and in some cases that can be both feasible and desirable. Even though we have not seen any examples of such CD evaluations among the studies we looked at, it is not hard to imagine how some components of these evaluations could have been approached with an experimental or quasi-experimental design. This comparison between designs takes many arguments for granted and this is not the place for an extensive exposition on scientific method. The main point to illustrate is that the designs have strengths and weaknesses and an inquiry that takes a holistic grip over CD as an instrument of development cooperation is best advised seeking out the strengths of all by using the mixed-methods approach but within an overall framework where a Theory-based design makes up the conceptual lead.

17 The choice of models Capacity development is an inherently complex task. Even the question of individual capacity development has a tendency to become complex, and when the organisational and systemic levels of capacity development are considered there can be no doubt about the complex character of the task. Dealing with complexity has become an obsession across many schools and disciplines in the social sciences including evaluation. The evaluation reports that are used for this study show that evaluators find different ways of reducing the complexity of the evaluation task to make it more manageable. They are not equally successful, but the best deal with complexity by making it understandable and being concrete and for that they use models. Models are simplifications of reality that evaluators use to show patterns, configurations, causal mechanisms, feedback loops, non-linear relations, multiple causality, boundary conditions, etc. Models create an overview and can be seen as the first stepping stones in managing complexity. Even though, in terms of ontology and epistemology, everything is complex, evaluators can make practical progress by using models (Forss and Marra, 2012). Most of the evaluations we looked at use one or several models. Models of capacity development The first types of models are those that depict capacity development, i.e. they are models of that which is being evaluated. Not all of the evaluations have used models of capacity building, but some models may be implicit. When the evaluations draw up interview guidelines or surveys to assess capacity development on the individual level, they often make use of the famous Fitzpatrick (1959) model to assess CD training without necessarily describing it as a model or referring to Fitzpatrick. The basics of Fitzpatrick s models are shown in Figure 1. Figure 1. Fitzpatrick s model of individual capacity development Reaction - what participants think about the training (surveys, 'smile sheets') Knowledge - what participants actually learn during training (tests) How participants apply things from training to practice Impact, what difference this makes in the wider world There are many models of organisational development, i.e. models that sum up what is to be considered organisational development. The figure below presents the McKinsey 7-S Model (Waterman et al, 1980). This is an internationally tested and many times replicated organising framework to understand an organisation. It has not been used by any of the evaluations we have looked at, but most of them understand organisations in the terms used by this model, and most also use one or another alternative model that could be related to this mother of OD models.

18 16 Figure 2: McKinsey 7-S Model. Another commonly used model of organisational capacities takes its starting point in the four basic capacities that an organisation needs; the capacity to exist, the capacity to organise, the capacity to relate, and the capacity to show results. This model was developed by Kruse and used in the organisational review of Save the Children (2008) and several other evaluations. The model is illustrated in Figure 2. Figure 3. Four essential organisational capacities Internal dimensions AN ABILITY TO BE Maintain an identity reflecting important purposes, values and strategies, and leadership to direct and manage the organisation: - Governance - Leadership - Identity External dimensions AN ABILITY TO RELATE Respond and adapt to new demands among its users and changing needs in society, and retain effective partnerships: - Standing - Alliances and connections - Responsiveness AN ABILITY TO ORGANISE AN ABILITY TO DO Establish effective managerial systems and procedures, and ensure that human and financial resources are available. - Human resources - Systems and procedures - Material and financial resources Provide relevant services for its users and/or members: - Relevance - Effectiveness - Sustainability It is surprising to note that there are not many examples of model building at the next level of capacity building in our sample. It is partly explained by the focus of evaluations, and as most of them focus on organisational capacities that is also where they build and use models. Nevertheless, the systemic level is important and it would be useful to construct models to analyse capacities here too. One of the reasons there are so few findings that relate to the systemic level may well be that the evaluators did not have the models at hand to understand changes at this level. Even though systemic capacities were discussed in some of the evaluations, there was no comprehensive analysis and no presentation of data and evidence. Table 4 summarises the discussion of models of capacity development.

19 17 Table 4. Use and availability of models for capacity development Level of Capacity Development Use of Models Availability of models Individual CD Relatively scarce, implicit Several models available, most commonly used Fitzpatrick, 1959 Organisational CD Common in most evaluations of this level of CD Several models available; e.g. McKinsey (1980), Statskontoret (2009), Kruse (2008), Systemic CD: Networks Rare Several models available; e.g. Cooperrider (1993), Granovetter (1973), Lincoln (1982), Forss (2000) Systemic CD: Laws and regulations Systemic CD: Norms and values Rare Rare Not so common, an area which is usually analysed without models Not so common, there are frameworks for analysis and dimensions of norms and values e.g. Hofstede (1980), Global Value Survey Modelling the evaluation process Yet another use of models relates to the evaluation process itself. Several of the evaluations refer to the OECD/DAC model which defines evaluation and explains the basis for value judgement along the five dimensions relevance, efficiency, effectiveness, impact and sustainability. Another use of models is to explain the process of inquiry, such as Statskontoret does in the model for agency assessment. The model is illustrated in the next figure. This model is different from the models of OD above. It does not specify what the qualities to be analysed are, as those models do. Instead, it is an outline of a framework for the assessment, where the evaluation teams are instructed to follow a narrative sequence and a process of explanation. The focus is on understanding change. As it is a framework for analysis of existing organisations that are part of the regular public service this makes it different. The agencies have not necessarily been subjected to any CD interventions and there is no donor involved. On the other hand, most organisations in a modern public administration are constantly developing, changing and striving to increase capacities. The quest for improved results and higher efficiency is never-ending.

20 18 Figure 4. Steps in the analysis of government agencies. A model was also used in the World Bank study of capacity building in Africa that depicts a stylized results chain for public sector capacity building. The results chain link capacity building inputs and outputs to immediate public sector outputs and longer-term outcomes and development results. The report recognized that it is not possible to directly attribute development impacts to capacity building efforts. Hence, the evaluation looked to intermediate outcomes and improved public sector performance to assess the effect of efforts to enhance capacity. It also sought to identify the reasons for success and failure in the chain from inputs to intended outcomes, taking account of both demand and supply-side factors. The absence of baseline data and the extremely limited evidence from monitoring and evaluation limited the inferences that can be drawn from the activities reviewed. However, this is not a theory of change as such. Most models presented in Annex 5 are in the strict sense of the term not theories of change. They present the elements in an intervention logic and the expected pathway from inputs to outputs, outcomes and impact, but the assumptions are not articulated explaining how the change processes are expected to happen and cause change. Basic results chains are useful for identifying programme outputs, outcomes and impact, but they fall short when it comes to defining or describing the change process(es) targeted by capacity development interventions. Models of theories of change The role of a Theory of Change (ToC) is to define the building blocks required to bring about a given long-term goal. This set of connected building blocks is usually depicted on a map known as a pathway of change/change framework/intervention logic, which is a model of the change process. The ToC is both a planning tool, a management tool and an evaluation tool. The main reason to bring up the subject here is that a ToC, whether explicitly described as such or as an implicit figure of mind, is a necessary ingredient when evaluating CD if one aims to explain how a CD project may lead to increased capacities. It is though a ToC that one makes explicit how some intervention is meant to lead to increased capacities. In a ToC, an outcome is connected to an intervention, analysing and testing the complex web of activities that is required to bring about change. A ToC would not be complete without an

GLOSSARY OF EVALUATION TERMS

GLOSSARY OF EVALUATION TERMS Planning and Performance Management Unit Office of the Director of U.S. Foreign Assistance Final Version: March 25, 2009 INTRODUCTION This Glossary of Evaluation and Related Terms was jointly prepared

More information

Doctor of Education - Higher Education

Doctor of Education - Higher Education 1 Doctor of Education - Higher Education The University of Liverpool s Doctor of Education - Higher Education (EdD) is a professional doctoral programme focused on the latest practice, research, and leadership

More information

OUTLINE OF PRINCIPLES OF IMPACT EVALUATION

OUTLINE OF PRINCIPLES OF IMPACT EVALUATION OUTLINE OF PRINCIPLES OF IMPACT EVALUATION PART I KEY CONCEPTS Definition Impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended

More information

MSc Financial Risk and Investment Analysis

MSc Financial Risk and Investment Analysis School of Business, Management and Economics Department of Business and Management MSc Financial Risk and Investment Analysis Course Handbook 2013/14 2013 Entry Table of Contents School of Business, Management

More information

Monitoring, Evaluation and Learning Plan

Monitoring, Evaluation and Learning Plan Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net

More information

STAGE 1 COMPETENCY STANDARD FOR PROFESSIONAL ENGINEER

STAGE 1 COMPETENCY STANDARD FOR PROFESSIONAL ENGINEER STAGE 1 STANDARD FOR PROFESSIONAL ENGINEER ROLE DESCRIPTION - THE MATURE, PROFESSIONAL ENGINEER The following characterises the senior practice role that the mature, Professional Engineer may be expected

More information

FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS

FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS Activity: A specific action or process undertaken over a specific period of time by an organization to convert resources to products or services to achieve results. Related term: Project. Appraisal: An

More information

Evaluation policy and guidelines for evaluations

Evaluation policy and guidelines for evaluations Evaluation policy and guidelines for evaluations IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and Operations Evaluation Department IOB October 2009 Policy and O

More information

PSYCHOLOGY PROGRAM LEARNING GOALS AND OUTCOMES BY COURSE LISTING

PSYCHOLOGY PROGRAM LEARNING GOALS AND OUTCOMES BY COURSE LISTING PSYCHOLOGY PROGRAM LEARNING GOALS AND OUTCOMES BY COURSE LISTING Psychology 1010: General Psychology Learning Goals and Outcomes LEARNING GOAL 1: KNOWLEDGE BASE OF PSYCHOLOGY Demonstrate familiarity with

More information

Since the 1990s, accountability in higher education has

Since the 1990s, accountability in higher education has The Balanced Scorecard Beyond Reports and Rankings More commonly used in the commercial sector, this approach to strategic assessment can be adapted to higher education. by Alice C. Stewart and Julie Carpenter-Hubin

More information

PROGRAMME AND COURSE OUTLINE MASTER S PROGRAMME IN MULTICULTURAL AND INTERNATIONAL EDUCATION. 12O ECTS credits. The academic year 2013/2014

PROGRAMME AND COURSE OUTLINE MASTER S PROGRAMME IN MULTICULTURAL AND INTERNATIONAL EDUCATION. 12O ECTS credits. The academic year 2013/2014 PROGRAMME AND COURSE OUTLINE MASTER S PROGRAMME IN MULTICULTURAL AND INTERNATIONAL EDUCATION 12O ECTS credits The academic year 2013/2014 Oslo and Akershus University College of Applied Sciences Faculty

More information

Kindergarten to Grade 4 Manitoba Foundations for Scientific Literacy

Kindergarten to Grade 4 Manitoba Foundations for Scientific Literacy Kindergarten to Grade 4 Manitoba Foundations for Scientific Literacy The Five Foundations Manitoba Foundations for Scientific Literacy To develop scientifically literate students, science learning experiences

More information

1. Title: Support for International Development Research

1. Title: Support for International Development Research Ministry of Foreign Affairs TAS File no.: 104.C.110.b. Internal Grant Committee Meeting 2 April 2014 Agenda Item no.: 2 1. Title: Support for International Development Research 2. Partners: African Economic

More information

Glossary Monitoring and Evaluation Terms

Glossary Monitoring and Evaluation Terms Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.

More information

LONDON SCHOOL OF COMMERCE. Programme Specifications for the. Cardiff Metropolitan University. MSc in International Hospitality Management

LONDON SCHOOL OF COMMERCE. Programme Specifications for the. Cardiff Metropolitan University. MSc in International Hospitality Management LONDON SCHOOL OF COMMERCE Programme Specifications for the Cardiff Metropolitan University MSc in International Hospitality Management 1 Contents Programme Aims and Objectives 3 Programme Learning Outcomes

More information

Using qualitative research to explore women s responses

Using qualitative research to explore women s responses Using qualitative research to explore women s responses Towards meaningful assistance - how evidence from qualitative studies can help to meet survivors needs Possible questions Why do survivors of SV

More information

Using Qualitative Information For Impact Assessment: Introducing The Qualitative Impact Assessment Protocol (QUIP)

Using Qualitative Information For Impact Assessment: Introducing The Qualitative Impact Assessment Protocol (QUIP) Using Qualitative Information For Impact Assessment: Introducing The Qualitative Impact Assessment Protocol (QUIP) By Katie Wright-Revolledo, INTRAC Introduction Qualitative methods can provide insight

More information

Responsibility I Assessing Individual and Community Needs for Health Education

Responsibility I Assessing Individual and Community Needs for Health Education CHE Competencies Starting in the early 1990s, three national professional organizations the Society for Public Health Education, the American Association for Health Education, and the American Alliance

More information

NGOS AND PARTNERSHIP

NGOS AND PARTNERSHIP NGO Policy Briefing Paper No.4, April 2001 For the NGO Sector Analysis Programme NGOS AND PARTNERSHIP This Policy Briefing Paper presents the findings from the first phase of INTRAC s research Promoting

More information

Correlation between competency profile and course learning objectives for Full-time MBA

Correlation between competency profile and course learning objectives for Full-time MBA Correlation between competency and course for Full-time MBA Competency management in the Organizational Behavior and Leadership Managing Sustainable Corporations Accounting Marketing Economics Human Resource

More information

)LQDQFLDO$VVXUDQFH,VVXHV RI(QYLURQPHQWDO/LDELOLW\

)LQDQFLDO$VVXUDQFH,VVXHV RI(QYLURQPHQWDO/LDELOLW\ )LQDQFLDO$VVXUDQFH,VVXHV RI(QYLURQPHQWDO/LDELOLW\ ([HFXWLYH6XPPDU\ %\ 3URI'U0LFKDHO*)DXUH//0 DQG 0U'DYLG*ULPHDXG Maastricht University and European Centre for Tort and Insurance Law (ECTIL) Final version

More information

7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION NEEDS: INFORMATION GAP ANALYSIS

7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION NEEDS: INFORMATION GAP ANALYSIS 7. ASSESSING EXISTING INFORMATION 6. COMMUNITY SYSTEMS AND LEVEL INFORMATION MONITORING NEEDS: OF THE INFORMATION RIGHT TO ADEQUATE GAP ANALYSIS FOOD 7. ASSESSING EXISTING INFORMATION SYSTEMS AND INFORMATION

More information

Key Principles for Promoting Quality in Inclusive Education. Recommendations Matrix

Key Principles for Promoting Quality in Inclusive Education. Recommendations Matrix Key Principles for Promoting Quality in Inclusive Education Key Principles for Promoting Quality in Inclusive Education European Agency for Development in Special Needs Education CONTENTS INTRODUCTION...3

More information

Basel Committee on Banking Supervision. Working Paper No. 17

Basel Committee on Banking Supervision. Working Paper No. 17 Basel Committee on Banking Supervision Working Paper No. 17 Vendor models for credit risk measurement and management Observations from a review of selected models February 2010 The Working Papers of the

More information

Free/Libre and Open Source Software: Survey and Study FLOSS

Free/Libre and Open Source Software: Survey and Study FLOSS Free/Libre and Open Source Software: Survey and Study FLOSS Deliverable D18: FINAL REPORT Part 0: Table of Contents and Executive Summary International Institute of Infonomics University of Maastricht,

More information

School of Advanced Studies Doctor Of Management In Organizational Leadership. DM 004 Requirements

School of Advanced Studies Doctor Of Management In Organizational Leadership. DM 004 Requirements School of Advanced Studies Doctor Of Management In Organizational Leadership The mission of the Doctor of Management in Organizational Leadership degree program is to develop the critical and creative

More information

The University of Adelaide Business School

The University of Adelaide Business School The University of Adelaide Business School MBA Projects Introduction There are TWO types of project which may be undertaken by an individual student OR a team of up to 5 students. This outline presents

More information

Part 1. MfDR Concepts, Tools and Principles

Part 1. MfDR Concepts, Tools and Principles Part 1. Concepts, Tools and Principles 3 Overview Part 1. MfDR Concepts, Tools and Principles M anaging for Development Results (MfDR) is multidimensional, relating back to concepts about how to make international

More information

NMBA Registered nurse standards for practice survey

NMBA Registered nurse standards for practice survey Registered nurse standards for practice 1. Thinks critically and analyses nursing practice 2. Engages in therapeutic and professional relationships 3. Maintains fitness to practise and participates in

More information

REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME

REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME September 2005 Myra A Pearson, Depute Registrar (Education) Dr Dean Robson, Professional Officer First Published 2005 The General Teaching Council

More information

Framework and Resources for Early Childhood Education Reviews

Framework and Resources for Early Childhood Education Reviews Framework and Resources for Early Childhood Education Reviews Education Reviews in Early Childhood Education Services Contents 1 Introduction 1 The purpose of this framework ERO s approach to early childhood

More information

New Metrics Briefing 2: Feedback to the GELP Metrics co-design group Simon Breakspear, Cambridge University

New Metrics Briefing 2: Feedback to the GELP Metrics co-design group Simon Breakspear, Cambridge University New Metrics Briefing 2: Feedback to the GELP Metrics co-design group Simon Breakspear, Cambridge University The paper was prepared by Simon for the new metrics co-design group. We share it with the community

More information

Methodological Issues for Interdisciplinary Research

Methodological Issues for Interdisciplinary Research J. T. M. Miller, Department of Philosophy, University of Durham 1 Methodological Issues for Interdisciplinary Research Much of the apparent difficulty of interdisciplinary research stems from the nature

More information

Research into consumer financial services

Research into consumer financial services 1 5/10/11 Call for proposals Research into consumer financial services A call under the Financial Market Research programme. 2 1. The call in brief VINNOVA is Sweden s innovation agency and exists to increase

More information

Framework. Australia s Aid Program to Papua New Guinea

Framework. Australia s Aid Program to Papua New Guinea Framework Australia s Aid Program to Papua New Guinea 21 October 2002 Our Unique Development Partnership our close bilateral ties are reflected in our aid program Enduring ties bind Papua New Guinea with

More information

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template UNICEF Global Evaluation Report Oversight System (GEROS) Review Template Colour Coding CC Dark green Green Amber Red White Questions Outstanding Yes t Applicable Section & Overall Rating Very Confident

More information

Report of a Peer Learning Activity in Limassol, Cyprus 17 21 October 2010. School Leadership for learning

Report of a Peer Learning Activity in Limassol, Cyprus 17 21 October 2010. School Leadership for learning EUROPEAN COMMISSION Directorate-General for Education and Culture Life Long Learning: policy and programmes School Education; Comenius Education and Training 2020 programme Thematic Working Group 'Teacher

More information

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire

More information

Annex 1. Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation

Annex 1. Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation Annex 1 Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation European policy experimentations in the fields of Education, Training and

More information

School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology. DM/IST 004 Requirements

School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology. DM/IST 004 Requirements School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology The mission of the Information Systems and Technology specialization of the Doctor of Management

More information

Critical Inquiry in Educational Research and Professional Practice

Critical Inquiry in Educational Research and Professional Practice DOCTOR IN EDUCATION COURSE DESCRIPTIONS A. CORE COURSES NEDD 800 Professionalism, Ethics, and the Self This introductory core course will explore and interrogate ideas surrounding professionalism and professionalization.

More information

Guide for the Development of Results-based Management and Accountability Frameworks

Guide for the Development of Results-based Management and Accountability Frameworks Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and

More information

Module 1: Using Quantitative Data in Research: Concepts and Definitions

Module 1: Using Quantitative Data in Research: Concepts and Definitions Module 1: Using Quantitative Data in Research: Concepts and Definitions Antony Fielding 1 University of Birmingham & Centre for Multilevel Modelling Contents Rebecca Pillinger Centre for Multilevel Modelling

More information

School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology. EDD/ET 003 Requirements

School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology. EDD/ET 003 Requirements School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology The mission of the Doctor of Education in Educational Leadership degree program

More information

Evaluation of the PEFA Programme 2004 2010 & Development of Recommendations Beyond 2011 (dated July 2011)

Evaluation of the PEFA Programme 2004 2010 & Development of Recommendations Beyond 2011 (dated July 2011) Evaluation of the PEFA Programme 2004 2010 & Development of Recommendations Beyond 2011 (dated July 2011) Management Response by the PEFA Steering Committee September 22 nd, 2011 1. Introduction The importance

More information

Meta-analysis of Reviews and Evaluation of Ecoregional Programs

Meta-analysis of Reviews and Evaluation of Ecoregional Programs Meta-analysis of Reviews and Evaluation of Ecoregional Programs Julio A. Berdegué and Germán Escobar 1. Aim, method and organization of the analysis The aim of this report is to summarize the main lessons

More information

THE STANDARD FOR DOCTORAL DEGREES IN LAW AT THE FACULTY OF LAW, UNIVERSITY OF TROMSØ

THE STANDARD FOR DOCTORAL DEGREES IN LAW AT THE FACULTY OF LAW, UNIVERSITY OF TROMSØ THE FACULTY OF LAW THE STANDARD FOR DOCTORAL DEGREES IN LAW AT THE FACULTY OF LAW, UNIVERSITY OF TROMSØ Guidelines for the Faculty of Law in Tromsø, adopted by the Faculty Board on 31 May 2010. 1 Background

More information

PSYCHOLOGY PROGRAM LEARNING GOALS, LEARNING OUTCOMES AND COURSE ALLIGNMENT MATRIX. 8 Oct. 2010

PSYCHOLOGY PROGRAM LEARNING GOALS, LEARNING OUTCOMES AND COURSE ALLIGNMENT MATRIX. 8 Oct. 2010 PSYCHOLOGY PROGRAM LEARNING GOALS, LEARNING OUTCOMES AND COURSE ALLIGNMENT MATRIX 8 Oct. 2010 Departmental Learning Goals and Outcomes LEARNING GOAL 1: KNOWLEDGE BASE OF PSYCHOLOGY Demonstrate familiarity

More information

CREATING A LEAN BUSINESS SYSTEM

CREATING A LEAN BUSINESS SYSTEM CREATING A LEAN BUSINESS SYSTEM This white paper provides an overview of The Lean Business Model how it was developed and how it can be used by enterprises that have decided to embark on a journey to create

More information

Subject Benchmark Statement Political Science

Subject Benchmark Statement Political Science Subject Benchmark Statement Political Science I CONTENT Page No Foreword II 1 Introduction 1 1.1 Subject Benchmark Statement Scope and Purpose 1 1.2 Nature and Extent of the Subject 1 2 Subject Aims 3

More information

Master s Programme in International Administration and Global Governance

Master s Programme in International Administration and Global Governance Programme syllabus for the Master s Programme in International Administration and Global Governance 120 higher education credits Second Cycle Confirmed by the Faculty Board of Social Sciences 2015-05-11

More information

MSc in Global Supply Chain and Logistics Management

MSc in Global Supply Chain and Logistics Management School of Business, Management and Economics Department of Business and Management MSc in Global Supply Chain and Logistics Management Course Handbook 2013/14 2013 Entry Table of Contents School of Business,

More information

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

Guidance Note on Developing Terms of Reference (ToR) for Evaluations Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim

More information

CDIA Strategy and Action Plan for Pro-poor Urban Infrastructure Development 2011-2012. July 2011 Final Version CDIA

CDIA Strategy and Action Plan for Pro-poor Urban Infrastructure Development 2011-2012. July 2011 Final Version CDIA CDIA Strategy and Action Plan for Pro-poor Urban Infrastructure Development 2011-2012 July 2011 Final Version CDIA Cities Development Initiative for Asia TABLE OF CONTENTS TABLE OF CONTENTS... I ACRONYMS...

More information

P3M3 Portfolio Management Self-Assessment

P3M3 Portfolio Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction

More information

Single Window Interoperability (SWI) Discussion Paper

Single Window Interoperability (SWI) Discussion Paper Single Window Interoperability (SWI) Discussion Paper Abstract This paper forms part of a series of four discussion papers supporting the UN/CEFACT project to create a recommendation on Single Window Interoperability

More information

University of Bath. Welsh Baccalaureate Qualification Internal Evaluation. Themed Report: MARKETING AND PROMOTION

University of Bath. Welsh Baccalaureate Qualification Internal Evaluation. Themed Report: MARKETING AND PROMOTION University of Bath Welsh Baccalaureate Qualification Internal Evaluation Themed Report: MARKETING AND PROMOTION [This is one of eight themed reports which draw on issues relating to particular themes that

More information

2 nd EUA Funding Forum: Strategies for efficient funding of universities

2 nd EUA Funding Forum: Strategies for efficient funding of universities 2 nd EUA Funding Forum: Strategies for efficient funding of universities Bergamo, 9-10 October 2014 Forum report Liviu Matei, General rapporteur Table of contents I. Executive Summary 3 II. What is the

More information

Revision. AS Sociology. Sociological Methods. The relationship between Positivism, Interpretivism and sociological research methods.

Revision. AS Sociology. Sociological Methods. The relationship between Positivism, Interpretivism and sociological research methods. AS Sociology Revision Sociological The relationship between Positivism, Interpretivism and sociological research methods. Chris. Livesey 2006: www.sociology.org.uk Methodology Positivism Positivism means

More information

ANNEX III TO FINAL REPORT EVALUATION OF NPT AND NICHE

ANNEX III TO FINAL REPORT EVALUATION OF NPT AND NICHE Ministry of Foreign Affairs of the Netherlands Final Report May 2012 ANNEX III TO FINAL REPORT EVALUATION OF NPT AND NICHE EVALUATION OF NPT AND NICHE Contact Person: Lennart Raetzell Manager T +49 30

More information

Bachelor Program in Analytical Finance, 180 credits

Bachelor Program in Analytical Finance, 180 credits Program Curriculum Page 1 of 7 Program code: RMV20 Bachelor Program in Analytical Finance, 180 credits This is a translation of the original program study plan in Swedish, which was approved by the Faculty

More information

Composite performance measures in the public sector Rowena Jacobs, Maria Goddard and Peter C. Smith

Composite performance measures in the public sector Rowena Jacobs, Maria Goddard and Peter C. Smith Policy Discussion Briefing January 27 Composite performance measures in the public sector Rowena Jacobs, Maria Goddard and Peter C. Smith Introduction It is rare to open a newspaper or read a government

More information

Profession and Professional Work in Adult Education in Europe

Profession and Professional Work in Adult Education in Europe Profession and Professional Work in Adult Education in Europe Ekkehard Nuissl In the recent decade it became more important to reflect about the work which is done in adult education, who is doing it and

More information

Perspectives on the knowledge-based society

Perspectives on the knowledge-based society Perspectives on the knowledge-based society Interviews about Netherlands as knowledge land Editor: Inge Wichard In association with: Vincent Delemarre and Gerda Sulman (editor) Introduction The knowledge-based

More information

Programme Specification and Curriculum Map for MA Global Governance and Public Policy

Programme Specification and Curriculum Map for MA Global Governance and Public Policy Programme Specification and Curriculum Map for MA Global Governance and Public Policy 1. Programme title MA / PGDip / PG Cert Global Governance and Public Policy: International Development 2. Awarding

More information

School of Advanced Studies Doctor Of Health Administration. DHA 003 Requirements

School of Advanced Studies Doctor Of Health Administration. DHA 003 Requirements School of Advanced Studies Doctor Of Health Administration The mission of the Doctor of Health Administration degree program is to develop healthcare leaders by educating them in the areas of active inquiry,

More information

A SYSTEMS MODEL OF PROJECT MANAGEMENT

A SYSTEMS MODEL OF PROJECT MANAGEMENT A SYSTEMS MODEL OF PROJECT MANAGEMENT Richard Brian Barber Research Student School of Civil Engineering Australian Defence Force Academy Canberra ACT Australia 2600 barberrb@bigpond.net.au INTRODUCTION

More information

LDC Needs Assessment under TRIPS: The ICTSD experience (2007-2011)

LDC Needs Assessment under TRIPS: The ICTSD experience (2007-2011) Information Note Number 19. APRIL 2011 LDC Needs Assessment under TRIPS: The ICTSD experience (2007-2011) Executive Summary The experience of ICTSD with intellectual property (IP) needs assessments has

More information

ENGINEERING COUNCIL. Guidance on Risk for the Engineering Profession. www.engc.org.uk/risk

ENGINEERING COUNCIL. Guidance on Risk for the Engineering Profession. www.engc.org.uk/risk ENGINEERING COUNCIL Guidance on Risk for the Engineering Profession www.engc.org.uk/risk This guidance describes the role of professional engineers and technicians in dealing with risk, and their responsibilities

More information

Competencies of BSc and MSc programmes in Electrical engineering and student portfolios

Competencies of BSc and MSc programmes in Electrical engineering and student portfolios C:\Ton\DELTA00Mouthaan.doc 0 oktober 00 Competencies of BSc and MSc programmes in Electrical engineering and student portfolios Ton J.Mouthaan, R.W. Brink, H.Vos University of Twente, fac. of EE, The Netherlands

More information

PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE

PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISTANCE DEVELOPMENT ASSISTANCE COMMITTEE PARIS, 1991 DAC Principles for Evaluation of Development Assistance Development Assistance Committee Abstract: The following

More information

2 Computer Science and Information Systems Research Projects

2 Computer Science and Information Systems Research Projects 2 Computer Science and Information Systems Research Projects This book outlines a general process for carrying out thesis projects, and it embraces the following components as fundamentally important:

More information

Executive summary. Today s researchers require skills beyond their core competencies

Executive summary. Today s researchers require skills beyond their core competencies EXECUTIVE SUMMARY 9 Executive summary Today s researchers require skills beyond their core competencies The formation and careers of researchers are important policy issues and training for transferable

More information

MSc Applied Child Psychology

MSc Applied Child Psychology MSc Applied Child Psychology Module list Modules may include: The Child in Context: Understanding Disability This module aims to challenge understandings of child development that have emerged within the

More information

Programme approval 2006/07 PROGRAMME APPROVAL FORM SECTION 1 THE PROGRAMME SPECIFICATION. ECTS equivalent

Programme approval 2006/07 PROGRAMME APPROVAL FORM SECTION 1 THE PROGRAMME SPECIFICATION. ECTS equivalent PROGRAMME APPROVAL FORM SECTION 1 THE PROGRAMME SPECIFICATION 1. Programme title and designation Public Services Policy and Management 2. Final award Award Title Credit ECTS Any special criteria value

More information

INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD

INDICATIVE GUIDELINES ON EVALUATION METHODS: EVALUATION DURING THE PROGRAMMING PERIOD EUROPEAN COMMISSION DIRECTORATE-GENERAL REGIONAL POLICY Thematic development, impact, evaluation and innovative actions Evaluation and additionality The New Programming Period 2007-2013 INDICATIVE GUIDELINES

More information

Using Case Studies in Research

Using Case Studies in Research Biographical Note Professor Jennifer Rowley can be contacted at the School of Management and Social Sciences, Edge Hill College of Higher Education, Ormskirk, Lancashire, England L39 4QP. by Jennifer Rowley

More information

International Engineering Alliance. Glossary of Terms Ver 2: 15 September 2011

International Engineering Alliance. Glossary of Terms Ver 2: 15 September 2011 International Engineering Alliance Glossary of Terms Ver 2: 15 September 2011 Ability: a bodily or mental power to perform an action. Accreditation of programmes (Programme accreditation): recognition

More information

CHAPTER THREE: METHODOLOGY. 3.1. Introduction. emerging markets can successfully organize activities related to event marketing.

CHAPTER THREE: METHODOLOGY. 3.1. Introduction. emerging markets can successfully organize activities related to event marketing. Event Marketing in IMC 44 CHAPTER THREE: METHODOLOGY 3.1. Introduction The overall purpose of this project was to demonstrate how companies operating in emerging markets can successfully organize activities

More information

Using Value Added Models to Evaluate Teacher Preparation Programs

Using Value Added Models to Evaluate Teacher Preparation Programs Using Value Added Models to Evaluate Teacher Preparation Programs White Paper Prepared by the Value-Added Task Force at the Request of University Dean Gerardo Gonzalez November 2011 Task Force Members:

More information

INTOSAI. Performance Audit Subcommittee - PAS. Designing performance audits: setting the audit questions and criteria

INTOSAI. Performance Audit Subcommittee - PAS. Designing performance audits: setting the audit questions and criteria INTOSAI Performance Audit Subcommittee - PAS Designing performance audits: setting the audit questions and criteria 1 Introduction A difficult phase in performance auditing is planning and designing. In

More information

Programme description for PhD Programme in Educational Sciences for Teacher Education (180 ECTS credits) at Oslo and Akershus University College of

Programme description for PhD Programme in Educational Sciences for Teacher Education (180 ECTS credits) at Oslo and Akershus University College of Programme description for PhD Programme in Educational Sciences for Teacher Education (180 ECTS credits) at Oslo and Akershus University College of Applied Sciences Approved by the Oslo and Akershus University

More information

THE MASTER S DEGREE IN DESIGN PROGRAMME DESCRIPTION Adopted by the Board of KHiB on 27 October 2011

THE MASTER S DEGREE IN DESIGN PROGRAMME DESCRIPTION Adopted by the Board of KHiB on 27 October 2011 THE MASTER S DEGREE IN DESIGN PROGRAMME DESCRIPTION Adopted by the Board of KHiB on 27 October 2011 1. THE PROFILE AND OBJECTIVES OF THE STUDY PROGRAMME The goal of the Department of Design is to educate

More information

2012/2013 Programme Specification Data. Environmental Science

2012/2013 Programme Specification Data. Environmental Science 2012/2013 Programme Specification Data Programme Name Programme Number Programme Award QAA Subject Benchmark Statements Environmental Science P02123 BSc Hons Earth Science, Environmental Science, Environmental

More information

Information Technology Research in Developing Nations: Major Research Methods and Publication Outlets

Information Technology Research in Developing Nations: Major Research Methods and Publication Outlets Information Technology Research in Developing Nations: Major Research Methods and Publication Outlets Franklin Wabwoba, Anselimo Peters Ikoha Masinde Muliro University of Science and Technology, Computer

More information

Case study research design

Case study research design Case study research design Contents 1 Introduction... 1 2 Applications of case study design... 3 3 Outline of the design... 3 4 Strengths and weaknesses of case study designs... 9 5 References... 10 1

More information

The Transport Business Cases

The Transport Business Cases Do not remove this if sending to pagerunnerr Page Title The Transport Business Cases January 2013 1 Contents Introduction... 3 1. The Transport Business Case... 4 Phase One preparing the Strategic Business

More information

General remarks. Page 1 of 6

General remarks. Page 1 of 6 Frankfurt am Main, 14. April 2010 Sophie Ahlswede Deutsche Bank AG/DB Research P.O. Box 60262 Frankfurt, Germany e-mail: sophie.ahlswede@db.com Tel. +49 (0)69 910 31832 Deutsche Bank response to the public

More information

Nottingham Trent University Course Specification MA Criminology

Nottingham Trent University Course Specification MA Criminology Nottingham Trent University Course Specification MA Criminology Basic Course Information 1. Awarding Institution: Nottingham Trent University 2. School/Campus: School of Social Science/City Campus 3. Final

More information

Institute of Leadership & Management. Creating a coaching culture

Institute of Leadership & Management. Creating a coaching culture Institute of Leadership & Management Creating a coaching culture Contents Introduction 01 Executive summary 02 Research findings 03 Conclusion 07 Methodology 08 Introduction The world of work is complex

More information

Edital Faperj n.º 38/2014 RCUK CONFAP RESEARCH PARTNERSHIPS CALL FOR PROJECTS

Edital Faperj n.º 38/2014 RCUK CONFAP RESEARCH PARTNERSHIPS CALL FOR PROJECTS Edital Faperj n.º 38/2014 RCUK CONFAP RESEARCH PARTNERSHIPS CALL FOR PROJECTS Research Councils UK (RCUK) (http://www.rcuk.ac.uk/) and the Brazilian Council of State Funding Agencies (CONFAP) (www.confap.org.br;

More information

Programme Study Plan

Programme Study Plan Faculty of Economic Sciences, Communication and IT Programme Study Plan Master Programme in Global Media Studies Programme Code: Programme Title: Programme Approval SAGMS Master Programme in Global Media

More information

Single and Multiple-Case Study Designs IS493

Single and Multiple-Case Study Designs IS493 1 2 Research Strategies Basic oppositions Survey research versus Case study quantitative versus qualitative The whole gamut Experiment Survey Archival analysis Historical research Case study 3 Basic Conditions

More information

Departmental Goals and Objectives Department of History and Political Science. History Program

Departmental Goals and Objectives Department of History and Political Science. History Program Departmental Goals and Objectives Department of History and Political Science History Program GOALS Majors will possess basic foundation knowledge. Majors will be critical thinkers. Majors will be effective

More information

Consulting projects: What really matters

Consulting projects: What really matters Consulting projects: What really matters The factors that influence the success of management consulting projects Case 138: het 'Zwijsen future proof' project met de inzet van GEA Results PhD 2014, Bart

More information

The Logical Framework Approach An Introduction 1

The Logical Framework Approach An Introduction 1 The Logical Framework Approach An Introduction 1 1. What is the Logical Framework Approach? 1.1. The background The Logical Framework Approach (LFA) was developed in the late 1960 s to assist the US Agency

More information

Errors in Operational Spreadsheets: A Review of the State of the Art

Errors in Operational Spreadsheets: A Review of the State of the Art Errors in Operational Spreadsheets: A Review of the State of the Art Stephen G. Powell Tuck School of Business Dartmouth College sgp@dartmouth.edu Kenneth R. Baker Tuck School of Business Dartmouth College

More information

Foundation Degree Contemporary Textiles. Foundation Degree Contemporary Fashion. Foundation Degree Design for Interiors

Foundation Degree Contemporary Textiles. Foundation Degree Contemporary Fashion. Foundation Degree Design for Interiors Foundation Degree Contemporary Textiles Foundation Degree Contemporary Fashion Foundation Degree Design for Interiors Abbreviated Programme Specification Containing Both Core + Supplementary Information

More information

Master of Science in Management

Master of Science in Management Programme Syllabus for Master of Science in Management 120 higher education credits Second Cycle Established by the Faculty Board of the School of Business, Economics and Law, University of Gothenburg,

More information

THE REASONING ART: or, The Need for an Analytical Theory of Architecture

THE REASONING ART: or, The Need for an Analytical Theory of Architecture P ROCEEDINGS VOLUME I SPACE SYNTAX TODAY THE REASONING ART: or, The Need for an Analytical Theory of Architecture Professor Bill Hillier and Dr Julienne Hanson University College London, London, England

More information