Key Principles and Best Practice for Evaluation of RD&E programs and projects Research outputs need to be monitored and evaluated to determine whether research outcomes are being met. Feedback through this process is essential to the R,D&E system. Evaluations are conducted to provide information to be used to assist with overall management and to provide objective evidence about stated achievements. Evaluation is undertaken to: Help understand and improve performance; and Identify, calculate and communicate costs and benefits. The following principles generally apply at program, project or organisational level. Early design of evaluation strategy The evaluation strategy should be worked out at the outset, as part of the design phase for R,D&E program and projects. Performance measurement should be factored into the design of the program and decisions made about what information to collect during this phase. It is often difficult and costly to collect information and data later, particularly if systems have not been established to do so. Take care deciding what information and data you need to collect do not impose onerous collection requirements be selective and use innovative measures. Articulating a clear program logic or theory of change will make it much easier to plan an appropriate evaluation strategy and identify supporting data requirements. The evaluation process and evaluation results should facilitate learning and program improvement. It should therefore be seen as an integral and helpful component of a program, rather than something to be added on towards the end. Regular review and monitoring Regular review and monitoring should be part of an integrated approach to program management and is an essential precursor to an effective program evaluation. Organisations need to be monitoring a program s progress and be able to assess where it is at any point in time. This will only be possible if the evaluation has been built in to the design of the program. Page 1 of 8
Constant questioning is good, for example: Is it working? Is it delivering the expected benefits? Is it still relevant for now? Has something changed? Is increased resourcing or a change to program design needed to get it back on track? Should it continue to be funded or is there some better alternative? o Program managers need to be prepared to make the hard decisions and cut something that is not working. Transparency and independence in evaluations The evaluation process should be as transparent as possible. o It will not always be possible or practicable to have an external or independent body conduct major evaluations. o Participatory evaluation may be more appropriate in some cases Regardless of the options chosen, it is crucial to ensure that governance arrangements are sound. Involve outsiders in the evaluation, or use an independent reference group to oversight the evaluation. Method for evaluation A diversity of approaches may be used to evaluate RD&E programs no one method is right or suits all programs. o The method to be used depends on the questions being asked and the nature of the program being evaluated. Whatever method is used, the evaluation needs to be systematic, robust and evidence-based. A list of documents and guidelines that can assist with determining the appropriate method is at Appendix A. Utilisation of findings When planning evaluation it is important to have an end point in mind and incorporate a complete feedback loop. The findings need to be utilised to improve future investments and decision making as well as improving the management of a program or project from an organisational perspective. The learning needs to be disseminated. This applies to both funders and investors in terms of assessing whether goals and outcomes have been met. Page 2 of 8
Communication of evaluation processes and results The needs of different stakeholders (e.g. program clients, decision makers etc) should be considered in developing a communication strategy. This is necessary to facilitate engagement with the evaluation process and to ensure results are taken on board. It is important for departments and agencies to disseminate learnings from evaluations. In particular, mistakes are more likely to be repeated if poor results are delayed or not released, or sanitised. Performance measures Ensure you have both efficiency and effectiveness measures: Collect only what is relevant to monitoring and evaluating the program s performance. A clear program logic and understanding of the results the program is intended to achieve will help identify appropriate performance measures. Cost-benefit analysis Evaluation serves to help our understanding to make better investment decisions in the future and manage the programs and projects that underpin these investments. Cost-benefit analysis will strengthen the evaluation process and where possible and appropriate, both qualitative and quantitative analysis should be undertaken. There are some common agreed standards of rigour that cost-benefit analysis, regardless of the methodology used, should address. This will help to facilitate comparison of results derived using different evaluation methodologies. These are contained at Appendix B. Page 3 of 8
Appendix A Evaluation Reference Documents The following is a list of evaluation documents that you may find useful in developing your own evaluation processes procedure and techniques. The Australasian Evaluation Society also provides links to many overseas documents, conference and regional networks. Agency Title Year member documents The Department of Agriculture and Food WA Department of Primary Industries Victoria The Department of Agriculture and Food Evaluation Policy DPI Agriculture and Fisheries Investment Performance Measurement: Evaluation, Monitoring and Reporting Guideline 2010 CRRDC Guidelines for Evaluation CRRDC Australian Government National Resource Management Public Administration Today (publication) Bureau of Rural Sciences Instructions for the Preparation of Standardised CBAs Caring for our Country: monitoring, evaluation, reporting and improvement strategy Measuring Contributions to Sustainable Development A Practical Approach Signposts for Australian Agriculture 2010 2004 Australian Government documents ANAO Expenditure Review Principles PM&C DIISR Implementation of Programme and Policy Initiatives Framework of Principles for Innovation Initiatives 2006 DIISR Innovation Metrics Framework Not-For-Profit Journal (publication) Techniques of Social Evaluation Documents from overseas agencies BIS (UK) OECD Guidance for using Additionality Benchmarks in Appraisal Enhancing public research performance through Evaluation, Impact Assessment and Priority Setting Page 4 of 8
Agency Title Year Documents from overseas agencies HM Treasury (UK) Cabinet Office (UK) European Commission UNFPA CDC (US) NESTA The Green Book Appraisal and Evaluation in Central Government The Magenta Book guidance notes on Policy Evaluation Smart Innovation: A Practical Guide to Evaluating Innovation Programs The Programme Manager s Planning, Monitoring and Evaluation Toolkit Framework for Program Evaluation in Public Health Evaluating the Enterprising Further Education Pilot 2003 2008 2006 1999 2010 Research Councils UK Evaluation: Practical Guidelines 2005 Canadian Evaluation Society Program Evaluation Standards 2008 Others Federation of Australian Scientific and Technological Societies Giving Preparedness a Central Role in Science and Innovation Policy Page 5 of 8
Appendix B Standards for Cost-benefit Analysis There are some common agreed standards of rigour that cost-benefit analysis, regardless of the methodology used, should address. Counterfactual Ensure you try to take into account the counterfactual situation in a robust way. What would have happened in the absence of this particular program/project? For example: o Take into account that someone might have done this piece of research over a similar timeframe anyway. o Consider whether the work brought forward research and outcomes that would have taken place anyway but at a later date. Using the appropriate counterfactual helps ensure that the true cost of funding the particular program/activity is taken into account in the assessment of its benefits. It also makes it easier to compare the results of different evaluations. Cost of implementation Ensure all costs of implementation, for example, by business, industry, governments or consumers are taken into account when calculating the net benefits arising from the program. Opportunity cost It is essential to take into account the opportunity cost of directing funds to a particular program/project, as the funds could have been used for another purpose by all parties involved. o At a minimum, use the long term Government bond rate. This takes into account the fact that investors are foregoing the opportunity to generate a risk free return on their investment equal to the long term Government bond rate. Additionality and crowding out Don t count benefits resulting from program activities or outputs that are not additional, for example, benefits resulting from research that would have been undertaken in any event. See counterfactual section above. Take into account any crowding out. This occurs where expenditure that would have been undertaken by the private sector is displaced by government expenditure that is financed by borrowing, which may lead to increases in interest rates, which curb individual expenditure. o Crowding out is more likely to be an issue in relation to individual publicly funded projects, where the benefits are likely to be used by relatively narrow groups. Page 6 of 8 PISC 20: Item 6.4
Benefits from a program/project/activity Identify economic, social and environmental benefits wherever possible. o Social and environmental benefits can be more difficult to put a value on, but can be very relevant to science and innovation programmes. Ensure you have external verification of claimed benefits. Look at preparedness as an outcome/benefit of science and innovation: o Preparedness is becoming more relevant as economies and societies become more connected in a global sense, particularly with regard to terrorism and pandemic diseases, for example. o The concept of preparedness as an outcome class also relates to forewarning markets and the general community of future risks, in so doing bringing forward responses and mitigating future (higher) costs. o Preparedness as an outcome class is also a useful feature of investments in capability-building. For economic benefits: o Changes in consumption are more meaningful than changes in Gross Domestic Product. o Take into account time lags an discount future economic benefits to net present value terms; o the extent to which they can be attributed to the project/programme should be taken into account [see above]. o When quantifying future economic benefits: involve the expected end-users/beneficiaries; take account of previous observed impacts from similar programs in that particular sector; and take into account historically achieved productivity growth rates in that sector. o Factor risks into the impact assessment process one way to do this is to use expected value analysis, where probability estimates are assigned to alternative outcomes and expected outcomes generated. Spillovers Identify any spillovers that are likely to occur. These are benefits that cannot be appropriated fully by one party or group (including businesses, individuals). o The merits of public funding where there are no or few spillovers are questionable. o Programs or projects which produce substantial spillovers are more appropriately funded by government. o The existence of positive spillovers is not a sufficient reason to justify public expenditure. Page 7 of 8 PISC 20: Item 6.4
Attribution In many cases, the particular program may be only one of a number of important contributors to an outcome. o Where this is the case, it can be difficult to decide what proportion of the outcome can reasonably be attributed to the existence of the particular program. o Clearly state any assumptions made about attribution rates. It is highly unlikely that one intervention will be the only contributor particularly for longer term programs and environmental and social change. Page 8 of 8 PISC 20: Item 6.4