An introduction to impact measurement
|
|
|
- Albert Garrett
- 9 years ago
- Views:
Transcription
1 An introduction to impact measurement Contents 1 Introduction 2 Some definitions 3 Impact measurement at BIG 4 Setting impact measures for programmes APPENDICES A External Resources (separate document) 1
2 1 Introduction What's in this document This section explains the purpose of this document. Section Two provides some definitions and principles. Section Three outlines existing practice and resources within BIG. Section Four suggests ways to explore options for measuring programme impact and provides some structured questions to help. Appendix A introduces some reference materials on tools and resources. Purpose This guide aims to provide an accessible, non-prescriptive introduction to impact measurement at a programme level. Impact measurement is key to helping funders and those they fund demonstrate what they are achieving and learn how to improve. Evidence from grant holders helps funders to identify their own impact, and direct money to where it will make the biggest difference. Within BIG, there is a well-established outcomes culture that underpins programme design and delivery. A good deal of knowledge and information exists within the organisation and established internal procedures and external resources are in place. There is an interest across the funding sector in improving co-ordination and best practice in impact measurement. This is a dynamic and evolving area and there is a daunting amount of information in circulation. There is a proliferation of tools and providers in the field of impact measurement and an acknowledged lack of coordination among providers of impact measurement support. According to New Philanthropy Capital's report Inspiring impact 1 there are over 1,000 different methods available. There also appears to be general consensus amongst funders that there is a shortage of low-cost, 'off the shelf tools and systems. This document is not an exhaustive study, rather it aims to provide a starting point for discussions on impact and ways of thinking about measuring it, to support BIG's mission of bringing real improvements to communities and to the lives of people most in need. 1 ng_impact.aspx 2
3 2 Some definitions What is impact? The definition of impact we use at BIG is: Any effects arising from an intervention. This includes immediate short-term outcomes as well as broader and longer term effects. These can be positive or negative, planned or unforeseen. Outputs Products, services or facilities that result Why is impact important? Deciding on what impact we want to have is the first and most important question to address when designing any programme, project or intervention. It will enable us to: support a mission - e.g. BIG's mission to bring real improvements to communities and to the lives of people most in need and the priorities identified in England, Scotland, Wales and Northern Ireland be accountable to sponsors, stakeholders, customers and beneficiaries know what works in the interventions we fund, to adapt and to do more improve policy and practice make choices and set priorities about the way we design our programmes and about what we fund What are the relationships between outputs, outcomes and impact? There is a risk of becoming bogged down in definitions. What matters is that there is a shared understanding of terms amongst all those involved. Impact and outcomes are sometimes used interchangeably. Both terms are about change. Impact tends to describe longer-term, broader change than outcomes. In the context of BIG's approach to funding, whilst we wish to know about grant holders' and programme outcomes, we also wish to know about a programme's impact. This will involve some kind of aggregation of the outcomes from individual projects plus perhaps other changes that add up to an overall impact of the programme such as unintended effects and short- and long-term, positive and negative effects on individuals, organisations or the environment. Outcomes are therefore components of the wider, all-encompassing definition of impact. In summary: Outputs are the products, services or facilities that result from an organisation s or project s activities. For example in a programme to improve well-being amongst older people, outputs might include the different types of interventions being offered by projects, or the numbers of people overall participating in activities under the programme. Outcomes are the changes, benefits, learning or other effects that result from what the project or organisation makes, offers or provides. For example, for the same well-being programme, outcomes might be improvements in clients' physical or emotional health, or projects' improved ability to extend their reach to different client groups. Impact is the broader or longer-term effects of a project s or organisation s outputs, outcomes and activities. For example, in addition to an understanding of the extent to which projects funded by the well-being programme have achieved their outcomes, there might be a longer-term change in the way some projects work with their clients, new partnerships may have developed, or policy may have been influenced at a local or wider level. 3
4 What is impact measurement? Impact measurement is the process of trying to find out what effect an intervention (such as a funding programme) is having on people, organisations or their external physical, economic, political or social environment. Impact measurement refers to all activities involved in managing and assessing impact - from 'light touch' routine monitoring of outcomes data to 'high level' and resource-intensive evaluation. Why is it important to measure impact? So that we can: know what has changed and what works about an intervention or funding programme know the extent and intensity of the change benchmark and make comparisons learn and make improvements test assumptions provide evidence of value for money which can increase sustainability detect any unintended impacts Don't forget... It's never too early to start thinking about impact when planning a programme. In seeking to measure impact, the fundamental questions to be answered will remain the same: o Did we accomplish the change we set out to? o How do we know? o What changes happened as a result of our work? o What unexpected or unplanned effects did our funding have on people, organisations, or the physical, economic, political or social environment? 4
5 3 Impact measurement at BIG What's already in place at BIG? Measuring impact isn t a brand new way of doing things and the concept of measuring impact is well established in BIG. Outcomes are already embedded in BIG's approach to funding. Each programme has explicit outcomes and applicants must demonstrate how their projects will meet these outcomes. Self-evaluation by grant holders is encouraged as an approach and applicants can bid for funds for this in their applications. A number of internal processes and external resources are in place to support programme-level impact measurement. For example: Processes A Theory of Change exercise is carried out in the development of every BIG programme, to develop programme aims and outcomes. The Programme Effectiveness Process is incorporated into every BIG programme, where Measures of Success are set in the areas of impact, learning and programme management, and then annually reviewed. Self-evaluation by BIG s grant holders is encouraged as an approach, and evaluation is required of organisations receiving very large grants. Applicants can include funding for monitoring and evaluation in their project budgets, as part of their application. BIG sometimes runs support and development contracts alongside funding programmes, which include self-evaluation support for projects. BIG commissions programme evaluations when rigorous programme-level evidence is needed to inform policy or practice, or to understand an innovative or high-profile intervention. BIG has begun to explore common approaches to measurement by projects in a targeted programme. Common outcomes, indicators or data collection make sense when projects are working to a common goal with the same beneficiary group. Resources An Impact Measurement intranet page pulls together all relevant documents and resources. There is detailed guidance on the Programme Effectiveness Process including support on setting Measures of Success for impact, with examples of different types of measure and how these might be reported annually. BIG s resource Getting Funding and Planning Successful Projects helps applicants and grant holders to identify need, develop aims, outcomes, indicators and activities and understand how to learn from their project. BIG has funded the development of a number of accessible impact measurement tools: Outcomes Star, Prove It, SOUL Record and Rickter Scale. Externally BIG takes a strategic role in discussions on impact. For example, it is currently involved with (and part-funding) the collaborative Inspiring Impact project (alongside VCS organisations, funders and measurement specialists) to accelerate the improvement of high quality impact measurement and encourage and support social organisations to measure their impact better and thereby make an even bigger difference in the lives of the people they support. This group will support change over the next ten years by providing clear guidance along with affordable and accessible training, tools and systems. The 12-month action plan includes objectives to clarify what good impact measurement looks like, campaign to change 5
6 the attitudes of leaders in the sector, set up online resources, and make tools more accessible. Don't forget... There is a lot of knowledge and expertise and a large pool of resources on impact, outcomes and learning within BIG. The Research and Learning team can provide advice and signpost you to this and a range of other support. 6
7 4 Setting impact measures BIG funds an extremely varied range of organisations with a correspondingly vast range of aims, outcomes and activities. There are consequently many different impact measurement approaches that are applicable, from simple, off the shelf, low cost tools to high-level academic evaluation. What existing approaches might be helpful? Tools and methods can measure different things such as: individuals' progress over time; an organisation's performance against benchmarks or quality standards; impact on specific sectors of the economy, the environment or society. There are a number of tried and tested approaches available for impact measurement, some of which are tailored for specific subsectors. Where appropriate, programmes should align with these, for example: Projects aiming to bring about specific changes for vulnerable individuals may choose an Outcomes Star tailored for that particular group, enabling individuals to chart their own progress. Data from Outcomes Stars can be aggregated. Projects aiming to bring about change in communities may choose Community Impact Mapping as a way to support organisations that are new to impact measurement to chart the difference they are making. For larger, more experienced organisations, tools such as Social Return on Investment, which helps put a monetary value on the project s benefits, or Outcomes Based Accountability may be appropriate ways of measuring impact. Large programmes that aim to influence policy or share learning across a wide network may invest in a full-scale programme evaluation over a number of years that develops tailored impact measures specific to that programme. Well-established practices in measuring impact may already be in place, especially amongst more experienced organisations. When developing impact measures for a funding programme BIG must be aware of this, be sensitive to it, and where possible work with it rather than impose new or different practice on those organisations. Equally, some organisations will be glad of advice or signposting from BIG. How do we arrive at an appropriate impact measurement approach? What is appropriate for BIG s programme-level impact measurement will depend on: what the programme aims to achieve what it is possible to measure by what means, how and to whom we (and grant holders) wish to communicate this and practical considerations such as resources and capacity of both grant holders and BIG staff. Impact measurement of grant programmes requires careful thought and planning. There are few blueprints. It can be challenging to work out how best to aggregate data from a range of projects in order to inform progress against programme outcomes and wider impact. It is critical to invest sufficient time for discussion and planning at the programme design stage, in order to arrive at an approach for measuring impact that is appropriate and proportionate. Deciding on the purpose of impact measurement and the key audiences is essential. A good impact measuring approach should meet certain quality criteria. It should be: 7
8 able to provide evidence and lead to learning that is useful and relevant to all stakeholders capable of providing good quality evidence relevant to the programme aims tried and tested easy to communicate proportionate to the skills / capacity of the types of organisations being funded adaptable if required, i.e. usable by the different types of project within the programme capable of being integrated with or complementing existing reporting systems capable of providing information that can be aggregated from project to programme within BIG's resources to develop and implement, and offer value for money There is no cut and dried route to follow to arrive at the right impact measurement approach, particularly given the number of variables in BIG's funding interventions. However, the following steps may help the thinking process: Step 1 - set up a discussion group or workshop with key stakeholders to discuss programme impact and how it might be achieved, demonstrated and communicated. This will most probably be at the Setting Measures of Success stage of the Programme Effectiveness Process. Step 2 - go through these six impact measurement questions in the group (further guidance on exploring these questions is provided at the end of this section): 1. What impact is the programme trying to achieve? 2. What level of influence do we expect to have? 3. Who are the likely grant holders and what is their capacity? 4. What will be measured (and how does this align with existing approaches for interventions of this sort)? 5. What kind of evidence will be needed and how will it be collected? 6. What resources are available? At the end there should be a set of requirements from the impact measurement approach that can help to start exploring possible options. Step 3 - using the requirements identified, explore some possible approaches using the resources in Appendix A as a starting point. Step 4 - check the identified options against the impact measurement quality criteria above. The range of programmes funded by BIG means that there are many variables to be considered in order to arrive at an impact measurement method that is appropriate and realistic to implement. At one end of the spectrum are programmes with broad outcomes, funding locally-based projects and a wide range of beneficiaries. Here BIG may decide to ask projects to report basic accountability data, and encourage them to set up appropriate measurement approaches that help them to improve their services and demonstrate their impact to local stakeholders. At the other end of the spectrum might be innovative programmes with highly specific outcomes, targeted beneficiaries and the potential to inform national policy. These may require more sophisticated bespoke evaluations that enable BIG, the evaluators and the grant holders to use the evidence for significant influence. Most programmes will fall somewhere in between 8
9 these two examples. The impact measurement questions attempt to explore these variables in order to stimulate thinking and discussion at the start of the programme. Don't forget... It is critical to invest sufficient thinking and planning time early in the programme's life to identify the intended impact and ways to measure it. There is a lot of good information available (starting with the resources in Appendix A) and Research and Learning colleagues can help. This will probably be an evolving process - it may be necessary to go through the questions more than once. Measuring impact can be difficult! There is no magic bullet so aim for a reasonable approach that gets people thinking about evidence for the change achieved. Don t let the measurement approach overtake common sense: not everything that matters can be counted! 9
10 Impact measurement questions Start your discussion by considering three key questions: 1. What impact is the programme trying to achieve? 2. What level of influence do we expect to have? 3. Who are the likely grant holders and what is their capacity? Then follow up with some more detailed supplementary questions: 4. What will be measured (and how does this align with existing approaches for interventions of this sort)? 5. What kind of evidence will be needed and how will it be collected? 6. What resources are available? All the questions below link to each other to help guide you towards identifying some measurement options. Although a broad sequence is suggested, in practice collecting together the information is more like piecing together a jigsaw than applying a linear process and it may be necessary to go through the questions more than once. Key questions 1. What impact is the programme trying to achieve? Think about whom or what the impact will primarily be on: Individuals, for example improving people's wellbeing Communities, for example supporting communities of interest / subsectors or geographical communities Organisations, for example changing the way organisations work, deliver services, reach beneficiaries The wider sector or society at large, for example broad, intentionally non-specific impact over a wide geographical spread The economic, political or physical environment, for example targeting a strategic issue, influencing policy or debate. 2. What level of influence do we expect to have? Do we seek to have: Limited influence beyond beneficiaries and grant holders - impact data will need to be accessible and appropriate to that group and costs of sharing learning are likely to be modest Influence on communities of interest - there may be specific approaches that have credibility and will support diffusion of learning, for example tried and tested sector-specific methods. Sharing learning may be achieved via existing networks or infrastructure Influence on government, policy makers and other funders - high standards of evidence will be required, requiring potentially higher cost approaches, such as external evaluation. Consider what has been previously published and whether you can build on existing evidence and if you should use the same or a complementary approach to measuring impact. There may be learning from other funders who have worked in this area. 10
11 3. Who are the likely grant holders and what is their capacity? In all cases, grant holders' ability to meet the demands of data collection is a key consideration, depending on who they are. Here are some examples: Within one specific sector - in which case explore impact measurement approaches that are tried and tested within that sector Community groups or small organisations - in which case aim for simple, low cost, off the shelf approaches Medium sized organisations - in which case look for simple, low cost, off the shelf approaches and also sector-specific models if relevant Large charities or social enterprises / public sector bodies / partnerships and consortia - in which case explore sector-specific approaches if relevant as well as systems and approaches already used or developed by these organisations; these organisations may have experience and capacity to develop and pilot new approaches; it may be appropriate to get them to commission formal evaluations. Supplementary questions 4. What will be measured (and how does this align with existing approaches for interventions of this sort)? Consider which aspects of the intended programme impact it makes sense to try and measure, if they can be quantified, if baselines can be set and what these would look like - and what this tells you about the measurement approach that might be most suitable. It is important to be realistic and practical. The other questions will help - i.e. the programme's anticipated level of influence, the grant holders' capacity, and the likely resources available. 5. What kind of evidence is needed and how should it be collected? Good quality impact evidence should ideally have: o breadth - covering a representative spread of grants o depth - digs down to provide information from a number of different perspectives o objectivity - stands up to scrutiny and can be independently verified; even if only case study evidence is available, it should not just comprise good news case studies or anecdotal information o information about change robust quantitative and qualitative data consistently collected throughout the relevant period to track the change that is happening Bearing in mind these quality criteria, think about what is reasonable and practical, taking into account the anticipated types and numbers of beneficiaries and grant holders. In some circumstances, such as a targeted programme with few grant holders, it may be feasible to consult on this. If the programme is highly targeted, it may be appropriate to require all grant holders to adopt a uniform method, particularly if there is a tried and tested tool already in use in that sub-sector. If the programme is large and diverse, it might it be more practical to collect information from a sample of grant holders. Consider if existing systems will support the collection, retrieval and aggregation of data. If additional ways of getting the information are needed, consider what options are appropriate. Setting up simple spreadsheets to collect particular information from the 11
12 start of the programme may be a relatively simple and effective option. There may be value in planning in a one-off exercise to analyse routinely collected data to explore a particular programme outcome. Given the nature of the programme and the intensity of engagement with individuals, consider how far it is reasonable to attribute the benefits to the intervention. Does your measurement approach really prove that intervention A caused outcome B? If the funding programme is one part of a multi-agency effort, you might want to go for contribution not attribution, i.e. measuring achievement of the goal with other partners, rather than trying to disentangle the part that BIG s funding played. 6. What resources are available? If existing tools or methods are appropriate, it may be most cost-effective simply to signpost grant holders to these, depending on the quality and range of impact data the programme demands. If a particular approach to impact measurement is required of grant holders, it may be necessary to include specific costs for impact measurement in grants. Is funding a development and support contract a viable option? For example, if a relatively technical model such as SROI is identified as the best tool for measuring the programme's impact, training would probably need to be available for grant holders and this may be most efficiently achieved via one provider, depending on the number and geographical spread of likely grants. Having identified the most appropriate options for measuring impact, it will need to be established whether internal capacity in terms of staffing and time exists to manage it. If the programme warrants an external evaluation, it will be necessary to consult the Research and Learning team Bear in mind cost versus rigour: self-evaluation is cheap but not particularly rigorous. Different impact measurement approaches have different costs and levels of rigour. 12
13 Acknowledgements In addition to the documents listed in Appendix A, a number of sources were draw on to compile the information in this document, principally: New Philanthropy Capital Charities Evaluation Services The Guild New Economics Foundation 13
14 14
The Code of Good Impact Practice. June 2013
The Code of Good Impact Practice June 2013 Inspiring Impact The Code of Good Impact Practice has been developed through a sector-wide public consultation and the input of a working group of 17 organisations
Measuring the Impact of Volunteering
Measuring the Impact of Volunteering Why is measuring the impact of volunteering important? It is increasingly important for organisations or groups to describe the difference that volunteering makes to,
Social Return on Investment
Social Return on Investment Valuing what you do Guidance on understanding and completing the Social Return on Investment toolkit for your organisation 60838 SROI v2.indd 1 07/03/2013 16:50 60838 SROI v2.indd
1. Trustees annual report
1. Trustees annual report Accounting and reporting by charities Overview and the purpose of the trustees annual report 1.1. The primary purpose of the trustees annual report (the report) is to ensure that
Fundraising for Success
Fundraising for Success A Guide for Small and Diaspora NGOs Ajay Mehta INTRAC PEER LEARNING PROGRAMME Introduction Fundraising is one of the most important pieces of the organisational jigsaw. Quite clearly,
Board of Member States ERN implementation strategies
Board of Member States ERN implementation strategies January 2016 As a result of discussions at the Board of Member States (BoMS) meeting in Lisbon on 7 October 2015, the BoMS set up a Strategy Working
Investing in Communities programme. Peter Devlin and Kathleen Little
Guidance notes Investing in Communities programme Stock code Print Photography BIG-IIC2010 C&R Printing Peter Devlin and Kathleen Little Further copies available from: Email [email protected]
UNDERSTANDING YOUR TARGET AUDIENCE AND DEFINING YOUR VALUE PROPOSITION
57 Stage 3: Set Up and Start Up Theme 6: Marketing UNDERSTANDING YOUR TARGET AUDIENCE AND DEFINING YOUR VALUE PROPOSITION Now that you re ready to turn your idea into a reality and begin defining your
<Business Case Name> <Responsible Entity> <Date>
(The entity Chief Information Officer, Chief Financial Officer and Business Area programme Lead must sign-off the completed business case) Signed: Date:
the role of the head of internal audit in public service organisations 2010
the role of the head of internal audit in public service organisations 2010 CIPFA Statement on the role of the Head of Internal Audit in public service organisations The Head of Internal Audit in a public
How to Measure and Report Social Impact
How to Measure and Report Social Impact A Guide for investees The Social Investment Business Group January 2014 Table of contents Introduction: The Development, Uses and Principles of Social Impact Measurement
Glossary Monitoring and Evaluation Terms
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
PARC. Evaluation Series No.1 The Logical Framework
PERFORMANCE ASSESSMENT RESOURCE CENTRE PARC Evaluation Series No.1 The Logical Framework By w PARC Evaluation Series No. 1: The Logical Framework Looking back over the last year and the conferences where
Scotland Your project business plan
Scotland Your project business plan Scotland Your project business plan Further copies available from: Email [email protected] Phone 0870 240 2391 Textphone 0141 242 1500 Our website
Measuring Social Impact. A very brief guide to Social Value, Social Accounting and Audit and other tools Part I
Measuring Social Impact A very brief guide to Social Value, Social Accounting and Audit and other tools Part I Helen Vines, VINESWORKS and Social Audit Network After this morning you will: Gain an understanding
Internal Quality Assurance Arrangements
National Commission for Academic Accreditation & Assessment Handbook for Quality Assurance and Accreditation in Saudi Arabia PART 2 Internal Quality Assurance Arrangements Version 2.0 Internal Quality
Government Communication Professional Competency Framework
Government Communication Professional Competency Framework April 2013 Introduction Every day, government communicators deliver great work which supports communities and helps citizens understand their
key evaluation questions
key evaluation questions What do we mean by Key evaluation questions? Carefully crafted questions Are the organising construct of: any evaluation study, or for an organisational monitoring and evaluation
Specification Document (11/1023)
Responsive funding in the NIHR SDO programme: open call (with special interest in studies with an economic/costing component) Closing date 1.00pm on 15 September 2011 1. Introduction This is the fifth
The Beginners Guide To Social Return On Investment. Learn more about SROI and how you can measure your impact
The Beginners Guide To Social Return On Investment Learn more about SROI and how you can measure your impact Marlon van Dijk - Managing Director We all make choices in our life, whether as an individual
DRAFT V5. PFSC 16/05/2014 Appendix 1. Outline Plan to deliver the County Council s investment property Strategy
PFSC 16/05/2014 Appendix 1 Outline Plan to deliver the County Council s investment property Strategy 1. Strategic Drivers 1.1 The County Council s property investment strategy sets out how its property
The Transport Business Cases
Do not remove this if sending to pagerunnerr Page Title The Transport Business Cases January 2013 1 Contents Introduction... 3 1. The Transport Business Case... 4 Phase One preparing the Strategic Business
Evaluating Government Communication Activity. Standards and Guidance
Evaluating Government Communication Activity Standards and Guidance Contents Introduction 3 PROOF: five guiding principles for evaluation 4 The big IDIA: the four-stage evaluation process 5 Stage 1: Identify
Evaluability Assessment Template
Evaluability Assessment Template Evaluability Assessment Template An Evaluability Assessment examines the extent to which a project or a programme can be evaluated in a reliable and credible fashion. An
OPEN INTERNATIONAL MARKETS INCREASE MARKET CONFIDENCE CREATE COMPETITIVE ADVANTAGE A PLATFORM FOR INNOVATION
National Standardization Strategic Framework OPEN INTERNATIONAL MARKETS INCREASE MARKET CONFIDENCE A PLATFORM FOR INNOVATION CREATE COMPETITIVE ADVANTAGE Foreword Standards influence everything we do.
Volunteer Managers National Occupational Standards
Volunteer Managers National Occupational Standards Contents 00 Forward 00 Section 1 Introduction 00 Who are these standards for? 00 Why should you use them? 00 How can you use them? 00 What s in a Standard?
Assessment Policy. 1 Introduction. 2 Background
Assessment Policy 1 Introduction This document has been written by the National Foundation for Educational Research (NFER) to provide policy makers, researchers, teacher educators and practitioners with
Measuring Social Impact. A very brief guide to Social Value, Social Accounting and Audit and other tools Part II
Measuring Social Impact A very brief guide to Social Value, Social Accounting and Audit and other tools Part II Helen Vines, VINESWORKS and Social Audit Network So how do we measure? Systematically Qualitative
Unit 5/LD 205(K): Principles of positive risk taking for individuals with disabilities
Unit 5/LD 205(K): Principles of positive risk taking for individuals with disabilities Unit code: QCF Level 2: K/601/6285 BTEC Specialist Credit value: 2 Guided learning hours: 20 Unit aim This unit provides
How to write a marketing and communications strategy
How to write a marketing and communications strategy Bringing together all you ve learned and all elements of the marketing mix to make a positive impact on your target audience, here s a simple accessible
Investors in People Assessment Report. Presented by Alli Gibbons Investors in People Specialist On behalf of Inspiring Business Performance Limited
Investors in People Assessment Report for Bradstow School Presented by Alli Gibbons Investors in People Specialist On behalf of Inspiring Business Performance Limited 30 August 2013 Project Reference Number
GUIDELINES FOR PILOT INTERVENTIONS. www.ewaproject.eu [email protected]
GUIDELINES FOR PILOT INTERVENTIONS www.ewaproject.eu [email protected] Project Lead: GENCAT CONTENTS A Introduction 2 1 Purpose of the Document 2 2 Background and Context 2 3 Overview of the Pilot Interventions
getting there Models for Self- Directed Support broker support Getting There Discussion paper
Models for Self- Directed Support broker support Getting There Discussion paper getting there Outside the Box November 2012 Introduction Introduction what this section covers: About Getting There Summary
Social Return on Investment. an introduction
Social Return on Investment an introduction SROI an introduction in association with 1 Introduction Social Return on Investment (SROI) is an innovative way to measure and account for the value you create
COLUMN. Planning your SharePoint intranet project. Intranet projects on SharePoint need a clear direction APRIL 2011. Challenges and opportunities
KM COLUMN APRIL 2011 Planning your SharePoint intranet project Starting a SharePoint intranet project, whether creating a new intranet or redeveloping an existing one, can be daunting. Alongside strategy
Workforce capacity planning model
Workforce capacity planning model September 2014 Developed in partnership with 1 Workforce capacity planning helps employers to work out whether or not they have the right mix and numbers of workers with
Planning, Monitoring, Evaluation and Learning Program
Planning, Monitoring, Evaluation and Learning Program Program Description Engineers Without Borders USA EWB-USA Planning, Monitoring, Evaluation 02/2014 and Learning Program Description This document describes
BUSINESS CONTINUITY POLICY
BUSINESS CONTINUITY POLICY Document Type Corporate Policy Unique Identifier CO-038 Document Purpose To provide a structure through which: i. A comprehensive business continuity management system (BCMS)
Monitoring and Evaluation of. Interventions
Monitoring and Evaluation of Sports in Development (SiD) Interventions presented by: Pamela K. Mbabazi, Ph.D. Faculty of Development Studies, Department of Development Studies Mbarara University of Science
Monitoring and evaluation of walking and cycling (draft)
Sustrans Design Manual Chapter 16 Monitoring and evaluation of walking and cycling (draft) November 2014 September 2014 1 About Sustrans Sustrans makes smarter travel choices possible, desirable and inevitable.
Membership Management and Engagement Strategy 2014-17
Membership Management and Engagement Strategy 2014-17 communicating engaging representing Contents Introduction 3 What is membership? 4 Defining the membership community 5 Engaging members and the public
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
IT strategy. What is an IT strategy? 3. Why do you need an IT strategy? 5. How do you write an IT strategy? 6. Conclusion 12. Further information 13
IT strategy made simple What is an IT strategy? 3 Why do you need an IT strategy? 5 How do you write an IT strategy? 6 step 1 Planning and preparation 7 step 2 Understanding your organisation s IT needs
Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet
Applies from 1 April 2007 Revised April 2008 Core Competence Framework Guidance booklet - Core Competence Framework - Core Competence Framework Core Competence Framework Foreword Introduction to competences
Mama Cash s Women s Funds Programme Framework. Policies and Guidelines for Partnerships with Women s Funds
I. Introduction Mama Cash s Women s Funds Programme Framework Policies and Guidelines for Partnerships with Women s Funds Mama Cash has played an important and active role during the last decade in the
Assignment Brief. Director of Executive Education Cambridge Institute for Sustainability Leadership
Assignment Brief Director of Executive Education Cambridge Institute for Sustainability Leadership Prepared by Anne Esler Head of Education Practice [email protected] +44 (0)20 7333 1873 Prepared
Corporate Strategy 2015 2020
168982 Corporate Strategy 2015 2020 Corporate Strategy 2015 2020 Our strategy recognises that better services to learners will benefit higher education providers; and that better services for higher education
Monitoring, Evaluation and Learning Plan
Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net
Introductory Level Management Training Programme
Introductory Level Management Training 2016 Foreword January 2016 Managers and supervisors across the career development sector, be this in the context of career education, career guidance/development,
Improving information to support decision making: standards for better quality data
Public sector November 2007 Improving information to support decision making: standards for better quality data A framework to support improvement in data quality in the public sector Improving information
How To Manage Performance In North Ayrshire Council
North Ayrshire Council Performance Management Strategy February 2015 Contents 1. Foreword... 3 2. Introduction... 3 3. What is performance management?... 4 3.1 Why is it important to North Ayrshire Council?...
PQASSO and Investors in People. An introductory self-assessment tool
PQASSO and Investors in People An introductory self-assessment tool August 2006 02 The LSC exists to make England better skilled and more competitive. We have a single goal: to improve the skills of England's
PERCEPTION OF BASIS OF SHE AND SHE RISK MANAGEMENT
PERCEPTION OF BASIS OF SHE AND SHE RISK MANAGEMENT Per Berg and Roger Preston Safety Section, Global SHE, AstraZeneca INTRODUCTION After the merger between the two pharmaceutical companies Astra and Zeneca
World Health Organization
March 1, 2005 Proposed Networks to Support Health Decision-Making and Health Policy Formulation in Low and Lower Middle Income Countries & Considerations for Implementation World Health Organization A
Section 2 - Key Account Management - Core Skills - Critical Success Factors in the Transition to KAM
Section 2 - Key Account Management - Core Skills - Critical Success Factors in the Transition to KAM 1. This presentation looks at the Core skills required in Key Account Management and the Critical Success
first steps in monitoring and evaluation
helping you do better what you do best 4 Coldbath Square London EC1R 5HL t +44 (0) 20 7713 5722 f +44 (0) 20 7713 5692 e [email protected] w www.ces-vol.org.uk Company limited by guarantee Registered
UNIVERSITY OF LONDON GUIDE TO RISK MANAGEMENT. Purpose of the guide... 2
UNIVERSITY OF LONDON GUIDE TO RISK MANAGEMENT Purpose of the guide... 2 Risk Management The Basics... 2 What is Risk Management?... 2 Applying Risk Management... 2 The Use of Risk Registers in Risk Management...
Article Four Different Types of Evidence / Literature Reviews
Article Four Different Types of Evidence / Literature Reviews The rapid growth in the number of reviews undertaken can partly be explained by the current emphasis on evidence-based practice. Healthcare
Corporate Communications Strategy
Corporate Communications Strategy 2014 Office for Nuclear Regulation page 1 of 6 Introduction Effective communications is an important part of ONR s success and supports our reputation as a trusted, independent
Review of the Management of Sickness Absence Conwy County Borough Council
Audit 2004/2005 Date: December 2005 Authors: Ros Adams and George Jones Ref: 1072A2005 Review of the Management of Sickness Absence Conwy County Borough Council Contents Summary Report Introduction 3 Background
A framework to plan monitoring and evaluation
6. A framework to plan monitoring and evaluation Key ideas Monitoring and evaluation should be an integral part of (and therefore affect) all stages of the project cycle. As mentioned earlier, you need
SFJ EFSM14 Manage the performance of teams and individuals to achieve objectives
Manage the performance of teams and individuals to achieve objectives Overview This standard is about making the best use of your team and its members so that they can achieve your organisation's objectives.
Generic grade descriptors and supplementary subjectspecific guidance for inspectors on making judgements during visits to schools
Religious education Generic grade descriptors and supplementary subjectspecific guidance for inspectors on making judgements during visits to schools Inspectors visit 150 schools each year to inform Ofsted
Strategic plan. Outline
Strategic plan Outline 1 Introduction Our vision Our role Our mandate 2 About us Our governance Our structure 3 Context Our development Camden 4 Resources Funding Partners 5 Operating model How we will
The Child at the Centre. Overview
The Child at the Centre Overview Giving our children the best start in life Twin aims of early education and childcare: Free part-time quality pre-school education place for all three and four year olds;
Individual Giving Resource Sheet
Individual Giving Resource Sheet Individual giving can be broken down into three broad channels: supporter acquisition, supporter retention and supporter development. Before we look at the mechanics of
Getting funding and planning successful projects. Big Lottery Fund guide to outcomes
Getting funding and planning successful projects Big Lottery Fund guide to outcomes Getting funding and planning successful projects: Big Lottery Fund s guide to outcomes Further copies available from:
Evaluation: Designs and Approaches
Evaluation: Designs and Approaches Publication Year: 2004 The choice of a design for an outcome evaluation is often influenced by the need to compromise between cost and certainty. Generally, the more
Guide for the Development of Results-based Management and Accountability Frameworks
Guide for the Development of Results-based Management and Accountability Frameworks August, 2001 Treasury Board Secretariat TABLE OF CONTENTS Section 1. Introduction to the Results-based Management and
EMBEDDING BCM IN THE ORGANIZATION S CULTURE
EMBEDDING BCM IN THE ORGANIZATION S CULTURE Page 6 AUTHOR: Andy Mason, BSc, MBCS, CITP, MBCI, Head of Business Continuity, PricewaterhouseCoopers LLP ABSTRACT: The concept of embedding business continuity
strategic plan and implementation framework 2013-2018
strategic plan and implementation framework 2013-2018 contents Introduction 3 Strategic Plan 2013-2018 4 Strategic Priorities 4 2 Implementing the Plan 5 Measuring and Monitoring 5 Communicating and Reporting
Sports Sponsorship. A cost effective investment for your brand. Suzy Aronstam Head of MMT
Sports Sponsorship A cost effective investment for your brand Suzy Aronstam Head of MMT Sports sponsorships are big business Earlier this year Manchester United agreed a 750m, 10-year sponsorship deal
Charities & the Non-Profit Sector
Charities & the Non-Profit Sector 2 Charities & the Non-Profit Sector Table of Contents SECTION PAGE 1. Challenges 2. The Value of Search 3. Our Approach 4. Identify, Engage, Secure 5. Our Experience 6.
SENIOR MANAGEMENT APPRAISAL
Report Resources Committee 2 April 21 SENIOR MANAGEMENT APPRAISAL 5-1. Reason for Report To appraise Members of developments since the Best Value and Community Planning Audit highlighted a number of issues
ECO Stars Fleet Recognition Scheme Improving Local Air Quality Through Operator Engagement
ECO Stars Fleet Recognition Scheme Improving Local Air Quality Through Operator Engagement Mark Cavers, Transport & Travel Research Ltd and Ann Beddoes, Barnsley Metropolitan Borough Council [email protected]
D 1. Working with people Develop productive working relationships with colleagues. Unit Summary. effective performance.
1 Develop productive working relationships with colleagues Unit Summary What is the unit about? This unit is about developing working relationships with colleagues, within your own organisation and within
University Mission. Strategic Plan. Organisation Structure. Jobs
Introduction The main purpose of any job description is to outline the main duties and responsibilities that are involved in a particular job. Additional information is often requested in order that one
Collaborative development of evaluation capacity and tools for natural resource management
Collaborative development of evaluation capacity and tools for natural resource management Helen Watts (Adaptive Environmental Management, formerly NSW Department of Environment and Climate Change) Sandra
AGENDA ITEM 5 AYRSHIRE SHARED SERVICE JOINT COMMITTEE 1 MAY 2015 AYRSHIRE ROADS ALLIANCE CUSTOMER SERVICE STRATEGY
AYRSHIRE SHARED SERVICE JOINT COMMITTEE 1 MAY 2015 AYRSHIRE ROADS ALLIANCE CUSTOMER SERVICE STRATEGY Report by the Head of Roads Ayrshire Roads Alliance PURPOSE OF REPORT 1. The purpose of this report
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
Recruitment Pack Next Step!
Recruitment Pack Next Step! Role: Digital Marketing Specialist Location: Birchwood, Warrington Salary: 33,600-42,000 per annum Contract term: Permanent Closing date for applications: Friday 3 July 2015
Relationship Manager (Banking) Assessment Plan
Relationship Manager (Banking) Assessment Plan ST0184/AP03 1. Introduction and Overview The Relationship Manager (Banking) is an apprenticeship that takes 3-4 years to complete and is at a Level 6. It
January 2011. For more information contact: Rhodri Davies Policy Manager 03000 123221 [email protected]
Response to Cabinet Office Green Paper Modernising Commissioning: Increasing the role of charities, social enterprises, mutuals and cooperatives in public service delivery January 2011 For more information
Social Return on Investment
Social Return on Investment Case study - London Borough September 2014 Overview The Social Value Act is transforming public sector procurement. Business needs to embrace this opportunity and use it to
Interview with Hugh McLaughlin and Helen Scholar, editors of Social Work Education: The International Journal
Interview with Hugh McLaughlin and Helen Scholar, editors of Social Work Education: The International Journal As new editors of Social Work Education: The International Journal tell us a bit about yourselves
Social Business Plan Template
Social Business Plan Template Your one page plan... 3 Your service from your customer s point of view... 3 Market research... 3 Vision, mission and objectives... 6 What will you do? (your activities)...
