Evaluating Government Communication Activity. Standards and Guidance
|
|
|
- Shonda Bishop
- 9 years ago
- Views:
Transcription
1 Evaluating Government Communication Activity Standards and Guidance
2 Contents Introduction 3 PROOF: five guiding principles for evaluation 4 The big IDIA: the four-stage evaluation process 5 Stage 1: Identify 6 Stage 2: Develop 8 Stage 3: Implement 17 Stage 4: Analyse and report 23 Conclusions 27 Appendix: Recommended metrics 28 2
3 Introduction As government communicators, we re all aware of the need to make every piece of work that we produce as effective and efficient as possible. To do this, we need to understand what s already working well and where there s room for improvement. This in turn requires us to evaluate our work and apply the learning from this evaluation to future activity. We re also increasingly required to demonstrate how we re applying evaluation in our day-to-day jobs, through the plans and reports that we submit to the Efficiency and Reform Group (ERG) for activities with a spend of 100,000 or more and the information that we supply to feed into the annual communication plan. The revised Government Communication Network (GCN) Core Skills for Government Communicators will set out clear evaluation standards that we should all follow, based on our grade and the discipline that we work in. To help us evaluate our activity effectively, in line with expected standards, we need clarity on what good evaluation practice looks like. This guide sets out an approach to evaluation that should be followed as a minimum for all government communication activity, regardless of size, discipline or budget. This approach is pragmatic and focuses on helping you to produce the best possible evaluation given the scope of your activity and the time, resource and budget that you have available for evaluation. By following the guide, you can be confident that you will produce an evaluation that meets the required standards for your activity and for your role. If you re new to evaluation, use the guide to help you get started, recognising that it will take time to build up your approach and gather what s working. Remember, a partial evaluation is almost always better than no evaluation at all. If you re already evaluating your activity effectively, ensure that you re following the standards required of you, making modifications as necessary. Think about how you can share what you ve learned with others to help them evaluate more effectively. By evaluating the activity that we carry out, we will be able to improve the effectiveness and efficiency of our work over time in the future. We will also be able to demonstrate the contribution that well-planned and executed communication activity makes to government overall and hence justify further investment in our work. 3
4 PROOF: five guiding principles for evaluation Whatever the size or scope of your communication activity, following these guiding principles will help to ensure that your evaluation is as effective as possible. P Pragmatic best available within budget, not best ever A pre-planned, but partial, evaluation is better than no evaluation at all. Be transparent: acknowledge the gaps in your evaluation and the implications of these gaps. Even if you are not able to fully quantify the effect of your communication activity, you will still be able to draw valuable learning from the evidence that you have obtained. R Realistic prove what you can, acknowledge what you can t Don t worry if you can only collect a small amount of data in the short term. By establishing a robust evaluation framework that is linked to a clear set of communication objectives, you will be able to interpret and analyse whatever information you gather. Over time, you will be able to build on this knowledge, increasing the amount of data that you collect from each subsequent activity. O Objective approach your evaluation with an open mind Be honest and constructive about what was achieved, so that we can all learn for the future. Learn from your successes and from things that didn t work as well as you d hoped. Use this to refine your future strategy. O Open record and share as much as possible Share your learning as widely as possible so that colleagues can also benefit from your experience. Work with GCN to develop a detailed case study. F Fully integrated integrate evaluation into activity planning and delivery Plan ahead. Start thinking about how to evaluate your activity as soon as possible, ideally well before it begins. This will help you to put the right mechanisms in place for subsequent measurement and data collection. Retrospective evaluation is often less effective because the right data may not have been collected or objectives may not be measurable. 4
5 The big IDIA: the four-stage evaluation process You can evaluate all kinds of communication activity, including press and media management, marketing and internal communication. Follow the four-stage process set out in Figure 1 below and you will carry out an effective evaluation in line with the five guiding principles for evaluation. Figure 1: The four-stage evaluation process Identify the scope of your project Develop your evaluation plan Implement source data to measure performance Analyse and report performance against the plan 5
6 Stage 1: Identify the scope of your project Checklist Task: to define what you need to evaluate by asking: ç What activity am I evaluating? ç What do I already know? ç What is my evaluation expected to achieve? Output: summary of your proposed evaluation approach What activity are you evaluating? Begin by establishing exactly what you are going to evaluate. What activities do you need to include? Are these activities part of a wider communication strategy? Identify the time period over which you will evaluate the activity. For a one-off piece of activity with clear start and end dates, this is generally straightforward. For ongoing activity, you need to identify and agree appropriate time periods for evaluation, i.e. how often you will assess performance against objectives. Examples of types of activity Press Marketing Internal communication For ongoing reputation management work, it s often most effective to track the effect of your work over time. Identify the key messages that you want to track and monitor coverage on a regular basis, providing monthly or quarterly updates rather than reporting on individual activities. If the activity that you are evaluating runs across a range of channels (which might include, for example, paid-for activity, partnerships, leaflets, a website or social media), check that you ve included all of them in your plan. If you are evaluating the effect of a change management programme, ensure that you include all the elements of the programme in your evaluation plan (e.g. briefings to senior staff, communication via and the intranet, events, training etc). 6 Stage 1: Identify
7 What do you already know? Review similar previous activity that your team or others have run to see what you can learn. This will give you a benchmark to measure performance against and may help to refine your objectives and your approach to the activity that you are about to run. $ Useful tip gathering evidence for ERG exemption requests If your activity is subject to ERG approval, use your previous evaluation results as evidence for why you believe it will meet its objectives in sections 3 and 10 of the exemption request form. What is your evaluation expected to achieve? Ask the following questions to help identify what is expected from your evaluation: What are the key questions that your evaluation report needs to answer? What level of detail is expected from your report and what format does it need to be in? Is any budget available for your evaluation? Who will do the work? This might include you and others on your team, research, analysis and evaluation specialists from within your hub or the Shared Communications Service, an external agency or a combination of all of these. $ Useful tip ERG evaluation standards If your activity is subject to ERG approval, there is a standard format that you must use for your evaluation report. A report template can be found here. Ensure that your evaluation plans are designed with this in mind Summarising your approach to evaluation Before beginning work, set out your proposed approach to evaluation, basing this on the information that you have gathered so far. Depending on the scope of the activity that you re evaluating and the person for whom you are carrying it out, this may be a detailed proposal to stakeholders in your department, an to your manager or a note to yourself. Whatever format you use, you can find a useful template here that sets out the questions you should consider. $ Useful tip managing expectations Sometimes, there may be a gap between the time, resource and budget that you have available and others expectations of what your evaluation can achieve. Discuss whether it s better to increase available resource, budget or time or to lower expectations. Decide this early in the process. Stage 1: Identify 7
8 Stage 2: Develop your evaluation plan Checklist Task: to define how you ll measure success by: ç Setting out activity objectives ç Defining target audiences ç Mapping out how your activity will work ç Setting performance metrics ç Agreeing metrics and targets Output: draft evaluation plan Setting out activity objectives Clear and measurable communication objectives are the cornerstone of any evaluation plan. They set out what your activity aims to achieve and the overall goals against which you should judge success. Your objectives should already be identified in the communication strategy for the activity you are evaluating. It may be useful to summarise them as shown in Figure 2, clearly demonstrating how each sub-objective links to the overall communication and departmental objective. Start by identifying the overall departmental objective and the issue it is designed to address. Your activity will be part of a set of interventions which link back to this objective. Next, identify the communication objective. This is the overall role that communication is expected to play in achieving the policy objective. Different channels or activities may have distinct roles to play in achieving the overall communication objective. Each of these should be set out as a separate communication sub-objective. You may only be evaluating the effectiveness of one communication sub-objective, but it is important to understand how this is expected to contribute towards the overall communication objective. Ensure that all communication objectives are clear and set out what each activity was put in place to achieve, together with a measure of success. Always consider whether or not you will be able to prove a communication objective has been met. If not, it will need to be revised. You may find it useful to map out your objectives as shown in Figure 2. 8 Stage 2: Develop
9 Figure 2: Hierarchy of objectives Departmental objective Put in place to address a specific issue Includes: policy development, policy delivery, reputation management Communication objective Role that communication will contribute to achieving departmental objective Communication sub-objectives Role that individual activities or channels will play in meeting the overall communication objective Sample objectives might include: Press Departmental objective: To ensure that compliance with a new tax regulation is above 80%. Communication objective: To ensure that the majority of the general public understands the reasons why complying with the regulation benefits the economy. Press-specific sub-objective: To ensure that the public is given a fair and balanced view of the policy, via the media. Marketing Departmental objective: To get 10,000 more people working as community service volunteers in your area. Communication objective: To get 40,000 people in the area to register as potential volunteers on your community website. Sub-objective 1: To increase the proportion of the public who recognise the value of volunteering from 20% to 40%. Sub-objective 2: To get 80,000 people to visit the website and find out more about how to volunteer. Sub-objective 3: To secure 40,000 incremental registrations. Internal communication Departmental objective: To ensure that unauthorised staff absences fall by 50%. Communication objective: To ensure that all staff are able to follow the correct processes for reporting absences from work. Sub-objective 1: To ensure that all staff recognise that there is a policy for reporting absences from work and that they must follow this. Sub-objective 2: To ensure that all staff understand how to access the guidance on how to report absences. Stage 2: Develop 9
10 Defining target audiences All communication activity has an end audience the people at whom it is ultimately targeted. But an activity may also have an intermediary audience a group of people targeted so that they will deliver the message to the end audience on your behalf. Typical intermediaries include the media, stakeholders, such as non-governmental organisations and charities involved in delivering a policy objective, and commercial partners working with you to deliver a piece of marketing activity. If your activity includes an intermediary audience, you should evaluate: how effectively the intermediary was engaged by the activity how effectively the intermediary communicated the message to the end audience. Audiences might include: Press Where you are using a media engagement or PR campaign to try and raise volunteer levels among the public overall: Intermediary (the press): How did the media react to the activity targeted at them? Did they feature your messages, what volume and quality of coverage did you get for the story? End audience (the general public): How many people volunteered as a result? Marketing Where you are trying to engage a partner to run events promoting volunteering on your behalf: Intermediary (partner): How effectively did you engage the partner and how many events did they run as a result? End audience (the general public): How many people volunteered as a result? Internal communication If you are training staff on how to communicate better with members of the public so that customer satisfaction improves: Intermediary (staff): How effectively did you train them? Did they put their skills into practice in their interaction with the public? End audience (the general public): Did they notice that they received a better service from trained staff? Were they more satisfied? $ Useful tip intermediary audiences If your activity includes an intermediary audience, make sure you include performance metrics that enable you to: (1) evaluate the effect of your activity on the intermediary; and (2) evaluate the effect of their activity on your end audience. 10 Stage 2: Develop
11 Mapping out how your activity will work Spend some time thinking about how your activity is expected to achieve its objectives. If it is successful, what messages will the target audience(s) see, what will they think or feel and what will they do? Mapping out the steps to success will help you identify the right performance metrics to evaluate your activity. Draw on any behavioural insight 1 modelling or customer journey work that has already been done. Setting performance metrics Having identified the objectives and target audiences for your activity and mapped out how you expect it to work, you need to build a set of performance metrics. These are the measures you will use to assess the activity s performance against its objectives and to identify which elements of the activity were most and least successful. Make sure that your evaluation plan includes a range of performance metrics from the following five categories: Figure 3: The five types of performance metrics for evaluation 1. Inputs The activity carried out 2. Outputs How many people had the opportunity to see or hear your activity? 3. Out-takes What was its immediate effect on them? 4. Intermediate outcomes Did they do anything as a result of your activity? 5. Outcomes Did you achieve your overall objective? Using these five categories as a guide will help you to pick the right performance metrics for your activity. The categories can be applied equally to the full range of press, marketing and internal communication activity. 1 For more on behavioural insight, see MINDSPACE: Influencing behaviour through public policy (Institute for Government/Cabinet Office Stage 2: Develop 11
12 Inputs Inputs include details of the actual activity that has been undertaken, including the channels that you used to communicate. The channels to include will have been identified at Stage 1. Examples of input metrics Press Marketing Number of press releases sent out or engagement work carried out Paid-for media plan Website or digital space created Number of partners contacted, types of message shared or requests made of partners (intermediary audience) Internal communication Number of staff events organised Number of briefings or training sessions organised Web content created and put on to the intranet Include the costs of carrying out the activity in your input metrics. Include all external and internal costs and time spent (including staff time). When comparing results for more than one piece of activity, use a consistent methodology to record the costs and time spent against each one. This will be essential for calculating return on marketing investment. 2 $ Useful tip choosing the right performance metrics At this stage, don t worry about whether you can get data for the performance metrics you choose. Pick those that you would need for an effective evaluation. Stage 3 looks at how to secure data and deal with gaps. 2 For more information refer to Evaluating the financial impact of public sector marketing communication: An Introduction to Payback, Return on Marketing Investment (ROMI) and Cost per Result ( 12 Stage 2: Develop
13 Outputs Output metrics measure the number of people who had the opportunity to see or hear your activity, regardless of whether they recall or recognise it. Try to include: Reach the total number of people or organisations in your target audience who were exposed to your activity. Frequency the number of times they saw or heard the activity. Examples of output metrics Press Number of pieces of coverage achieved Frequency of exposure to coverage by end audience Marketing Proportion of the target audience reached by media activity Number of impressions (one page-view) on the web page Number of stakeholders you contacted and number of contacts made Internal communication Number of staff attending events and training sessions Number of impressions on the intranet $ Useful tip activity mapping Use the activity map that you created earlier to help you identify the right out-take, intermediate outcome and outcome performance metrics for your activity. Out-takes Out-take metrics look at the impact that the activity had on your target audience s awareness, understanding and attitude. Think about what you wanted people to recall, think or feel about your activity and include performance metrics that allow you to measure this. $ Useful tip other metrics Performance metrics for out-takes, intermediate outcomes and final outcomes cannot be standardised in the same way as those for inputs and outputs. They will need to be tailored to reflect how you expect your activity to work and what it is trying to achieve. Stage 2: Develop 13
14 Examples of out-take metrics Recall Think Feel How many people are aware of your activity or the message(s) that it is promoting? How many people understand the key messages that your activity is trying to get across? What effect has your activity had on people s attitudes? Do they intend to behave differently as a result of your communication? Intermediate outcomes Intermediate outcome metrics capture any action taken by the target audience as a result of your activity which may lead to the eventual end behaviour. Think about including performance metrics that allow you to measure the following: Examples of intermediate outcome metrics Talk Direct response Indirect response Other actions How many people discussed the activity or its message with peers, friends and families? How many partners or stakeholders discussed it with their work colleagues? How many people responded to or otherwise interacted directly with you as a result of the activity? This could include visiting a website, attending training, ringing a phone line or having a face-to-face discussion. How many people responded to or otherwise interacted with third parties as a result of your activity? This could include people interacting with stakeholders or partners or other local and national services. How many people took (or claim to have taken) any other action as a result of your activity? For example, in a stop-smoking campaign this might include people buying books, patches or other similar products. Refer back to your activity map and make sure that you include measures to cover the full range of actions that people could have taken. 14 Stage 2: Develop
15 Outcomes Outcome metrics look at the effect that an activity has had on the overall communication and policy objectives that it was put in place to address. Include metrics that enable you to measure whether your communication objective was met and the effect your activity had on the wider policy objective. This could be related to changing behaviour, adopting a service, increasing positive reputation, increasing understanding and awareness or increasing participation. Y The appendix to this document provides a list of recommended metrics for each component. You can use this to help identify the appropriate metrics for your evaluation plan. This will enable you to compare your activity with other cross-hub work and will also ensure that your plan meets the standards required by ERG where applicable. Agreeing KPIs and targets You may decide to set a small number of key performance indicators (KPIs) based on the performance metrics that you have chosen or KPIs may already have been set by your team, department or hub. KPIs are measures of success that can help you track how well an activity is progressing towards its end objective or contributing to a broader communication strategy. KPIs are particularly useful where your activity won t achieve its end objective for some time. They will enable you to track how effectively the activity is progressing towards this end objective and may provide timely information to make changes to the existing communication plan, if necessary. KPIs are easier to set when the activity has run a number of times before, as you will have a better feel for which metrics most accurately predict how it will ultimately perform. KPIs may be single performance metrics or combine several metrics. You may choose to bring together responses by phone, web or face-to-face to give a total number of responses. Where people s overall satisfaction is driven by four or five different factors, you could bring these together in one composite satisfaction measure. How to set KPIs You don t need KPIs to evaluate effectively only set them if they are useful in helping you understand whether an activity is progressing as expected or contributing effectively to overall goals. Ensure that your KPIs are measurable. Select no more than five KPIs for your end audience (and the same for any intermediary audiences). Set targets for each KPI and specify the time-frame in which you expect to achieve them. Stage 2: Develop 15
16 Make targets as meaningful and as realistic as possible, drawing on previous results where applicable. The less historical data you have available, the broader your targets should be. As far as possible, benchmark targets against other activity carried out by you and others within your team, department or hub. This will give you a broader context for what success looks like. Creating the evaluation plan Your evaluation plan should bring together: the objectives and target audiences that you will evaluate performance against the performance metrics (and KPIs and targets if you re using them) that you will use in your evaluation. Your team or department may already have a standard template for evaluation plans. If not, you may find these templates for activities with and without an intermediary audience useful. $ Useful tip multi-channel activity For more complex activities, include a separate set of performance metrics for each activity/channel and audience. 16 Stage 2: Develop
17 Stage 3: Implement source data to measure performance Checklist Task: to identify and gather evaluation data by: ç Identifying available data and evidence ç Creating proxies and assumptions Output: completed evaluation plan ç Using monitoring, market research and feedback ç Reviewing any remaining gaps ç Agreeing who will collect data ç Completing the evaluation plan At this stage, you need to source data and evidence for the performance metrics you identified in Stage 2. Budget, time and resource may restrict the amount of data you are able to gather but this should not stop you evaluating your activity. It s better to produce an evaluation report with gaps in it than to produce nothing at all, provided that you are clear about what is missing. Identifying available data and evidence Begin by identifying the performance metrics for which data is immediately available. This will come from three main sources: data gathered from your activity data gathered by stakeholders and partners existing data sources (e.g. government data and wider media and lifestyle data). Data gathered from your activity Gather as much data as possible directly while your communication activity is running. Identify all the ways in which people can respond to it and ensure that data is being gathered for each one. Data might include web visits, telephone calls or face-to-face interaction. Also, look for ways to gather additional data from these responses for example, asking for people s personal details (while remaining mindful of the requirements of the Data Protection Act 1998) or asking for permission to contact them again for follow-up research. $ Useful tip plan ahead Always try to identify the data sources that you plan to use in your evaluation before the activity actually runs. This will give you time to ensure that the right data is being gathered in the right format while the activity is live. Stage 3: Implement 17
18 Data gathered by stakeholders and partners Look at what data or evidence could be collected by an agency, stakeholder, partner or colleague working with you to deliver the activity. This might include the number of responses to an event or helpline run by a partner; any feedback received by stakeholders; data from surveys; competitions or promotions; and website statistics. $ Useful tip getting data from agencies When you re procuring an agency to work on planning, running or evaluating an activity for you, always ask what data they can provide to feed into the evaluation at the procurement stage. This can then be built into their contract. Ask for data well in advance and, where appropriate, agree to share the results of your evaluation. Existing data sources As well as fresh data gathered from current activity, you may find it useful to look at data sources that already exist to help you put your results in context and back them up with supporting evidence. For example: Media consumption surveys such as NRS (newspapers), BARB (TV) or comscore (digital media) provide information on the number of people reached by different media channels. These are particularly useful for input and output measures. Syndicated consumer lifestyle and media surveys such as TGI, TouchPoints or ACORN can be used to build up lifestyle and behavioural profiles for a range of audiences. These are particularly useful for tracking longer-term outcome measures and optimising future campaign planning. Existing government demographic and research data gathered centrally or by individual departments can be used for measuring longer-term outcomes. The Office for National Statistics publishes a number of surveys across all sectors of public interest. For more information on how you might be able to use and access these and other surveys, talk to colleagues in your department s or hub s research and analytics teams, the Shared Communications Service or external media and research agencies. Creating proxies and assumptions If you can t get the exact data that you need for a performance metric, look at whether you can source a similar piece of evidence as an alternative. This is known as a proxy measure. For example: if an activity is asking people to check their smoke alarms more regularly, a good proxy would be the number of nine-volt batteries being sold (a type of battery almost exclusively used in smoke alarms). 18 Stage 3: Implement
19 Is your activity similar to other activity run in the past for which you ve got accurate results? If so, you might create an assumption that this activity will perform in the same way. For example: if your activity encourages people to sign up for a stop-smoking service, you may not have the budget to carry out research to see how many people have given up smoking as a result of using the service. However, all things being equal, you could assume that cessation rates were the same as in previous research studies. Using monitoring, market research and feedback If gaps still remain in your evaluation plan once you have exhausted all the available data sources, consider filling them using bespoke monitoring, market research and feedback. Monitoring You can use monitoring to track how many people your communication is reaching. The main monitoring techniques and tools include: PR and media monitoring: If your activity is designed to generate media coverage, you could use an agency to track how many people it reached, how many times the key messages were mentioned and how favourable the coverage was. If you don t have the budget for this, consider whether you could monitor coverage yourself, either by subscribing to a PR-monitoring service such as Gorkana or by looking at what s being said across a representative selection of media channels. Web monitoring: The Government Digital Service (GDS) measures a range of standard digital metrics for gov.uk and other government and partner sites. Topline data is available in the regular reports provided by GDS and more detailed analysis will be available on request. If you are responsible for monitoring performance of a stand-alone website, there is a range of free and paid-for tools that you can use to monitor performance, including Google Analytics. Social media monitoring: GDS will also provide guidance on social media monitoring. There are various tools that you can use to monitor performance yourself: Where your activity is hosted on a social media site such as Facebook, LinkedIn or YouTube, you can set up standard user reports to monitor interactions with your content. Look both at how many people interact with you and at the quality of these interactions. If you share information via Twitter, monitor how many people follow you and how many retweet your messages. Tweetdeck enables you to analyse interactions more effectively. Buzz-monitoring tools enable you to see whether people are commenting online about your message, and whether the comments are positive and from credible sources. Effective buzz-monitoring relies on you defining the terms that you want to monitor in advance; the more specific you can be, the more accurate the results. There is a wide range of free and paid-for buzz-monitoring tools. Google Alerts is one example of a free tool. There is a range of free and paid-for tools that can be used to monitor what terms people are searching for online. Google Trends provides a useful free snapshot. Stage 3: Implement 19
20 Market research If you have gaps in the data for out-take, intermediate outcome and outcome metrics, then paid-for market research is generally the most effective way of filling them, provided that budget is available. Market research for evaluation doesn t necessarily have to involve large-scale face-to-face quantitative surveys. Lower-cost research methodologies can be a useful source of insight. These include: Omnibus studies: An omnibus study is a quantitative survey of a representative sample of an audience (usually, the general public). The questionnaire is made up of groups of questions placed by different clients, which means the overall costs are shared, making it a relatively cheap option. Omnibus surveys may not be appropriate if your activity is localised, your audience is niche, or if you want to ask many and/or in-depth questions about a particular topic. Commissioned online surveys: Standard online surveys recruit respondents from large panels of people who have agreed to take part in research. They can be very cost-effective if your target audience is digitally engaged. However, check the quality of the panel and how it is managed in advance. Online panels are not always suitable for tracking long-term activities, as you may not wish to survey the same panel members repeatedly. Qualitative research: Qualitative research, including discussion groups or interviews, can be a useful alternative to quantitative research in some circumstances. It may be appropriate for small-scale activities; when audiences are hard to reach; or when activity is only running for a short period with no requirement to track its impact over time. When commissioning paid-for research, you may be able to lower your costs by reducing the size or specification of your sample, shortening the length of your questionnaire or simplifying your reporting requirements. Research and evaluation specialists in your marketing hub or the Shared Communications Service (SCS) can provide advice and help with commissioning paid-for research or conducting your own online research. Feedback Feedback is informal comment and opinion that you gather yourself. It can be a valuable alternative to robust research when there are limitations in time, resource and budget. This may take the form of a few informal telephone or face-to-face interviews with the primary target audience, those involved in delivering the activity or direct engagement with the audience; alternatively, feedback can be obtained through an online survey that you run yourself, or via a blog, or text. If possible, seek input from colleagues in research or insight roles within your department or SCS before gathering feedback yourself. They will be able to give you advice on questionnaire design, data protection and propriety issues. They will also be able to advise you on how to use online survey tools such as SurveyMonkey when seeking feedback. 20 Stage 3: Implement
21 Examples of feedback include: Press Speaking to a few journalists Monitoring the type of enquiries that you get when you run a story Marketing Counting the number of people attending an event Informally asking stakeholders or partners for their observations on how an activity went Running a survey on your website or social media space Adding a question about your service to a call-centre script Internal communication Informal interviews with frontline staff Feedback from training sessions Feedback is not scientific. It will give you anecdotal evidence, rather than statistically robust measures. Use as many different sources as possible and ensure that your analysis reflects the limitations of using such informal methods (see Stage 4 for more detail). $ Useful tip data protection If you re asking members of the public to provide you with personal information, this will need to be gathered and stored in line with government and industry standards. Check your department s information risk policy and talk to market research specialists in your marketing hub if you need more advice. Reviewing any remaining gaps After gathering the data, review your evaluation plan to identify any performance metrics that have no data source. If gaps still exist, consider how important it is to measure that particular performance metric or KPI. If it is not central to the evaluation, you may choose not to measure it at all but point out this limitation in the final report. If a performance metric or KPI is crucial to your evaluation, consider asking for additional resource or budget to measure it, and make clear the implications of not obtaining it. Agreeing who will collect data Before the activity runs, agree who will gather the data for each source that you have identified, when and in what format. Collecting each piece of data in the same format using consistent time periods and target audiences will help with the analysis later on. Stage 3: Implement 21
22 Completing the evaluation plan You should now complete your evaluation plan by setting out the data sources that you will use for each of your performance metrics and by noting any gaps that still exist. Your completed plan will now include: objectives and target audiences performance metrics and KPIs, and the data sources that you will use to measure each one any limitations in your evaluation agreed budget and resource needed to gather the data (signed off by the budgetholder if needed). The three Cs principles for good data collection Whatever you re measuring, make the data that you collect as continuous, consistent and comparable as possible. Continuous Consistent Comparable Most communication activity aims to get people to start, stop or continue a particular attitude or behaviour. To quantify its effect, try to measure people s attitudes or behaviour before, during and after the activity runs. Benchmarking before it runs is particularly important to demonstrate the effect of your activity. For ongoing activity, measure performance regularly enough to show the effect it is having on attitudes or behaviour. The more data points you can capture, the more obvious the trends will be. Use consistent measures and methodology to assess activity that you repeat or run continuously over time. Using the same wording for questions, tracking against the same audience and collecting data in the same way every time will enable you to measure longer-term trends accurately. As far as you can, try to use the same measures for every piece of activity that you run. Try to make your measures as similar as possible to the ones used by other communicators in your team and across government. This will make it easier to compare results in the future. The appendix gives you guidance on the types of measures that you might want to use for your activity to help with standardisation. 22 Stage 3: Implement
23 Stage 4: Analyse and report performance against the plan Checklist Task: to assess the success of your activity by: ç Analysing effectiveness ç Demonstrating efficiency and value for money Output: final evaluation report Analysing effectiveness Once your activity has run, you will need to measure how effectively it met its objectives. Begin by gathering data and evidence from all your sources and bringing it together in a centralised database or folder. Check to see whether it looks correct before beginning the analysis. Also, check that the activity ran as planned and whether anything unexpected is likely to have affected its performance. $ Useful tip activity diary You will find it useful to keep a diary while your activity is running, noting down any external or operational factors that might affect performance as they happen (e.g. bad weather affecting event attendance; negative news stories affecting public perceptions; or downtime on a website affecting visitor numbers). This will make subsequent analysis easier. How to approach analysis Did your activity work as you expected? At Stage 2, you mapped out how you expected your activity to work. Use this as the basis for your analysis. Create some key hypotheses results or outcomes that you might expect to see based on your objectives, activity map and past performance. Check performance against your KPIs and targets Have these been met? If so, what is driving success? If not, consider whether your targets were realistic and whether your KPIs are accurate measures of success. For example, if you have set KPIs for a new activity, you may need to consider revising these in the future. Stage 4: Analyse and report 23
24 Be objective Analyse the full range of potential outcomes and results for your activity. Do not ignore results or trends that don t fit with your hypotheses or with the general patterns in the data. Instead, look for the reasons behind these. If something didn t work in the way that you expected or failed to meet its objectives, look at the reasons why. This will help you to amend future activity and so improve future performance. Try to isolate any operational or external issues Did any unforeseen operational issues affect your activity s performance? For example: Were there enough contact centre staff to respond to the calls which your activity generated? Did the website to which you were directing people stop working? Did external factors (e.g. negative PR, bad weather, the economic situation) have an impact on your activity s success? An activity diary can be very useful in helping you to identify these factors. Analyse data on a bottom-up basis If you are analysing the effect of more than one activity or channel, analyse data on a bottom-up basis against your evaluation plan. First, analyse the effectiveness of each individual activity against its sub-objective. Then, look at the effect that each activity had on the wider communication objective, considering which one had the greatest effect. Assess the effect of communication on the policy objective Consider the effect that communication as a whole has had on the wider policy objective, bearing in mind the effect of other interventions designed to meet that objective. You may not have sufficient data to fully isolate the effect of communication on the policy objective, but try to draw conclusions based on the available evidence. Review progress When evaluating activity that will not achieve its communication objectives for some time, look for evidence of progress towards these based on the KPIs and performance metrics that you have identified. Cross-check your conclusions Where you have data from a range of different sources, check to see if the results from each one are pointing towards the same conclusions. This is particularly important where you are relying on less robust data sources such as informal feedback. One set of results may not always give a conclusive answer, but several pieces of evidence all pointing in the same direction may allow you to be more confident in your conclusions. 24 Stage 4: Analyse and report
25 Signpost gaps and limitations Your evaluation plan will identify the gaps in available data and the implications of those gaps. Be aware of the limitations that this puts on the evaluation. Where possible, try to draw assumptions about missing results from the available data. For example, if you don t know how many people have stopped smoking as a result of your stop-smoking event, can you make an assumption about possible cessation rates based on the number of people who tell you that they re going to quit at the end of the event? If you can t make such assumptions, signpost missing data within the evaluation report. Get specialist support where necessary Some evaluation will require more sophisticated analysis, for example econometric modelling techniques to predict future performance or isolate the effect of different factors on performance. This type of analysis will generally be carried out by specialists, either within your department or externally. Only seek external support where you are unable to carry out the analysis within your marketing hub. Demonstrating efficiency and value for money When you evaluate performance, always ask yourself whether the results that you achieved justify the time, resource and money that you spent. As part of your evaluation, also consider whether the results that the activity achieved justify the time and money that were spent on it. Where you are able to quantify the actual effect that the activity had on the overall policy objective and put a financial value on this, you should be able to calculate the overall return on marketing investment. Otherwise, calculate the cost per result the amount of time or money that was invested for each person who carried out a specified action. Compare your results with other activities that you or others have run to see which ones are most cost-effective and time-efficient and use your learning to optimise future activity. Examples of return on marketing investment and valid cost per result figures include: Return on marketing investment: Where you are able to demonstrate how many lives have been saved as a result of a campaign to reduce speeding on residential streets and calculate the financial value of each life saved, the return on marketing investment is the total number of lives saved multiplied by the financial value of each life saved minus the cost of running the campaign. Cost per result: Where you are not able to demonstrate how many lives have been saved as a result of the campaign, you may instead choose to calculate the cost per result, basing this on any action that people took after seeing the activity. This might include the cost per web visit (total web visits divided by total campaign cost) if the activity sends people to a website for more information, or the cost per event attendee (total event attendees divided by total campaign cost) if the activity involves running events on road safety. Further guidance on demonstrating financial return is available in a separate GCN publication. 3 3 Evaluating the financial impact of public sector marketing communication: An Introduction to Payback, Return on Marketing Investment (ROMI) and Cost Per Result ( Stage 4: Analyse and report 25
26 The evaluation report You will have agreed the format for your report at Stage 1. Check that this is still correct. Whatever format your report is in, always try to follow these principles: be objective and include all results, positive and negative always provide clear conclusions and recommendations what should be done as a result of this report and by whom? separate fact from opinion and recommendations so that others can review the evidence on which you have based your conclusions acknowledge gaps in the data and their implications state any assumptions that you have made and what you based these on include references for all data sources used in your report include appropriate graphs and images to aid interpretation. A template for your evaluation report is included here. If you are submitting an evaluation report to ERG, you must use the following template. 26 Stage 4: Analyse and report
27 Conclusions Adopting the principles, processes and practical examples set out in this guide will enable you to design and implement evaluation plans for the full range of government communication activity to the standards required by ERG and the Communication Delivery Board. By following the recommended approach, you can also be confident that your work will meet the revised GCN competencies for government communicators. Adopting good evaluation standards will also enable you both to demonstrate the contribution that your communication activity makes towards achieving overall policy objectives and to make a well-informed business case to invest further resource and budget in your work. Sharing your learning with other government communicators in your team, department or arm s-length body and hub, and across GCN more widely, will enable others to benefit from your knowledge and insight as well. Finally, remember that it can take time to fully implement the approach set out in this guide and to gather the data needed to understand what is driving success. Begin gradually, recognising that it is always better to produce a partial evaluation than nothing at all, provided that you are clear on the limitations of your end report. This guide and the examples that it contains reflect knowledge and experience drawn from a wide range of colleagues from across the government communication network. Further comment, feedback and examples are always welcome. Please contact [email protected]. Helpful contacts and resources If you want more help or support in applying the content in this guide, talk to your hub lead or to the evaluation specialists within your hub. The GCN website includes their contact details. There are also a number of teams within the Government Communication Centre who may be able to help: The Campaigns and Strategy Team works with the communication hubs to define good practice evaluation standards across government and to ensure that these are reflected in all annual communication plans and ERG submissions. The Evaluation Team in the Shared Communications Service can provide further advice on how to apply these evaluation standards to specific projects; they can offer practical support in planning and conducting your evaluation and advice on the procurement of external agencies and should be your first point of contact for general evaluation queries. The Government Procurement Service will provide access to specialist external agencies, if required. The Government Communication Network runs training courses on evaluation and the GCN website contains a wealth of information and access to groups and individuals to help you with your evaluation. Go to for more information. Conclusions 27
28 Appendix Recommended metrics Introduction This appendix gives examples of the types of performance metric that you should consider including in your evaluation plan. They are split by discipline: 1. Press 2. Marketing 3. Internal communication Within each discipline, performance metrics are further split by channel and objective type. Choose the set that is most relevant to your activity, adapt it as appropriate and include it in your evaluation plan. If you are using more than one channel, you will need to pick separate sets for each channel that your activity uses. If your activity reaches the end audience via an intermediary (e.g. the media or a partner), use separate sets of metrics to measure: how effectively your activity engaged the intermediary; how effectively your intermediary engaged the end audience on your behalf. Templates for activities with and without an intermediary audience are available on the GCN website. $ Useful tip adapting performance metrics to your activity Ensure that you are clear on your activity s objectives, the channels that you are using and the key messages that you are promoting. This will enable you to adapt the example performance metrics to fit your activity. 28 Appendix
29 1. Press activity Press activity includes proactive publicity (activity which proactively promotes a policy or your organisation), reactive media handling (activity put in place to respond to a specific issue or event or to rebut inaccurate coverage of this issue or event) and media briefing and handling sessions that you organise to support ministers and senior policy officials in their work. All types of press activity have an intermediary audience (e.g. the media or other spokesperson (a minister or senior official)) and an end audience (generally members of the public). Choose the performance metrics that are most relevant for your activity from the list below. $ Useful tip evaluating intermediary and end audiences The final outcome for your intermediary audience should always form the input for your end audience use this evaluation plan template to help you. Inputs for intermediary audience The number and nature of press or media activities that you carry out. This might include: Proactive publicity the number of PR activities, media briefings and packages issued to the media Reactive media handling the number of corrections, reactive statements and rebuttals issued, the number of interviews arranged Media briefing and handling the number of media briefings and media handling sessions that you organise for ministers and officials Any costs incurred in running the activity, time and internal resources used. Outputs for intermediary audience Proactive publicity the number of media contacts that you reach with your activity, the number of times that you contact them, the messages that you pass on Reactive media handling the number of media contacts that you reach with your corrections, reactive statements and rebuttals, the number of interviews that take place Media briefing and handling the number of briefings and training sessions that you organise, the information and skills that you pass on. Appendix 29
30 Out-takes for intermediary audience Proactive publicity the attitude of the media overall (and key media contacts where applicable) towards the message that you are promoting or towards your organisation more generally Reactive media handling the attitude of the media towards the issue that you are working on and your handling of it. Have you changed their knowledge or attitude? Media briefing and handling what do ministers and senior officials think about the briefing and training that you provide? Do they find it useful? Do they intend to put it into practice? Intermediate outcomes for intermediary audience Proactive publicity the number of media contacts that you reach with your activity, the number of times that you contact them, the messages that you pass on Reactive media handling the number of media contacts that you reach with your corrections, reactive statements and rebuttals, the number of interviews that take place Media briefing and handling knowledge and skills that ministers and officials gain as a result of your briefing or training sessions. Final outcomes for intermediary audience/inputs for end audience (these performance metrics are the same) The volume and quality of media coverage achieved by your activity. This might include: Proactive publicity number of pieces of coverage achieved, accuracy of coverage, favourability of coverage, key message penetration, quotes and interviews used Reactive media handling number of pieces of coverage or interviews containing your responses, amendments or corrections, overall accuracy of coverage, favourability of coverage, the amount of negative coverage that has been prevented as a result of your work $ Useful tip measuring a reduction in negative coverage It is often far harder to measure the negative coverage that has been avoided as a result of the corrections and rebuttals that you issue. By keeping a record of the contacts that you have with the media for each issue that you work on, over time it becomes easier to demonstrate how your interventions are affecting coverage in the longer term. Media briefing and handling the number of times that your contacts use the knowledge and skills that you have passed on in their contact with the media, the volume of coverage achieved as a result of these contacts, accuracy and favourability of coverage, key message penetration. 30 Appendix
31 Outputs for end audience The number or percentage of your end audience reached by the media activity and how often they saw it. Out-takes for end audience Use the performance metrics included in section 2.1 (purchased marketing activity) to measure what your end audience recall, think or feel about the media activity. Intermediate outcomes for end audience Depending on the nature of the messages that you promote, you may expect your end audience to take some form of action as a result of seeing the media coverage. This might include: Seeking information from you Seeking information from other sources Registering for a service, product or information Starting, stopping or continuing a particular behaviour. Use the performance metrics included in section 2.1 (purchased marketing activity) to measure each of these actions where relevant. Final outcomes for end audience Final outcome measures should assess whether your activity met its overall communication objective, and its effect on the overall policy or reputation objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure this. Appendix 31
32 2. Marketing activity Choose the performance metrics that are most relevant from the following list, based on the type of activity that you are evaluating: 2.1 Purchased media 2.2 Websites, social media and other digital spaces 2.3 Public and stakeholder engagement 2.4 Partnership activity (including public advocacy). 2.1 Purchased media (TV, radio, press, outdoor, digital advertising, paid-for search, direct marketing) $ Useful tip evaluating multi-channel activity If your activity includes more than one paid-for channel, remember to include performance metrics to measure the effectiveness of each one. Inputs The number of people you plan to reach with your activity and frequency of exposure. This might include: Estimated reach, coverage and frequency (TV, press, radio, outdoor) The number of impressions served (digital advertising) The number of planned clicks (paid-for search) The number of inserts produced, the number of leaflets planned to be distributed (direct marketing) The costs (media and production) incurred (by channel). Outputs The number of people actually reached by your activity and the number of times they were exposed to your message. $ Useful tip sourcing data If an agency is booking media or organising distribution of direct marketing materials, it will be able to provide input and output data for you. 32 Appendix
33 Out-takes Think about the key messages that you want your activity to get across to your end audience what will they be thinking or feeling if it s been successful? Adapt the following performance metrics to enable you to measure their reaction, choosing those which are most relevant for your activity. $ Useful tip identifying key messages Use your activity map to help you to identify what your key messages are and how you expect your target audience(s) to react to them. Have they seen it (RECALL)? Are the end audience aware of the key/supporting message that you are promoting? Have they seen any advertising, communication or publicity about the key/ supporting message? Can they describe this advertising, communication or publicity? Where have they seen the advertising, communication or publicity? Do they recognise your activity when it s shown to them (this might include advertising, leaflets, letters, event invitations etc produced by you or press coverage and partnership activity delivered via an intermediary)? Do they understand the activity s key messages (THINK)? Did the end audience like the activity? Did they think it was relevant to them? Was it clear and engaging? Did it capture their attention? Did they understand the key/supporting message that the activity was promoting? Has it impacted on their views? Do they intend to do anything as a result (FEEL)? What would you want the end audience to feel or think if the activity was successful? On this basis, what effect did the activity have on your end/ intermediary audience s attitudes? Do the end/intermediary audience feel more positive about your department and its work as a result of the activity? Are they willing to support you in your work or to advocate your department more widely? Do the end/intermediary audience intend to take the action specified in the key/supporting message in the coming weeks or months (depending on the time-frame for your activity)? Appendix 33
34 Intermediate outcomes Intermediate outcome metrics measure the actions that people have taken as a result of your activity. These will vary depending on your communication activity, but might include: a) Seeking information from you b) Seeking information from other sources c) Registering for a service, product or information d) Starting, stopping or continuing a particular behaviour e) Interacting with you or with others online or offline. Suggested performance metrics for each of these intermediate outcomes are given below choose the one(s) that is/are most appropriate for your activity and adapt as appropriate. a) Seeking information from you Sometimes, you will ask people to contact you directly for more information. Always try to include a range of metrics that cover who contacted you, how they contacted you and the depth of contact. Even if you don t directly ask people to contact you, it s always worth checking to see whether you ve had uplift in contact as a result of your activity. Suggested performance metrics for the different contact channels include: Call centre: The number of people calling What drove them to call (your activity, activity carried out by intermediaries, e.g. media, stakeholders, other factors) The reason for their call How many callers you passed relevant information on to Digital activity (advertising): The number of people clicking on digital advertising or text links (split by creative, site) The number visiting your website or social media space as a result of advertising Website or social media space (NB: data for these metrics will be available from the Government Digital Service (GDS)): Source of visits (how people came to the site) Average length of time spent on the site (and number spending more than 30 seconds on the site) The number of pages visited The number watching any videos or embedded content on your site (starting to watch and watching whole clip) The number seeking relevant information Bounce rate (percentage only visiting one page) 34 Appendix
35 Intermediate outcomes (continued) a) Seeking information from you (continued) Events/face-to-face The number attending events as a result of activity you ran to promote them The number visiting you or your staff in person at other times What drove them to visit (your activity, other factors) How many you passed relevant information on to Correspondence Number of s received (positive/negative) Number of letters received (positive/negative). b) Seeking information from other sources If your activity is being delivered by stakeholders or partners, ask them whether they can gather data on how people seek information using the performance metrics listed under section (a) above. Other sources that you might want to include are: People searching for information on your activity online People who claim to have asked friends, families or other organisations for information or advice following your activity. c) Registering for a service, product or information Your activity might ask people to register for a service, more information or a product, or to attend an appointment with you or a stakeholder. If so, you might want to include measures that look at: The number of people seeking information on the service, product or information featured in your activity (see previous section) The number of people requesting a product featured in your activity (and where they heard about this product) The number who actually receive this product The number who go on to use this product The number of people signing up to receive information in the future The number of people starting the registration process for a service or product (and where they heard about it) The number of people completing the registration process The people using the service that they have registered for and how they use it Appendix 35
36 Intermediate outcomes (continued) c) Registering for a service, product or information (continued) $ Useful tip choosing the right performance metrics for your activity The performance metrics that you choose here will vary depending on the nature of your activity. For example, if you re running an ongoing programme to support people as they give up smoking, you might record how many people attend an initial meeting or read the information that you have sent to them and then record how many attend subsequent meetings, request further information etc. Creating an activity map can be a helpful way of identifying how you would expect people to use the service, and can help you to choose the right performance metrics. The number of people making an appointment with you or a stakeholder The number of people attending this appointment. d) Starting, stopping or continuing a particular behaviour Your activity may ask people to start, stop or continue a particular behaviour. If so, you might want to include performance metrics that look at: The number of people undertaking that behaviour before the activity ran The number of people who take the relevant steps to change their behaviour as a result of your activity The number of people who continue with the changed behaviour over time. e) Interacting with you or with others online or offline You may want people to pass your message on or to discuss it with others. If you re using social media, you may want them to join an online community or contribute to content on your website. You may find the following metrics useful, depending on what media you re using: The number of people who pass your message on or discuss your activity or the message that it is promoting with others The number of people who claim to have discussed your message with others (online and offline) The number passing on or sharing your content digitally (e.g. tweets, retweets, sharing link on Facebook, via Digg, Delicious, StumbleUpon, Reddit etc) Amount of talk on digital forums, blogs, websites etc (tracked via buzz monitoring) Amount of this talk that is positive (and accurate) 36 Appendix
37 Intermediate outcomes (continued) e) Interacting with you or with others online or offline (continued) The number of people who interact with you online depending on the digital space that you use, this might include: The number of Facebook fans or friends, the number liking your content The number of Facebook fans or friends who are active (i.e. return to the site regularly) The number of people posting content on your site The number of people commenting on your content (e.g. on your own site, Facebook, YouTube). $ Useful tip passing the message on If your activity asks someone to pass a message on on your behalf, also include metrics that enable you to measure the effect of this message on those to whom they speak. Final outcomes Final outcome measures should assess whether your activity met its overall communication objective, and its effect on the overall policy objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure the activity s impact on both. 2.2 Websites, social media and other digital spaces GDS will produce standard reporting dashboards for gov.uk. These include standard performance metrics that measure outputs, out-takes and intermediate outcomes. Bespoke reports for gov.uk and other government websites and social media spaces will also be available. Further information on how to get this data will be available soon via the GCN website. Always ensure that you integrate these performance metrics with wider measures that look at the effect of your website or social media activity. These might include: Inputs The website, social media or other digital space and related content created to meet communication objectives The cost and resource used to create this website, social media space or content. Outputs (available via GDS reports) GDS can provide performance metrics that you can use to look at how many people were exposed to your digital content. These will include: The number of the end audience visiting the site or digital space at least once during the evaluation period (unique users) The source of visitor referrals. Appendix 37
38 Out-takes (available via GDS reports) What people think, feel or recall about the webspace. Section 2.1 gives more detail on the types of performance metric that you might want to include. You may also be able to infer attitudes and understanding from comments that people leave on your website or social media space. Intermediate outcomes (available via GDS reports) GDS can provide a number of standard metrics that measure how people interact with your site or digital space. These will include: Average length of time spent on site Bounce rate The number watching any videos or embedded content on your site (starting to watch and watching whole clip) The number of pages visited. Also consider including metrics that look at actions carried out on social media websites, such as: Number of likes on Facebook Number of retweets on Twitter. Always ensure that you tie these standard metrics back to your website or digital space s objectives and consider what you are trying to get people to do (register, read a particular piece of content, retweet content etc). This will enable you to interpret the data appropriately. Final outcomes Final outcome measures should assess whether your website or digital space met its overall communication objective, and its effect on the overall policy objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure the activity s impact on both. 2.3 Public and stakeholder engagement Public or stakeholder engagement activity is put in place to secure feedback on or support for a specific policy area. Depending on the activity that you are running, you may use a range of channels including events, digital and purchased channels. Inputs The activity that you carry out to promote your initiative. Depending on the nature of your engagement activity, this might include: The number of invitations to events that you send out The number of requests to provide feedback that you issue Activity that you carry out to publicise your engagement activity The website that you create to gather feedback The number and nature of contacts that you make with specific stakeholders. 38 Appendix
39 Outputs Depending on the nature of your engagement activity, these might include: The number of people that you reach via your publicity, invitations and requests The number of people attending the event that you organised The number of people who visit your website or digital space (see section 2.2 for more detail on digital metrics) The number of conversations that you have with a specific stakeholder as a result of the contacts that you make. Out-takes Think about what you want the audience that you are engaging to recall, think or feel as a result of your engagement activity. Often there will be a specific policy that you want them to advocate or support. Include relevant out-take measures to assess levels of support for your message and your department. Use the performance metrics included in section 2.1 (purchased marketing activity) to help to measure the depth of engagement and attitudes towards your policy area. Intermediate outcomes Intermediate outcome metrics measure the actions people have taken as a result of your activity. These will vary depending on your communication activity, but typical intermediate outcomes for stakeholder and public engagement include: a) Providing usable feedback b) Attending an event c) Passing your message on to others. Suggested performance metrics for each of these intermediate outcomes are given below choose the one(s) that is/are most appropriate for your activity and adapt as appropriate. a) Providing usable feedback You may be running activity to increase engagement among specific audiences possibly to seek help in developing policy or to ask for feedback or consult on a particular policy area. If so, you might include measures that look at: The number of people who respond to your engagement activity in any way. This might include: Comments or feedback received on your website, in writing, by phone, face-to-face The quality of feedback and comments received The number of people attending public meetings or hearings The number of people agreeing to help to develop a specific policy area (generally in relation to stakeholder engagement) The amount of usable feedback incorporated into future policy development or engagement activity. Appendix 39
40 Intermediate outcomes (continued) b) Attending an event You may invite stakeholders or members of the public to attend events so that you can share information or increase engagement. Depending on what your event is designed to do, you might want to think about including performance metrics that look at: The number of event attendees who interact with you or your staff during the event The number of event attendees who undertake specific actions or behaviours in line with messages promoted at the event The number who thought the event was good and useful The number who understand the messages that you were promoting. c) Passing your message on to others You may want people to pass on specific messages from your engagement activity. You may find the following metrics useful, depending on what media you re using: The number of stakeholders or members of the public who pass your message on and discuss your activity or the message that it is promoting with others The number of stakeholders or members of the public who claim to have discussed your message with others (online and offline): The number passing on or sharing your content digitally (e.g. tweets, retweets, sharing link on Facebook, via Digg, Delicious, StumbleUpon, Reddit etc) Amount of digital talk on forums, blogs, websites etc (tracked via buzz monitoring) Amount of positive (and accurate) digital talk. $ Useful tip passing the message on If your activity asks someone to pass a message on on your behalf, also include metrics that enable you to measure the effect of this message on those to whom they speak. 40 Appendix
41 2.4 Partnership activity (including public advocacy) In partnership activity a partner is used as an intermediary to advocate your work, pass on a message or deliver a service to an end audience on your behalf typically in the areas of policy delivery and reputation management. Partners may include commercial, not-for-profit and government bodies as well as individual stakeholders. $ Useful tip evaluating partnership activity All partnership activity will have an intermediary audience the partner that you are engaging. Always think about who your end audience is and what you are asking the partner to do to reach them on your behalf. Inputs for intermediary audience Details of your overall partnership engagement strategy and the supporting activity that you carry out to engage stakeholders or partners so that they will pass on your message on your behalf. This might include: The number of partners that you contact by phone, or face-to-face The number of partners that you invite to events, training etc and the number of events and training sessions that are held The number of times that you contact each partner (particularly when building relationships and hence advocacy over time) The number and nature of key messages that you pass on to each partner (particularly when building relationships/advocacy) Any costs incurred in contacting partners, time and internal resources used. Outputs for intermediary audience The number of partners who are exposed to the activities that you run and the messages that you promote. This might include: The number of partners successfully reached by your activity (the number with whom you are able to share your messages) The number of partners attending events or training that you have organised The quality and range of messages shared with each partner over time (particularly when building advocacy). Out-takes for intermediary audiences Think about what you want partners to think or feel as a result of the contact that you make. This might include: The number of partners who support your activity The number of partners who are interested in finding out more about your initiative or in working with you. Appendix 41
42 Intermediate outcomes for intermediary audiences Intermediate outcomes will depend on the nature of contact that you make with partners and the response that you are seeking from them. They might include: The number of partners contacting you for more information The number of partners discussing the activity with you The number of partners expressing an interest in working with you The number of partners agreeing to undertake a specific activity or pass on a specific message for you. Final outcomes for intermediary/inputs for end audience The number and/or nature of messages passed on or activities carried out to promote messages on your behalf by partners. This might include: The number of partners advocating the work of your department or team: The number of partners passing on specific messages about your department and its work (through conversations with others, interviews, media coverage, digital interaction etc) The number of partners offering more general support for your department The number of messages passed on by each individual partner, content of each message, favourability of each message The number passing on or sharing your content digitally (e.g. tweets, retweets, sharing link on Facebook, via Digg, Delicious, StumbleUpon, Reddit etc) The number of partners carrying out activities to promote your message, and the nature of these activities: This might include sponsorship or activities carried out by commercial or non-commercial partners. Outputs for end audience The number of the end audience reached by partner activity, promotion or advocacy, and the number and type of messages that they hear. $ Useful tip gathering output data Always ask partners to provide output data wherever possible. Depending on the nature of the partner and the activity that they are running, this might include visits to their website, footfall data, media reach data etc. Out-takes for end audience Think about what you want the end audience whom the partner is engaging on your behalf to recall, think or feel as a result of their activity generally. For example: Use the performance metrics included in section 2.1 to help to measure their awareness, understanding and attitudes towards the partner activity. 42 Appendix
43 Intermediate outcomes for end audience Depending on the nature of the activity that you are carrying out, you may expect your end audience to take some form of action as a result of seeing the partnership activity. This might include: Seeking information from you Seeking information from other sources Registering for a service, product or information Starting, stopping or continuing a particular behaviour. Use the performance metrics included in section 2.1 to measure each of these actions where relevant. Final outcomes for end audience Final outcome measures should assess whether your activity met its overall communication objective, and its effect on the overall policy or reputation objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure this. Appendix 43
44 3. Internal communications Performance metrics for internal communication are split into two categories: 3.1 Direct engagement (where you are asking your employees to take a specific action) 3.2 Indirect engagement (where you are asking employees or senior managers to pass a message on to others). Choose the set that is most relevant for your activity. 3.1 Direct engagement In direct engagement campaigns, you communicate with your organisation s employees (the end audience) to change their attitudes or behaviour. Typically, you will be aiming to increase employee engagement or to implement a change management programme. Inputs The number of internal communication activities that you plan and carry out. These might include: Invitations to events or training sessions s or other written communication Content posted on the intranet Informal events (without prior invitation) Costs and time spent on organising activities. Outputs The number of employees who receive invitations to events or training sessions, s or other written communication, the number of employees seeing content on the intranet, the number attending events and training. Out-takes Think about what you want employees to recall, think or feel as a result of your activity. Typically, you might want to raise their understanding of and engagement with a particular initiative, increase their support for organisational change or increase their overall engagement with the organisation. Use the performance metrics included in section 2.1 to help to measure their awareness, understanding and attitudes towards your activity. $ Useful tip measuring employee engagement The Civil Service People Survey provides a useful annual benchmark of general attitudes and engagement within departments and across the Civil Service. 44 Appendix
45 Intermediate outcomes The actions that you want your employees to carry out will vary depending on your end objective. Think about including measures that enable you to assess how many employees: a) Seek information from you b) Start, stop or continue a particular behaviour c) Attend an event or training d) Engage with the department s work or a specific initiative. Use the performance metrics included in section 2.1 (purchased marketing activity) to measure (a) and (b) those seeking further information or starting, stopping or continuing a particular behaviour. For (c) attending an event or training, consider including the following metrics: The number of event attendees who interact with you or your staff during the event The number of event attendees who undertake specific actions or behaviours in line with messages promoted at the event The number of attendees putting the skills or information that you shared or trained them in into practice The number of attendees saying that they have put the skills or information into practice. You should also add out-take measures to see whether people thought the event or training was good and useful and to ensure that they understood the messages that you were promoting (see out-takes above). For (d) engaging with the department s work or a specific initiative, consider including the following metrics: The number of people who respond to your engagement activity in any way. This might include: Comments or feedback received on your website, in writing, by phone, face-to-face The quality of feedback and comments received Overall staff or stakeholder engagement levels (including indicators such as retention, absence due to illness, performance appraisals, claimed motivation). Final outcomes Final outcome measures should assess whether your activity met its overall communication objective, and its effect on the overall policy or reputation objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure this. Appendix 45
46 3.2 Indirect engagement Indirect engagement includes providing support for senior managers so that they are more able to communicate effectively within the organisation or training and engaging employees so that they are more able to advocate the department and its values to the wider public. Here you might want to include performance metrics that enable you to measure how effectively you engaged these groups as an intermediary and then measures to see what effect your work had on the wider organisation or public (the end audience): Inputs for intermediary audience The number of internal communication activities that you plan and carry out to increase employee advocacy or support senior managers. These might include: Training, support and advice that you offer to senior managers Events, training and publications that you produce to increase employee advocacy Costs and time spent on organising activities. Outputs for intermediary audience The number of managers or employees who receive training, support or advice and the number and range of messages and skills shared. This might include: The amount of support and training received by senior managers, specific skills and knowledge shared The number of employees attending events or training or seeing publications and content shared via these media. Out-takes for intermediary audience Include measures that enable you to look at what managers and employees think about the events and training plans that you organised and the support that you offered and their understanding of and attitude towards the content and key messages that you promoted. These might include: Those rating the event or training as good or useful Those who are aware of specific actions that you want them to take or specific messages that you want them to understand and pass on Those who support these messages Those who are more confident or more able to carry out their role as a result of your support or training. Section 2.1 gives more general guidance on setting out-take performance metrics. 46 Appendix
47 Intermediate outcomes for intermediary audience The actions that you want senior managers or employees to carry out will vary depending on your end objective. Think about including measures that enable you to measure how many managers or employees: Use skills learned in training or claim to have done so Pass on a specific message Advocate the work of the department. Final outcomes for intermediary and inputs for end audience The amount and nature of contact that managers or employees have with the end audience as a result of your activity. This might include: Events, publications and initiatives where senior managers interact with the organisation with your support The advice and messages passed on via these interactions The number of employees advocating the department and its work or passing specific messages on to the general public as a result of your engagement work. Outputs for end audience The number of employees reached by manager or employee interactions, the number of messages that they hear, the nature and content of the messages The number of members of the public reached by employee interaction, the number of messages they hear, the nature and content of the messages. Out-takes for end audience Think about what you want employees or members of the public to recall, think or feel as a result of manager or employee interaction: Where a manager is interacting with an organisation, you might want their action to raise employee understanding of and engagement with a particular initiative or increase their support for organisational change Where an employee is interacting with a member of the public, you might want their interaction to increase that person s understanding of the message that the employee is passing on or to improve their overall attitude towards the department and its work Use the performance metrics included in section 2.1 to help to measure their awareness, understanding and attitudes towards your activity. Appendix 47
48 Intermediate outcomes for end audience Think about what you want employees or members of the public to do as a result of the activity. This might include: a) Seeking information from you b) Starting, stopping or continuing a particular behaviour c) Engaging with the department s work or a specific initiative For more specific performance metrics for (a) and (b), go to section 2.1. For more specific performance metrics for (c), consider what intermediate outcomes greater engagement would lead to. These might include: Overall staff engagement levels (including indicators such as retention, absence due to illness, performance appraisals, claimed motivation) Positive feedback received from staff or members of the public. Final outcomes Final outcome measures should assess whether your activity met its overall communication objective, and its effect on the overall policy or reputation objective that you are working to and the effect that this has had. Choose performance metrics that enable you to measure this. 48 Appendix
49 Acknowledgements This guide was put together by Catherine Hunt from the GCC Campaigns and Strategy Team with input and advice from colleagues across GCN. Particular thanks go to Tina Trythall, Ian Theo and Jesse Henry, and to Richard Slade, Tracy Logan, Stef Hrycyszyn, Matthew Taylor, Samantha Redmond, Darren Belnikoff and Tamsin Berry for their help with earlier drafts and guidance.
50 Cabinet Office 22 Whitehall London SW1A 2WH Publication date: December 2012 Crown copyright 2012
Government Communication Professional Competency Framework
Government Communication Professional Competency Framework April 2013 Introduction Every day, government communicators deliver great work which supports communities and helps citizens understand their
Measuring the Impact of Volunteering
Measuring the Impact of Volunteering Why is measuring the impact of volunteering important? It is increasingly important for organisations or groups to describe the difference that volunteering makes to,
Chesterfield Borough Council. Internal Communications Strategy. April 2014 - April 2017.
Appendix 1 Chesterfield Borough Council Internal Communications Strategy April 2014 - April 2017. Section 1: Introduction 1.1 Chesterfield Borough Council s single biggest asset is its employees. 1.2 It
5 Point Social Media Action Plan.
5 Point Social Media Action Plan. Workshop delivered by Ian Gibbins, IG Media Marketing Ltd ([email protected], tel: 01733 241537) On behalf of the Chambers Communications Sector Introduction: There
Communications Strategy
Communications Strategy 2014-2017 Classification: Internal/Stakeholder 1. Introduction Good communication is central to the perception of City Property (Glasgow) LLP and our credibility. It is at the core
An introduction to impact measurement
An introduction to impact measurement Contents 1 Introduction 2 Some definitions 3 Impact measurement at BIG 4 Setting impact measures for programmes APPENDICES A External Resources (separate document)
TOOL D14 Monitoring and evaluation: a framework
TOOL D14 Monitoring and evaluation: a framework 159 TOOL D14 Monitoring and evaluation: a framework TOOL D14 For: About: Purpose: Use: Resource: Commissioners in primary care trusts (PCTs) and local authorities
Social Return on Investment
Social Return on Investment Valuing what you do Guidance on understanding and completing the Social Return on Investment toolkit for your organisation 60838 SROI v2.indd 1 07/03/2013 16:50 60838 SROI v2.indd
1.4. Ensuring people and communities know and understand these issues can help build trust and confidence in the Council and improve our reputation.
Draft Communications Strategy -2018 1. Introduction and context 1.1. In the challenging and changing environment of local government, it s really important that regular, reliable and accurate information
Corporate objectives. Communications strategy. Digital marketing and inclusion strategy
marketing and inclusion strategy 2014-2019 (draft) Introduction Corporate objectives Communications strategy marketing and inclusion strategy This digital marketing and inclusion strategy is a supplementary
Research and information management strategy 2015-18. Using research and managing information to ensure delivery of the Commission s objectives
Research and information management strategy 2015-18 Using research and managing information to ensure delivery of the Commission s objectives 1 1. Introduction This strategy sets out a range of research
COMMUNICATIONS STRATEGY 2014-2018
COMMUNICATIONS STRATEGY 2014-2018 1. INTRODUCTION Communications is at the core of everything the Council does. This strategy outlines how we will plan and manage our communications activities over the
National Deaf Children s Society (NDCS) submission to Work and Pensions Select Committee inquiry
National Deaf Children s Society (NDCS) submission to Work and Pensions Select Committee inquiry Employment support for disabled people: Access to Work Summary Access to Work (AtW) plays a vital role in
Insight. The analytics trend. in customer service. 4-point plan for greater efficiency in contact centres. we are www.daisygroup.
Insight The analytics trend in customer service 4-point plan for greater efficiency in contact centres 2 Introduction The subject of analytics these days includes a vast number of factors relating to customer
Where are all the candidates at?
HireHive Handbooks 06 Where are all the candidates at? Things you can do to make searching for great talent even easier. 1 Where are all the candidates at? Hire people who are better than you are, then
TAXREP 01/16 (ICAEW REP 02/16)
TAXREP 01/16 (ICAEW REP 02/16) January 2016 ICAEW research survey: HMRC Customer Service Standards 2015 Results of the ICAEW 2015 research survey among ICAEW smaller agents about HMRC service standards.
Last Updated: 08/27/2013. Measuring Social Media for Social Change A Guide for Search for Common Ground
Last Updated: 08/27/2013 Measuring Social Media for Social Change A Guide for Search for Common Ground Table of Contents What is Social Media?... 3 Structure of Paper... 4 Social Media Data... 4 Social
GUIDE Social Media Strategy Guide. How to build your strategy from start to finish
GUIDE Social Media Strategy Guide How to build your strategy from start to finish Social Media Strategy Guide How to build your strategy from start to finish Whether you re a social media coordinator for
To be used in conjunction with the Invitation to Tender for Consultancy template.
GUIDANCE NOTE Tendering for, choosing and managing a consultant Using this guidance This information is not intended to be prescriptive, but for guidance only. Appointing consultants for relatively small
Content Management Guide
Content Management Guide Content marketing has a reputation for being difficult to measure, which can make it hard to obtain buy-in and support. But that s no excuse for playing it by ear there are plenty
Social Media Strategy
Social Media Strategy Tonbridge School Social Media Strategy We believe social media is important to the school and its communications. Tonbridge School should share messages about academic excellence,
Board report for 31 May 06 Item 8
Board report for 31 May 06 Item 8 DRAFT Internal communications strategy Contents 1. Executive Summary 2. Introduction 3. Background 4. The vision for communications 5. Strategic objectives 6. Early priorities
COMMUNICATIONS & ENGAGEMENT PLAN
COMMUNICATIONS & ENGAGEMENT PLAN 2016-2018 Creating A better environment Creating a better environment contents 1 General...3 1.1 Introduction 1.2 Internal Stakeholders 1.3 External Stakeholders 1.4 Organisational
Communication Capability Review: Department for Business, Innovation & Skills
Communication Capability Review: Department for Business, Innovation & Skills 1 1 Management summary 1.1 The Communication Capability Review of the Department for Business, Innovation and Skills (BIS)
Explaining the difference your project makes A BIG guide to using an outcomes approach. Sara Burns and Joy MacKeith Triangle Consulting October 2006
Explaining the difference your project makes A BIG guide to using an outcomes approach Sara Burns and Joy MacKeith Triangle Consulting October 2006 Explaining the difference your project makes Stock code
A fresh look at equity release
For financial adviser use only. Not approved for use with customers. A fresh look at equity release How you could add an extra dimension to your business How you could add an extra dimension to your business
Copeland Borough Council. Communications Strategy 2006/7
Copeland Borough Council Communications Strategy 2006/7 CONTENTS Introduction: Why Communicate? - external communications - internal communications The Purpose; - what is a communications strategy? - what
Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet
Applies from 1 April 2007 Revised April 2008 Core Competence Framework Guidance booklet - Core Competence Framework - Core Competence Framework Core Competence Framework Foreword Introduction to competences
E-Government Chair Research Project The use of social media for effective public engagement. Case Study - NZ Transport Agency: Drugged drivers
E-Government Chair Research Project The use of social media for effective public engagement Case Study - NZ Transport Agency: Drugged drivers Author: Summary New Zealand Transport Agency (NZTA) uses a
Communications Strategy
Communications Communications July 2013 Version 1.1 1 Communications River Clyde Homes Vision Our vision is to provide quality, affordable homes, in neighbourhoods we can be proud of and to deliver excellent
CSci application information for self-guided route
Outline To become a Chartered Scientist through the you must complete the following stages of application: 1. Application To meet the application requirements you must: be a paid-up Full (voting) Member
Test your talent How does your approach to talent strategy measure up?
1 Test your talent How does your approach to talent strategy measure up? Talent strategy or struggle? Each year at Head Heart + Brain we carry out research projects to help understand best practice in
OUR CODE OF ETHICS. June 2013
OUR CODE OF ETHICS. June 2013 OUR CODE OF ETHICS GUIDING PRINCIPLES Ethical behaviour is an integral part of the way we do business. It's crucial that all our stakeholders are able to trust us to treat
Making a positive difference for energy consumers. Competency Framework Band C
Making a positive difference for energy consumers Competency Framework 2 Competency framework Indicators of behaviours Strategic Cluster Setting Direction 1. Seeing the Big Picture Seeing the big picture
Digital TV switchover: Social media
Digital TV switchover: Social media By Matt Heselden, Social media lead Summary With the growth in popularity of social media, we recognised that increasing numbers of consumers and opinion formers would
ScottishPower Competency Based Recruitment Competency Guidelines External Candidate. pp077682 ScottishPower [Pick the date]
ScottishPower Competency Based Recruitment Competency Guidelines External Candidate pp077682 ScottishPower [Pick the date] Aims and Objectives This document will give you an overview of the selection process
JOB DESCRIPTION. 2. Answer customer comments, queries and complaints with timely and appropriate responses.
JOB DESCRIPTION Job Title Department Grade Location Responsible to Responsible for Digital Marketing Assistant Marketing B Marketing, Barbican Centre Digital Marketing Executive N/A The Marketing Department,
Volunteer Managers National Occupational Standards
Volunteer Managers National Occupational Standards Contents 00 Forward 00 Section 1 Introduction 00 Who are these standards for? 00 Why should you use them? 00 How can you use them? 00 What s in a Standard?
1. Trustees annual report
1. Trustees annual report Accounting and reporting by charities Overview and the purpose of the trustees annual report 1.1. The primary purpose of the trustees annual report (the report) is to ensure that
FIELD GUIDE TO LEAN EXPERIMENTS
FIELD GUIDE TO LEAN EXPERIMENTS LEAN ENTERPRISE ACCELERATOR PROGRAM HOW TO USE THIS GUIDE This guide is designed to be used in conjunction with the Experiment Map posters. If you have not done so already,
see, say, feel, do Social Media Metrics that Matter
see, say, feel, do Social Media Metrics that Matter the three stages of social media adoption When social media first burst on to the scene, it was the new new thing. But today, social media has reached
Six top tips for travel managers to create savings in 2015
Six top tips for travel managers to create savings in 2015 E-Guide 2 Introduction Savings remain a key focal point for Travel Managers in 2015 and through regular reviews and analysis, using management
Employee Engagement FY15. 1. Introduction. 2. Employee Engagement. 3. Management Approach
1. Introduction This document forms part of our Disclosures on Management Approach (DMA) series, prepared in accordance with the Global Reporting Initiative s G4 Guidelines. The DMA series is designed
B2B Customer Satisfaction Research
Circle Research White Paper B2B Customer Satisfaction B2B Customer Satisfaction Research IN SUMMARY This paper on B2B customer satisfaction research: Identifies why customer satisfaction matters Provides
Completing the competency based application form
Completing the competency based application form For the HEO/SEO cohort, you will be required to provide evidence of how you meet the following competencies: This involves completing and submitting a competency
A Guide to Marketing Automation
A Guide to Marketing Automation How Has B2B Marketing Changed? B2B marketing has undergone a significant shift in recent years. Buyers are able to take more control of the buying process by undertaking
A Marketer's Guide. to Facebook Metrics
A Marketer's Guide to Facebook Metrics 2 Whether you ve invested big in social or are starting to consider a new strategy for 2014, one of the key aspects to think about is social media metrics - namely
BUSINESS. Unit 7 Marketing campaign. 2016 Suite. Cambridge TECHNICALS LEVEL 3. L/507/8154 Guided learning hours: 60. ocr.org.
2016 Suite Cambridge TECHNICALS LEVEL 3 BUSINESS Unit 7 Marketing campaign L/507/8154 Guided learning hours: 60 Version 2 - Revised content - March 2016 ocr.org.uk/business LEVEL 3 UNIT 7: Marketing campaign
Health and Care Research Wales Communications Strategy
Health and Care Research Wales Date: April 2016 Version: 1 Authors: Cheryl Lee, Communications Manager Summary: This document sets out a for Health and Care Research Wales. It covers internal communications
SOCIAL MEDIA SUCCESS IN 14 STEPS
SOCIAL MEDIA SUCCESS IN 14 STEPS DELIVERING ENGAGING CONTENT ON SOCIAL MEDIA PLATFORMS IS AN EFFECTIVE WAY TO REACH A TARGET AUDIENCE. IN THE LAST FIVE YEARS, SOCIAL MEDIA MARKETING HAS GONE FROM EXPERIMENTATION
SOCIAL MEDIA STRATEGY 2013 2015
SOCIAL MEDIA STRATEGY 2013 2015 October 2013 1 Contents 1. Background 2. Channels 3. Use 4. Management and Administration 5. Analytics and Reporting 6. Communication and Employee Engagement 7. Timeline
Incorporating Social Media into a Technical Content Strategy White Paper
Incorporating Social Media into a Technical Content Strategy White Paper Authored by Bill Gearhart, Comtech Services, Inc. USER-GENERATED CONTENT Table of Contents Table of Contents Introduction...2 Selected
A Changing Commission: How it affects you - Issue 1
A Changing Commission: How it affects you - Issue 1 Contents Overview... 3 Change Programme... 4 Introduction... 4 Reviewing how we regulate and engage... 4 What are the key changes... 5 What does it mean
Consultation and Engagement Strategy
Consultation and Engagement Strategy Contents: 1. Introduction 2 2. Purpose 3 3. Aims and Objectives 4 4. Key principles 5 5. Delivery of the Strategy 6 6. Action Plan 2011-12 7 Appendix 1 Understanding
Seven business travel tips for PAs
Seven business travel tips for PAs For most PAs, organising travel will be a small part of your day-to-day role, but there are ways to make the process more seamless for you and your travellers E-Guide
stakeholder communications
stakeholder communications t h e t o o l k i t Stakeholder engagement plans cannot work without stakeholder communications. Every time you engage or communicate with stakeholders, there are implications
cprax Internet Marketing
cprax Internet Marketing cprax Internet Marketing (800) 937-2059 www.cprax.com Table of Contents Introduction... 3 What is Digital Marketing Exactly?... 3 7 Digital Marketing Success Strategies... 4 Top
Encouraging Sustainability Amongst Small Businesses
Behaviour Change: A Series of Practical Guides for Policy-Makers and Practitioners Number 9 Encouraging Sustainability Amongst Small Businesses Summer 2006 The National Centre for Business & Sustainability
Driving more business from your website
For financial intermediaries only. Not approved for use with customers. Driving more business from your website Why have a website? Most businesses recognise the importance of having a website to give
Quality Meets the CEO
Quality Meets the CEO Jeffery E. Payne [email protected] Reliable Software Technologies Corporate management does not care about quality. This is the cold, hard reality of the software world. Management
England. Your project and its outcomes. Acknowledgements This booklet has been prepared for the Big Lottery Fund by Charities Evaluation Services.
England Your project and its outcomes Sally Cupitt with Jean Ellis Charities Evaluation Services Acknowledgements This booklet has been prepared for the Big Lottery Fund by Charities Evaluation Services.
TOOL. Project Progress Report
Purpose TOOL SUMMARY: LOGFRAME CHEAT SHEET The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the
Take Online Lead Generation to the Next Level
Take Online Lead Generation to the Next Level 5 Ways to Capture New Market Niches By: Deven Pravin Shah WSI Internet Marketing Consultant Overview Many business owners ask the same questions about capturing
NATIONAL E-PROCUREMENT PROJECT GUIDANCE NOTES
NATIONAL E-PROCUREMENT PROJECT GUIDANCE NOTES PROCUREMENT SPEND ANALYSIS Title: Procurement Analysis Identification: Supports organisations in gathering and analysing procurement spend to highlight system
1. An overview of local authority communications p3. 2. New National Reputation Project p3. 3. Key aims of the Corporate Communications Strategy p4
Corporate Communications Strategy 2010-2015 Contents 1. An overview of local authority communications p3 2. New National Reputation Project p3 3. Key aims of the Corporate Communications Strategy p4 4.
User research for information architecture projects
Donna Maurer Maadmob Interaction Design http://maadmob.com.au/ Unpublished article User research provides a vital input to information architecture projects. It helps us to understand what information
UNIVERSITY OF LONDON GUIDE TO RISK MANAGEMENT. Purpose of the guide... 2
UNIVERSITY OF LONDON GUIDE TO RISK MANAGEMENT Purpose of the guide... 2 Risk Management The Basics... 2 What is Risk Management?... 2 Applying Risk Management... 2 The Use of Risk Registers in Risk Management...
Preparing for and coping with a crisis online. White Paper 2 Crisis management in a digital world
Contents 3 6 11 16 19 Introduction Preparing for a crisis During a crisis After the crisis About The Partners Group 2 In the new world of social media and mobile technology, communication is instantaneous,
Running surveys and consultations
Running surveys and consultations Susannah Wintersgill, Public Affairs Annette Cunningham & Rob Markham, Estates Tuesday 9 June 2015 Contents Five key stages of planning a survey Questionnaire design:
TEAM WE RE RECRUITING: IN ALL THREE AREAS RESEARCH & ANALYSIS STRATEGY & PLANNING MARKETING & DEVELOPMENT
JOIN OUR RESEARCH & ANALYSIS TEAM MARKETING & DEVELOPMENT STRATEGY & PLANNING WE RE RECRUITING: IN ALL THREE AREAS CULTURE REPUBLIC AT A GLANCE For over fifteen years we have worked with hundreds of professionals
Achieve. Performance objectives
Achieve Performance objectives Performance objectives are benchmarks of effective performance that describe the types of work activities students and affiliates will be involved in as trainee accountants.
Personal Development Planning and eportfolio. Student Guide
Personal Development Planning and eportfolio Student Guide 1 Introduction PDP helps you to develop the skills you need to take responsibility for your learning, development and career progression. Engaging
Effective objective setting provides structure and direction to the University/Faculties/Schools/Departments and teams as well as people development.
Effective objective setting provides structure and direction to the University/Faculties/Schools/Departments and teams as well as people development. The main purpose of setting objectives is to reflect
Tools for High Performance Recruitment: Carl Freelove Marketing Manager
Tools for High Performance Recruitment: Building a Better Workforce Carl Freelove Marketing Manager Agenda: what you will takeaway About jobs.ac.uk Recruitment market overview The recruitment tools: creating
TRANSPORT FOR LONDON CORPORATE PANEL
AGENDA ITEM 4 TRANSPORT FOR LONDON CORPORATE PANEL SUBJECT: EMPLOYEE ENGAGEMENT DATE: 17 NOVEMBER 2009 1 PURPOSE AND DECISION REQUIRED 1.1 The purpose of this report is to advise the Panel on TfL s approach
West Dunbartonshire Council s Employee Recognition Framework
West Dunbartonshire Council s Employee Recognition Framework CONTENTS 1 Introduction page 3 2 Overview of the framework page 3 3 Communicating and promoting success page 4 4 WDC Annual Employee Recognition
Planning and conducting a dissertation research project
Student Learning Development Planning and conducting a dissertation research project This guide addresses the task of planning and conducting a small research project, such as an undergraduate or masters
Effective complaint handling
This guide sets out key information for state sector agencies about developing and operating an effective complaints process. It also provides information about the Ombudsman s role, as an independent,
Trust Communications Strategy a discussion draft
ITEM: 09/005 Doc : 4 MEETING: TITLE: Trust Board 21 st January 200 Trust Communications Strategy a discussion draft SUMMARY: This discussion paper lays out a revised draft communications strategy for the
Queensland Government Human Services Quality Framework. Quality Pathway Kit for Service Providers
Queensland Government Human Services Quality Framework Quality Pathway Kit for Service Providers July 2015 Introduction The Human Services Quality Framework (HSQF) The Human Services Quality Framework
Digital Segmentation. Basic principles of effective customer segmentation
Digital Segmentation Basic principles of effective customer segmentation October 2012 Introduction This paper is an introduction to customer segmentation. It goes through the basics of segmentation, explaining
Future Council Programme Evaluation Framework
Future Council Programme Evaluation Framework Overview of the Evaluation Framework for the Future Council Programme DRAFT v0.7 August 2015 Contents 1. Evaluation Framework Overview 2. Evaluation Framework
Monitoring Social Impact: How does business measure up?
BUSINESS INTELLIGENCE FOR CORPORATE RESPONSIBILITY AND SUSTAINABILITY Monitoring Social Impact: How does business measure up? 1 Monitoring Social Impact: How does business measure up? Executive summary
Managing Your Career Tips and Tools for Self-Reflection
Managing Your Career Tips and Tools for Self-Reflection Your career may well be the primary vehicle for satisfying many of your personal needs, i.e. your need to feel a sense of belonging, to feel appreciated
IT strategy. What is an IT strategy? 3. Why do you need an IT strategy? 5. How do you write an IT strategy? 6. Conclusion 12. Further information 13
IT strategy made simple What is an IT strategy? 3 Why do you need an IT strategy? 5 How do you write an IT strategy? 6 step 1 Planning and preparation 7 step 2 Understanding your organisation s IT needs
Ten top tips for social media success
Ten top tips for social media success 1. Conversation is king The key to how you should behave within a social environment is the word social. This means it is not a one-way street. It is not a place for
Creating mutual trust
13. 3and Creating mutual trust respect Organisations that thrive are those where the company culture promotes mutual trust and respect of colleagues, and this is as true in PR as it is elsewhere. In this
Lefèvre Trust & Charles de Gaulle Trust. A guide to the programme
Lefèvre Trust & Charles de Gaulle Trust A guide to the programme Contents Page 1. Introduction to the programme 3 2. Planning partnership activity 8 3. Using the partnership progression framework 10 4.
Analyzing the Impact of Social Media From Twitter to Facebook
Analyzing the Impact of Social Media From Twitter to Facebook Analyzing the Impact of Social Media: From Twitter to Facebook Engaging and monitoring the new world of social media are the big first steps,
Funding success! How funders support charities to evaluate
Funding success! How funders support charities to evaluate A review of Evaluation Support Accounts with Laidlaw Youth Trust and Lloyds TSB Foundation for Scotland The process of working with ESS has greatly
OPERATIONAL CASE STUDY PRACTICE EXAM ANSWERS
OPERATIONAL CASE STUDY PRACTICE EXAM ANSWERS The Practice Exam can be viewed at http://www.pearsonvue.com/cima/practiceexams/ These answers have been provided by CIMA for information purposes only. The
Newman Students Union. Recruitment Pack. Development Manager. October 2015
Newman Students Union Recruitment Pack Development Manager October 2015 Welcome to Newman SU! Newman SU is a very special students union. We ve the very best attributes of a small and specialist students
digital mums THE Strategic Social Media Manager Programme COURSE BROCHURE
digital mums THE Strategic Social Media Manager Programme COURSE BROCHURE About Digital Mums Digital Mums are a new kind of social media solution. We take the natural talents and skills of mums, add some
