Best practice for customer satisfaction surveying in contact centre operations. The customer has moved on, so must we

Similar documents
FCR The Driver of All Other Metrics

Delivering successful Interactive Voice Response (IVR)

B2B Customer Satisfaction Research

OMNICHANNEL CUSTOMER SERVICE

06. Create a feedback loop. 01. Create a plan. 02. Improve People skills. 07. Get a tool that supports the workflow. 03. Keep your promises

15 Most Typically Used Interview Questions and Answers

CUSTOMER SERVICE EXCELLENCE

Chris Bell. Customer Experience Coach.

HOW TO USE MARKETING AND SALES ANALYTICS TO DRIVE RETURN ON INVESTMENT

USING CUSTOMER RELATIONSHIP MANAGEMENT AND ONBOARDING SOLUTIONS TO IMPROVE THE CUSTOMER EXPERIENCE

The. The Marketing Agency s Guide to Call Tracking

360 feedback. Manager. Development Report. Sample Example. name: date:

Briefing Paper. How to Compete on Customer Experience: Six Strategic Steps. gro.c om SynGro SynGro Tel: +44 (0 )

The Customer Experience:

Do slow applications affect call centre performance?

BETTER RELATIONSHIP SELLING

Your guide to creating a customer experience program that works

[ know me ] A Strategic Approach to Customer Engagement Optimisation

This is really important, because EE needs to be defined and understood in the context within which it is being used.

Customer Experience Management

CHAPTER 3 - CUSTOMER RELATIONSHIP MANAGEMENT (CRM)

Social Return on Investment

5 KEYS [ ] to Successfully Tracking Customer Experience. Are you delivering a top-notch customer experience that <keeps them coming back?

5/30/2012 PERFORMANCE MANAGEMENT GOING AGILE. Nicolle Strauss Director, People Services

TEN TOP TIPS FOR GREAT FOCUS GROUPS

How to Become a Data Driven Business

Environment Sustainability and Highways

Counting the Customer

Steps to Improve Customer Satisfaction Through Branch Banking

Measuring the Impact of Volunteering

The Bright Guide to Achieving Best In Class Contact Centre Performance

TAXREP 01/16 (ICAEW REP 02/16)

CRM. Booklet. How to Choose a CRM System

How to benchmark your service desk

2006

REPUTATION MANAGEMENT SURVIVAL GUIDE. A BEGINNER S GUIDE for managing your online reputation to promote your local business.

Ask the Customer Experience Experts

Exceptional Customer Service

Insight. The analytics trend. in customer service. 4-point plan for greater efficiency in contact centres. we are

CIPD Employee engagement

to selection. If you have any questions about these results or In the second half of 2014 we carried out an international

Performance Management Rating Scales

Where are all the candidates at?

Exceptional Customer Experience AND Credit Risk Management: How to Achieve Both

How to Choose a CRM System

Market Research. Market Research: Part II: How To Get Started With Market Research For Your Organization. What is Market Research?

20 Customer Service Best Practices SELL. SERVICE. MARKET. SUCCEED.

When being a good lawyer is not enough: Understanding how In-house lawyers really create value

Best Practice in Customer Experience Management

LISTEN TO THE VOICE OF CUSTOMER EXPERIENCE

Employer Brand Analytics

CRM Phase 3 Development, support and maintenance - Questions and Answers

What really drives customer satisfaction during the insurance claims process?

Analytics Software that allows Small- to Medium-Sized Companies To Get Actionable Data out of Their Financial Statements

TOP 10. Strategies for Modernizing Workforce Optimization. ebook

LETS CHAT ABOUT IT A guide to successfully using Live Chat in your organisation

Management Competencies Assessment

Family law a guide for legal consumers

Wiltshire Council s Behaviours framework

Four Pillars of Sales Success. Sales Training for Large Organisations

The Financial Services Industry

SAP Thought Leadership SAP Customer Relationship Management. Strengthen the Brand and Improve

Call Center First Call Resolution Guide. Sponsored by

A Review of the Integration of Brokerage Services in the South West

The Challenger Sale SOUND SMART. SAVE TIME. SELL MORE. A 15-page guide to the 240-page sales book.

Preparing and Evaluating A Request for Proposals: How to Select a Vendor Andrew Ness, Mercer Investment Consulting

ScottishPower Competency Based Recruitment Competency Guidelines External Candidate. pp ScottishPower [Pick the date]

Best in Class Customer Retention

Quick Guide to Getting Started: LinkedIn for Small Businesses and Nonprofits

What your customers want In a world full of contact channels

POWER YOUR ECOMMERCE BUSINESS

Five steps to improving the customer service experience

The Journey into Speech Analytics

Maximizing Customer Retention: A Blueprint for Successful Contact Centers

Boosting Customer Loyalty and Bottom Line Results

After the Reduction in Force: How to Re-Energize Your Team

Creating and Monitoring Customer Satisfaction

Customer Service Strategy

Building an Effective Business Architecture & Metrics Capability

WHITE PAPER. The Five Fundamentals of a Successful FCR Program

The Office of Public Services Reform The Drivers of Satisfaction with Public Services

Chesterfield Borough Council. Internal Communications Strategy. April April 2017.

Why Your Business Needs a Website: Ten Reasons. Contact Us: Info@intensiveonlinemarketers.com

10 Ways to Increase Your Recruitment Business Leads. World Class Staffing & Recruitment Software

Linking Employee Satisfaction, Employee Engagement, and Employee Ambassadorship Session 1: Ambassadorship Concept/Framework Introduction and Rationale

The A-Star Event Guide will provide you with some top tips for creating outstanding events.

Choosing the Right ERP Solution:

BEAT THE ELITE HOW TO USE CRM TO OUT COMPETE YOUR RIVALS.

DESCRIBING OUR COMPETENCIES. new thinking at work

Celebrus for Telecommunications: Deepening customer intelligence with individual-level digital data

cprax Internet Marketing

Exceptional Customer Service

Case study: Improving performance in HR London Camden

Uniphore Software Systems Contact: Website: 1

How B2B Customer Self-Service Impacts the Customer and Your Bottom Line. zedsuite

Salary Benchmarking. For Accounting Firms

University of Alberta Business Alumni Association Alumni Mentorship Program

Let s start with a couple of definitions! 39% great 39% could have been better

PROPOSAL TO DEVELOP AN EMPLOYEE ENGAGEMENT PROGRAMME

You ve Got the Technology Now What?

Transcription:

White paper Best practice for customer satisfaction surveying in contact centre operations The customer has moved on, so must we Bright UK Ltd April 2013 Bright UK Ltd Richmond Bridge House, 419 Richmond Road, Richmond TW1 2EX Tel +44 (0) 7986 767 388, Fax: +44 (0)208 296 1943 www.brightindex.co.uk info@brightindex.co.uk

CONTENT 1. Introduction... 3 2. Sample sizes... 4 3. Survey fatigue... 5 4. Segmentation and questions... 7 5. Take-up rates and cherry picking... 7 6. Return on investment... 8 7. Driving change in the customer service operation and across the organisation... 9 About Bright Bright was founded in 2000. Since then over 1,000 contact centre reviews have been carried out and millions of clients customers have been surveyed. Based on this we know what it takes to become a best in class customer service operation and have perfected our tools to help clients achieve this. Main areas covered are: Performance Customer satisfaction Employee engagement The results form a platform for clients target setting and improvement plans. The end goal is to significantly improve clients competitive edge, at the same time as we improve overall customer and staff satisfaction in the industry. www.brightindex.co.uk Page 2 of 11

1. Introduction The backbone of a solid customer experience strategy depends strongly on the survey tools in place. The best way of finding out what our customers think is still by asking them. However industry insight into what constitutes a robust methodology for this has changed over the last five years and so has customers attitudes to participating. This paper is aimed at anyone in customer services with a need for reliable customer insight, as well as an interest in being able to drive internal improvements with this data. The biggest shift over recent years is towards actionable customer insight, as opposed to high-level non-accountable data. We need accountability Customer service is more important than ever. Consumer views about companies and products are widely accessible and social media acting as word of mouth on steroids. If companies fail, there is a risk of not only that particular disgruntled Real- time insight is key customer becoming a detractor, but him or her influencing hundreds of other prospective customers negatively. It is therefore crucial to find out what your customers think fast. In addition to accountable and real-time data, it of course needs to be reliable. We will below share some of the basics of survey methodology, based on 12 years of experience and millions of customer satisfaction surveys for clients. Among them are: Sample size, how much is enough and how much is too much? Robust data is vital Survey fatigue, how do we ensure we are not too intrusive? Survey scripts, what should we ask our customers? How can we prove a return of investment and get the capex approved? How can driving change significantly change how we use the data? www.brightindex.co.uk Page 3 of 11

2. Sample sizes Most companies are shaking off their previously relaxed attitude to robust sample sizes and strive for reliable data. This is further driven by factors such as: New channels that the customer is using to contact us (is there a difference in satisfaction between channels?). Legal compliance. The desire to be able to break down the results to a staff, product or customer type level. These factors are countered by the ever-increasing threat of survey fatigue amongst customers. All of them need to be considered when deciding on sample sizes. It is crucial to differentiate between two requirements of customer satisfaction (C-Sat) surveys: - C-Sat for business intelligence. - C-Sat as an operational tool. Most modern solutions for measuring post contact satisfaction will tick both boxes. By gathering feedback on an individual level, you can drive individual behaviour and coach on results at the same time as you accumulate thousands of surveys on a monthly basis for the business needs. This situation creates misconception around sampling data, and what the size of the sample is worth if companies are using the customer surveying to develop At staff level, we staff (and thus drive C-SAT or NPS or other metrics on a recommend 20 higher level). Bright recommend a minimum of 20 surveys per month surveys per agent and month, which is a robust enough amount of data to continuously deliver external 360 feedback. At 20 surveys per agent and month, a small contact centre with 50 seats will generate 1,000 surveys every month; which is a very healthy number for sampling high-level feedback. www.brightindex.co.uk Page 4 of 11

Graph 2 illustrates how many surveys companies should produce to obtain a statistically significant* representation of the average customer s impression. The required number of surveys hardly changes. * Source: http://www.research-advisors.com/tools/samplesize.htm based on 95% confidence level and 3% margin of error (common). To put it simply: in order to get a sample overview of the average impression of your entire operation, you only need to interview around 1,500 customers. However, such information would scarcely be enough to understand it and definitely not enough to drive change. This is why setting a sample size target on individual level is needed. You will benefit from great support for team leaders and agents alike at the same time as generating lots of data allowing you to investigate the finer details of results. (One thing that grows very well together with larger samples is qualitative information such as verbatim comments.) From time to time we see overkill in this approach where organisations strive to gather as many expensive surveys as possible, without knowing what to do with them. If you understand what you are doing, and can utilise your findings and tools to drive change, you should not need to generate more than 20 surveys per agent per month. Most contact centres are either neglecting the need for robust sample sizes on an individual level, or completely overdoing it and still not getting value for the money. If you are actively collecting feedback from customers today, ask yourself what you are doing (and what you should be doing) with all this valuable information. 3. Survey fatigue Customer feedback is so incredibly valuable to an organisation that it is easy to focus on the opportunities and forget about the possible drawbacks. Surveys generate a range of adjectives in verbatim feedback; intrusive, overwhelming, tiring, annoying and so on. Bright recommends taking the following actions depending on channel. Channel Intrusive Non-intrusive alternatives Web Huge web survey popup while browsing a website, requiring attention (either you participate Floating tab or widget near the edge of the screen requiring no attention at all (opt in) or a www.brightindex.co.uk Page 5 of 11

Email Phone or opt out). Multiple email reminders following your contact, or even worse being called up about your email experience. Receiving random automated calls with surveys too often is the biggest pitfall here. We recommend a bare minimum of 30 days before calling the same telephone number. NB: If somebody hangs up or opts out that usually means they don t want to participate for one reason or another, which is a good reason for not queuing the number for the next hour or day. standard link on the website. First of all: skip the reminders, maximum one follow-up email for the survey should be sent to avoid frustration. Even better, including an opt-in link near the email signature for seamless transition. Automated post call IVR surveys should take place as soon after the relevant contact as possible. Turn off any repeat calls or reminder calls. Text/SMS surveying is also a good non-intrusive alternative following a phone call as they are easy to ignore. When surveying customers for feedback you are asking them for information and usually giving something in return. This something can take many shapes and forms, for example: A promise to improve customer services for you and others in the future. An opportunity to vent frustration in the case of a negative experience Monetary incentive direct (money) or indirect (raffle) Assumption of altruism, that everybody loves helping No matter how you return the favour to the customer it is important to recognise this step. To illustrate how this can go awry, some survey designers work with the assumption that the customer owes the company something as a token of good service as if that s not taken for granted - or rely on altruism because people like to help out (4 th point above). Using money as an incentive is an easy but possibly expensive shortcut to keep your surveys non-intrusive as well as driving the response rate ups. If you are going down this route it is more effective to set up a raffle instead of paying every participant 1, based on the simple observation that many more than 1,000 customers would participate in a raffle for 1,000. The most logical and fair exchange is swapping information for information. As a customer leaves feedback, let them know how it will be used and respond to their comments. If you can make the customer feel that he or she is actually making an impact you can easily transform a disgruntled customer into a very satisfied ambassador. Another way to leverage the perception of your feedback solution is to publish your customer care strategy online. Dedicate a page to describing the journey and impact of a survey in a transparent manner and your customers will feel appreciated and www.brightindex.co.uk Page 6 of 11

involved. Not only will this strengthen the relationship between your business and the customer, but also be a powerful asset in marketing. 4. Segmentation and questions As we mentioned earlier, sampling 1,500 of a company s customers would only be enough for high-level findings. By collating more specific data and better understand where improvements can be made, your survey approach will be more meaningful. The larger your operations are, the more likely it is that you have different teams specialising in different queries. There might for example be a sales, service and support department, all handling very different types of queries. It makes sense to reflect these differences in your survey strategy, especially when scripting the questionnaires. Example of IVR survey tailor-made for a generic service department: How satisfied were you with our speed to answer? How knowledgeable did you find the advisor? How helpful did you find the advisor? Overall, how satisfied were you with the call? How likely are you to recommend us to a friend or family member? (NPS) Was your query resolved in the call? (Yes/No) Plus the ability to record a comment providing more information, including factors beyond the contact centre s control. The reasons for choosing these questions are that we mainly want to find out three things: 1. What drives C-SAT and NPS; speed of answer, agent engagement, FCR? Additionally to what extent? 2. Who are the individuals having a negative / positive effect on the customer perception and why? 3. If the problems do not lie with the advisor, then what other area is the customer unhappy with? The key to building a successful post contact survey is to keep it short and simple. Two minutes is a good benchmark for IVR surveys, and no more than 6-7 questions regardless of channel. 5. Take-up rates and cherry picking In addition to the criteria covered above you will also have to factor in the response rate you can expect for different channels. Looking at around 1M surveys carried out in 2012, these are the average figures we found: Email 11-14% IVR manual invite 60-80% IVR automated 30-40% (10 seconds after call) IVR automated 15-20% (within 24 hours after call) Customer Opt-in 1-5% (Transferred to IVR directly after call) www.brightindex.co.uk Page 7 of 11

A problem many companies suffer from is using a manual approach to surveying, letting contact centre agents choose which customers to survey and not monitoring if they cherry pick happy customers only. There are however ways around this and we have gathered our top three below. Manual methodology You pick a day when all customers must be asked, no exceptions. Our tool Bright Navigator has a monitoring function built in showing surveys offered and take-up rate per agent making absolutely sure all customers are invited. Semi-automatic methodology Bright is sent a bulk file with all numbers contacted by the client during the day and then our servers carry out the surveys unbeknownst to the advisors. This results in slightly lower take up rate due to delay before surveying but requires no advisor involvement and thus no cherry picking is possible. Fully automatic methodology Ten seconds after the advisor has finished the call the number is sent to us by the client s system automatically. This is the ideal scenario and clients can turn up or down the volume as they please, customers already surveyed are automatically excluded etc. Previously only half of the UK centres could be set up to do this, the others systems were too old. Bright can however now offer a solution to enable automatic surveying for most companies, using a simple application-programming interface. 6. Return on investment High-level return on investment is easy to prove. If you measure key metrics in parallel with customer retention and spend, you will normally see an overall correlation. Colleagues from other departments may question if this has to do with customer service and not any of the other Kotler s P s i.e. price reductions, products, promotions or place. or any of the other of Kotler s P s. Once you start measuring customer satisfaction on an individual level however, you will see exactly what impact customer service has on customers that have been exposed to the same offers and marketing messages. For some sectors there are very few interactions per year to base your sample on though. Therefore it may take a long time before you have robust proof but keep at it. You will find your budget requests approved a lot quicker when they are accompanied by an irrefutable benefits case. Read more about this in Bright s white paper Proving and positioning contact centres as a corporate competitive advantage. You will find it in the Research section on brightindex.co.uk. First call resolution, or FCR, can give you a more instant benefits case. Consider an FCR of 70% (70% of all calls were resolved in first contact) for a contact centre handling 1,000,000 calls every year. If we assume the unresolved cases requires another point of contact, that is 300,000 extra calls. Even a 1% increase in FCR would reduce those extra calls by 10,000, saving your business 40,000. www.brightindex.co.uk Page 8 of 11

Customer surveying plays a key part here, in defining what goes wrong and thus focusing your root cause analysis. Depending on the implementation, you may also see a number of benefits such as lower attrition, saving time for team leaders, cutting down on other QA and more. And of course, your complaints department will receive fewer calls. 7. Driving change in the customer service operation and across the organisation All too often we see companies overlooking the huge impact using C-SAT as an operational tool can have. CEOs in the UK often feel anything to do with customer satisfaction should sit centrally with Marketing. We anticipate a shift here when it is clear that C-SAT measuring following the five steps below, can have a significant impact on the company s market position with 30% more very satisfied customers as just one of the outputs. Step 1: Make C-SAT surveying accountable and actionable Asking customers what they feel when they hear your brand name will not help drive change in the contact centre. Instead you need to be able to link the feedback to a recent call to your centre and break it down to agent or minimum team level, making it accountable. Only when you see who is affecting C-SAT or NPS positively/negatively, can you do something about it. Step 2: Deliver 360 feedback to staff When doing instant post call surveys and feeding back their own results to advisors individually, we see metrics like FCR and customer satisfaction improve significantly. There is an element of competitiveness behind this but mainly it comes from the staff member actually seeing that they are making a difference for the first time. They can also experiment with the way they handle customers and see what effect that has on their scores. For instance, if they make an effort to sound happier on calls, and this result in better customer satisfaction scores, they will repeat that behaviour. Thus, sharing the results at agent level creates a self-developing and self-learning organisation. Step 3: Empower your managers Equally powerful is putting structured C-SAT feedback in the hands of your team leaders. Instead of spending time analysing and scoring agent contacts, they can use feedback direct from your customers. Combining this with insight into what actually drives NPS and C-SAT, they can improve their effectiveness further by focusing on the underperforming staff (see graph below). For your managers that aren t very comfortable with confrontation (and you will invariably have a few) an additional bonus is that in one to ones it isn t them telling the advisor that they need to improve, it s the customer telling them. www.brightindex.co.uk Page 9 of 11

Step 4: Use verbatim to create a real time information hub for your organisation Last but certainly not least, the benefit that will put the centre on the wider organisational map. For just as long as we have been trying to prove the value of customer satisfaction, we have been advocating the power of using the contact centre as a real-time information hub for the entire organisation. Millions have been Invested in CRM systems and data warehousing solutions but now contact centres are showing a much simpler way of doing it. The enabler is the arrival of low cost customer survey methodologies that allow us to gather large volumes of verbatim comments. Even if using IVR surveys, sound files can now be accurately converted into text at a very low cost, enabling further data mining of a large volume of responses. When systematically sorted and passed on to other parts of the organisation, the following will happen: A department receiving hundreds of comments every month from customers (passed on by the contact centre) complaining about a process being wrong, will do something about it. This will reduce avoidable calls into customer service and thus costs. A CCA report recently cited a saving of 5-10% from initial projects. The status of the contact centre will change from a necessary evil / cost centre to a business intelligence hub, helping the company keep their finger on the pulse, at no extra cost! www.brightindex.co.uk Page 10 of 11

Step 5: Find out the real drivers of C-SAT and NPS, and close the loop We have touched on how important it is to find out the facts before you decide on your KPIs. How do you do it? A simple approach is to look at correlations. Just like the team leaders above look at what the drivers are on an advisor level, you as a manager can find out the exact drivers on a higher level. This will only tell you what customers prioritise and in what order though. However if you measure this in parallel with your internal performance metrics (like speed of answer, advisor engagement, FCR etc.) over a 6-month period, you will also find your own individual breaking points. This will open up a completely new world in terms of being able to focus on the right things and get more out of your budget by enabling you to exceed customer expectations yet at the lowest cost possible. Do this regularly (and adjust your KPIs and targets accordingly) because your customers expectations as well as the industry and your peers performance will change. Don t just measure customer satisfaction, drive it. Bright UK Ltd Richmond Upon Thames www.brightindex.co.uk Page 11 of 11