Development of a Balanced Scorecard for Service Desk KPIs Presented by: Robert Higgins and Jason Reid Position: IT Service Desk and Telecommunications Manager and Team Leader
Balanced IT Scorecard
History
Before Situation
KPI - Categories
KPI Quiz
1. COST per CONTACT 2. CUSTOMER SATISFACTION 3. AGENT UTILISATION 4. FIRST CONTACT RESOLUTION 5. AGENT SATISFACTION 6. AVERAGE SPEED of ANSWER 7.????
7. AGREGATE SERVICE DESK PERFORMANCE
PGW
5 th PEA?
Quantity Quality Timeliness Compliance
1-6
Step 1: Understand customer expectation and service requirements Step 2: Define the Critical Success Factors of the Service Desk 5Ps you are measuring Step 3: Choose the KPI Category you are measuring (Quantity, Quality, Timeliness, Compliance) Step 4: Define the KPI Step 5: Define the KPI target or success criteria Step 6: Determine the data source and instrumentation to obtain and collect KPI measurements 1-6
KPI CSF CSF Value KPI Data Data Measure Category Category Category Elements Source First Level Resolution Rate Total Cost of Ownership is reduced by resolution at Level 1 compared to higher costs at Level 2,3 Performance Efficiency Quantity Group Closed By LSD Request Form Customer Satisfaction Customers are satisified with the quality of the service provided by the Service Desk Quality Satisifaction Quality Overal Satisfaction with IT LSD Survey First Contact Resolution Rate Request Closed in SLA The majority of incidents should be resolved by the Service Desk on first contact Incidents are resolved within timeframes agreed with the customer Quality Performance Productivity, Efficiency Productivity, Efficiency Quality Quantity Timeliness Compliance FCR Field (*new) Time Closed LSD Request Form LSD Request Form Average Speed to Answer (ASA) (sec.) Customers can reach a Service Desk Analyst when needed Performance Productivity, Efficiency Compliance ASA Cisco Call Manager SDA Job Satisfaction Agents are satisfied with their job and are engaged with PGW (Stocktake) Quality Satisifaction Quality Happiness Factor (*new) 1 on 1
Performance Metric Metric Weight 1 First Level Resolution Rate 25% 2 Customer Satisfaction 25% 3 First Contact Resolution Rate 20% 4 Requests Closed in SLA 15% 5 Average Speed to Answer (ASA) (sec.) 10% 6 SDA Job Satisfaction Total 5% 100%
Performance Metric Metric Weight Annual Performance Range Worst Case Best Case 1 First Level Resolution Rate 25% 55% 63% 2 Customer Satisfaction 25% 63% 75% 3 First Contact Resolution Rate 20% 40% 52% 4 Requests Closed in SLA 15% 91% 96% 5 Average Speed to Answer (ASA) (sec.) 10% 67 12 6 SDA Job Satisfaction 5% 55% 88% Total 100%
Performance Metric Metric Weight Annual Performance Range Worst Case Best Case Months Performance 1 First Level Resolution Rate 25% 55% 63% 55% 2 Customer Satisfaction 25% 63% 75% 68% 3 First Contact Resolution Rate 20% 40% 52% 51% 4 Requests Closed in SLA 15% 91% 96% 96% 5 Average Speed to Answer (ASA) (sec.) 10% 67 12 15 6 SDA Job Satisfaction 5% 55% 88% 55% Total 100%
[(Worst Case Actual performance) / (Worst case Best case)] x 100
Performance Metric Metric Weight Annual Performance Range Worst Case Best Case Months Performance Metric Score 1 First Level Resolution Rate 25% 55% 63% 55% 0% 2 Customer Satisfaction 25% 63% 75% 68% 41% 3 First Contact Resolution Rate 20% 40% 52% 51% 92% 4 Requests Closed in SLA 15% 91% 96% 96% 100% 5 Average Speed to Answer (ASA) (sec.) 10% 67 12 15 95% 6 SDA Job Satisfaction 5% 55% 88% 55% 0% Total 100%
Metric Score x Metric Weight
Performance Metric Metric Weight Annual Performance Range Worst Case Best Case Months Performance Metric Score Balanced Scorecard 1 First Level Resolution Rate 25% 55% 63% 55% 0% 0.00% 2 Customer Satisfaction 25% 63% 75% 68% 41% 10.29% 3 First Contact Resolution Rate 20% 40% 52% 51% 92% 18.33% 4 Requests Closed in SLA 15% 91% 96% 96% 100% 15.00% 5 Average Speed to Answer (ASA) (sec.) 10% 67 12 15 95% 9% 6 SDA Job Satisfaction 5% 55% 88% 55% 0% 0.00% Total 100% 53.08%
Service Desk Scorecard Between May 2014 and Apr 2015 Performance Metric Metric Weight Annual Performance Range Months Worst Case Best Case Performance 1 First Level Resolution Rate 25% 69% 77% 69% 0% 0.00% 2 Customer Satisfaction 25% 50% 94% 93% 97% 24.29% 3 First Contact Resolution Rate 20% 39% 57% 57% 100% 20.00% 4 Requests Closed in SLA 15% 81% 90% 85% 48% 7.21% 5 Average Speed to Answ er (ASA) (sec.) 10% 40 18 19 95% 9.55% 6 SDA Job Satisfaction 5% 58% 73% 67% 59% 2.97% Total 100% Metric Score Balanced Scorecard 64.01%
Performa nce Metri c Definition 1 First Level Resolution (FLR) rate FLR is the percentage of requests resolved by the IT Service Desk (Helpdesk plus SD Level 2). Any request resolved by another IT team (Infrastructure, Applications or Web) is, by definition, not resolved at our First level. Typically this is considered a cost metric since it has a strong impact on Total Cost of Ownership for end-user support, however we are also using this to measure our ability to free up Level Three resources so they can focus on value added project work
Why it s important FLR is a measure of the overall competency of the Service Desk, and is a proxy for Total Cost of Ownership (TCO). A high FLR Rate helps to minimize TCO because each contact that is resolved at Level 1 avoids a higher cost of resolution and free up resources at Level n (Infrastructure, Applications or Web). Service Desks can improve their Level 1 Resolution Rates through training, and investments in certain technologies such as remote diagnostic tools, and knowledge management systems.
Expected Outcomes CSFs Expected SDA Behaviours Total Cost of Ownership is reduced by resolution at First Level compared to higher costs at Level 3 level 3 resources are free up for value added project work. Continuous learning, knowledge sharing, ownership, not escalating, growth
Performance Metric Definition Why it s important Expected Outcomes CSFs Expected SDA Behaviours 1 First Level Resolution (FLR) rate FLR is the percentage of requests resolved by the IT Service Desk (Helpdesk plus SD Level 2). Any request resolved by another IT team (Infrastructure, Applications or Web) is, by definition, not resolved at our First level. Typically this is considered a cost metric since it has a strong impact on Total Cost of Ownership for end-user support, however we are also using this to measure our ability to free up Level Three resources so they can focus on value added project work FLR is a measure of the overall competency of the Service Desk, and is a proxy for Total Cost of Ownership (TCO). A high FLR Rate helps to minimize TCO because each contact that is resolved at Level 1 avoids a higher cost of resolution and free up resources at Level n (Infrastructure, Applications or Web). Service Desks can improve their Level 1 Resolution Rates through training, and investments in certain technologies such as remote diagnostic tools, and knowledge management systems. Total Cost of Ownership is reduced by resolution at First Level compared to higher costs at Level 3 level 3 resources are free up for value added project work. Continuous learning, knowledge sharing, ownership, not escalating, growth 2 Customer Satisfaction Customer Satisfaction is the rating of the quality of the overall experience of the service delivered by IT. This metric is measured via a customer satisfaction survey within our Service Portal triggered every 1 in ten requests logged Customer Satisfaction is the single most important measure of Service Desk quality. Any successful Service Desk will have consistently high Customer Satisfaction ratings. Customers are satisfied with the quality of the service provided by the Service Desk. Increased employee productivity Professionalism, active listening, politeness, teaching, prevention 3 4 5 6 First Contact Resolution (FCR) rate Request Closed in SLA (Priority) Average Speed to Answer (ASA) (sec.) SDA Job Satisfaction FCR applies to all forms of contact be it via telephone call, email, self-service or walk-in. It is the percentage of requests that are resolved on the first interaction (contact) with the customer, divided by all resolved requests. Calls that require a customer call-back, or are otherwise unresolved on the first contact for any reason, do not qualify for Net First Contact Resolution The percentage of requests resolved within SLAs (success rate ). Requests are assigned Priorities based on business impact in order to measure whether resolution times meet agreed timeframes. ASA is the total wait time that callers are in queue, divided by the number of calls handled. This includes both IVR handled calls as well as calls handled by a live Agent Agent Job Satisfaction is a measure of how engaged the service desk staff member is. Also known as the happiness factor. Typically above 70% is engaged, 40% - 70% ambivalent, and less than 40% disengaged FCR is the single biggest driver of Customer Satisfaction. A high FCR rate is almost always associated with high levels of Customer Satisfaction. Service Desks that emphasize training (i.e., high training hours for new and veteran Agents), and have good technology tools, such as remote diagnostic capability and knowledge management, generally enjoy a higher than average FCR Rate. Expectations are set by business through SLAs which IT needs to deliver against. Depending on the demand for requests this can help in remedial action such as resource allocation planning or training. ASA is an indication of how responsive a Service Desk is to incoming calls. Gives an indication on how we are tracking of rather stocktake employee engagement. It is a proxy for morale, while difficult to measure, is a bellwether metric that affects almost every other metric in the Service Desk. High performance Service Desks almost always have high levels of Agent Job Satisfaction. Perhaps The majority of incidents should be resolved by the Service Desk on first contact Incidents are resolved within timeframes agreed with the customer Customers can reach a Service Desk Analyst when needed Agents are satisfied with their job and are engaged with PGW Continuous learning, knowledge sharing, ownership, not escalating/calling back/log and flog Efficiency, research, communication, updating, time Readiness, log and flog, brevity, call backs Engagement, presence, longevity, loyalty, growth
After Situation
Log non urgent requests IT Service Portal* new Your primary channel for logging non urgent IT issues and requests DIY IT Service Portal #1 Channel Feedback Survey Monitor Moni and tor Update and Requests Upda te See Latest Alerts Reduced call hold times http://pggwapp04/laytonservicedesk/euserauto.aspx
% of Requests by Source Between 01/12/2014 and 30/04/2015 Dec-2014 Jan-2015 Feb-2015 Mar-2015 Apr-2015 Bushw ire Forms 0.00% 1.108% 2.287% 4.879% 5.447% Email 47.913% 50.29% 51.659% 50.385% 49.463% Phone 36.754% 38.313% 39.507% 38.194% 37.691% Self Service 3.337% 1.956% 2.451% 4.381% 4.419% Unknow n 11.729% 7.342% 3.424% 1.052% 0.00% Walk In 1.091% 2.178% 1.617% 1.765% 3.57%
https://nz.linkedin.com/in/robertghiggins Rob Higgins IT Service Desk and Telecommunications Manager Email. rob.higgins@pggwrightson.co.nz