SESSION 207 Wednesday, March 25, 11:30 AM - 12:30 PM Track: Desktop Support Performance Metrics for Desktop Support Mike Hanson Senior IT Manager, Optum, Inc. mike@middlemgr.com Session Description There are many challenges to measuring the performance of desktop support, both at the team and individual levels. While the approach may vary based on unique factors like industry or scope, there are some basic measures that can be applied to any desktop support team. This session will look at common measurements within second-level support and present an effective approach to capturing relevant effort, efficiency, and quality measures for your desktop support analysts and the team as a whole. (Fundamental) Speaker Background Over the past twenty-five years, Mike Hanson has been involved with many aspects of IT, ranging from application development to desktop support. Today, Mike is a senior IT manager at Optum, Inc., where he proactively seeks out ways to improve service delivery for more than 125,000 clients. He is a certified HDI Support Center Manager, ITIL Foundation and Practitioner certified, and former chair of the HDI Desktop Support Advisory Board.
Performance Metrics for Desktop Support Michael Hanson Session 207 Bio Over the past twenty-five years, Mike Hanson has been involved with many aspects of IT, ranging from application development to desktop support. Today, Mike is a senior IT manager at Optum, Inc., where he proactively seeks out ways to improve service delivery for more than 125,000 clients. He is a certified HDI Support Center Manager, ITIL Foundation and Practitioner certified, and former chair of the HDI Desktop Support Advisory Board.
What We ll Cover Most common desktop support metrics Customer-centric measures Three critical measures and how to use them Why Performance Metrics? Tells us how we re performing against business goals Gives near-real time feedback on team and individual performance Helps us refine and improve how well we provide services to our clients/customers
Video From Roxanne, Columbia Pictures, 1987 Times Have Changed Just a few years ago, desktop support metrics had no standards Hard to categorize No consistently measurable like the help desk No longer the case Now, there are companies that specialize in metrics
What is Desktop Support? For this discussion: Refers to the IT organization that is responsible for responding to incidents, questions, and service requests involving desktop hardware, software, and operating systems. Incidents (break/fix) Service Requests (scheduled work) 48% of organizations track these types of requests separately How Do We Receive Work? 54%
Volume Common Metrics Number of incidents and/or service requests over a given period Shows how much work is coming into DTS and allows appropriate staffing Trending volumes can identify peak periods Volume Common Metrics Individual volumes are performance indicators Helps managers understand what analysts can accomplish 60% of organizations measure individual volumes 22.5%
Responsiveness Common Metrics Tells management how fast the DTS team is getting to a normal incident or service request Also illuminates how well the team meets customer expectations Responsiveness Common Metrics 53.6% respond to an incident in an hour or less Services requests are not as clear 27.8% in less than 60 minutes 18.5% in 8-24 hours 21.7% more than 24 hours Variation probably due to definition of a service request
Common Metrics Efficiency Common Metrics Directly impacts the customer How many hand-offs 16-20% Should an incident be resolved at the service desk? Gets the customer back to work faster and is less costly to the business 34% track incidents that went to DTS but should have been resolved at the service desk.
Efficiency Common Metrics Once an incident reaches DTS, how quickly is it resolved? Half of all organizations measure average time to resolve 59.8% state incidents usually resolved within 8 hours Service requests take longer 14.9% takes 8-24 hours 18.5% takes 1-2 days 20.9% takes 3-5 days Efficiency Common Metrics FCR First Contact resolution Median: 70-80% Escalations (requires assistance outside of DTS) 39% track escalations Median is 6-10%
Common Metrics Common Metrics Effort Actual time the analyst puts into solving the problem Surprisingly, only 25% measure this 61.2% spend 2 hours or less per incident 59.3% of service requests are 1 hour or less Why this is important: Gauges how much time is spent on core work Indicates how much time specific types of incidents require
Common Metrics Satisfaction Customer s perception of DTS performance 35% capture customer satisfaction in some way 91% Satisfied/Very Satisfied IS IT TIME TO THINK DIFFERENT?
What are These Metrics Telling Us? How much work happened? How long did it take? How much did it cost? How happy are our customers? Quantitative metrics Becoming Business-Centric How do we measure business impact? CSAT (Customer Satisfaction) Net Promoter (NPS) Customer Effort Score (CES) These measure what is important to the customer.
CSAT Measures the degree that a service meets the customer s expectations Gathered by surveying of customers Responses are averaged for a composite CSAT score Generally expressed as a percentage 0-100% Can correlate with changes in training or procedures Net Promoter Score Measures customer loyalty How likely is it that you d recommend this service to a friend? 0-10 scale Usually allows for comment ( Why do you feel this way? ) Motivates and focuses and organization on improvement
Customer Effort Score How hard is it to be our customer? How much work do we ask them to do to solve a problem? High effort equals low customer loyalty How much effort did you have to personally put into this request? Customer Effort Score Stop Trying to Delight Your Customers Customers almost always punish poor service Rarely reward excellent service Loyalty is more often tied to how well we provide basic services
Video MEASURING INDIVIDUAL PERFORMANCE
Evaluating Staff Can be a real challenge for technical managers Often fall into a trap of managing by spreadsheet Do you know how your employees feel about their performance is evaluated? Environment How often do you talk to your employees? If you manage supervisors or other managers, how often do you have skip-level discussions? Private, confidential, safe Diverse Business has national or international presence
Carrot & Stick Method Good Performance = Rewarded with a Carrot Bad Performance = Whacked with a stick REACTIVE Simple Method Three Key Performance Indicators Effort the amount of time the employee takes to actually perform the core responsibilities of their job Efficiency how much work the employee is able to produce in a measured time frame Quality whether the employee performed the task correctly the first time and the customer is satisfied
Effort Need a means to capture effort Preferably in the IM system Cultural change Should account for non core activity Efficiency Similar to a volume measure Degree of time and effort per analyst can vary Both are equally efficient Allows for cherry picking of incidents
Quality Many ways to determine quality Surveys most common Rework How often does the customer reopen for the same issue because it was not fixed correctly the first time? FCR First contact resolution Effort Calculating Efficiency Quality Based on the percentage of success
Using the Three KPI s Put together a utilization report Regularly scheduled, likely monthly Objective is to provide an at a glance score for individual analysts Compares individual scores with historical data Baselines for expected productivity Comparison with overall team performance Using the Three KPI s Baseline: Percentage based on the number of available hours available for core work Team: Based on an average of the overall team scores
Final Words Looked at a lot of numbers LEAD, don t simply manage Communicate regularly One-on-one meetings should be mandatory Be prepared! Document! Translate business goals to goals IT can understand True performance management is not only about numbers, it s about relationships Sources 2013 HDI Desktop Support Salary & Practices Report 2014 HDI Desktop Support Salary & Practices Report HDI research paper: Metrics for the New World of Support. Roy Atkinson. HDI research paper: Desktop Support Metrics. Mike Hanson Harvard Business Review. Stop Trying to Delight Your Customers. Matthew Dixon, Karen Freeman, Nicholas Toman. July 2010 HDI Focus on Performance: It s Not Just the Numbers: Performance Management for Support Teams. Michael Hanson. 2011
Thank you for attending this session. Session 207: Performance Metrics for Desktop Support Michael Hanson Don t forget to complete an evaluation form!