At the beginning of my career as a desktop support manager, I searched everywhere



Similar documents
SESSION 207 Wednesday, March 25, 11:30 AM - 12:30 PM Track: Desktop Support

Technical support in the healthcare industry is fast-paced and multifaceted. Support

When people think of computers, they typically think about the machines that allow

Desktop Support Through Remote Support

Looking back on how desktop support has evolved, it s interesting to see how tools

The Importance of First Contact Resolution. 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com

Through various types of technology solutions, support organizations are empowering

Support organizations are implementing new technologies and processes to

Bring Your Own Device (BYOD): Hot or Not?

Comtech Systems Inc.

Solving IT Hardware Asset Management Problems: May 2015

Five Ways to Satisfy Both Business and User Demands for IT Service

Managing the customer experience across channels -- a manager's guide

FLORIDA COURTS E-FILING AUTHORITY HELP DESK POLICIES & PROCEDURES

Is ITIL right for you? Understand the benefits of Implementing ITIL Processes

Continual Service Improvement How to Provide Value on an Ongoing Basis

Remote Support: Key Metrics to drive Improvement in your Center

How To Measure Tickets Per Technician Per Month

Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ

Best Practices in IT Support Systems IMPROVING HELP DESK PERFORMANCE AND SUPPORT

Metric of the Month: First Contact Resolution

MONEYBALL: An Incident Reduction Approach. Dave Wilson Advocate Health Care

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

Metric of the Month: Percent Resolved Level 1 Capable

Evolving the IT Service Experience to Meet New Business and User Demands

How to benchmark your service desk

6 Tips to Help You Improve Incident Management

Infasme Support. Incident Management Process. [Version 1.0]

ILTA 2010 Strategic Unity. The Guru s Guide for Helpdesk and Deskside Support

The State of Desktop Support: What You Need to Know. Houston, Texas October 12, 2011

A Framework for Project Metrics

Metrics from a Support Services Perspective

Why You Need a Metrics Review

Let Your Call Center Customer Service Representatives be a Judge!

Business Metrics. Business Intelligence that Positively Impacts Your Business. White Paper

The True Cost of Desktop Support: Understanding the Critical Cost Drivers

Support and Service Management Service Description

ITIL v3 (Lecture III) Service Management as a Practice IT Operation

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001

Transform Your Service Desk by Using Award-winning Strategies

Metrics That Matter. Presented by: Pete McGarahan

Expert Reference Series of White Papers. The Value of ITIL Certification

UNISYS ENTERPRISE HELP DESK USERS GUIDE

ASSESSING CUSTOMER SERVICE MATURITY AN IN-DEPTH LOOK AT MICROSOFT CUSTOMERS

Leveraging Mobility to Drive Productivity and Provide a Superior IT Service Management Experience

INCIDENT MANAGEMENT & REQUEST FULFILLMENT PROCESSES. Process Owner: Service Desk Manager. Version: v2.0. November 2014 Page 0

Module 1 Study Guide

Office of Technology Services. Help Desk Request and Priority Processing

We, as an industry, are dominated by huge pressure to meet an inordinate number of performance targets.

ITIL: Foundation (Revision 1.6) Course Overview. Course Outline

How To Create A Help Desk For A System Center System Manager

Kennebec Valley Community College Information Technology Department Service Level Agreement

ITIL by Test-king. Exam code: ITIL-F. Exam name: ITIL Foundation. Version 15.0

First Call/Visit Resolution Getting It Fixed the First Time

Improving contact center productivity and customer satisfaction with a proven portal solution.

Customer Experience Management (CEM) Technology: What, Why, and How Does It Work

ELIMINATE RECURRING INCIDENTS

Stanford / MIT Benchmarking IT Help Desk

Knowledge Management in Technical Support

ISO :2005 Requirements Summary

The ITIL Foundation Examination

UW Connect Update & Incident Management Overview

Metric of the Month: Tickets per User per Month

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness

Ohio University Office of Information Technology

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6

Metrics and Myths About Metrics Roy Atkinson HDI Connecticut Chapter September 19, 2014

The Customer Experience:

Measuring Success Service Desk Evaluation Guide for the Midsized Business: How to Choose the Right Service Desk Solution and Improve Your ROI

WHITE PAPER. IT Outsourcing HDI INDUSTRY BENCHMARK REPORT SERIES Bomgar Corporation. All Rights Reserved bomgar.com

Department of Information Technology

IT Service Management

Software System for Automated Support of Endusers

00 Service desk. Processes regarding the service desk activities (according to ITIL). Documented processes:

Unbreak ITSM: Work the Way People Do

The ITIL Foundation Examination Sample Paper A, version 5.1

RTM Consulting. Practical Knowledge Management. The Keys to Customer Satisfaction. Randy Mysliviec CEO

Service Level Agreement and Management By: Harris Kern s Enterprise Computing Institute

CA Service Desk Manager

tips to help you deliver great customer support and keep your customers happy

How To Improve Your Business

Sponsored by: Microsoft. November ! Team skill is directly responsible for organizational performance in several key IT functional areas.

SapphireIMS Service Desk Feature Specification

Smart Reporting: Using Your Service Desk to Better Manage Your IT Department

ITIL V3 Intermediate Capability Stream:

Florida Courts efiling Authority. User Forum Policy. Page 1 of 11 DRAFT

HP Service Manager. Software Version: 9.40 For the supported Windows and Linux operating systems. Processes and Best Practices Guide (Codeless Mode)

ITSM Reporting Services. Enterprise Service Management. Monthly Metric Report

SESSION 403 Thursday, March 26, 10:00 AM - 11:00 AM Track: Support Center Optimization

How to Select the Right Remote Support Tool:

RFP Trend Update: The People Behind the Metrics

Customer Care Center Details

Mastering Institutional Biospecimen Management

Enabling Chat -- Key Success Factors in Chat Implementation

For Infrastructure & Operations Professionals

Managed Desktop Support Services

Creating and Monitoring Customer Satisfaction

3rd Edition August The Guru s Guide for. Desk Support. Law firm specific metrics & key performance indicators

Helpdesk Software: Service Desk Management or Glorified Database? Tweet using #APP6

Transcription:

SEPTEMBER 2013 Desktop Support Metrics Written by Mike Hanson Data analysis by Jenny Rains At the beginning of my career as a desktop support manager, I searched everywhere for examples of industry-standard measurements for second-level support organizations. At that time, my search was in vain because desktop support was in its infancy. There were plenty of metrics for help desks, but nothing concrete for desktop support teams or individuals. Today, I m happy to report that s no longer the case. The support profession has matured and is now recognized as an integral part of IT and the business. There are even organizations that specialize in measuring support teams, and there are now a number of common metrics used across a variety of industries. In this brief, we will focus on the metrics commonly used by desktop support. This data is based on the 2013 HDI Desktop Support Practices & Salary Report. From November 2012 through January 2013, HDI surveyed a crosssection of more than thirty industries, with the 978 respondents representing multinational organizations that are either based in or provide support to end users around the world. THE FOUNDATION Depending on the size of an organization or the type of business, desktop support can mean different things, have different scopes, or even have different names (e.g., second-level support, field services). For the purposes of this paper, desktop support refers to the IT organization that s responsible for responding to incidents, questions, and service requests that involve desktop hardware, software, and operating systems. They re also usually responsible for fulfilling service requests related to desktop hardware and deploying or updating software on a client s local workstation. Desktop support tickets are categorized as desktop support based on the type of issue (64%), the individual assigned to handle the issue (45%), and the manner of resolution (17%). Incident refers to a problem the customer is having with hardware or software (i.e., there s something broken that needs to be fixed). Conversely, service requests are scheduled events. If a customer needs a new computer or peripheral, or new or upgraded software, then the service desk would open a service request and desktop support would schedule the work. Forty-one percent of organizations track these two ticket types separately. In these organizations, on average, 56 percent of tickets are incidents and 42 percent are service requests. Thirty-five percent of organizations don t differentiate, while the remaining 25 percent distinguish between them but don t track them separately. Desktop support organizations operate in almost every industry, serving a variety of business types and supporting a range of client bases, from very small to very large. Thus, desktop support presents a unique challenge. Over the years, certain metrics have bubbled to the top as standards, but the interpretation of those metrics may not be as consistent as you might find in a more structured environment, like the service desk. Nonetheless, a good baseline of organizational metrics can help us manage our operations properly and respond more effectively to the business s needs. 56% of organizations saw an increase in the number of tickets received by desktop support. THE METRICS Desktop support gets its work from a variety of sources. Depending on how large the organization is or how mature the support teams are, there may be multiple avenues for work to make its way into the support queue. The support center is the primary channel, with 46 percent of tickets assigned to desktop support from there. In some organizations, customers are able to contact desktop support directly via phone (22% of tickets) or email (14% of tickets). Web requests and walk-ups/drive-bys constitute the remaining 17 percent of tickets. Regardless of intake channel, there are some metrics that are common across industries. HDI Research Brief, September 2013 1

Percentage of tickets assigned to desktop support from each of the following channels: 46.3% 22.3% 14.4% Assigned by support center End user calls desktop support directly End user emails desktop support directly 10.2% 6.4% 0.4% End user submits a web request (i.e., does not involve the support center) Walk-ups/drive-bys? Other Volume For forecasting purposes, the most common metric is volume, or the number of incidents and service requests (or both) assigned to the desktop support team. This is an important measure because it gives management an idea of how much work is coming to the support team and it allows them to staff the team appropriately to deal with that incoming work. Tracking this data over time can provide historical perspective, enabling managers to identify certain times of the week, month, or year that may require more resources. In 2013, 56 percent of organizations saw an increase in the number of tickets received by desktop support. 21% of organizations report that desktop technicians handle more than 200 tickets each month! More than half of the organizations surveyed measure the percentage of tickets handled by desktop support. For those organizations that separate incidents and service requests, 25.5 percent report that more than half of all incidents are assigned to desktop support; the median is 25 30 percent. The numbers are similar for service requests, with 30.5 percent reporting that more than half are being by desktop support; the median here is 41 45 percent. As we might expect, desktop support handles more service requests than incidents. This is even true in organizations that don t distinguish between the two ticket types, with 36.5 percent receiving more than half of all tickets assigned (median = 41 45%). Volume is also important at the individual level, as it helps managers understand what individual desktop support analysts can accomplish. Almost 60 percent of organizations surveyed measure the average number of tickets resolved by a single technician in a month. In 21 percent of organizations, analysts handle more than 200 tickets each month, but the median is 101 125 tickets per month. Responsiveness Responsiveness is just as important as volume. More than half (53%) of organizations measure responsiveness, which tells managers just how fast the support team can get to a typical incident or request (not including high-priority or urgent issues). This metric also helps management understand how well the support team is meeting customer expectations. The survey revealed that 48 percent of organizations that measure responsiveness are able to respond to an incident in an hour or less, with the median response time being 1 2 hours. Service requests, conversely, don t follow a clear pattern, with those response rates showing a high degree of variability. For example, 15.9 percent of organizations respond in 8 24 hours, while 14.2 percent get back to the client in 30 60 minutes (median = 2 4 HDI Research Brief, September 2013 2

53% of desktop support organizations measure average time to respond. hours). In organizations that don t distinguish between incidents and service requests, 42.7 percent respond in an hour or less, 40.4 percent in 1 8 hours, and the remaining 13.8 percent in 8 hours or more (median = 1 2 hours). This variation is likely the result of the fact that the definition of a service request is broader than that of an incident. The type of activity a service request requires very much depends on the size and type of organization, whereas incidents are almost universally considered to have priority. Efficiency There are a number of metrics that measure the efficiency of the IT organization. Some of these metrics relate directly to desktop support, while others focus on the flow of incidents in and out of the support center. Depending on the complexity of an issue, a support ticket can move between several support groups before being resolved to the client s satisfaction. In most organizations, the service desk owns the issue and determines who should be engaged to resolve the problem. Ideally, the incident should be resolved by the service desk, because that gets the customer back to work without delay and keeps costs down. Some organizations (34%) actually track the incidents that reach desktop support that should have been resolved by the service desk (level 1). 11-15% of the tickets sent to desktop support should have been resolved by the service desk. For those that measure ticket types separately, 32 percent of respondents reported that more than 20 percent of their incidents could and should have been handled by the service desk. Likewise, 32 percent of respondents indicated that more than 20 percent of their service requests could and should have been level 1 issues. In organizations that don t distinguish between incidents and service requests, the number is higher: 37 percent report that more than 20 percent of their tickets should have been resolved before they reached desktop support. However, the median for all three groups is the same: 11 15%. Average time to resolve desktop support tickets (from the time a ticket is received to the time it is resolved): INCIDENTS SERVICE REQUESTS Less than 1 hour 8.9% 8.9% Less than 1 hour 6.8% 6.4% 1 4 hours 20.7% 18.6% 1 4 hours 10.7% 11.9% 4 8 hours 18.1% 16.9% 4 8 hours 11.5% 11.5% 8 24 hours 19.8% 19.4% 8 24 hours 21.4% 13.6% 1 2 days 15.2% 15.2% 1 2 days 16.7% 15.7% 3 5 days 8.4% 6.3% 3 5 days 18.4% 21.3% More than 5 days 1.3% 0% More than 5 days 8.1% 3.8% I don t know 7.6% 7.2% I don t know 6.4% 6.8% HDI Research Brief, September 2013 3

Once an issue reaches desktop support, it becomes a question of how quickly and efficiently the desktop support team can resolve the problem. Of the organizations surveyed, 49 percent measure the average time to resolve desktop support tickets. Predictably, incidents are handled faster than service requests, since resolving a problem is not the same as fulfilling a request. Almost half of the respondents (48%) indicate that incidents are generally resolved within one business day (8 hours). An additional 19.8 percent resolve incidents within 24 hours. The remaining respondents take 1 5 days or more. Service requests nearly always take longer, with 21.4 percent taking 8 24 hours to resolve, 16.7 percent taking 1 2 days, and 18.4 percent taking 3 5 days. Only 29 percent are resolved in less than 8 hours. Service requests also impact those teams that combine their tickets: 17.5 percent of such requests are resolved in 8 24 hours, while 18.3 percent take 1 2 days. Like the service desk, the desktop support team may also track how often a ticket is resolved on the first attempt. On the service desk, this is usually called first call resolution (FCR). For second-level support, we swap call for contact: first contact resolution. Regardless, the median for tickets resolved by the technician on the first attempt is 70 80 percent. There are always circumstances that require desktop support to engage other support teams, which obviously adds time to the duration of an incident. For this reason, some teams also track ticket escalation. Of the survey respondents, 39 percent say they measure this metric, and the median for escalations from desktop support to other levels is 11 15 percent. However, there s a clear difference between the amount of time it takes to resolve an incident and the amount of time or effort an analyst or technician puts into the resolution of the incident. Time to resolve is the duration of the ticket: how long it takes to reach a resolution from the time the ticket is opened to the time it s closed. Effort is the actual time the analyst/technician spends working on the problem. For example, if an issue requires a desktop computer to be reimaged, the entire reimage job may take several hours. The actual effort put in by the analyst/technician may be far less (i.e., however long it takes to set up and kick off the job). While that job is running, the analyst/technician can work on other tasks. The effort measure represents only the time they spent directly working on the issue. What s surprising is how few organizations attempt to measure effort. There s significant value in this metric, because it shows how much time the analyst/technician is spending on core work. It also allows managers to gauge how much time and money specific types of tickets require. Of the organizations surveyed, only 25 percent reported measuring effort. For incidents, 69.5 percent of organizations are spending 2 hours or less on each ticket, compared to the duration metric of 8 hours. For service requests, 62 percent of organizations report that less than an hour of effort is typical. In mixed environments, a little more than 50 percent show effort of an hour or less. Satisfaction Support organizations exist to resolve the customers technical issues. Most of the metrics noted above objectively show how well desktop support is able to handle its volume of work. What they don t speak to the customer s perception of desktop support. That s where customer satisfaction metrics are valuable. Many organizations distribute customer surveys to get feedback on individual and process performance. Thirtyfive percent of the organizations surveyed have some mechanism for capturing this information. The survey shows that 15.8 percent solicit feedback from a random sampling of customers, while almost 10 percent send out surveys for all tickets. The remaining respondents collect data on a regular schedule. Average desktop support customer satisfaction rating: 65% Very satisfied 31% Satisfied 1.2% Dissatisfied 2% Neutral 2% Very dissatisfied HDI Research Brief, September 2013 4

By far, email is the most common method for distributing surveys (26.6%). Customer-facing websites are the next most common (13.1%), followed by phone, interoffice mail, postal mail, and other methods. The good news is that, across all industries, 96 percent of desktop support customers are either satisfied or very satisfied; just 3.2 percent of organizations have customers who are dissatisfied or worse. Summary of desktop support metrics: Incidents* Median Service Requests** Percentage of total tickets handled by desktop support 25-30% 41-45% Average number of tickets resolved by one desktop support technician in a month 101-125 Average time to respond (includes only typical tickets, not urgent or high-priority tickets) Percentage of tickets sent to desktop support that could have been resolved by the support center (level 1) Average time to resolve a desktop support ticket: (from the time a ticket is received to the time it is resolved) 1-2 hours 2-4 hours 11-15% 11-15% 8-24 hours 8-24 hours Percentage of tickets resolved by the technician on the first attempt 70-80% 70-80% Percentage of tickets escalated to another department or level 11-15% Average amount of dedicated work time (effort) a desktop support technician spends on a ticket 1-2 hours 1-2 hours * Incidents: Results for tickets for unplanned work required to fix something. ** Service requests: Results for tickets where nothing is broken but a service is needed. Conclusion There s an old saying that you can t manage what you can t (or don t) measure. That s very true in the world of desktop support. Effective metrics helps management focus its attention on what s important. They tell us where we re going, where we ve been, and how to prepare for the future. They allow us to set realistic goals, and they tell us when we achieve those goals. A meaningful program of metrics and measures is well worth your time and energy! If you re a desktop support professional, I encourage you to pick up a copy of the 2013 HDI Desktop Support Practices & Salary Report. It expands on the information in this report and much, much more! Sponsored by: For all available HDI Research Briefs, visit www.thinkhdi.com/bepartofthecorner. Copyright 2013 UBM LLC. All rights reserved. HDI Research Brief, September 2013 5