SESSION 207 Wednesday, March 25, 11:30 AM - 12:30 PM Track: Desktop Support

Similar documents
At the beginning of my career as a desktop support manager, I searched everywhere

Metrics from a Support Services Perspective

Why You Need a Metrics Review

Looking back on how desktop support has evolved, it s interesting to see how tools

The Importance of First Contact Resolution. 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com

Continual Service Improvement How to Provide Value on an Ongoing Basis

Remote Support: Key Metrics to drive Improvement in your Center

CRM. Best Practice Webinar. Next generation CRM for enhanced customer journeys: from leads to loyalty

ITIL: Service Operation

Enabling Chat -- Key Success Factors in Chat Implementation

Desktop Support Through Remote Support

RIGHTNOW MULTI-CHANNEL CONTACT CENTER BENCHMARK REPORT

Metric of the Month: First Contact Resolution

ITIL: Foundation (Revision 1.6) Course Overview. Course Outline

Measuring Customer Experience

ITSM Process Maturity Assessment

Project Management and ITIL Transitions

Call Center Glossary. Call Center Resources

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.

The Customer Experience:

ITIL Foundation for IT Service Management 2011 Edition

How much is poor customer service costing your business?

ELIMINATE RECURRING INCIDENTS

SANDY ROGERS Expert Consultant Practice Leader Thought Leader Speaker

ITIL by Test-king. Exam code: ITIL-F. Exam name: ITIL Foundation. Version 15.0

WHITE PAPER. The Five Fundamentals of a Successful FCR Program

Quality Monitoring ROI

ITIL V3 Foundation Certification - Sample Exam 1

Customer Experience Strategy

HDI Support Center Certification: A Pragmatic Approach to Verifying Core ITIL Process Maturity

IT Service Management Practitioner: Support & Restore (based on ITIL ) (IPSR.EN)

Choosing the right CX metric

6 Tips to Help You Improve Incident Management

Continual Service Improvement: The Catalyst for Service Desk Excellence and Enterprise Productivity

20 Customer Service Best Practices SELL. SERVICE. MARKET. SUCCEED.

Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ

Service Catalogue. 0984v1

BUSINESS PROCESS OPTIMIZATION IN THE CONTACT CENTER

SapphireIMS Service Desk Feature Specification

Gartner delivers the technology-related insight necessary for our clients to make the right decisions, every day.

Foundation. Summary. ITIL and Services. Services - Delivering value to customers in the form of goods and services - End-to-end Service

The ITSM Journey. Value. Chaos. Patrick Bolger. Chief Evangelist Hornbill Service Management

Technical support in the healthcare industry is fast-paced and multifaceted. Support

MONEYBALL: An Incident Reduction Approach. Dave Wilson Advocate Health Care

Creating Service Desk Metrics

Information Technology Infrastructure Library (ITIL )

Industry Megatrends: A Benchmarking Perspective

No Surprises! The Support Center s Role in Successful Change and Release Management

Hong Kong Information Security Group TRAINING AGENDA

Ask the Customer Experience Experts

Client Services Manager Self and contribution to Team. Information Services

Cisco IT Technology Tutorial Overview of ITIL at Cisco

Knowledge Management 101 Better Support Decisions, Faster

ITS Help Desk Annual Report 2012

HELP DESK SERVICE LEVEL EXPECTATIONS

Guide. BMC Analytics for Business Service Management User. Supporting. BMC Analytics version for Business Service Management.

Development of a Balanced Scorecard for Service Desk KPIs

Which ITIL process or function deals with issues and questions about the use of services, raised by end users?

Camber Quality Assurance (QA) Approach

Powerful and Practical Tools for Service Desk Quality Improvements and Cost Savings

Role Profile. Job No. (Office Use) A79

Building the Business Case for IT Service Management

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001

How To Measure Tickets Per Technician Per Month

ITIL 2011 Lifecycle Roles and Responsibilities UXC Consulting

SESSION 303 Wednesday, March 25, 3:00 PM - 4:00 PM Track: Support Center Optimization

Allstate Getting Much More from Its IT Services with ServiceNow Cloud-Based IT Service Management Solution

Metric of the Month: Tickets per User per Month

How to benchmark your service desk

EXIN.Passguide.EX0-001.v by.SAM.424q. Exam Code: EX Exam Name: ITIL Foundation (syllabus 2011) Exam

SUPERVISOR, SERVICE DESK (Existing position)

Executive Summary Factors Affecting Benefits And Costs Disclosures TEI Framework And Methodology Analysis...

Auxilion Service Desk as a Service. Service Desk as a Service. Date January Commercial in Confidence Auxilion 2015 Page 1

Internal Customers Matter Too: Using VOC Methods to Understand and Meet Internal Customer Needs

Measuring Customer Effort

Transcription:

SESSION 207 Wednesday, March 25, 11:30 AM - 12:30 PM Track: Desktop Support Performance Metrics for Desktop Support Mike Hanson Senior IT Manager, Optum, Inc. mike@middlemgr.com Session Description There are many challenges to measuring the performance of desktop support, both at the team and individual levels. While the approach may vary based on unique factors like industry or scope, there are some basic measures that can be applied to any desktop support team. This session will look at common measurements within second-level support and present an effective approach to capturing relevant effort, efficiency, and quality measures for your desktop support analysts and the team as a whole. (Fundamental) Speaker Background Over the past twenty-five years, Mike Hanson has been involved with many aspects of IT, ranging from application development to desktop support. Today, Mike is a senior IT manager at Optum, Inc., where he proactively seeks out ways to improve service delivery for more than 125,000 clients. He is a certified HDI Support Center Manager, ITIL Foundation and Practitioner certified, and former chair of the HDI Desktop Support Advisory Board.

Performance Metrics for Desktop Support Michael Hanson Session 207 Bio Over the past twenty-five years, Mike Hanson has been involved with many aspects of IT, ranging from application development to desktop support. Today, Mike is a senior IT manager at Optum, Inc., where he proactively seeks out ways to improve service delivery for more than 125,000 clients. He is a certified HDI Support Center Manager, ITIL Foundation and Practitioner certified, and former chair of the HDI Desktop Support Advisory Board.

What We ll Cover Most common desktop support metrics Customer-centric measures Three critical measures and how to use them Why Performance Metrics? Tells us how we re performing against business goals Gives near-real time feedback on team and individual performance Helps us refine and improve how well we provide services to our clients/customers

Video From Roxanne, Columbia Pictures, 1987 Times Have Changed Just a few years ago, desktop support metrics had no standards Hard to categorize No consistently measurable like the help desk No longer the case Now, there are companies that specialize in metrics

What is Desktop Support? For this discussion: Refers to the IT organization that is responsible for responding to incidents, questions, and service requests involving desktop hardware, software, and operating systems. Incidents (break/fix) Service Requests (scheduled work) 48% of organizations track these types of requests separately How Do We Receive Work? 54%

Volume Common Metrics Number of incidents and/or service requests over a given period Shows how much work is coming into DTS and allows appropriate staffing Trending volumes can identify peak periods Volume Common Metrics Individual volumes are performance indicators Helps managers understand what analysts can accomplish 60% of organizations measure individual volumes 22.5%

Responsiveness Common Metrics Tells management how fast the DTS team is getting to a normal incident or service request Also illuminates how well the team meets customer expectations Responsiveness Common Metrics 53.6% respond to an incident in an hour or less Services requests are not as clear 27.8% in less than 60 minutes 18.5% in 8-24 hours 21.7% more than 24 hours Variation probably due to definition of a service request

Common Metrics Efficiency Common Metrics Directly impacts the customer How many hand-offs 16-20% Should an incident be resolved at the service desk? Gets the customer back to work faster and is less costly to the business 34% track incidents that went to DTS but should have been resolved at the service desk.

Efficiency Common Metrics Once an incident reaches DTS, how quickly is it resolved? Half of all organizations measure average time to resolve 59.8% state incidents usually resolved within 8 hours Service requests take longer 14.9% takes 8-24 hours 18.5% takes 1-2 days 20.9% takes 3-5 days Efficiency Common Metrics FCR First Contact resolution Median: 70-80% Escalations (requires assistance outside of DTS) 39% track escalations Median is 6-10%

Common Metrics Common Metrics Effort Actual time the analyst puts into solving the problem Surprisingly, only 25% measure this 61.2% spend 2 hours or less per incident 59.3% of service requests are 1 hour or less Why this is important: Gauges how much time is spent on core work Indicates how much time specific types of incidents require

Common Metrics Satisfaction Customer s perception of DTS performance 35% capture customer satisfaction in some way 91% Satisfied/Very Satisfied IS IT TIME TO THINK DIFFERENT?

What are These Metrics Telling Us? How much work happened? How long did it take? How much did it cost? How happy are our customers? Quantitative metrics Becoming Business-Centric How do we measure business impact? CSAT (Customer Satisfaction) Net Promoter (NPS) Customer Effort Score (CES) These measure what is important to the customer.

CSAT Measures the degree that a service meets the customer s expectations Gathered by surveying of customers Responses are averaged for a composite CSAT score Generally expressed as a percentage 0-100% Can correlate with changes in training or procedures Net Promoter Score Measures customer loyalty How likely is it that you d recommend this service to a friend? 0-10 scale Usually allows for comment ( Why do you feel this way? ) Motivates and focuses and organization on improvement

Customer Effort Score How hard is it to be our customer? How much work do we ask them to do to solve a problem? High effort equals low customer loyalty How much effort did you have to personally put into this request? Customer Effort Score Stop Trying to Delight Your Customers Customers almost always punish poor service Rarely reward excellent service Loyalty is more often tied to how well we provide basic services

Video MEASURING INDIVIDUAL PERFORMANCE

Evaluating Staff Can be a real challenge for technical managers Often fall into a trap of managing by spreadsheet Do you know how your employees feel about their performance is evaluated? Environment How often do you talk to your employees? If you manage supervisors or other managers, how often do you have skip-level discussions? Private, confidential, safe Diverse Business has national or international presence

Carrot & Stick Method Good Performance = Rewarded with a Carrot Bad Performance = Whacked with a stick REACTIVE Simple Method Three Key Performance Indicators Effort the amount of time the employee takes to actually perform the core responsibilities of their job Efficiency how much work the employee is able to produce in a measured time frame Quality whether the employee performed the task correctly the first time and the customer is satisfied

Effort Need a means to capture effort Preferably in the IM system Cultural change Should account for non core activity Efficiency Similar to a volume measure Degree of time and effort per analyst can vary Both are equally efficient Allows for cherry picking of incidents

Quality Many ways to determine quality Surveys most common Rework How often does the customer reopen for the same issue because it was not fixed correctly the first time? FCR First contact resolution Effort Calculating Efficiency Quality Based on the percentage of success

Using the Three KPI s Put together a utilization report Regularly scheduled, likely monthly Objective is to provide an at a glance score for individual analysts Compares individual scores with historical data Baselines for expected productivity Comparison with overall team performance Using the Three KPI s Baseline: Percentage based on the number of available hours available for core work Team: Based on an average of the overall team scores

Final Words Looked at a lot of numbers LEAD, don t simply manage Communicate regularly One-on-one meetings should be mandatory Be prepared! Document! Translate business goals to goals IT can understand True performance management is not only about numbers, it s about relationships Sources 2013 HDI Desktop Support Salary & Practices Report 2014 HDI Desktop Support Salary & Practices Report HDI research paper: Metrics for the New World of Support. Roy Atkinson. HDI research paper: Desktop Support Metrics. Mike Hanson Harvard Business Review. Stop Trying to Delight Your Customers. Matthew Dixon, Karen Freeman, Nicholas Toman. July 2010 HDI Focus on Performance: It s Not Just the Numbers: Performance Management for Support Teams. Michael Hanson. 2011

Thank you for attending this session. Session 207: Performance Metrics for Desktop Support Michael Hanson Don t forget to complete an evaluation form!