Stanford / MIT Benchmarking IT Help Desk



Similar documents
IT Support Center Call Handling Metrics Week Beginning: January 3, 2016

24x7 Help Desk Services Questions & Answers for RFP 40016_

Metrics That Matter. Presented by: Pete McGarahan

ILTA 2010 Strategic Unity. The Guru s Guide for Helpdesk and Deskside Support

Helpdesk Software: Service Desk Management or Glorified Database? Tweet using #APP6

ITS Managed IT Support

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

Client Services Quarterly Report

Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ

IMT Performance Metrics and Qualitative Feedback

Remote Support: Key Metrics to drive Improvement in your Center

COMPUTER SERVICES HELP DESK USING METRICS FOR BUSINESS PROCESS IMPROVEMENT

Performance and Costs Behind Staffing the Desk. A Helpdesk Health Check

AITS FY15 Metrics Report. 7/1/2015 University of Illinois Administrative Information Technology Services

Creating Service Desk Metrics

3rd Edition August The Guru s Guide for. Desk Support. Law firm specific metrics & key performance indicators

LEGAL SERVICE DESK SUPPORT

APPENDIX 4 TO SCHEDULE 3.3

SD Monthly Report Period : August 2013

ITD Help Desk Traffic Report May 2002

At the beginning of my career as a desktop support manager, I searched everywhere

Executive Branch IT Reorganization Project Plan

COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*

COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*

Metric of the Month: First Contact Resolution

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001

MIS. Platform Services. Software. Business Intelligence. Technology Customer Support

Aryzta Commercial Excellence (ACE) ACE Project Overview

State Network Consumer Assistance Small Group Consultation Call Center

Helpdesk RT Metrics for FY2011. As of June 30th, 2011

By default, the Dashboard Search Lists show tickets in all statuses except Closed.

Computing & Telecommunications Services Monthly Report March 2015

CSG Help Desk Survey Results

A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management

Program: ServiceOakville Program Based Budget Page 19

ITS Help Desk Annual Report 2012

Presented By: Daniel Chetty

Consumer ID Theft Total Costs

Continual Service Improvement: The Catalyst for Service Desk Excellence and Enterprise Productivity

UNT Institutional Effectiveness Database

Rapid Service Desk Training Through Knowledge

Office of the. Ombudsman Annual Report. A Message from Don Moffatt, the Ombudsman Listening to You Enhancing Your Experience Helping You

STRATEGIC BUSINESS PLAN QUARTERLY KPI REPORT FOR: FISCAL YEAR 2015, QUARTER 2 (JULY THROUGH DECEMBER 2014)

Global Service Desk (GSD) IT Executive Learning Series

Effective Reporting. The Nuts and Bolts of Best Practice Reporting for the Contact Center. What Is Effective Reporting?

CedarCrestone HR Systems Survey Highlights. In Denver!

1. INTRODUCTION AND CONTACT INFORMATION

Attachment 2 Performance Metrics

Smart Reporting: Using Your Service Desk to Better Manage Your IT Department

COUNTY OF ORANGE, CA ATTACHMENT A STATEMENT OF WORK DEFINITIONS ATTACHMENT A STATEMENT OF WORK DEFINITIONS. for. Date TBD

COUNTY OF ORANGE, CA Schedule 2D Service Desk Services SOW SCHEDULE 2D SERVICE DESK SERVICES SOW. for. Date TBD

AT&T Global Network Client for Windows Product Support Matrix January 29, 2015

QUICK FACTS. Establishing a Telephony Service Desk System to Enhance Telecommunications Support. TEKsystems Global Services Customer Success Stories

NOMINATION FORM. Category for judging: 5 - Digital Government: Government to Government (G to G)

Student Facing Customer Service Contact

RIGHTNOW MULTI-CHANNEL CONTACT CENTER BENCHMARK REPORT

IT Help Desk Management Survey Questionnaire January 2007

SESSION 208 Monday, November 2, 11:30am - 12:30pm Track: Service Support and Operations

Department of Information Technology

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. Automating Service & Support with Voice Self-Service

Customer Contact Center Benchmarking Results Executive Summary

TECHNICAL SUPPORT GUIDE

ITS Help & Service Desk Annual Report

Real World Proactive ITIL Continuous Improvement Practices Part 1. Mickey Nakamura

Customer Care Center Details

Development of a Balanced Scorecard for Service Desk KPIs

Continual Service Improvement How to Provide Value on an Ongoing Basis

KELLER INDEPENDENT SCHOOL DISTRICT

MNsure Assessment Summary

Moving a Call Center to the Cloud

Kennebec Valley Community College Information Technology Department Service Level Agreement

IT Master Plan Executive Summary

Logging In. Supervisor Set Up. Contents. No table of contents entries found.

Organizational Development Plan

Request for Proposals (RFP) Managed Services, Help Desk and Engineering Support for Safer Foundation

Helpdesk and Technology Support Procedures

Pega Texas Conference 2012 Department of Veterans Affairs. Terry Riffel, Director Financial Services Center October 26, 2012

GTA Board of Directors September 4, 2014

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6

Achieving Excellence in Non-Clinical Call Center Services. Healthcare Call Center Times Conference June 2015

Hospital IT Service Desk Benchmarks

Improving. Summary. gathered from. research, and. Burnout of. Whitepaper

JUN Technology Services. Peak Metrics Report Out D E N V E R PERFORMANCE

The Service Desk Survival Guide 2005 Peter McGarahan

NGITS ITSM STRATEGY JAYASHREE RAGHURAMAN SHIVA CHANDRASHEKHER VIKAS SOLANKI

SERVICE DESK EFFECTIVENESS SELF SERVICE REIGNS SUPREME

Avaya Automated Services Nov-2013

BT Retail Social Media making it easy for our customers

NYSED DATA DASHBOARD SOLUTIONS RFP ATTACHMENT 6.4 MAINTENANCE AND SUPPORT SERVICES

Service desk / Desktop / PC Services Director: Apurva Mehta Manager: Raymond Tse

1. Introduction. 2. Performance against service levels 1 THE HIGHLAND COUNCIL. Agenda Item. Resources Committee 26 th March 2003 RES/43/03

1.1 Please indicate below if any aspect of the service is legally mandated by any of the following and provide the relevant reference.

ITS Help Desk Annual Report 2011

UW Oshkosh IT Plan Submission January, 2014

How to Select the Right Remote Support Tool:

Student Contact Center: From Development to Delivery. A Vision for Anytime, Anywhere Student Services

How to Select the Right Remote Support Tool

No deviation. 5% deviation. No deviation. 10% of the voice mail messages may be answered within 60 minutes. No deviation.

MNsure Metrics Dashboard. Prepared for Board of Directors Meeting September 17, 2014

University Systems Desktop Support Service Level Commitment

Transcription:

Stanford / MIT Benchmarking IT Help Desk Final Presentation November 13, 2002

Agenda Project Goals Help Desk Benchmarking Goals & benchmark metrics Initial data comparisons Findings & hypotheses Quick wins and lessons Tools and timelines to get there Benchmarking as a Methodology 2

Benchmarking Project Goals Help Desk Specific Enable comparisons between institutions Develop ongoing management tool» Determine metrics» Develop initial comparisons & findings» Identify tools needed, future plans Benchmarking in Higher Education Enable comparisons among schools and industry Develop methodology Provide a test-case Develop strategy to expand See additional context in Appendix 1 Project History & Goals 3

Benchmarks must tie to management goals 1. Support client needs with quality service 2. Be responsive 3. Be cost effective 4. Provide appropriate level of investment 5. Develop effective, mature processes 6. Maintain high-performing, competent team 7. Support rollout of new systems 4

Goals must tie to specific metrics Invest Appropriately Be Cost Effective % of budget Cost per case by topic Clients served/fte Total costs by topic Cases by media, including self-help Support Customer Needs with High Quality Service Annual customer survey Spot-surveys on selected transactions Be Responsive Elapsed time per case Call abandonment Hold time Time to answer Support Rollout of New Systems Case volume by topic 3 months before and after launch Minutes per case Develop High Performing, Competent Teams Employee satisfactn survey Individual perf. metrics Team performance metrics Training $$ / FTE % Help Desk certification Case volume compared to staff skills mix Develop Effective, Mature Processes # of contacts vs. # of days to resolve Origin of Help Desk cases See Appendix 6 for discussion of specific data elements to calculate benchmarks 5

Caveat: Best benchmarking is yet to come Problems with historical data Assumptions Extrapolations Simply missing some data Similar but very different operations Common metrics going forward Don t t focus too heavily on current data; look to future! 6

Context for Comparison: Help Desk Operations Organization Structure Location Offices Staffing ACD Tool Media MIT Consolidated + unquantified support in acad. depts. Single unit; junior/senior mixed; informal limit 15 min Single, separate location Cubes + call center 2-4 hour blocks; many stdts 4 phone numbers Home-grown Java Heavy email 50%+ Stanford Distributed support model across ITSS Tier 1 & 2 (10 min limit Tier 1) Multiple, distributed Individual offices Full-time assignments 1 published #, then phone tree Customized Remedy Heavy Web-form See more details in Appendix 3. How Each Help Desk Works 7

Context for Comparison: FY 02 Sizing Demographics MIT Stanford Variance Faculty & Staff 9,230 10,792 17% Students 10,204 14,173 39% Total Population 19,434 24,965 28% University Consolidated Budget $1,535,949 $1,937,900 26% IT Department Information Annual Base Budget $x $x 1 83% Full Time Staff (FTE) 270 430 1 59% Help Desk Information With Students No Students Annual Base Budget $x $x 2 5% Full Time Staff (FTE) 27.3 18.6 2-32% Tickets Processed 43,553 56,125 2 29% 1 Includes providing telecommunications for Stanford s hospital 2 Does not include Student Help Desk due to no tracking/ticketing system. Approximate increase with students would be +$275K and +5 or 6 FTEs. 8

Gauging investment and effectiveness MIT Stanford Variance IT Dept Budget / University Budget 2.9% 4.2% Help Desk Budget / IT Budget 4.2% 2.3% Tickets / School Population 2.24 2.25 1 0% Population per HD Employee 712 1,342 1 89% Tickets / Help Desk FTE 1,595 3,017 89% Help Desk Budget / Ticket $41.83 $33.92-19% 1 This ratio s meaningfulness is affected because it does not include Student Help Desk numbers due to no tracking/ticketing system. 9

Goal: Be cost effective $200 $180 $160 Cost per Ticket $140 $120 MIT Stanford $100 $80 $60 $40 $20 $0 Accounts Backup Business Apps Business Func Cluster Connectivity Courseware Email Hardware OS Software Other Printing Desk Software Security/Virus Web See supporting data in Appendix 9

Goal: Be cost effective $480,000 $400,000 Total Annual Cost by Help Desk Topic Annual Cost $320,000 $240,000 $160,000 MIT Stanford $80,000 $0 Accounts Backup Business Apps Business Func Cluster Connectivity Courseware Email Hardware OS Software Other Printing Desk Software Security/Virus Web See supporting data in Appendix 9 11

Goal: Be cost effective Cases & Complexitys 12000 10000 8000 6000 4000 MIT First Contact Helpdesks Estimated Complexity of Case 4-10 1-4 0-1 2000 0 Accts/ID's/Auth Backup Bus App Suppt Bus. Functions Cluster Connectivity Courseware Email Hardware OS Software Other Printing Productivity SW Security/Virus Web See supporting data in Appendix 9 12

Goal: Be cost effective 20000 15000 Stanford Helpdesks: Tier Where Resolved Cases & Complexity 10000 5000 Other Level 2 Level 1 0 Accts/ID's/Auth Backup Bus App Suppt Bus. Functions Cluster Connectivity Courseware Email Hardware OS Software Other Printing Productivity SW Security/Virus Web See supporting data in Appendix 9 13

Goal: Support rollout of new systems 7,000 Impact: New Business Applications 6,000 Number of Trouble Tickets 5,000 4,000 3,000 2,000 Kronos/HR People Soft Jan. Axess Probs. March HR Salary Setting May ITSS Help Desk Level 1 ITSS Help Desk Level 2 Legacy Applications Rollout New Applications 1,000 - Total Help Tickets A S O N D J F M A M J J 12 Months FY02 Some rollout application tickets included in HDLevels 1 & 2. See supporting data in Appendix 9 (Stanford data only). 14

Preliminary data offers initial observations Implementation choices affect Help Desk costs MIT» Email» Connectivity to desktop» Security, Kerberos Stanford» Accounts, authentications» Business apps Time is money. Topics are expensive when they are complex, must escalate, or relate to unique applications Specialists are required more frequently for unique, proprietary issues System rollouts create overall spikes and some dips in specific areas 15

Initial observations Student employees» MIT Help Desk employs more students at a lower overall budget» More FTEs but difficult now to gauge overall effectiveness of using students Structured tiers:» Using structured tiers may support a greater number of cases» Resolving at Tier 1 significantly reduces costs» You can either tier the work through process choices, or can tier the staff to handle only certain types of work Media of case submission may affect costs» Web submission may support greater number of cases/fte 16

Goal: Be responsive Responsiveness to Phone Calls Stanford Tier 1 only Jan-Aug 02 MIT- All desks Jul 01 Jun 02 Average Speed to Answer (seconds) 51 45 Abandon Rate % 17% 13% Time before Caller Abandons (seconds) 68 99 Average Call Length (seconds) 252 (4.2 min) 415 (6.9 min) Time between Calls (for staffer) 4.9 min Average # of Calls Monthly 4032 2039 HDI Industry Comparisons 59.5 4% 5 min. Speed to Answer = Time a call waited before being answered by a person. Abandon rate = % of calls where customer hung up; includes callers that hung up after reaching voice mail. Call Length = Time from when a call is answered to when it is released. Time between Calls = Time between calls that an ACD agent handles. Agent closes out ticket before getting back in queue to receive calls. 17

Goal: Support customers with high quality service Customer satisfaction appears comparable across institutions. MIT Computing Help Desk (All Levels) and Stanford Help Desk (Level 1) "Snapshot" Surveys after Recently Closed Cases Annualized Average Timeliness of Response 5.0 Quality of Resolution 5 Point Likert Scale 4.0 3.0 2.0 1.0 Courtesy and Professionalism Technical Competence Overall Satisfaction 0.0 MIT 2001 SU 2002 Improvement efforts do bear fruit as shown in MIT s two annual survey results. Likert Scale 5.0 4.0 3.0 2.0 1.0 0.0 MIT Computing Help Desk (All Levels) from IS Annual Customer Satisfaction Surveys 2000 2002 Timeliness of Response Quality of Resolution Courtesy and Professionalism Technical Competence Overall Satisfaction Ability to Get Through Turnaround Time See supporting data in Appendix 5 18

Goal: Develop effective, mature processes The vast majority of cases are resolved quickly. Help Desk Overall Time to Close Cases -- Help Desk Case neglected or Customer unresponsive 5000 100.00% 4500 90.00% 40 35 4000 3500 61.53% 80.00% 70.00% Days to Close 30 25 20 15 Cases 3000 2500 2000 1500 60.00% 50.00% 40.00% 30.00% 10 5 0 0 10 20 30 40 50 n of Interactions 1000 500 0 8.46% 9.23% 10.95% 4.09% 3.40% 2.35% 1 3 7 14 21 31 More 20.00% 10.00%.00% Desirable Days 19

Goal: Maintain high-performing, competent team Individual performance can vary greatly; must consider Hawthorne effect Managers D Staff Member C (75%) Staff Member A 130 294 414 Stanford IT Help Desk (Level 2) Hours Logged FY02 Staff Member I 461 Staff Member E (55%) 532 Staff Member B Staff Member H (80%) Staff Member G 623 630 676 Staff Member F 1086 0 200 400 600 800 1000 1200 20

Goal: Maintain high-performing, competent team More team or employee-related metrics are desirable but need more high level discussion Employee satisfaction survey Customer satisfaction tied to individual service providers Individual performance metrics MIT already tracks by team Stanford tracks by individual Help Desk Certifications Behavioral and/or technical competencies 21

Initial performance data also yields some Customer satisfaction observations» Appears comparable across both institutions» Improvements efforts did increase satisfaction over 2 years (MIT) Process effectiveness» Better categorization, identification of the work may help with faster escalation. Get to the right place faster.» Vast majority are resolved quickly Employee performance» Can vary significantly among individuals» Metrics do affect behavior (Stanford) 22

The data also raise good questions Processes Which cases should be escalated more quickly? Should you tier the work or the organization? organization? How does web-submission affect cost? Staffing Is student employment effective? What additional training should tier 1 1 receive? How should each institution use employee performance data? Support intensive systems Should support intensive systems be replaced? How can we help with new system design to minimize Help Desk requirements? Investments to improve efficiencies Which tools should we acquire to improve performance? 23

Quick Wins for Implementation MIT Stanford Tracking Track tickets at student-staffed Unix desk Track internal hand-offs or tiers/escalations explicitly Standardize work reporting categories Track type of media for each case Consolidate reporting functions into one ticket system Examine excess ticket counts in specific categories In Place Quick Win Quick Win In Place Quick Win Quick Win Quick Win In Place Quick Win Quick Win In Place Quick Win Customer feedback Initiate or increase transaction-based spot surveys Quick Win Quick Win Proactively use data Generate & review weekly metric reports Generate weekly standards for tickets processed or time spent and use as part of individual performance management Quick Win Not quick Quick Win Quick Win Reconfigure Space Reconfigure space and move staff to allow for more efficiencies and collaboration In Place Quick Win 24

of Help Desk Benchmarking Software or Hardware Investments Customer feedback Management High Performing Team Scope Remedy changes needed for bmarking; ; engage consultant Casetracker to allow consultant to track touch minutes per case and escalations (tiers) both within and outside HD Knowledge Management system Pilot use to help HD staff retrieve standard answers Evaluate usefulness of client use (self-help) ACD call-board to display calls waiting in queue Create dashboard reports and process for regular reporting Self-serve password reset tools Collaborate on annual customer survey Define process for using customer survey responses Create cross-functional ITSS team for Delphi rollout Institute regular review of metrics with finance Create Help Desk Standard Oprtng Procdrs & Handbk Solicit employee feedback for process and job improvement Track % of HD Certifications and training $ per employee MIT In Place Long Term In Place In Place Long Term Stanford 25

Cost to Implement Metrics Software or Hardware Investments Remedy consultant to program changes Casetracker consultant Knowledge Management system Pilot use to help HD staff Evaluate use of client self-serve ACD call-board to display queued calls Self-serve password reset tools Joint customer satisfaction survey Creation of Standard Operating Procedures Self Creation of Dashboard Next 6 months MIT Stanford $ 75K $ 60K $ 300K $300K $ 15K $ 60K $10K $10K $32K $15K 26

Stanford: Help Desk Dashboard MONTHLY 8 7 6 5 4 3 2 1 0 Cases by Media Email Calls Walk-In Self-Help 5 4 3 2 1 TO BE DEVELOPED Customer Satisfaction - Spot Ticket Surveys IT Help Desk Tech Support Student J F M A M J J A S O N D 0 Courtesy Tech Knowledge Overall Satis 3,400 3,200 3,000 2,800 2,600 2,400 2,200 2,000 1,800 1,600 1,400 1,200 1,000 800 600 400 200 - Level 1 Help Desk # of Tickets Created by Employee - 2002 F M A M J J A S Avg 9 8 7 6 5 4 3 2 1 Another Metric Call Length (Tier 2) Phone-In Statistics Abandon 10 5 0 Time to Answer Hold Time Actual Goal 7 6 5 4 3 2 1 0 Problems by Cause - Tier 1 -- for Month of XXXXX Current Previous % Problem Resolution 100% 80% 60% 40% 20% Tier 3 Tier 2 Tier 1 Account ID Authority Bus App Cluster Securty/Virus Connectvty Data Backup Email Other HW Print SW - OS SW -Personal Web Telecom 0% 27

MIT: Help Desk Dashboard MONTHLY MIT First Contact Helpdesks MIT First Contact Help Desks Cases By Method 12000 60% 50% Cases / Complexity 10000 8000 6000 4000 2000 0 Accounts/ID's/Authority Backup Business Application Support Business Functions Cluster Connectivity Courseware Email Hardware OS Software Other Printing Productivity Software Security/Virus Web Specialist (Tier 2) Simple (Tier 1) Referral (Tier 3) 40% 30% 20% 10% 0% 02 02 5 4 3 2 1 0 02 02 02 Jan- Feb- Mar- Apr- May- Jun- 02 TO BE DEVELOPED Customer Satisfaction - Spot Ticket Surveys Courtesy Tech Overall Satis Knowledge IT Help Desk Tech Support Student 02 Jul-02 Aug- Sep- Messages 02 (Email, Web) Interactive (Voice, Char) In-Person (Walk In, Site Visit) Internal (Transfers, Referrals) MIT First Contact Helpdesks Interactive Communications (Phone, Chat) Answered % MIT First Contact Helpdesks Message Communications (Email, Web Requests) Time to Close Cases Call Length (specialist) (0-600) Call Length (simple) (0-600) Phone Time to Answer (0-60) Hold Time (0-60) Notes: 1) The MIT ACD currently does not collect hold time. 2) Times are in seconds 3) MIT currently cannot distinguish specialist from simple calls; actual is average Actual Goal Time between exchanges (specialist) (0-6000) Number of exchanges (specialist) (0-6) Answered% Time between exchanges (simple) (0-600) Time to first response (0-600) Number of exchanges (simple) (0-6) Notes: This is a sample dashboard showing metrics to be delivered on a monthly basis pending some tool improvements. 1. The Cases by Complexity graph shows real ticket counts and categorizations. Complexity is currently estimated based on time-to-resolve. 2. We do not currently have complete Helpdesk Interactive Communications data such as time to first response. This graph is a placeholder. 3. Cases by Method shows additional data. The graph breaks out email and web requests. Ultimately we want to consolidate this data into the categories shown in the graph s legend. Actual Goal Cases 5000 4500 100.00% 90.00% 4000 80.00% 3500 61.53% 3000 70.00% 60.00% 2500 50.00% 2000 40.00% 1500 30.00% 1000 8.46%9.23% 10.95% 20.00% 500 4.09%3.40%2.35% 10.00% 0.00% 1 3 7 14 21 31 More Days 28

Next Steps for MIT/Stanford Help Desk Benchmarking Months: D J F M A M J J A S Implement selected metrics (dashboard) and quick wins Track common metrics; re-evaluate Begin software modifications Implement Knowledge Management system and ACD call board Implement customer spot surveys; then annual customer survey (?) Implement management, oprtns changes Consider inviting others to benchmark Ongoing Ongoing Annual? = On-site visits 29