Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ

Similar documents
Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

Industry Megatrends: A Benchmarking Perspective

Metric of the Month: Percent Resolved Level 1 Capable

Unleashing the Enormous Power of Call Center KPIs. Call Center Best Practices Series

The True Cost of Desktop Support:

Metric of the Month: Tickets per User per Month

How To Measure Tickets Per Technician Per Month

ITSM Process Description

The True Cost of Desktop Support: Understanding the Critical Cost Drivers

The Service Desk Survival Guide 2005 Peter McGarahan

Metrics That Matter. Presented by: Pete McGarahan

Metric of the Month: First Contact Resolution

HOW TO OPTIMIZE INFRASTRUCTURE SUPPORT SERVICES. BETSOL The Right Solution,Right Now

Ongoing Help Desk Management Plan

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness

Improving. Summary. gathered from. research, and. Burnout of. Whitepaper

ENTERPRISE SERVICE DESK (ESD) SERVICE DELIVERY GUIDE

Optimizing Service Delivery Through Analytics and ITSM. Rich Jaso VP, Unisys Global Managed Services

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.

Helping Midsize Businesses Grow Through HR Technology

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6

Remote Support: Key Metrics to drive Improvement in your Center

ITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists

The Seven Most Important Performance Indicators for the Service Desk

IT Support Supervisor #02982 City of Virginia Beach Job Description Date of Last Revision:

The Evolving Role of Process Automation and the Customer Service Experience

Metric of the Month: The Service Desk Balanced Scorecard

COUNTY OF ORANGE, CA Schedule 2D Service Desk Services SOW SCHEDULE 2D SERVICE DESK SERVICES SOW. for. Date TBD

APPENDIX 4 TO SCHEDULE 3.3

Benchmark Against Best Practice Service Delivery Metrics

Small Business. Leveraging SBA IT resources to support America s small businesses

Managing a 24x7x365 Support Center and Network Engineering for a Government Agency QUICK FACTS

ATTACHMENT V2. Transnet

Introduction to ITIL: A Framework for IT Service Management

Global Service Desk. Superior end-user support for the Adaptive Enterprise. HP Services

Service Level Agreement and Management By: Harris Kern s Enterprise Computing Institute

Customer Contact Center Benchmarking Results Executive Summary

APPENDIX 3 TO SCHEDULE 8.1

Development of a Balanced Scorecard for Service Desk KPIs

How To Use The Pace Help Desk

Cisco Network Optimization Service

Transform Your Service Desk by Using Award-winning Strategies

The Importance of First Contact Resolution. 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com

Yale University Incident Management Process Guide

Department of Information Technology

Enhancing Business Performance Through Innovative Technology Solutions

Enabling Chat -- Key Success Factors in Chat Implementation

Information Technology Engineers Examination. Information Technology Service Manager Examination. (Level 4) Syllabus

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001

Drive Down IT Operations Cost with Multi-Level Automation

EXIN.Passguide.EX0-001.v by.SAM.424q. Exam Code: EX Exam Name: ITIL Foundation (syllabus 2011) Exam

Creating Service Desk Metrics

Sample Slide Deck for IDRC Readout

ADDENDUM 5 TO APPENDIX 4 TO SCHEDULE 3.3

IT Service Management by SAP Africa (ITSM) Dirk Smit ALM Engagement Manager

SESSION 709 Wednesday, November 4, 9:00am - 10:00am Track: Strategic View

INCIDENT MANAGEMENT & REQUEST FULFILLMENT PROCESSES. Process Owner: Service Desk Manager. Version: v2.0. November 2014 Page 0

GSN Cloud Contact Centre Partnership Datasheet

ASIAN PACIFIC TELECOMMUNICATIONS PTY LTD STANDARD FORM OF AGREEMENT. Schedule 3 Support Services

The Modern Service Desk: How Advanced Integration, Process Automation, and ITIL Support Enable ITSM Solutions That Deliver Business Confidence

The Keys to Successful Service Level Agreements Effectively Meeting Enterprise Demands

Service Management Foundation

Let Your Call Center Customer Service Representatives be a Judge!

City of Hapeville, GA VC3Advantage Work Order

Systems Support - Standard

HP Service Manager software

IAOP: Creating Sustainable value in Outsourcing Klaus Koefoed

Helpdesk Incident & Request Management Procedure For

Ohio University Office of Information Technology

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. IT Must Innovate to Meet Rising Business Expectations

ITIL V3 Foundation Certification - Sample Exam 1

ITIL by Test-king. Exam code: ITIL-F. Exam name: ITIL Foundation. Version 15.0

Measuring Success Service Desk Evaluation Guide for the Midsized Business: How to Choose the Right Service Desk Solution and Improve Your ROI

HELP DESK MANAGEMENT PLAN

The Importance of Information Delivery in IT Operations

Unisys Innovation Plan

ISO :2005 Requirements Summary

HELP DESK MANAGEMENT PLAN

INSPECTOR GENERAL. Availability of Critical Applications. Audit Report. September 25, Report Number IT-AR OFFICE OF

How Does YOUR Call Center Stack Up?

HDI Support Center Certification: A Pragmatic Approach to Verifying Core ITIL Process Maturity

Title goes here. Critical Access Hospital Workshop Affordable Information Technology For Smaller Health Care Organizations

A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management

Help Desk Technician. Information Technology. IT Director, IT Department. Permanent. MUH & HSU- Enterprise Agreement

Contact Center TotalCare Enhanced Services

QUICK FACTS. Consolidating Service Desks Post-Merger for a Leading U.S. Energy Supplier TEKSYSTEMS GLOBAL SERVICES CUSTOMER SUCCESS STORIES

State of Tennessee Sourcing Event #9160 ServiceNow Preliminary Statement of Work (SOW)

Auxilion Service Desk as a Service. Service Desk as a Service. Date January Commercial in Confidence Auxilion 2015 Page 1

Continual Service Improvement How to Provide Value on an Ongoing Basis

BMC and ITIL: Continuing IT Service Evolution. Why adopting ITIL processes today can save your tomorrow

Alcatel-Lucent Managed Services Overview

Transcription:

Company XYZ Peer Group Desktop Support Benchmark Company XYZ

Report Contents Project Overview and Objectives Page 2 Industry Background Page 37 Performance Benchmarking Summary Page 51 Best Practices Process Assessment Page 78 Interview Themes and Quotes Page 92 Conclusions and Recommendations Page 102 Detailed Benchmarking Comparisons Page 134 Price Metrics Page 135 Productivity Metrics Page 142 Service Level Metrics Page 155 Quality Metrics Page 164 Technician Metrics Page 171 Ticket Handling Metrics Page 184 Workload Metrics Page 191 About MetricNet Page 200 1

Project Overview and Objectives Company XYZ 2

Project Objectives Review and assess the performance of Company XYZ s desktop support function Benchmark the performance of Company XYZ against a peer group of comparable desktop support organizations Recommend strategies to optimize performance Achieve world-class levels of performance Maximize customer satisfaction 3

Project Approach Module 1: Company XYZ Baselining / Data Collection Module 2: Benchmarking and Gap Analysis Module 3: Balanced Scorecard Module 4: Best Practices Process Assessment Module 5: Strategies for Optimized Performance Module 6: Report Development and Presentation of Results 4

Module 1: Company XYZ Baselining/Data Collection Core Topics Project Kickoff Data Collection Interviews 5

Project Kickoff Meeting Company XYZ Key Objectives: Project Kickoff Meeting Introduce the MetricNet and Company XYZ project teams Discuss the project schedule Distribute the data collection document Answer questions about the project 6

Data Collection 7

Interviews Company XYZ Interviews Technicians, team leads, supervisors QA/QC, Workforce schedulers, trainers 8

Module 2: Benchmarking and Gap Analysis Core Topics Peer Group Selection Benchmarking Comparison Gap Analysis 9

Benchmarking Peer Group Selection Scope Scale Complexity IDEAL PEER GROUP Geography Read MetricNet s whitepaper on Benchmarking Peer Group Selection. Go to www.metricnet.com to get your copy! 10

Dynamic Peer Group Selection Scope Scale Complexity Geography Scope refers to the services offered by Desktop Support. The broader the scope of services, the broader the skill set required by the technicians. As scope increases, so too does the cost of providing support. Desktop Support groups selected for benchmarking comparison must be comparable in the scope of services offered. Scale refers to the number of tickets handled by Desktop Support. Virtually everything in Desktop Support is subject to scale economies. This is particularly true when it comes to the volume of tickets handled. The approximate scale effect for volume is 5%. What this means is that every time the number of transactions doubles, you should expect to see the cost per ticket decline by 5%. For this reason, it is important to select benchmarking peer groups that are similar in scale. The complexity of transactions handled will influence the handle time, and hence the cost per ticket. For example, a system reboot is a simple transaction that takes very little time, and costs very little to resolve. By contrast, a complete system reimaging takes much longer and costs much more to resolve. MetricNet uses a proprietary algorithm to determine a weighted complexity index based upon the mix of tickets handled by Desktop Support. The companies chosen for a benchmarking peer group will have similar complexity factors. The main factor that is affected by geography is cost; specifically labor cost. Since labor accounts for 65% of desktop support operating expense, it is important to benchmark desktop support groups that have a common geography. Even within a particular geography, wage rates can differ significantly, so MetricNet makes adjustments to ensure that each Desktop Support organization in a benchmarking peer group is normalized to the same wage rate. 11

Desktop Support Benchmark: Key Questions Answered Key Questions How is your Desktop Support group performing? How does your Desktop Support group compare to other comparable support groups? What are the strengths and weaknesses of your desktop support group? What are the areas of improvement for your Desktop Support? How can you enhance Desktop Support performance and achieve world-class status? MetricNet s Benchmarking Database Desktop Support Benchmark Gap Analysis Improvement Recommendations Realized Performance Gains Company XYZ Peer Group Desktop Support Data 12

The Benchmarking Methodology Company XYZ Peer Group Desktop Support Performance COMPARE Performance of Benchmarking Peer Group Determine How Best in Class Achieve Superiority Adopt Selected Practices of Best in Class Build a Sustainable Competitive Advantage The ultimate objective of benchmarking Read MetricNet s whitepaper on Desktop Support Benchmarking. Go to www.metricnet.com to download your copy! 13

Summary of Included Desktop Support Metrics Price Productivity Service Level Price per Ticket Price per Incident Price per Service Request Quality Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours Technician Ticket Handling Workload Annual Technician Turnover Daily Technician Absenteeism New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 14

Benchmarking KPI Performance Summary Metric Type Price Productivity Service Level Quality Technician Ticket Handling Workload Key Performance Indicator (KPI) Company XYZ Peer Group Statistics Average Min Median Max Price per Ticket $79.98 $124.74 $50.49 $99.69 $325.06 Price per Incident $62.13 $109.24 $39.38 $96.35 $301.55 Price per Service Request $82.84 $161.32 $51.71 $134.55 $446.50 Tickets per Technician per Month 138.4 99.1 32.3 93.5 183.9 Incidents per Technician per Month 19.2 69.8 19.2 56.9 144.8 Service Requests per Technician per Month 119.3 29.2 7.1 17.8 119.3 Technicians as a Percent of Total FTE's 87.5% 88.9% 79.6% 88.1% 97.1% Technician Utilization 82.9% 58.3% 47.1% 55.5% 82.9% Mean Time to Resolve Incidents (hrs) 77.29 11.79 3.00 6.30 77.29 % of Incidents Resolved in 24 Hours 30.1% 69.9% 30.1% 75.3% 86.7% Mean Time to Fulfill Service Requests (Days) 6.40 3.04 1.20 3.10 6.40 % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% 33.5% 51.8% 80.7% Customer Satisfaction 92.0% 84.9% 66.5% 86.6% 98.5% Incident First Visit Resolution Rate N/A 84.5% 67.0% 85.3% 95.1% % Resolved Level 1 Capable N/A 18.8% 6.9% 19.1% 29.0% Annual Technician Turnover 62.3% 30.6% 9.6% 28.5% 62.3% Daily Technician Absenteeism 11.0% 4.6% 0.6% 3.9% 11.0% New Technician Training Hours 80 59 0 61 146 Annual Technician Training Hours N/A 10 0 3 56 Technician Tenure (months) 14.6 43.9 14.6 37.4 88.0 Technician Job Satisfaction 78.6% 78.3% 69.9% 78.1% 85.4% Average Incident Work Time (minutes) 35.0 21.1 11.1 19.2 40.6 Average Service Request Work Time (minutes) 50.0 48.7 21.0 45.0 108.0 Estimated Travel Time per Ticket (minutes) 10.0 34.5 10.0 30.0 99.0 Tickets per Seat per Month 0.33 0.52 0.19 0.46 1.01 Incidents per Seat per Month 0.05 0.39 0.05 0.27 0.85 Service Requests per Seat per Month 0.28 0.12 0.04 0.11 0.28 Incidents as a % of Total Ticket Volume 13.8% 70.7% 13.8% 75.4% 93.4% 15

Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Company XYZ Peer Group Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective Company XYZ Desktop Support Global Database Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Cost (Efficiency) Lower Price 16

Module 3: Balanced Scorecard Core Topics Metrics Selection Metric Weightings Scorecard Construction 17

Desktop Support Scorecard: An Overview Desktop Support scorecard employs a methodology that provides you with a single, all-inclusive measure of your Desktop Support performance It combines price, service level, productivity, and quality metrics into an overall performance indicator of your Desktop Support Your Desktop Support score will range between 0 and 100%, and can be compared directly to the scores of other Desktop Support Groups in the benchmark By computing your overall score on a monthly or quarterly basis, you can track and trend your performance over time Charting and tracking your Desktop Support score is an ideal way to ensure continuous improvement in Desktop Support! 18

Company XYZ Peer Group Desktop Support Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $301.55 $39.38 $62.13 91.3% 13.7% Price per Service Request 15.0% $446.50 $51.71 $82.84 92.1% 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 19 Your balanced score for each metric is calculated: metric score X weighting

Balanced Scores Sample Report Only. Data is not accurate. Balanced Scorecard Summary 100.0% 90.0% 80.0% 70.0% Key Statistics Desktop Support Scores High 90.5% Average ----- 58.6% Median 58.0% Low 11.5% Company XYZ 75.2% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% 0.0% Desktop Support 20

Scorecard Performance Rankings Overall Ranking Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score 1 18 $61.57 $51.71 98.5% 92.7% 73.6% 83.5% 50.9% 85.4% 90.5% 2 8 $39.38 $110.15 96.5% 94.8% 68.1% 74.8% 56.7% 75.2% 79.7% 3 17 $85.66 $106.80 96.3% 95.1% 65.1% 82.4% 48.8% 78.1% 77.6% 4 Company XYZ $62.13 $82.84 92.0% N/A 82.9% 30.1% 42.0% 78.6% 75.2% 5 15 $99.37 $102.16 93.0% 89.1% 57.1% 80.0% 64.2% 84.4% 74.4% 6 7 $128.02 $169.89 96.9% 94.7% 50.5% 75.1% 49.6% 83.5% 69.9% 7 4 $96.35 $134.55 90.2% 93.5% 55.5% 80.0% 59.7% 76.1% 66.2% 8 16 $133.95 $179.05 91.6% 81.0% 60.5% 75.3% 80.7% 78.7% 64.6% 9 10 $111.08 $126.17 86.6% 86.3% 55.4% 83.1% 48.5% 75.2% 58.8% 10 1 $99.48 $236.48 83.7% 86.1% 48.9% 67.8% 77.0% 84.7% 58.0% 11 9 $87.04 $162.61 86.0% 82.8% 58.2% 61.9% 51.8% 76.3% 57.4% 12 2 $41.29 $79.98 75.0% 80.6% 63.6% 78.9% 59.0% 70.5% 54.6% 13 5 $138.93 $206.74 92.4% 86.3% 47.1% 63.3% 42.8% 77.4% 54.3% 14 6 $114.93 $161.24 79.5% 84.5% 47.3% 86.7% 42.0% 83.7% 52.8% 15 3 $52.04 $132.24 69.4% 73.9% 72.1% 42.3% 75.8% 77.9% 52.1% 16 13 $84.97 $120.15 66.5% 76.2% 48.6% 80.4% 46.2% 78.1% 39.8% 17 12 $301.55 $315.61 80.6% 81.5% 49.9% 85.0% 68.8% 80.6% 37.8% 18 11 $77.02 $140.15 70.4% 75.6% 52.7% 53.9% 33.5% 73.7% 37.5% 19 14 $260.84 $446.50 68.8% 67.0% 50.5% 43.2% 78.7% 69.9% 11.5% Key Statistics Scorecard Metrics Average $109.24 $161.32 84.9% 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $301.55 $446.50 98.5% 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $51.71 66.5% 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $134.55 86.6% 85.3% 55.5% 75.3% 51.8% 78.1% 58.0% 21

Module 4: Best Practices Process Assessment Core Components Company XYZ Self Assessment MetricNet Maturity Ranking Process Assessment Rollup 22

Six-Part Model for Desktop Support Best Practices Model Component Definition Strategy Strategy Defining Your Charter and Mission Stakeholder Communication Customer Enthusiasm Human Resources Human Resources Proactive, Life-cycle Management of Personnel Process Expeditious Delivery of Customer Service Performance Measurement Technology Process Technology Leveraging People and Processes Performance Measurement A Holistic Approach to Performance Measurement Stakeholder Communication Proactively Managing Stakeholder Expectations 23

Best Practices Evaluation Criteria Ranking Explanation 1 No Knowledge of the Best Practice. 2 Aware of the Best Practice, but not applying it. 3 Aware of the Best Practice, and applying at a rudimentary level. 4 Best Practice is being effectively applied. 5 Best Practice is being applied in a world-class fashion. 24

Company XYZ Peer Group Desktop Support Self Assessment Best Practice Strategy Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization. 2.5 2.94 2 Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line. 2.0 2.32 3 Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan. 1.0 2.75 4 Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment. 2.0 3.03 5 Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization. 3.0 3.16 6 Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's. 3.0 2.66 7 Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives. 2.0 2.05 Summary Statistics Total Score 15.5 18.9 Average Score 2.21 2.70 25

Average Score 4.5 Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary 4.0 3.5 3.77 3.0 2.5 2.0 1.5 2.21 2.70 2.43 1.94 2.55 2.50 2.53 2.54 2.61 1.64 2.25 1.0 0.5 0.0 Company XYZ Peer Group 26

World-Class = 288.0 Balanced Score Average = 180.2 Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary 100% 90% 80% Company XYZ Performance Process Assessment Score 160.8 Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% 0.00 50.00 100.00 150.00 200.00 250.00 300.00 350.00 Process Assessment Score 27

Module 5: Strategies for Optimized Performance Core Components Conclusions and Recommendations Roadmap for World- Class Performance 28

Conclusions and Recommendations Conclusions and Recommendations fall into six categories 1. Strategy 2. Human Resource Management 3. Call Handling Processes and Procedures 4. Technology 5. Performance Measurement and Management 6. Stakeholder Communication 29

KPI Correlations Drive Conclusions Price per Ticket Customer Satisfaction Technician Utilization FCR (Incidents) Service Levels: MTTR Techs/ Total FTE s Absenteeism/ Turnover % Resolved Level 1 Capable Work/ Travel Time SL s MTTR Scheduling Efficiency Technician Satisfaction Coaching Career Path Training Hours 30

Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Global Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective World-Class Desktop Support Peer Group Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Price (Efficiency) Lower Price 31

Performance Targets will be Established Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) 77.3 12.0% % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) 6.4 3.0% % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 32

Module 6: Report Development and Presentation of Results Core Topics Conclusions and Recommendations Report Development Presentation of Benchmarking Results 33

Write Benchmarking Report 34

Presentation of Results The results of the benchmark will be presented in a live webcast Company XYZ 35

Deliverables include Project Participation Kit: Summary of Deliverables Project Schedule Data collection questionnaires Project Kickoff Meeting Telephone Interviews Comprehensive Assessment and Benchmarking Report Project Overview and Objectives Industry Background Benchmarking Performance Summary Balanced Scorecard Best Practices Process Assessment Conclusions and Recommendations Onsite Presentation of Results 36

Industry Background Company XYZ 37

25 Years of Desktop Support Benchmarking Data More than 1,100 Desktop Support Benchmarks Global Database 30 Key Performance Indicators Nearly 60 Industry Best Practices 38

Then and Now: The Evolution of Desktop Support Desktop Support KPI s Monthly Desktop Tickets per Seat North American Averages 1988 Last Year 0.53 0.78 Price per Ticket $29 $62 Average Incident Work Time (min:sec) Incidents Resolved on First Contact % Resolved Level 1 Capable Starting Technician Salaries (current dollars) Desktop Cost per Seat per Year 17:40 32:15 74% 68% 54% 22% $37,050 $43,627 $184 $580 39

So What s Going on Here? Industry MegaTrends: The Drivers Increasing awareness and understanding of Support TCO (Total Cost of Ownership) End-User Support evolving from a support to a strategic role in the enterprise The growing importance of Desktop Support in shaping end-user opinions on IT 40

And What are the Implications? Industry MegaTrends: The Result Increased Emphasis on First Contact Resolution (FCR), and Mean Time to Resolve (MTTR) Strategic Application of Key Performance Indicators (KPI s) Investments in Technician Development New Models for Measuring Desktop Support Value Renewed Emphasis on Internal Marketing Increased Starting Salaries for Technicians 41

Tickets, Incidents, and Service Requests Tickets Incidents Service Requests Unplanned work that requires a physical touch to a device Hardware break/fix Device failure Connectivity failure Planned work that requires a physical touch to one or more devices Move/Add/Change Hardware or software upgrade Device refresh Device set-up Incident Volume + Service Request Volume = Ticket Volume 42

Characteristics of World-Class Desktop Support Desktop Support consistently exceeds customer expectations Result is high levels of customer satisfaction (> 93%) MTTR is below average for Incidents and Service Requests < 0.7 days for Incidents < 3.8 days for Service Requests Costs are managed at or below industry average levels Price per Ticket, per Incident, and per Service Request is below average Minimizes Total Cost of Ownership (TCO) Desktop Support follows industry best practices Industry best practices are defined and documented Desktop Support follows industry best practices Every transaction adds value A positive customer experience Drives a positive view of IT overall 43

Customer Satisfaction Sample Report Only. Data is not accurate. World-Class Desktop Support Defined Higher World-Class Desktop Support BEST-IN-CLASS PERFORMANCE CURVE AVERAGE PERFORMANCE CURVE Average Desktop Support Lower Price per Ticket Higher 44

World-Class Desktop Support: Three Sources of Leverage World-Class Desktop Support organizations recognize and exploit three unique sources of leverage: 1. Minimizing Total Cost of Ownership (TCO) 2. Improving End-User productivity 3. Driving High Levels of Customer Satisfaction for Corporate IT 45

Cost of Resolution: North American Averages Support Level Price per Ticket Vendor $471 Field Support $196 Level 3 IT (apps, networking, NOC, etc.) Level 2: Desktop Support $85 $62 Level 1: Service Desk $22 46

The Tao of SPOC (Single Point of Contact) Level 2: Desktop Support User Community Level 1: Service Desk Level 3: IT Support Field Support Vendor Support 47

The Tao of SPOC (Continued) Key SPOC Principles Enterprise takes an end-to-end view of user support User/Customer has a single point of contact for all IT-related incidents, questions, problems, and work requests The Level 1 Service Desk is the SPOC Level 1 is responsible for: Ticket triage Resolution at Level 1 if possible Effective handoffs to n level support Resolution coordination and facilitation Ticket closure Desktop Drive-bys, Fly-bys, and Snags are strongly discouraged 48

Productive Hours Lost per Employee per Year 70 60 50 40 30 20 10 0 Quality of Support Drives End-User Productivity 0 1 2 3 4 5 Support Function Service Desk Desktop Support Performance Quartile n = 60 Key Performance Indicator Performance Quartile 1 (top) 2 3 4 (bottom) Customer Satisfaction 93.5% 84.5% 76.1% 69.3% First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4% Mean Time to Resolve (hours) 0.8 1.2 3.6 5.0 Customer Satisfaction 94.4% 89.2% 79.0% 71.7% First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5% Mean Time to Resolve (hours) 2.9 4.8 9.4 12.3 Average Productive Hours Lost per Employee per Year 17.1 25.9 37.4 46.9 49

90% Support Drives Customer Satisfaction for All of IT 84% % Saying Very Important 80% 70% 60% 50% 40% 30% 20% 10% 47% 31% n = 1,044 Global large cap companies Survey type: multiple choice 3 responses allowed per survey 29% 22% 19% 8% 0% Service Desk Desktop Support Network Outages VPN Training Enterprise Applications Factors Contributing to IT Customer Satisfaction Desktop Software 84% cited the service desk as a very important factor in their overall satisfaction with corporate IT 47% cited desktop support as a very important factor in their overall satisfaction with corporate IT 50

Performance Benchmarking Summary Company XYZ 51

Company XYZ Peer Group Desktop Support Overview Desktop Support Location Hours of Operation Annual Price Paid Tickets Washington, DC 7:00 am - 7:30 pm ET, Monday through Friday $929,953 969 Monthly Contact Volume Incidents Service Requests 134 835 Desktop Support Tech Level 1 Desktop Support Tech Level 2 Desktop Support Tech Level 3 FTE Headcount Operations Manager 1.0 Technology Profile 2.0 Trouble Ticket System BMC, Remedy ITSM V7.5.0 4.0 Customer Information System BMC, Remedy ITSM V7.5.0 1.0 Automatic Call Distributor (ACD) CISCO, 8.6.2 Workforce Management/Scheduling System/Software CCModeler, CCModeler Total 8.0 Interactive Voice Response (IVR) CISCO, 8.6.2 Knowledge Management BMC, Remedy ITSM V7.5.0 Labor Reporting Systems CISCO, 8.5 Remote Control Software Logme In, Remotely Anywhere 52

Summary of Included Benchmarking Metrics Price Productivity Service Level Price per Ticket Price per Incident Price per Service Request Quality Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours Technician Ticket Handling Workload Annual Technician Turnover Daily Technician Absenteeism New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 53

Benchmarking KPI Performance Summary Metric Type Price Productivity Service Level Quality Technician Ticket Handling Workload Key Performance Indicator (KPI) Company XYZ Peer Group Statistics Average Min Median Max Price per Ticket $79.98 $124.74 $50.49 $99.69 $325.06 Price per Incident $62.13 $109.24 $39.38 $96.35 $301.55 Price per Service Request $82.84 $161.32 $51.71 $134.55 $446.50 Tickets per Technician per Month 138.4 99.1 32.3 93.5 183.9 Incidents per Technician per Month 19.2 69.8 19.2 56.9 144.8 Service Requests per Technician per Month 119.3 29.2 7.1 17.8 119.3 Technicians as a Percent of Total FTE's 87.5% 88.9% 79.6% 88.1% 97.1% Technician Utilization 82.9% 58.3% 47.1% 55.5% 82.9% Mean Time to Resolve Incidents (hrs) 77.29 11.79 3.00 6.30 77.29 % of Incidents Resolved in 24 Hours 30.1% 69.9% 30.1% 75.3% 86.7% Mean Time to Fulfill Service Requests (Days) 6.40 3.04 1.20 3.10 6.40 % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% 33.5% 51.8% 80.7% Customer Satisfaction 92.0% 84.9% 66.5% 86.6% 98.5% Incident First Visit Resolution Rate N/A 84.5% 67.0% 85.3% 95.1% % Resolved Level 1 Capable N/A 18.8% 6.9% 19.1% 29.0% Annual Technician Turnover 62.3% 30.6% 9.6% 28.5% 62.3% Daily Technician Absenteeism 11.0% 4.6% 0.6% 3.9% 11.0% New Technician Training Hours 80 59 0 61 146 Annual Technician Training Hours N/A 10 0 3 56 Technician Tenure (months) 14.6 43.9 14.6 37.4 88.0 Technician Job Satisfaction 78.6% 78.3% 69.9% 78.1% 85.4% Average Incident Work Time (minutes) 35.0 21.1 11.1 19.2 40.6 Average Service Request Work Time (minutes) 50.0 48.7 21.0 45.0 108.0 Estimated Travel Time per Ticket (minutes) 10.0 34.5 10.0 30.0 99.0 Tickets per Seat per Month 0.33 0.52 0.19 0.46 1.01 Incidents per Seat per Month 0.05 0.39 0.05 0.27 0.85 Service Requests per Seat per Month 0.28 0.12 0.04 0.11 0.28 Incidents as a % of Total Ticket Volume 13.8% 70.7% 13.8% 75.4% 93.4% 54

KPI Gap Summary Metric Type Key Performance Indicator (KPI) Company XYZ Peer Average Performance Gap Price Productivity Service Level Quality Technician Ticket Handling Workload Price per Ticket $79.98 $124.74 35.9% Price per Incident $62.13 $109.24 43.1% Price per Service Request $82.84 $161.32 48.6% Tickets per Technician per Month 138.4 99.1 39.7% Incidents per Technician per Month 19.2 69.8-72.6% Service Requests per Technician per Month 119.3 29.2 308.6% Technicians as a Percent of Total FTE's 87.5% 88.9% -1.6% Technician Utilization 82.9% 58.3% 42.2% Mean Time to Resolve Incidents (hrs) 77.29 11.79-555.6% % of Incidents Resolved in 24 Hours 30.1% 69.9% -57.0% Mean Time to Fulfill Service Requests (Days) 6.40 3.04-110.4% % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% -25.9% Customer Satisfaction 92.0% 84.9% 8.3% Incident First Visit Resolution Rate N/A 84.5% N/A % Resolved Level 1 Capable N/A 18.8% N/A Annual Technician Turnover 62.3% 30.6% -103.7% Daily Technician Absenteeism 11.0% 4.6% -139.7% New Technician Training Hours 80 59 36.1% Annual Technician Training Hours N/A 10 N/A Technician Tenure (months) 14.6 43.9-66.8% Technician Job Satisfaction 78.6% 78.3% 0.4% Average Incident Work Time (minutes) 35.0 21.1-65.8% Average Service Request Work Time (minutes) 50.0 48.7-2.6% Estimated Travel Time per Ticket (minutes) 10.0 34.5 71.0% Tickets per Seat per Month 0.33 0.52 36.9% Incidents per Seat per Month 0.05 0.39 88.4% Service Requests per Seat per Month 0.28 0.12-129.7% Incidents as a % of Total Ticket Volume 13.8% 70.7% -80.4% 55

KPI Gap Ranking Key Performance Indicator (KPI) Company XYZ Peer Average Performance Gap Service Requests per Technician per Month 119.3 29.2 308.6% Incidents per Seat per Month 0.05 0.39 88.4% Estimated Travel Time per Ticket (minutes) 10.0 34.5 71.0% Price per Service Request $82.84 $161.32 48.6% Price per Incident $62.13 $109.24 43.1% Technician Utilization 82.9% 58.3% 42.2% Tickets per Technician per Month 138.4 99.1 39.7% Tickets per Seat per Month 0.33 0.52 36.9% New Technician Training Hours 80 59 36.1% Price per Ticket $79.98 $124.74 35.9% Customer Satisfaction 92.0% 84.9% 8.3% Technician Job Satisfaction 78.6% 78.3% 0.4% Incident First Visit Resolution Rate N/A 84.5% N/A % Resolved Level 1 Capable N/A 18.8% N/A Annual Technician Training Hours N/A 10 N/A Technicians as a Percent of Total FTE's 87.5% 88.9% -1.6% Average Service Request Work Time (minutes) 50.0 48.7-2.6% % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% -25.9% % of Incidents Resolved in 24 Hours 30.1% 69.9% -57.0% Average Incident Work Time (minutes) 35.0 21.1-65.8% Technician Tenure (months) 14.6 43.9-66.8% Incidents per Technician per Month 19.2 69.8-72.6% Incidents as a % of Total Ticket Volume 13.8% 70.7% -80.4% Annual Technician Turnover 62.3% 30.6% -103.7% Mean Time to Fulfill Service Requests (Days) 6.40 3.04-110.4% Service Requests per Seat per Month 0.28 0.12-129.7% Daily Technician Absenteeism 11.0% 4.6% -139.7% Mean Time to Resolve Incidents (hrs) 77.29 11.79-555.6% 56

Quartile Rankings: Price and Quality Metrics Price Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Price per Ticket Price per Incident $50.49 $85.33 $99.69 $134.39 $85.33 $99.69 $134.39 $325.06 $39.38 $69.58 $96.35 $121.48 $69.58 $96.35 $121.48 $301.55 $79.98 $62.13 Price per Service Request Quality Metrics $51.71 $108.48 $134.55 1 (Top) Quartile 2 3 $174.47 $108.48 $134.55 $174.47 $446.50 4 (Bottom) $82.84 Your Desktop Support Performance Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable 98.5% 92.7% 86.6% 77.3% 92.7% 86.6% 77.3% 66.5% 95.1% 91.8% 85.3% 80.7% 91.8% 85.3% 80.7% 67.0% 6.9% 14.2% 19.1% 22.7% 14.2% 19.1% 22.7% 29.0% 92.0% N/A N/A 57

Quartile Rankings: Productivity Metrics Productivity Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization 183.9 120.0 93.5 73.1 120.0 93.5 73.1 32.3 144.8 82.1 56.9 48.7 82.1 56.9 48.7 19.2 119.3 38.6 17.8 13.3 38.6 17.8 13.3 7.1 97.1% 91.7% 88.1% 86.5% 91.7% 88.1% 86.5% 79.6% 82.9% 64.4% 55.5% 50.2% 64.4% 55.5% 50.2% 47.1% 138.4 19.2 119.3 87.5% 82.9% 58

Quartile Rankings: Service Level Metrics Service Level Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours 3.00 4.90 6.30 12.00 4.90 6.30 12.00 77.29 86.7% 81.4% 75.3% 62.6% 81.4% 75.3% 62.6% 30.1% 1.20 2.00 3.10 3.70 2.00 3.10 3.70 6.40 80.7% 66.5% 51.8% 47.4% 66.5% 51.8% 47.4% 33.5% 77.29 30.1% 6.40 42.0% 59

Quartile Rankings: Technician Metrics Technician Performance Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Annual Technician Turnover 9.6% 25.2% 28.5% 35.3% 25.2% 28.5% 35.3% 62.3% 62.3% Daily Technician Absenteeism 0.6% 2.4% 3.9% 2.4% 3.9% 5.7% 5.7% 11.0% 11.0% New Technician Training Hours 146 86 61 86 61 22 0 22 80 Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction 56 15 3 0 15 3 0 0 88.0 48.8 37.4 33.2 48.8 37.4 33.2 14.6 85.4% 82.1% 78.1% 75.7% 82.1% 78.1% 75.7% 69.9% N/A 14.6 78.6% 60

Quartile Rankings: Ticket Handling Metrics Ticket Handling Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Average Incident Work Time (minutes) 11.1 14.9 19.2 14.9 19.2 25.6 25.6 40.6 35.0 Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) 21.0 32.0 45.0 63.0 32.0 45.0 63.0 108.0 10.0 16.0 30.0 45.0 16.0 30.0 45.0 99.0 50.0 10.0 61

Quartile Rankings: Workload Metrics Workload Metrics 1 (Top) Quartile 2 3 4 (Bottom) Your Desktop Support Performance Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 0.19 0.32 0.46 0.73 0.32 0.46 0.73 1.01 0.05 0.20 0.27 0.57 0.20 0.27 0.57 0.85 0.04 0.08 0.11 0.14 0.08 0.11 0.14 0.28 93.4% 84.3% 75.4% 63.0% 84.3% 75.4% 63.0% 13.8% 0.33 0.05 0.28 13.8% 62

Desktop Support Scorecard: An Overview The Desktop Support scorecard employs a methodology that provides you with a single, all-inclusive measure of your Desktop Support performance It combines cost, service level, productivity, and quality metrics into an overall performance indicator for your Desktop Support Your Desktop Support score will range between 0 and 100%, and can be compared directly to the scores of other Desktop Support Groups in the benchmark By computing your overall score on a monthly or quarterly basis, you can track and trend your performance over time Charting and tracking your Desktop Support score is an ideal way to ensure continuous improvement in Desktop Support! 63

Company XYZ Peer Group Desktop Support Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $301.55 $39.38 $62.13 91.3% 13.7% Price per Service Request 15.0% $446.50 $51.71 $82.84 92.1% 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 64 Your balanced score for each metric is calculated: metric score X weighting

Balanced Scores Sample Report Only. Data is not accurate. Balanced Scorecard Summary 100.0% 90.0% 80.0% 70.0% Key Statistics Desktop Support Scores High 90.5% Average ----- 58.6% Median 58.0% Low 11.5% Company XYZ 75.2% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% 0.0% Desktop Support 65

Peer Group Scorecard Summary Data The next two pages illustrate the benchmarking peer group performance for each KPI in the scorecard Page 67 ranks each Desktop Support group from best performer (#18) to worst performer (#14) based upon their balanced scores Page 68 ranks each KPI in the scorecard from best (top row) to worst (bottom row) 66

Scorecard Performance Rankings Overall Ranking Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score 1 18 $61.57 $51.71 98.5% 92.7% 73.6% 83.5% 50.9% 85.4% 90.5% 2 8 $39.38 $110.15 96.5% 94.8% 68.1% 74.8% 56.7% 75.2% 79.7% 3 17 $85.66 $106.80 96.3% 95.1% 65.1% 82.4% 48.8% 78.1% 77.6% 4 Company XYZ $62.13 $82.84 92.0% N/A 82.9% 30.1% 42.0% 78.6% 75.2% 5 15 $99.37 $102.16 93.0% 89.1% 57.1% 80.0% 64.2% 84.4% 74.4% 6 7 $128.02 $169.89 96.9% 94.7% 50.5% 75.1% 49.6% 83.5% 69.9% 7 4 $96.35 $134.55 90.2% 93.5% 55.5% 80.0% 59.7% 76.1% 66.2% 8 16 $133.95 $179.05 91.6% 81.0% 60.5% 75.3% 80.7% 78.7% 64.6% 9 10 $111.08 $126.17 86.6% 86.3% 55.4% 83.1% 48.5% 75.2% 58.8% 10 1 $99.48 $236.48 83.7% 86.1% 48.9% 67.8% 77.0% 84.7% 58.0% 11 9 $87.04 $162.61 86.0% 82.8% 58.2% 61.9% 51.8% 76.3% 57.4% 12 2 $41.29 $79.98 75.0% 80.6% 63.6% 78.9% 59.0% 70.5% 54.6% 13 5 $138.93 $206.74 92.4% 86.3% 47.1% 63.3% 42.8% 77.4% 54.3% 14 6 $114.93 $161.24 79.5% 84.5% 47.3% 86.7% 42.0% 83.7% 52.8% 15 3 $52.04 $132.24 69.4% 73.9% 72.1% 42.3% 75.8% 77.9% 52.1% 16 13 $84.97 $120.15 66.5% 76.2% 48.6% 80.4% 46.2% 78.1% 39.8% 17 12 $301.55 $315.61 80.6% 81.5% 49.9% 85.0% 68.8% 80.6% 37.8% 18 11 $77.02 $140.15 70.4% 75.6% 52.7% 53.9% 33.5% 73.7% 37.5% 19 14 $260.84 $446.50 68.8% 67.0% 50.5% 43.2% 78.7% 69.9% 11.5% Key Statistics Scorecard Metrics Average $109.24 $161.32 84.9% 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $301.55 $446.50 98.5% 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $51.71 66.5% 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $134.55 86.6% 85.3% 55.5% 75.3% 51.8% 78.1% 58.0% 67

KPI Scorecard Data in Rank Order Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Scorecard Metrics Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score Company XYZ $62.13 $82.84 92.0% N/A 82.9% 30.1% 42.0% 78.6% 75.2% Ranking 5 3 7 N/A 1 19 17 8 4 Quartile 1 1 2 N/A 1 4 4 2 1 1 $39.38 $51.71 98.5% 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% 2 $41.29 $79.98 96.9% 94.8% 73.6% 85.0% 78.7% 84.7% 79.7% 3 $52.04 $82.84 96.5% 94.7% 72.1% 83.5% 77.0% 84.4% 77.6% 4 $61.57 $102.16 96.3% 93.5% 68.1% 83.1% 75.8% 83.7% 75.2% 5 $62.13 $106.80 93.0% 92.7% 65.1% 82.4% 68.8% 83.5% 74.4% 6 $77.02 $110.15 92.4% 89.1% 63.6% 80.4% 64.2% 80.6% 69.9% 7 $84.97 $120.15 92.0% 86.3% 60.5% 80.0% 59.7% 78.7% 66.2% 8 $85.66 $126.17 91.6% 86.3% 58.2% 80.0% 59.0% 78.6% 64.6% 9 $87.04 $132.24 90.2% 86.1% 57.1% 78.9% 56.7% 78.1% 58.8% 10 $96.35 $134.55 86.6% 84.5% 55.5% 75.3% 51.8% 78.1% 58.0% 11 $99.37 $140.15 86.0% 84.5% 55.4% 75.1% 50.9% 77.9% 57.4% 12 $99.48 $161.24 83.7% 82.8% 52.7% 74.8% 49.6% 77.4% 54.6% 13 $111.08 $162.61 80.6% 81.5% 50.5% 67.8% 48.8% 76.3% 54.3% 14 $114.93 $169.89 79.5% 81.0% 50.5% 63.3% 48.5% 76.1% 52.8% 15 $128.02 $179.05 75.0% 80.6% 49.9% 61.9% 46.2% 75.2% 52.1% 16 $133.95 $206.74 70.4% 76.2% 48.9% 53.9% 42.8% 75.2% 39.8% 17 $138.93 $236.48 69.4% 75.6% 48.6% 43.2% 42.0% 73.7% 37.8% 18 $260.84 $315.61 68.8% 73.9% 47.3% 42.3% 42.0% 70.5% 37.5% 19 $301.55 $446.50 66.5% 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Average $109.24 $161.32 84.9% 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $301.55 $446.50 98.5% 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $51.71 66.5% 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $134.55 86.6% 84.5% 55.5% 75.3% 51.8% 78.1% 58.0% 68

Price per Incident Sample Report Only. Data is not accurate. Scorecard Metrics: Price per Incident $320.00 $300.00 $280.00 $260.00 $240.00 $220.00 Key Statistics Price per Incident High $301.55 Average ----- $109.24 Median $96.35 Low $39.38 Company XYZ $62.13 $200.00 $180.00 $160.00 $140.00 $120.00 $100.00 $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 69

Price per Service Request Sample Report Only. Data is not accurate. Scorecard Metrics: Price per Service Request $500.00 $450.00 $400.00 $350.00 Key Statistics Price per Service Request High $446.50 Average ----- $161.32 Median $134.55 Low $51.71 Company XYZ $82.84 $300.00 $250.00 $200.00 $150.00 $100.00 $50.00 $0.00 Desktop Support 70

Customer Satisfaction Sample Report Only. Data is not accurate. Scorecard Metrics: Customer Satisfaction 100.0% 98.0% 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% 62.0% Key Statistics Customer Satisfaction High 98.5% Average ----- 84.9% Median 86.6% Low 66.5% Company XYZ 92.0% Desktop Support 71

Incident First Visit Resolution Rate Sample Report Only. Data is not accurate. Scorecard Metrics: Incident First Visit Resolution Rate 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% Key Statistics Incident First Visit Resolution Rate High 95.1% Average ----- 84.5% Median 85.3% Low 67.0% Company XYZ N/A 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 72

Technician Utilization Sample Report Only. Data is not accurate. Scorecard Metrics: Technician Utilization 85.0% 80.0% 75.0% 70.0% Key Statistics Technician Utilization High 82.9% Average ----- 58.3% Median 55.5% Low 47.1% Company XYZ 82.9% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% Desktop Support 73

% of Incidents Resolved in 24 Hours Sample Report Only. Data is not accurate. Scorecard Metrics: % of Incidents Resolved in 24 hours 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics % of Incidents Resolved in 24 Hours High 86.7% Average ----- 69.9% Median 75.3% Low 30.1% Company XYZ 30.1% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% Desktop Support 74

% of Service Requests Fulfilled in 72 Hours Sample Report Only. Data is not accurate. Scorecard Metrics: % of Service Requests Fulfilled in 72 hrs 85.0% 80.0% 75.0% 70.0% 65.0% Key Statistics % of Service Requests Fulfilled in 72 Hours High 80.7% Average ----- 56.7% Median 51.8% Low 33.5% Company XYZ 42.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% Desktop Support 75

Technician Job Satisfaction Sample Report Only. Data is not accurate. Scorecard Metrics: Technician Job Satisfaction 88.0% 86.0% 84.0% 82.0% Key Statistics Technician Job Satisfaction High 85.4% Average ----- 78.3% Median 78.1% Low 69.9% Company XYZ 78.6% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 76

Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Company XYZ Peer Group Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective Company XYZ Desktop Support Global Database Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Price (Efficiency) Lower Price 77

Best Practices Process Assessment Company XYZ 78

Six-Part Model for Desktop Support Best Practices Model Component Definition Strategy Strategy Defining Your Charter and Mission Stakeholder Communication Customer Enthusiasm Human Resources Human Resources Proactive, Life-cycle Management of Personnel Process Expeditious Delivery of Customer Service Performance Measurement Technology Process Technology Leveraging People and Processes Performance Measurement A Holistic Approach to Performance Measurement Stakeholder Communication Proactively Managing Stakeholder Expectations 79

Best Practices Evaluation Criteria Ranking Explanation 1 No Knowledge of the Best Practice. 2 Aware of the Best Practice, but not applying it. 3 Aware of the Best Practice, and applying at a rudimentary level. 4 Best Practice is being effectively applied. 5 Best Practice is being applied in a world-class fashion. 80

MetricNet Has Defined 72 Desktop Support Best Practices Strategy 7 Best Practices Human Resources 13 Best Practices Process 16 Best Practices Technology Performance Measurement 14 Best Practices Communication 10 Best Practices 12 Best Practices Total Score from 72 to 360 The lowest score possible on the Best Practices Process Assessment is 72: Maturity Level 1 X 72 Best Practices = 72 The highest score possible on the Best Practices Process Assessment is 360: Maturity Level 5 X 72 Best Practices = 360 81

Strategy: 7 Best Practices Best Practice Strategy Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization. 2.5 2.94 2 Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line. 2.0 2.32 3 Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan. 1.0 2.75 4 Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment. 2.0 3.03 5 Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization. 3.0 3.16 6 Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's. 3.0 2.66 7 Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives. 2.0 2.05 Summary Statistics Total Score 15.5 18.9 Average Score 2.21 2.70 82

Human Resources: 13 Best Practices Best Practice Human Resources Best Practices Defined Company XYZ's Score Peer Group Average 1 2 Desktop Support has a formalized and documented recruiting process for filling vacancies. Job requirements are well defined, and candidates are tested for both technical skills, and customer service soft skills. New hires go through a formal training curriculum, including technical and customer service skills, and are required to pass a proficiency exam before independently handling customer incidents and service requests. Veteran technicians (more than 6 months of experience) have access to training opportunities to improve their 3 skill set, job performance, and the overall performance of the Desktop Support organization. Veteran technicians 3.0 1.77 are required to complete a minimum number of refresher training hours each year. 4 Technician training classes and curricula are specifically designed to maximize customer satisfaction, the number of user incidents resolved on First Contact, and to Minimize the Mean Time to Resolve. 3.0 2.58 5 Individual Technician training plans are clearly defined, documented and regularly updated. 3.0 1.34 Desktop Support has a formalized, documented Technician career path. Technicians are made aware of their 6 career advancement opportunities, and are encouraged to proactively manage their careers. Technicians are 4.0 1.88 coached at least once yearly on their career path and career advancements options. 7 8 9 Technicians have the opportunity to advance their careers in at least two ways: by improving their technical and customer service skills, and by improving their management and supervisory skills. Technicians are coached by their supervisor in one-on-one sessions on a monthly basis. Logged tickets are reviewed, and the supervisor provides specific suggestions to each Technician on how to improve performance. Technicians have quantifiable performance goals (e.g., for First Contact Resolution, Customer Satisfaction, Number of Tickets Handled per Month, etc.), and are held accountable for achieving their goals on a monthly basis. 10 Technicians are eligible for incentives and rewards based upon performance. These could include monetary incentives such as annual bonuses, or other incentives such as time off work, gift certificates, etc. 4.0 2.43 11 Technician performance goals are linked to and aligned with the overall Desktop Support goals and performance 4.0 3.05 12 Technician Satisfaction surveys are conducted at least once per year, and the results of the survey are used to manage and improve Technician morale. 4.0 1.97 13 Formal Performance reviews are scheduled and completed for all Desktop personnel at least once annually. 4.0 4.58 Summary Statistics Total Score 49.0 31.55 Average Score 3.77 2.43 4.0 4.0 4.0 4.0 4.0 2.86 2.31 2.00 1.89 2.89 83

Process: 16 Best Practices Best Practice Process Best Practices Defined Company XYZ's Score Peer Group Average 1 2 3 4 5 6 7 8 9 10 Desktop Support is part of an end-to-end support process, where Level 1 Support acts as the Single Point of Contact (SPOC) for user support. Customers are offered a range of access options to Desktop Support, including live voice, voice mail, email, web chat, self-service, fax, and walk-in. Ticket handling processes are standardized, documented, and available online. With few exceptions, the standards are followed by Desktop Support technicians. Escalation points are well defined and documented. These include other support groups (e.g., Level 3 support, Field Support, etc.), and individuals to whom tickets may be escalated. Rules for ticket escalation and transfer are well defined and documented. Technicians know when and where to transfer or route a ticket if they are unable to assist the user. Indirect contact channels, including Email, Voice Mail, and Faxes are treated with the same priority as live phone calls and chat sessions. The work queues from these channels are integrated, or worked in parallel. Incoming tickets are assigned a severity code based upon the number of users impacted, and the urgency of the incident. System alarms notify Desktop Support when a service level has been breached, whether at Desktop Support, or at another support level within the organization. Desktop Support has a formal, rapid notification and correction process that is activated when a service level has been breached, whether at Desktop Support, or at some other support level. Desktop Support has contingency plans to handle sudden, unexpected spikes in contact volume. These could include having supervisors and other indirect personnel handle incoming calls during a call spike. 11 Desktop Support has contingency plans to handle both short and long term interruptions in service delivery. 1.0 2.67 12 13 Desktop Support has a well defined service planning and readiness process that works closely with both internal engineering groups and vendors, and continues through product field testing and pre-release. This process enables Desktop Support to train for and prepare for supporting new products and services in the IT environment. Desktop Support has a formal Knowledge Management Process that facilitates the acquisition, qualification, review, approval, and distribution of knowledge into a Knowledgebase. Desktop Support has a mature workforce scheduling process that achieves high technician utilization, while maintaining reasonable service levels. 14 1.0 15 Desktop Support has an effective, ongoing process for projecting future workload and staffing requirements. 1.0 1.99 16 1.0 2.27 Desktop Support conducts periodic Root Cause Analysis (RCA) on the user contact profile to eliminate problems at their source. Summary Statistics N/A N/A 1.5 1.5 1.5 N/A 5.0 5.0 1.0 2.8 2.0 1.0 3.67 3.80 2.42 2.99 2.58 2.72 3.25 1.43 1.80 2.39 2.92 1.83 2.05 Total Score 25.3 40.78 Average Score 1.94 2.55 84

Technology: 10 Best Practices Best Practice Technology Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a full-featured ticket management system that facilitates effective ticket tracking, service level compliance, reporting, and root cause analysis. 5.0 4.37 2 Desktop Support has a comprehensive knowledge management tool that facilitates effective knowledge capture and re-use. Desktop technicians are able to quickly find solutions to user problems by searching the knowledge base. Solutions for the vast majority of user problems and questions can be found in the knowledge base. 2.5 2.76 3 4 5 6 7 8 The Desktop Support knowledgebase is used frequently by all Desktop Support technicians, and results in higher First Contact Resolution Rates, and lower resolution times (MTTR). Desktop Support has an effective tool that allows technicians to proxy into a user's computer, take control of the computer, and remotely perform diagnostics and problem solving (e.g., Tivoli, Bomgar, GoTo Assist, etc.). The tool increases remote resolution rates, and reduces contact handle times. Desktop Support has an Automated Password Reset (APR) capability that dramatically reduces the number of password resets that must be performed manually by Desktop Support technicians. Desktop Support has an effective, integrated self-service portal that is available to all users. The self-service portal provides information, FAQ's, and solutions to problems that are more complex than simple password resets. The tool includes a direct link to Desktop Support technicians. Users are aware of the self-service portal, and usage rates are continuously increasing. The ticket management system can track and monitor the skill levels of Desktop Support technicians based on closed tickets by product and/or service code. Desktop Support uses technology alerts/alarms to notify Desktop Support or perform self healing scripts when a customer or system issue is proactively identified. 9 Desktop Support has a multi-year plan for an integrated technology strategy. 2.0 1.37 10 Desktop Support utilizes a capital investment justification process based on ROI, and reports on post installation ROI as part of this process. 1.0 2.56 Summary Statistics Total Score 22.5 25.3 Average Score 2.50 2.53 2.0 5.0 1.0 2.0 2.31 3.78 2.48 N/A 1.40 2.0 3.23 1.08 85

Performance Measurement: 14 Best Practices Best Practice Performance Measurement Best Practices Defined Company XYZ's Score Peer Group Average 1 Cost per Ticket is measured, recorded, and tracked on an ongoing basis. N/A 2.51 2 Customer Satisfaction is measured, recorded, and tracked on an ongoing basis. 5.0 4.30 3 First Contact Resolution Rate for Incidents is measured, recorded, and tracked on an ongoing basis. 5.0 2.99 4 First Level Resolution is measured, recorded, and tracked on an ongoing basis. N/A 1.19 5 Technician Utilization is measured, recorded, and tracked on an ongoing basis. 4.5 1.54 6 Technician Satisfaction is measured, recorded, and tracked. 1.0 2.63 7 8 Desktop Support maintains a balanced scorecard that provides a single, all-inclusive measure of Desktop Support performance. Desktop Support tracks the number of tickets that are that could have been resolved by the Desktop Support organization at Level one. 1.0 1.0 2.85 1.92 Desktop Support conducts event driven customer surveys whereby the results of customer satisfaction surveys 9 1.0 can be linked back to a specific ticket, and to a specific technician handling the contact at Desktop Support. 3.16 10 11 Desktop Support conducts benchmarking at least once per year. 1.0 2.59 12 Desktop Support KPI's are used to establish "stretch" goals. 4.0 3.04 13 14 Desktop Support measures are used holistically, and diagnostically to identify performance gaps in Desktop Support performance, and to prescribe actions that will improve performance. Desktop Support understands key correlations and cause/effect relationships between the various KPI's. This enables Desktop Support to achieve desired performance goals by leveraging and driving the underlying "causal" metrics. Desktop Support tracks the Mean Time to Resolve (MTTR), and the Percentage of tickets resolved within 24 hours. Summary Statistics 2.0 4.0 1.0 2.83 2.25 2.78 Total Score 30.5 36.6 Average Score 2.54 2.61 86

Communication: 12 Best Practices Best Practice Communication Best Practices Defined Company XYZ's Score Peer Group Average 1 2 3 Desktop Support maintains active communication with all stakeholder groups, including Desktop Support organization employees, IT managers, company managers outside of IT, and customers. Desktop Support has a formal communications schedule, and provides customized content for each stakeholder group. Desktop Support has established User Group Liaisons who represent different groups within the user community. Desktop Support meets periodically with the liaisons to learn about user concerns and questions, and to communicate Desktop Support services, plans, and initiatives. 3.0 1.0 1.0 3.17 2.51 1.23 Desktop Support meets frequently with user groups, and holds "informational briefings" to educate users on 4 supported products and services, hours of operation, training opportunities, tips for getting the most benefit from Desktop Support, etc. 1.0 1.86 5 Desktop Support meets frequently with other IT managers, and is an integral part of key decisions made within IT. Desktop Support plays the role of "voice of the user" within IT. IT is required to deliver a "turnover package" to Desktop Support for all changes that will impact the user 1.0 2.32 6 environment. This could include application updates, new desktop software, etc. The turnover package is 2.5 2.18 designed to prepare Desktop Support to provide support to users in the affected areas. 7 Customers are told what to expect on resolution time when their ticket is escalated or if a call-back is required. 1.0 2.93 8 Desktop Support monitors all tickets, including those that are escalated, until ticket closure. 2.0 2.50 9 The value added by Desktop Support is communicated to key managers in IT, and expectations are formally established regarding Desktop Support roles and responsibilities. 2.0 2.01 10 11 Desktop Support tracks the number of training related contacts it receives, and provides feedback to user groups within the organization on training areas that could help to reduce Desktop Support contact volumes. Desktop Support provides training aids to users that enable them use Desktop Support more effectively. These could include log-in screens with Desktop Support phone number, chat windows that can be clicked to initiate a real-time chat session, mouse pads imprinted with Desktop Support IVR menu, etc. 2.5 1.0 1.73 1.88 Desktop Support transmits outbound messages to users announcing major system and network outages, 12 thereby alerting users about potential problems in the IT environment. These proactive messages help to reduce N/A 2.71 contact volumes during incidents that impact a large number of users. Summary Statistics Total Score 18.0 27.0 Average Score 1.64 2.25 87

Best Practices Process Assessment Summary Best Practices Component Number of Success Factors Average Company XYZ Score Average Peer Group Score Strategy 7 2.21 2.70 Human Resources 13 3.77 2.43 Process 16 1.94 2.55 Technology 10 2.50 2.53 Performance Measurement 14 2.54 2.61 Communication 12 1.64 2.25 Total Score 160.8 180.2 * An average score of 4.0 or above is required in each component of the Best Practices Model to achieve Best Practices Certification 88

Average Score 4.5 Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary 4.0 3.5 3.77 3.0 2.5 2.0 1.5 2.21 2.70 2.43 1.94 2.55 2.50 2.53 2.54 2.61 1.64 2.25 1.0 0.5 0.0 Company XYZ Peer Group 89

Total Process Assessment Scores Sample Report Only. Data is not accurate. Overall Process Assessment Scores 320.0 300.0 280.0 260.0 240.0 220.0 Key Statistics Total Process Assessment Score High 286.4 Average ----- 180.2 Median 175.7 Low 83.1 Your Score 160.8 World-Class ----- 288.0 200.0 180.0 160.0 140.0 120.0 100.0 80.0 60.0 40.0 20.0 0.0 90

World-Class = 288.0 Balanced Score Average = 180.2 Sample Report Only. Data is not accurate. Process Maturity vs. Scorecard Performance 100% 90% 80% Company XYZ Performance Process Assessment Score 160.8 Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% 0.00 50.00 100.00 150.00 200.00 250.00 300.00 350.00 Process Assessment Score 91

Interview Themes and Quotes Company XYZ 92

MetricNet Conducted Seven Desktop Support Interviews Desktop Support Interviewees Interviewee 1 Interviewee 2 Interviewee 3 Interviewee 4 Interviewee 5 Interviewee 6 Interviewee 7 Title Account Manager Tower Lead Service Desk Program Manager Desktop Support Manger Desktop Support Technician Desktop Support Technician IT Specialist 93

Key Themes from Desktop Support Interviews Internal communication within the team is viewed as a key strength of desktop support The consensus among interviewees is that Desktop Support would benefit by having a knowledge base Most interviewees felt that Desktop Support s scope of work needs to be better defined Additionally, most interviewees felt that the support processes need to be better defined and documented Interviewees would like to see the training program improved, and most would like to receive more training The morale of Desktop Support is viewed as average Some interviewees indicated that it would help if they were able to use the same devices that they were being asked to troubleshoot e.g., iphones, and ipads 94

Representative Comments from Desktop Support Interviews Desktop Support does not have any processes or documentation. This makes it difficult, especially if the technician does not know how to fix a particular problem. The workload is not balanced, so the work is sometimes not completed on time. The training needs to be improved. There needs to be continuous training and constant measurement of the strengths and performance of the team members. The scope of support is not well defined. This needs to be documented. Having a knowledge base will improve the service that Desktop Support can provide. It will not only shorten the resolution time, but it will also improve the quality of work. Desktop Support needs leadership who can better manage and motivate the team. The morale of the team is average, at best. The team can be motivated by giving out incentives such as free lunches, or by recognizing technicians in front of the team for a job well done. Categorizing the tickets will make the work easier for the technicians. Desktop Support is developing a process where the tickets are resolved on the day that they are received. It is no longer acceptable for tickets to be resolved the following day. Desktop Support needs more training on ipads, iphones, Windows 7 and Company XYZ docs. The technicians also need to be trained on the scope of work for new technologies that will be implemented. It is the responsibility of Desktop Support to change the way the users see the team. Not all users are happy with the service that is being provided. 95

Representative Comments from Desktop Support Interviews The technicians know a lot about the work but need stronger leadership. The technicians need to have the same level of knowledge so the team can provide better service. A knowledgebase needs to be created to get everyone working at the same level. It is a good idea to create specialists who can focus on particular issues. This will minimize the time spent in resolving issues and will improve the quality of the resolution. Desktop Support needs more technicians to provide better services, and to ensure that someone will be available in case an urgent issue comes up. The technicians need to be trained on how to treat VIP s. Learning how to deal with VIP s at Company XYZ will allow the technicians to work more effectively, and be more relaxed. Company XYZ is migrating to Apple technology and Desktop Support needs more Apple devices to be able to address the issues better. Having these devices will allow the technicians to recreate the problem and solve it. Internal communication is a big strength. The technicians transfer knowledge within the team. If a technician has a question, whoever knows the answer will assist. Oftentimes Desktop Support is dealing with angry clients because the request has to go through Mexico first. By the time the issue gets passed on to the team, a couple of hours or a couple of days have already passed, and the client is already expecting a resolution. 96

Representative Comments from Desktop Support Interviews When a technician leaves the team, it reduces the team s output because the new technicians have to go through a learning curve. The headcount is now adequate, and has improved the personnel/work order incident ratio. The technicians are more relaxed and can focus more on the user. There is no training program. The new hire training is provided on the job by the senior technicians. Desktop Support has personal interaction with the users. The users can explain the issue a bit better in person, and the technicians can provide better troubleshooting because the technicians have seen the problem. Configuring devices, installing computers and troubleshooting hardware issues are some of the strengths of Desktop Support. The Desktop Support s vision is to avoid situations where users are unable to do their work. The main challenge is to resolve issues in a timely manner. The technicians help and support each other. Knowledge is shared with everyone. Desktop Support needs to identify the timeframe for resolving particular types of issues so that users have realistic expectations about how much time is needed to resolve an issue. Desktop Support would like to have a knowledge base so that technicians can easily find information that s needed. The training can be improved by creating manuals. 97

Representative Comments from Desktop Support Interviews Processes need to be created not just for the technicians to follow, but also for the users to know how their issue will be handled. Processes need to be followed to ensure that users can keep working. It is difficult not to be stressed when there is so much work. To reduce the stress, the technicians try to help each other and talk about non-work related topics. Some users think that Desktop Support is not providing good service. 98

Key Themes from Company XYZ Interviews Company XYZ management is very disappointed with the quality of service provided by Company ABC the service desk in particular is viewed as a failure Company XYZ believes that turnover for both service desk and desktop support is excessive, and has been a huge problem Lack of training for the techs and agents at Company ABC contributes to turnover, and drives the perception that the service desk, in particular, is not very effective Company XYZ believes that the service desk agents are more interested in opening and closing tickets than in resolving the customer s problem Company ABC processes and procedures are not very mature, and are not well documented knowledge management, for example, has not been developed at all by Company ABC Company XYZ interviewees believe that many bank employees have given up on the service desk, and call desktop support directly when they need help 99

Representative Comments from Company XYZ Interviews In the beginning, Company ABC brought in some good people. But they all left, and we had to start all over again. Company ABC is not taking responsibility. They are not training their people properly, and they are not pushing ahead with improving their processes. Company ABC is not good at documentation. They came in telling us that they would implement ITIL processes, and that they had ITIL Centers of Excellence that they could rely upon, but none of that has happened. We had many good processes and procedures in place before Company ABC came in. But they have documented none of those. When we push Company ABC to produce documentation, they do so very reluctantly, and what they create is not very good. Every ticket closed by desktop support should be tied to a knowledge article. But they are not doing that. There is really no knowledge management at all. The service desk has a habit of putting tickets into pending status. They leave them there indefinitely because if they open them up again they will breach the service level. The result is that over time hundreds of tickets are left in pending status. Desktop support is doing better than the service desk, but that s not saying much. I think it s fair to say that after three years our outsourcing experiment has failed. 100

Representative Comments from Company XYZ Interviews The service desk agents are very nice, and polite. They just don t know how to solve the customers problems because they don t have the training or experience. I hear all the time that people from Company XYZ call the service desk and are told Please be patient, I am just learning. Well, they have been at the bank for three years, so why are they just learning? The calls just take too long, and always end in frustration. The agents in Mexico obviously don t have the skills to resolve the tickets. The agents don t have the knowledge to provide the support we need at the bank. It seems like the incentive at the service desk is to open and close tickets. But they don t seem to care about whether or not they solve the problem. Everyone at Company XYZ is dissatisfied with the service desk. I don t know how the customer satisfaction scores can be so high. I just don t believe it. My biggest concern is that Company ABC doesn t take responsibility. If anything improves it s only because we push them to improve it. They should be taking ownership, but they don t I don t know why the turnover at Company ABC is so high. Maybe they are not paid enough, or maybe they are not treated well, but the high turnover leads to all the other problems we are having with Company ABC. 101

Conclusions and Recommendations Company XYZ 102

Notable Strengths Company XYZ Peer Group Desktop Support has a number of notable strengths. Price metrics are excellent Top quartile performance for all price metrics Customer Satisfaction is good Second quartile performance Productivity metrics are good Technician Utilization is the highest in the peer group Good technician utilization is one of the key drivers behind Company XYZ s low Price per Ticket Company XYZ s overall performance on the benchmark was above average Company XYZ placed 4 th out of 19 desktop support groups in the benchmark Top Quartile Performance Overall 103

But Opportunities for Improvement Remain Service levels were among the worst in the benchmark All service levels are in the 4 th quartile This a primary source of dissatisfaction for Company XYZ Several technician metrics are weak Technician Turnover, Absenteeism, and Tenure are all in the 4 th quartile Most interviewees expressed concerns that processes and procedures in desktop support have not been well defined or documented Several important desktop support metrics are not being tracked by Company XYZ Incident First Visit Resolution Rate % Resolved Level 1 Capable There is a significant difference in perception between Company XYZ and Company ABC, as evidenced by the results of the process assessment Company ABC believes they are doing an outstanding job Company XYZ gives Company ABC significantly lower marks for their performance 104

Summary of Benchmarking Recommendations 1. Begin tracking and trending Incident First Visit Resolution Rate, and % Resolved Level 1 Capable 2. Work on improving service levels, which were the lowest of all companies in the benchmark 3. Provide access to the knowledge base, and require that desktop support work in conjunction with the service desk to keep the knowledge base current 4. Develop improved documentation of processes and procedures for desktop support 5. Provide additional training opportunities for desktop support technicians 6. Consider adopting the MetricNet Desktop Support Balanced Scorecard, and update the scorecard monthly 7. Develop an internal communication program in conjunction with the service desk to improve the visibility and reputation of Company XYZ s end-user support functions 8. Discuss major points of difference between Company XYZ and Company ABC on the results of the process maturity assessment 105

Some Suggested Performance Targets Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) 77.3 12.0% % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) 6.4 3.0% % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 106

Customer Satisfaction Sample Report Only. Data is not accurate. Incident First Visit Resolution vs. Customer Satisfaction 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 30% 40% 50% 60% 70% 80% 90% First Visit Resolution Rate (Incidents) 107

Technician Satisfaction vs. Customer Satisfaction 100% 90% Customer Satisfaction 80% 70% 60% 50% 40% 30% 20% 10% 0% 30% 40% 50% 60% 70% 80% 90% 100% Technician Satisfaction 108

Measuring Ticket Defects: % Resolved Level 1 Capable Price per Ticket Customer Satisfaction Technician Utilization FCR (Incidents) Service Levels: MTTR Techs/ Total FTE s Absenteeism/ Turnover % Resolved Level 1 Capable Work/ Travel Time SL s MTTR Scheduling Efficiency Technician Satisfaction Coaching Career Path Training Hours 109

Technician Experience vs. Incident FCR 90% 80% Incident FCR 70% 60% 50% 40% 30% 20% 0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0 Technician Time on Job (months) 110

Technician Job Satisfaction Sample Report Only. Data is not accurate. Annual Training Hours vs. Technician Job Satisfaction 100% 90% 80% 70% 60% 50% 40% 0 20 40 60 80 100 120 140 Annual Technician Training Hours 111

Incident MTTR vs. Customer Satisfaction 100% Customer Satisfaction 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 Incident MTTR (days) 112

Consider Adopting the Desktop Support Balanced Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $301.55 $39.38 $62.13 91.3% 13.7% Price per Service Request 15.0% $446.50 $51.71 $82.84 92.1% 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 113 Your balanced score for each metric is calculated: metric score X weighting

Desktop Support Balanced Score Sample Report Only. Data is not accurate. And Update the Scorecard Monthly 85% 80% 75% 70% 65% 60% 55% 50% 45% 40% Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 12 Month Average Monthly Score 114

Productive Hours Lost per Employee per Year 70 60 50 40 30 20 10 0 Quality of Support Drives End-User Productivity 0 1 2 3 4 5 Support Function Service Desk Desktop Support Performance Quartile n = 60 Key Performance Indicator Performance Quartile 1 (top) 2 3 4 (bottom) Customer Satisfaction 93.5% 84.5% 76.1% 69.3% First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4% Mean Time to Resolve (hours) 0.8 1.2 3.6 5.0 Customer Satisfaction 94.4% 89.2% 79.0% 71.7% First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5% Mean Time to Resolve (hours) 2.9 4.8 9.4 12.3 Average Productive Hours Lost per Employee per Year 17.1 25.9 37.4 46.9 115

PERCEIVED VALUE Internal Communication: Positioning Company XYZ for Future Success HIGHER Perceived Value > Actual Value LOWER Perceived Value < Actual Value LOWER ACTUAL COST VALUE HIGHER 116

Where Does Company XYZ Peer Group Desktop Support Operate? PERCEIVED VALUE HIGHER Perceived Value > Actual Value A Common (but Dangerous) Operating Position LOWER Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 117

PERCEIVED VALUE #2 Brand Management Sample Report Only. Data is not accurate. Operational Effectiveness First! HIGHER Perceived Value > Actual Value #1 Operational Effectiveness LOWER Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 118

PERCEIVED VALUE Sample Report Only. Data is not accurate. Closing the Perception vs. Reality Gap HIGHER Perceived Value > Actual Value Where you Should Be Closing the Perception Gap LOWER Where you Are Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 119

Image Management: The Five W s 1. Who Who are the Key Stakeholder Groups? 2. What What are the Key Messages? 3. When When are You Going to Communicate Them? 4. Where/How Where/How do You Reach the Stakeholders? 5. Why Why are We Doing This? 120

Timing Timing Key Success Factors in Desktop Support Image Management Channels Use All Available Log-in messages Newsletters Reference Guides Asset tags Surveys User liaisons Timing Frequent Contact New employee orientation At session log-in During training During the incident At scheduled sessions Messages Messages Multiple Messages Services Major initiatives Performance Levels FAQ s Success Stories 121

Managing Expectations for Desktop Support We ve all heard the expression Expectations Not Set are Expectations Not Met! So, let s get serious about proactively managing expectations! 122

123

90% Support Drives Customer Satisfaction for All of IT 84% % Saying Very Important 80% 70% 60% 50% 40% 30% 20% 10% 47% 31% n = 1,044 Global large cap companies Survey type: multiple choice 3 responses allowed per survey 29% 22% 19% 8% 0% Service Desk Desktop Support Network Outages VPN Training Enterprise Applications Factors Contributing to IT Customer Satisfaction Desktop Software 84% cited the service desk as a very important factor in their overall satisfaction with corporate IT 47% cited desktop support as a very important factor in their overall satisfaction with corporate IT 124

Company XYZ Internal Communication Summary Managing the gap between perception and reality is fairly straightforward It doesn t take a lot of time, or cost a lot of money But it is critically important The success of your support organization depends as much on your image, as it does on your actual performance! The Benefits of effective Image Management Include: Customer loyalty and positive word-of-mouth referrals Credibility, which leverages your ability to Get Things Done! A Positive Image for IT overall High levels of Customer Satisfaction 125

World-Class = 288.0 Balanced Score Average = 180.2 Sample Report Only. Data is not accurate. Company XYZ Must Improve Process Maturity Over Time 100% 90% 80% Company XYZ Performance Process Assessment Score 160.8 Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% 0.00 50.00 100.00 150.00 200.00 250.00 300.00 350.00 Process Assessment Score 126

Best Practice Focus Areas: Strategy Strategy Best Practices Defined Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's. Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization. Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization. Company XYZ's Score 3.0 3.0 2.5 Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives. 2.0 Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment. Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line. 2.0 2.0 Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan. 1.0 127

Best Practice Focus Areas: Human Resources Human Resources Best Practices Defined Technicians have quantifiable performance goals (e.g., for First Contact Resolution, Customer Satisfaction, Number of Tickets Handled per Month, etc.), and are held accountable for achieving their goals on a monthly basis. Technician Satisfaction surveys are conducted at least once per year, and the results of the survey are used to manage and improve Technician morale. Technician performance goals are linked to and aligned with the overall Desktop Support goals and performance Formal Performance reviews are scheduled and completed for all Desktop personnel at least once annually. New hires go through a formal training curriculum, including technical and customer service skills, and are required to pass a proficiency exam before independently handling customer incidents and service requests. Technicians have the opportunity to advance their careers in at least two ways: by improving their technical and customer service skills, and by improving their management and supervisory skills. Desktop Support has a formalized and documented recruiting process for filling vacancies. Job requirements are well defined, and candidates are tested for both technical skills, and customer service soft skills. Technicians are coached by their supervisor in one-on-one sessions on a monthly basis. Logged tickets are reviewed, and the supervisor provides specific suggestions to each Technician on how to improve performance. Desktop Support has a formalized, documented Technician career path. Technicians are made aware of their career advancement opportunities, and are encouraged to proactively manage their careers. Technicians are coached at least once yearly on their career path and career advancements options. Technicians are eligible for incentives and rewards based upon performance. These could include monetary incentives such as annual bonuses, or other incentives such as time off work, gift certificates, etc. Individual Technician training plans are clearly defined, documented and regularly updated. Technician training classes and curricula are specifically designed to maximize customer satisfaction, the number of user incidents resolved on First Contact, and to Minimize the Mean Time to Resolve. Veteran technicians (more than 6 months of experience) have access to training opportunities to improve their skill set, job performance, and the overall performance of the Desktop Support organization. Veteran technicians are required to complete a minimum number of refresher training hours each year. Company XYZ's Score 4.0 4.0 4.0 4.0 4.0 4.0 4.0 4.0 4.0 4.0 3.0 3.0 3.0 128

Best Practice Focus Areas: Process Process Best Practices Defined Incoming tickets are assigned a severity code based upon the number of users impacted, and the urgency of the incident. System alarms notify Desktop Support when a service level has been breached, whether at Desktop Support, or at another support level within the organization. Desktop Support has contingency plans to handle sudden, unexpected spikes in contact volume. These could include having supervisors and other indirect personnel handle incoming calls during a call spike. Desktop Support has a well defined service planning and readiness process that works closely with both internal engineering groups and vendors, and continues through product field testing and pre-release. This process enables Desktop Support to train for and prepare for supporting new products and services in the IT environment. Ticket handling processes are standardized, documented, and available online. With few exceptions, the standards are followed by Desktop Support technicians. Escalation points are well defined and documented. These include other support groups (e.g., Level 3 support, Field Support, etc.), and individuals to whom tickets may be escalated. Rules for ticket escalation and transfer are well defined and documented. Technicians know when and where to transfer or route a ticket if they are unable to assist the user. Desktop Support has a mature workforce scheduling process that achieves high technician utilization, while maintaining reasonable service levels. Desktop Support has a formal Knowledge Management Process that facilitates the acquisition, qualification, review, approval, and distribution of knowledge into a Knowledgebase. Desktop Support has a formal, rapid notification and correction process that is activated when a service level has been breached, whether at Desktop Support, or at some other support level. Desktop Support conducts periodic Root Cause Analysis (RCA) on the user contact profile to eliminate problems at their source. Desktop Support has contingency plans to handle both short and long term interruptions in service delivery. Desktop Support has an effective, ongoing process for projecting future workload and staffing requirements. Desktop Support is part of an end-to-end support process, where Level 1 Support acts as the Single Point of Contact (SPOC) for user support. Customers are offered a range of access options to Desktop Support, including live voice, voice mail, email, web chat, self-service, fax, and walk-in. Indirect contact channels, including Email, Voice Mail, and Faxes are treated with the same priority as live phone calls and chat sessions. The work queues from these channels are integrated, or worked in parallel. Company XYZ's Score 5.0 5.0 2.8 2.0 1.5 1.5 1.5 1.0 1.0 1.0 1.0 1.0 1.0 N/A N/A N/A 129

Best Practice Focus Areas: Technology Technology Best Practices Defined Desktop Support has an effective tool that allows technicians to proxy into a user's computer, take control of the computer, and remotely perform diagnostics and problem solving (e.g., Tivoli, Bomgar, GoTo Assist, etc.). The tool increases remote resolution rates, and reduces contact handle times. Desktop Support has a full-featured ticket management system that facilitates effective ticket tracking, service level compliance, reporting, and root cause analysis. Desktop Support has a comprehensive knowledge management tool that facilitates effective knowledge capture and reuse. Desktop technicians are able to quickly find solutions to user problems by searching the knowledge base. Solutions for the vast majority of user problems and questions can be found in the knowledge base. Desktop Support uses technology alerts/alarms to notify Desktop Support or perform self healing scripts when a customer or system issue is proactively identified. The Desktop Support knowledgebase is used frequently by all Desktop Support technicians, and results in higher First Contact Resolution Rates, and lower resolution times (MTTR). Desktop Support has an effective, integrated self-service portal that is available to all users. The self-service portal provides information, FAQ's, and solutions to problems that are more complex than simple password resets. The tool includes a direct link to Desktop Support technicians. Users are aware of the self-service portal, and usage rates are continuously increasing. Desktop Support has a multi-year plan for an integrated technology strategy. Desktop Support utilizes a capital investment justification process based on ROI, and reports on post installation ROI as part of this process. Desktop Support has an Automated Password Reset (APR) capability that dramatically reduces the number of password resets that must be performed manually by Desktop Support technicians. The ticket management system can track and monitor the skill levels of Desktop Support technicians based on closed tickets by product and/or service code. Company XYZ's Score 5.0 5.0 2.5 2.0 2.0 2.0 2.0 1.0 1.0 N/A 130

Best Practice Focus Areas: Performance Measurement Performance Measurement Best Practices Defined First Contact Resolution Rate for Incidents is measured, recorded, and tracked on an ongoing basis. Customer Satisfaction is measured, recorded, and tracked on an ongoing basis. Technician Utilization is measured, recorded, and tracked on an ongoing basis. Desktop Support understands key correlations and cause/effect relationships between the various KPI's. This enables Desktop Support to achieve desired performance goals by leveraging and driving the underlying "causal" metrics. Desktop Support KPI's are used to establish "stretch" goals. Desktop Support measures are used holistically, and diagnostically to identify performance gaps in Desktop Support performance, and to prescribe actions that will improve performance. Desktop Support conducts event driven customer surveys whereby the results of customer satisfaction surveys can be linked back to a specific ticket, and to a specific technician handling the contact at Desktop Support. Technician Satisfaction is measured, recorded, and tracked. Company XYZ's Score 5.0 5.0 4.5 4.0 4.0 2.0 1.0 1.0 Desktop Support tracks the Mean Time to Resolve (MTTR), and the Percentage of tickets resolved within 24 hours. 1.0 Desktop Support tracks the number of tickets that are that could have been resolved by the Desktop Support organization at Level one. Desktop Support conducts benchmarking at least once per year. Desktop Support maintains a balanced scorecard that provides a single, all-inclusive measure of Desktop Support performance. Cost per Ticket is measured, recorded, and tracked on an ongoing basis. First Level Resolution is measured, recorded, and tracked on an ongoing basis. 1.0 1.0 1.0 N/A N/A 131

Best Practice Focus Areas: Communication Communication Best Practices Defined Desktop Support maintains active communication with all stakeholder groups, including Desktop Support organization employees, IT managers, company managers outside of IT, and customers. Desktop Support tracks the number of training related contacts it receives, and provides feedback to user groups within the organization on training areas that could help to reduce Desktop Support contact volumes. IT is required to deliver a "turnover package" to Desktop Support for all changes that will impact the user environment. This could include application updates, new desktop software, etc. The turnover package is designed to prepare Desktop Support to provide support to users in the affected areas. Desktop Support monitors all tickets, including those that are escalated, until ticket closure. The value added by Desktop Support is communicated to key managers in IT, and expectations are formally established regarding Desktop Support roles and responsibilities. Desktop Support has a formal communications schedule, and provides customized content for each stakeholder group. Customers are told what to expect on resolution time when their ticket is escalated or if a call-back is required. Desktop Support meets frequently with other IT managers, and is an integral part of key decisions made within IT. Desktop Support plays the role of "voice of the user" within IT. Desktop Support has established User Group Liaisons who represent different groups within the user community. Desktop Support meets periodically with the liaisons to learn about user concerns and questions, and to communicate Desktop Support services, plans, and initiatives. Desktop Support meets frequently with user groups, and holds "informational briefings" to educate users on supported products and services, hours of operation, training opportunities, tips for getting the most benefit from Desktop Support provides training aids to users that enable them use Desktop Support more effectively. These could include log-in screens with Desktop Support phone number, chat windows that can be clicked to initiate a real-time chat session, mouse pads imprinted with Desktop Support IVR menu, etc. Desktop Support transmits outbound messages to users announcing major system and network outages, thereby alerting users about potential problems in the IT environment. These proactive messages help to reduce contact volumes during incidents that impact a large number of users. Company XYZ's Score 3.0 2.5 2.5 2.0 2.0 1.0 1.0 1.0 1.0 1.0 1.0 N/A 132

Some Suggested Performance Targets Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) 77.3 12.0% % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) 6.4 3.0% % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 133

Detailed Benchmarking Comparisons Company XYZ 134

Price Metrics Company XYZ 135

Definition Price Metrics: Price per Ticket Price per Ticket is the amount paid to the outsourcer for each Desktop Support ticket handled. It is typically calculated by dividing the annual fee paid to the outsourcer by the annual Desktop Support ticket volume. Ticket volume includes both incidents and service requests. Why it s Important Price per Ticket is one of the most important Desktop Support metrics. It is a measure of contract efficiency and effectiveness with your outsourcer. A higher than average Price per Ticket is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Ticket is not necessarily good, particularly if the low price is achieved by sacrificing Customer Satisfaction or service levels. Every outsourced Desktop Support Organization should track and trend Price per Ticket on an ongoing basis. Key Correlations Price per Ticket is strongly correlated with the following metrics: Price per Incident Price per Service Request Incident First Visit Resolution Rate Average Incident Work Time Average Service Request Work Time Average Travel Time per Ticket 136

Price per Ticket Sample Report Only. Data is not accurate. Price Metrics: Price per Ticket $340.00 $320.00 $300.00 $280.00 $260.00 $240.00 Key Statistics Price per Ticket High $325.06 Average ----- $124.74 Median $99.69 Low $50.49 Company XYZ $79.98 $220.00 $200.00 $180.00 $160.00 $140.00 $120.00 $100.00 $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 137

Definition Price Metrics: Price per Incident Price per Incident is the amount paid to the outsourcer for each Desktop Support incident handled. It is typically calculated by dividing the annual fee paid to the outsourcer (for incidents only) by the annual Desktop Support incident volume. Why it s Important Price per Incident is one of the most important Desktop Support metrics. It is one of the key components of Price per Ticket; the other being the Price per Service Request. A higher than average Price per Incident is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Incident is not necessarily good, particularly if the low price is achieved by sacrificing quality of service. Every Desktop Support organization should track and trend Price per Incident on a monthly basis. Key Correlations Price per Incident is strongly correlated with the following metrics: Price per Ticket Incident First Visit Resolution Rate Average Incident Work Time Average Travel Time per Ticket 138

Price per Incident Sample Report Only. Data is not accurate. Price Metrics: Price per Incident $320.00 $300.00 $280.00 $260.00 $240.00 $220.00 Key Statistics Price per Incident High $301.55 Average ----- $109.24 Median $96.35 Low $39.38 Company XYZ $62.13 $200.00 $180.00 $160.00 $140.00 $120.00 $100.00 $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 139

Definition Price Metrics: Price per Service Request Price per Service Request is the amount paid to the outsourcer for each Desktop Support service request handled. It is typically calculated by dividing the annual fee paid to the outsourcer (for service requests only) by the annual Desktop Support service request volume. Why it s Important Price per Service Request is one of the most important Desktop Support metrics. It is one of the key components of Price per Ticket; the other being the Price per Incident. A higher than average Price per Service Request is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Service Request is not necessarily good, particularly if the low price is achieved by sacrificing quality of service. Every Desktop Support organization should track and trend Price per Service Request on a monthly basis. Key Correlations Price per Service Request is strongly correlated with the following metrics: Price per Ticket Average Service Request Work Time Average Travel Time per Ticket 140

Price per Service Request Sample Report Only. Data is not accurate. Price Metrics: Price per Service Request $500.00 $450.00 $400.00 $350.00 Key Statistics Price per Service Request High $446.50 Average ----- $161.32 Median $134.55 Low $51.71 Company XYZ $82.84 $300.00 $250.00 $200.00 $150.00 $100.00 $50.00 $0.00 Desktop Support 141

Productivity Metrics Company XYZ 142

Productivity Metrics: Tickets per Technician per Month Definition Tickets per Technician per Month is the average monthly ticket volume divided by the average Full Time Equivalent (FTE) technician headcount. Ticket volume includes both incidents and service requests. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Tickets per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Ticket Handle Time. Conversely, a high number of technician handled tickets may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Ticket Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Tickets per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Incident Work Time Average Service Request Work Time Average Travel Time per Ticket 143

Tickets per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Tickets per Technician per Month 200.0 180.0 160.0 140.0 Key Statistics Tickets per Technician per Month High 183.9 Average ----- 99.1 Median 93.5 Low 32.3 Company XYZ 138.4 120.0 100.0 80.0 60.0 40.0 20.0 0.0 Desktop Support 144

Productivity Metrics: Incidents per Technician per Month Definition Incidents per Technician per Month is the average monthly incident volume divided by the average Full Time Equivalent (FTE) technician headcount. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Incidents per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Incident Handle Time. Conversely, a high number of technician handled incidents may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Incident Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Incidents per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Incident Work Time Average Travel Time per Ticket Incidents as a % of Total Ticket Volume 145

Incidents per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Incidents per Technician per Month 150.0 140.0 130.0 120.0 110.0 Key Statistics Incidents per Technician per Month High 144.8 Average ----- 69.8 Median 56.9 Low 19.2 Company XYZ 19.2 100.0 90.0 80.0 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Desktop Support 146

Productivity Metrics: Service Requests per Tech per Month Definition Service Requests per Technician per Month is the average monthly service request volume divided by the average Full Time Equivalent (FTE) technician headcount. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Service Requests per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Service Request Handle Time. Conversely, a high number of technician handled service requests may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Service Request Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Service Requests per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Service Request Work Time Average Travel Time per Ticket Incidents as a % of Total Ticket Volume 147

Service Requests per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Service Requests per Tech per Month 120.0 110.0 100.0 90.0 80.0 Key Statistics Service Requests per Technician per Month High 119.3 Average ----- 29.2 Median 17.8 Low 7.1 Company XYZ 119.3 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Desktop Support 148

Productivity Metrics: Technicians as a Percent of Total FTE's Definition This metric is the Full Time Equivalent Technician headcount divided by the total Desktop Support headcount. It is expressed as a percentage, and represents the percentage of total Desktop Support personnel who are engaged in direct customer service activities. Why it s Important The Technician headcount as a percent of total Desktop Support headcount is an important measure of management and overhead efficiency. Since non-technicians include both management and nonmanagement personnel (e.g., supervisors and team leads, QA/QC, trainers, etc.), this metric is not a pure measure of management span of control. It is, however, a more useful metric than management span of control because the denominator of this ratio takes into account all personnel that are not directly engaged in customer service activities. Key Correlations Technicians as a % of Total Headcount is strongly correlated with the following metrics: Cost per Inbound Contact Cost per Minute of Inbound Handle Time 149

Technicians as a Percent of Total FTE's Sample Report Only. Data is not accurate. Productivity Metrics: Technicians as a Percent of Total FTE's 98.0% 96.0% 94.0% 92.0% Key Statistics Technicians as a Percent of Total FTE's High 97.1% Average ----- 88.9% Median 88.1% Low 79.6% Company XYZ 87.5% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% Desktop Support 150

Definition Productivity Metrics: Technician Utilization Technician Utilization is the average time that a Technician spends handling both inbound and outbound contacts per month, divided by the number of work hours in a given month. The calculation for Technician Utilization is shown on the next page. Why it s Important Technician Utilization is the single most important indicator of Technician productivity. It measures the percentage of time that the average Technician is in work mode, and is independent of Contact Handle Time or call complexity. Key Correlations Technician Utilization is strongly correlated with the following metrics: Inbound Contacts per Technician per Month Cost per Inbound Contact Cost per Minute of Inbound Handle Time Technician Occupancy Average Speed of Answer 151

Technician Utilization Defined ((Average number of inbound calls handled by a tech in a month) X (Average inbound handle time in minutes) + Technician Utilization = (Average number of outbound calls handled by a tech in a month) X (Average outbound handle time in minutes)) (Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr) Tech Utilization is a measure of actual time worked by techs in a month, divided by total time at work during the month It takes into account both inbound and outbound contacts handled by the techs But it does not make adjustments for sick days, holidays, training time, project time, or idle time 152

Example: Desktop Support Technician Utilization Inbound Contacts per Technician per Month = 375 Outbound Contacts per Technician per Month = 225 Average Inbound Contact Handle Time = 10 minutes Average Outbound Contact Handle Time = 5 minutes Technician Utilization = ((Average number of inbound Contacts handled by a Technician in a month) X (Average inbound handle time in minutes) + (Average number of outbound Contacts handled by a Technician in a month) X (Average outbound handle time in minutes)) (Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr) Technician Utilization ((375 Inbound Contacts per Month) X (10 minutes) + (225 Outbound Contacts per Month) X (5 minutes) = = 50.4% (21.5 working days per month) X (7.5 work hours per day) X (60 minutes/hr) Technician Utilization 153

Technician Utilization Sample Report Only. Data is not accurate. Productivity Metrics: Technician Utilization 85.0% 80.0% 75.0% 70.0% Key Statistics Technician Utilization High 82.9% Average ----- 58.3% Median 55.5% Low 47.1% Company XYZ 82.9% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% Desktop Support 154

Service Level Metrics Company XYZ 155

Service Level Metrics: Mean Time to Resolve Incidents Definition The Mean Time to Resolve Incidents is the average number of working hours that elapse from the time an incident is reported, until the time the incident is closed. Non working hours are not included in the calculation. If, for example, an incident is reported at 3:00 pm on a Tuesday, and the ticket is closed at 3:00 pm on Wednesday, the MTTR will be 8 hours, not 24 hours. Why it s Important Service levels, including the MTTR for incidents and service requests, is a key driver of customer satisfaction with Desktop Support. Key Correlations Mean Time to Resolve Incidents is strongly correlated with the following metrics: Customer Satisfaction Average Incident Work Time Average Travel Time per Ticket % of Incidents Resolved in 8 Hours 156

Mean Time to Resolve Incidents (hrs) Sample Report Only. Data is not accurate. Service Level Metrics: Mean Time to Resolve Incidents 80.00 75.00 70.00 65.00 60.00 55.00 Key Statistics Mean Time to Resolve Incidents (hrs) High 77.29 Average ----- 11.79 Median 6.30 Low 3.00 Company XYZ 77.29 50.00 45.00 40.00 35.00 30.00 25.00 20.00 15.00 10.00 5.00 0.00 Desktop Support 157

Service Level Metrics: % of Incidents Resolved in 24 Hours Definition The % of Incidents Resolved in 24 hours is fairly self-explanatory. Non-working days are not included in the calculation. So, for example, an incident that is reported at 1:00 pm on a Friday will be resolved in 24 hours if the ticket is closed by 1:00 pm on a Monday. Why it s Important Service levels, including the % of Incidents Resolved in 24 Hours, is a key driver of customer satisfaction with Desktop Support. Key Correlations % of Incidents Resolved in 24 Hours is strongly correlated with the following metrics: Customer Satisfaction Average Incident Work Time Average Travel Time per Ticket Mean Time to Resolve Incidents 158

% of Incidents Resolved in 24 Hours Sample Report Only. Data is not accurate. Service Level Metrics: % of Incidents Resolved in 24 Hours 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics % of Incidents Resolved in 24 Hours High 86.7% Average ----- 69.9% Median 75.3% Low 30.1% Company XYZ 30.1% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% Desktop Support 159

Service Level Metrics: Mean Time to Fulfill Service Requests Definition The Mean Time to Fulfill Service Requests is the average number of working days that elapse from the time a service request is logged, until the time the service request is completed. Non working days are not included in the calculation. If, for example, a service request is logged at 3:00 pm on a Friday, and the ticket is closed at 3:00 pm on Tuesday, the MTTF will be 2 working days, not 4 working days. Why it s Important Service levels, including the MTTF for service requests and incidents, is a key driver of customer satisfaction with Desktop Support. Key Correlations Mean Time to Fulfill Service Requests is strongly correlated with the following metrics: Customer Satisfaction Average Service Request Work Time Average Travel Time per Ticket % of Service Requests Fulfilled in 24 Hours 160

Mean Time to Fulfill Service Requests (Days) Sample Report Only. Data is not accurate. Service Level Metrics: Mean Time to Fulfill Service Requests 7.00 6.50 6.00 5.50 5.00 4.50 Key Statistics Mean Time to Fulfill Service Requests (Days) High 6.40 Average ----- 3.04 Median 3.10 Low 1.20 Company XYZ 6.40 4.00 3.50 3.00 2.50 2.00 1.50 1.00 0.50 0.00 Desktop Support 161

Service Level Metrics: % of Service Requests Fulfilled in 72 Hours Definition The % of Service Requests Fulfilled in 72 Hours is fairly self-explanatory. Non working days are not included in the calculation. So, for example, a Service Request that is logged at 1:00 pm on a Friday will be fulfilled in 24 hours if the request is fulfilled by 1:00 pm the following Wednesday. Why it s Important Service levels, including the % of Service Requests Fulfilled in 72 Hours, is a key driver of customer satisfaction with Desktop Support. Key Correlations % of Service Requests Fulfilled in 72 Hours is strongly correlated with the following metrics: Customer Satisfaction Average Service Request Work Time Average Travel Time per Ticket Mean Time to Fulfill Service Requests 162

% of Service Requests Fulfilled in 72 Hours Sample Report Only. Data is not accurate. Service Level Metrics: % of Service Requests Fulfilled in 72 Hours 85.0% 80.0% 75.0% 70.0% 65.0% Key Statistics % of Service Requests Fulfilled in 72 Hours High 80.7% Average ----- 56.7% Median 51.8% Low 33.5% Company XYZ 42.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% Desktop Support 163

Quality Metrics Company XYZ 164

Definition Quality Metrics: Customer Satisfaction Customer Satisfaction is the percentage of customers who are either satisfied or very satisfied with their Desktop Support experience. This metric can be captured in a numbers of ways including automatic after-call IVR surveys, follow-up outbound (live Technician) calls, Email surveys, postal surveys, etc. Why it s Important Customer Satisfaction is the single most important measure of Desktop Support quality. Any successful Desktop Support will have consistently high Customer Satisfaction ratings. Some Desktop Support managers are under the impression that a low Cost per Inbound Contact may justify a lower level of Customer Satisfaction. But this is not true. MetricNet s research shows that even Desktop Support s with a very low Cost per Inbound Contact can achieve consistently high Customer Satisfaction ratings. Key Correlations Customer Satisfaction is strongly correlated with the following metrics: First Contact Resolution Rate Call Quality 165

Customer Satisfaction Sample Report Only. Data is not accurate. Quality Metrics: Customer Satisfaction 100.0% 98.0% 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% 62.0% Key Statistics Customer Satisfaction High 98.5% Average ----- 84.9% Median 86.6% Low 66.5% Company XYZ 92.0% Desktop Support 166

Quality Metrics: Incident First Visit Resolution Rate Definition Incident First Visit Resolution Rate is the percentage of incidents that are resolved on the first visit to the customer. Incidents that require a second visit, or are otherwise unresolved on the first visit for any reason, do not qualify for Incident First Visit Resolution. Why it s Important Incident First Visit Resolution Rate is one of the biggest drivers of Customer Satisfaction. A high Incident First Visit Resolution Rate is almost always associated with high levels of Customer Satisfaction. Desktop Support groups that emphasize training and have good technology tools generally enjoy a higher than average Incident First Visit Resolution Rate. Key Correlations Incident First Visit Resolution Rate is strongly correlated with the following metrics: Customer Satisfaction New Technician Training Hours Annual Technician Training Hours Average Incident Work Time 167

Incident First Visit Resolution Rate Sample Report Only. Data is not accurate. Quality Metrics: Incident First Visit Resolution Rate 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% Key Statistics Incident First Visit Resolution Rate High 95.1% Average ----- 84.5% Median 85.3% Low 67.0% Company XYZ N/A 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 168

Quality Metrics: % Resolved Level 1 Capable Definition % Resolved Level 1 Capable is the percentage of tickets resolved by Desktop Support that could have been resolved by the Level 1 Service Desk. This metric is generally tracked by sampling desktop tickets after the fact to determine the percentage that could have been resolved at Level 1, or by having the Desktop Support technician check a box on the trouble ticket when closing a ticket, that indicates that the ticket could have been resolved at Level 1. Why it s Important Tickets resolved by Desktop Support that could have been resolved by the Level 1 Service Desk represent defects. Since the cost of resolution is typically much higher at Desktop Support than it is for Level 1 support, every ticket that is unnecessarily escalated by Level 1 to Desktop Support incurs unnecessary costs. To minimize TCO for end-user support, the % Resolved Level 1 Capable should be as low as possible. Key Correlations % Resolved Level 1 Capable is strongly correlated with the following metrics: Average Incident Work Time Tickets per Seat per Month Incidents per Seat per Month 169

% Resolved Level 1 Capable Sample Report Only. Data is not accurate. Quality Metrics: % Resolved Level 1 Capable 30.0% 28.0% 26.0% 24.0% 22.0% Key Statistics % Resolved Level 1 Capable High 29.0% Average ----- 18.8% Median 19.1% Low 6.9% Company XYZ N/A 20.0% 18.0% 16.0% 14.0% 12.0% 10.0% 8.0% 6.0% 4.0% 2.0% 0.0% Desktop Support 170

Technician Metrics Company XYZ 171

Technician Metrics: Annual Technician Turnover Definition Annual Technician Turnover is the percentage of Technicians that leave Desktop Support, for any reason (voluntarily or involuntarily), on an annual basis. Why it s Important Technician turnover is costly. Each time a Technician leaves Desktop Support, a new Technician needs to be hired to replace the outgoing Technician. This results in costly recruiting, hiring, and training expenses. Additionally, it is typically several weeks or even months before a Technician is fully productive, so there is lost productivity associated with Technician turnover as well. High Technician turnover is generally associated with low Technician morale in a Desktop Support. Key Correlations Annual Technician Turnover is strongly correlated with the following metrics: Daily Technician Absenteeism Annual Technician Training Hours Customer Satisfaction Net First Contact Resolution Rate Cost per Inbound Contact Technician Job Satisfaction 172

Annual Technician Turnover Sample Report Only. Data is not accurate. Technician Metrics: Annual Technician Turnover 65.0% 60.0% 55.0% 50.0% 45.0% Key Statistics Annual Technician Turnover High 62.3% Average ----- 30.6% Median 28.5% Low 9.6% Company XYZ 62.3% 40.0% 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% Desktop Support 173

Technician Metrics: Daily Technician Absenteeism Definition Daily Technician Absenteeism is the average percentage of Technicians with an unexcused absence on any given day. It is calculated by dividing the number of absent Technicians by the total number of Technicians that are scheduled to be at work. Why it s Important High Technician Absenteeism is problematic because it makes it difficult for a Desktop Support to schedule resources efficiently. High absenteeism can severely impact a Desktop Support s operating performance, and increase the likelihood that service level targets will be missed. A Desktop Support s ASA and Call Abandonment Rate typically suffer when absenteeism is high. Also, chronically high absenteeism is often a sign of low Technician morale. Key Correlations Daily Technician Absenteeism is strongly correlated with the following metrics: Annual Technician Turnover Technician Job Satisfaction Technician Utilization Cost per Inbound Contact Contacts per Technician per Month 174

Daily Technician Absenteeism Sample Report Only. Data is not accurate. Technician Metrics: Daily Technician Absenteeism 12.0% 11.0% 10.0% 9.0% Key Statistics Daily Technician Absenteeism High 11.0% Average ----- 4.6% Median 3.9% Low 0.6% Company XYZ 11.0% 8.0% 7.0% 6.0% 5.0% 4.0% 3.0% 2.0% 1.0% 0.0% Desktop Support 175

Technician Metrics: New Technician Training Hours Definition The name of this metric is somewhat self explanatory. New Technician Training Hours is the number of training hours (including classroom, CBT, self-study, shadowing, being coached, and OJT) that a new Technicians receives before he/she is allowed to handle customer contacts independently. Why it s Important New Technician Training Hours are strongly correlated with Call Quality and Net First Contact Resolution Rate. particularly during a Technician s first few months on the job. The more training a new Technician receives, the higher the Call Quality and Net FCR will typically be. This, in turn, has a positive effect on many other performance metrics including Customer Satisfaction. Perhaps most importantly, training levels have a strong impact on Technician morale: Technicians who receive more training typically have higher levels of job satisfaction. Key Correlations New Technician Training Hours are strongly correlated with the following metrics: Call Quality Net First Contact Resolution Rate Customer Satisfaction Inbound Contact Handle Time Technician Job Satisfaction 176

New Technician Training Hours Sample Report Only. Data is not accurate. Technician Metrics: New Technician Training Hours 150 140 130 120 110 Key Statistics New Technician Training Hours High 146 Average ----- 59 Median 61 Low 0 Company XYZ 80 100 90 80 70 60 50 40 30 20 10 0 Desktop Support 177

Technician Metrics: Annual Technician Training Hours Definition Annual Technician Training Hours is the average number of training hours (including classroom, CBT, self-study, shadowing, etc.) that a Technician receives on an annual basis. This number includes any training hours that a Technician receives that are not part of the Technician s initial (new Technician) training, but it does not include routine team meetings, shift handoffs, or other activities that do not involve formal training. Why it s Important Annual Technician Training Hours are strongly correlated with Call Quality, Customer Satisfaction, and Net First Contact Resolution Rate. Perhaps most importantly, training levels have a strong impact on Technician morale: Technicians who train more typically have higher levels of job satisfaction. Key Correlations Annual Technician Training Hours are strongly correlated with the following metrics: Call Quality Net First Contact Resolution Rate Customer Satisfaction Inbound Contact Handle Time Technician Job Satisfaction 178

Annual Technician Training Hours Sample Report Only. Data is not accurate. Technician Metrics: Annual Technician Training Hours 60 55 50 45 Key Statistics Annual Technician Training Hours High 56 Average ----- 10 Median 3 Low 0 Company XYZ N/A 40 35 30 25 20 15 10 5 0 Desktop Support 179

Technician Metrics: Technician Tenure (months) Definition Technician tenure is the average number of months that Technicians have worked on a particular Desktop Support. Why it s Important Technician tenure is a measure of Technician experience. Virtually every metric related to Desktop Support cost and quality is impacted by the level of experience the Technicians have. Key Correlations Technician tenure is strongly correlated with the following metrics: Cost per Inbound Contact Call Quality Customer Satisfaction Annual Technician Turnover Training Hours Coaching Hours Inbound Contact Handle Time Net First Contact Resolution Rate Technician Job Satisfaction 180

Technician Tenure (months) Sample Report Only. Data is not accurate. Technician Metrics: Technician Tenure (months) 90.0 85.0 80.0 75.0 70.0 65.0 Key Statistics Technician Tenure (months) High 88.0 Average ----- 43.9 Median 37.4 Low 14.6 Company XYZ 14.6 60.0 55.0 50.0 45.0 40.0 35.0 30.0 25.0 20.0 15.0 10.0 5.0 0.0 Desktop Support 181

Technician Metrics: Technician Job Satisfaction Definition Technician Job Satisfaction is the percent of Technicians in a Desktop Support that are either satisfied or very satisfied with their jobs. Why it s Important Technician Job Satisfaction is a proxy for Technician morale. And morale, while difficult to measure, is a bellwether metric that affects almost every other metric in Desktop Support. High performance Desktop Supports almost always have high levels of Technician Job Satisfaction. Perhaps more importantly, this metric can be controlled and improved through training, coaching, and career pathing. Key Correlations Technician Job Satisfaction is strongly correlated with the following metrics: Annual Technician Turnover Customer Satisfaction Daily Technician Absenteeism Net First Contact Resolution Rate Technician Training Hours Inbound Contact Handle Time Technician Coaching Hours Cost per Inbound Contact 182

Technician Job Satisfaction Sample Report Only. Data is not accurate. Technician Metrics: Technician Job Satisfaction 88.0% 86.0% 84.0% 82.0% Key Statistics Technician Job Satisfaction High 85.4% Average ----- 78.3% Median 78.1% Low 69.9% Company XYZ 78.6% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 183

Ticket Handling Metrics Company XYZ 184

Ticket Handling Metrics: Average Incident Work Time Definition Average Incident Work Time is the average time that a technician spends to resolve an incident. This does not include travel time to and from the customer, or time between visits if multiple visits are required to the user s desktop to resolve an incident. It includes only the time that a technician spends actually working on an incident. Why it s Important Incident Work Time is one of the basic units of work in Desktop Support. Average Incident Work Time, therefore, represents the amount of labor required to resolve one incident. Key Correlations Average Incident Work Time is strongly correlated with the following metrics: Cost per Incident Incidents per Technician per Month Incident First Visit Resolution Rate 185

Average Incident Work Time (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Average Incident Work Time 45.0 40.0 35.0 30.0 Key Statistics Average Incident Work Time (minutes) High 40.6 Average ----- 21.1 Median 19.2 Low 11.1 Company XYZ 35.0 25.0 20.0 15.0 10.0 5.0 0.0 Desktop Support 186

Ticket Handling Metrics: Average Service Request Work Time Definition Average Service Request Work Time is the average time that a technician spends to resolve a service request. This does not include travel time to and from the customer, or time between visits if multiple visits are required to fulfill a service request. It includes only the time that a technician spends actually fulfilling a service request. Why it s Important Service Request Work Time is one of the basic units of work in Desktop Support. Average Service Request Work Time, therefore, represents the amount of labor required to fulfill one service request. Key Correlations Average Service Request Work Time is strongly correlated with the following metrics: Cost per Service Request Service Requests per Technician per Month 187

Average Service Request Work Time (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Average Service Request Work Time 110.0 100.0 90.0 80.0 70.0 Key Statistics Average Incident Work Time (minutes) High 108.0 Average ----- 48.7 Median 45.0 Low 21.0 Company XYZ 50.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Desktop Support 188

Ticket Handling Metrics: Estimated Travel Time per Ticket Definition Average Travel Time per Ticket is the average round trip travel time to get to and from the site of a user or device being serviced. In a high density user environment (e.g., high rise office building) the Travel Time per Ticket will typically be less than 20 minutes. By contrast, in a more distributed user environment (e.g., field or campus locations), the Travel Time per Ticket will be correspondingly longer. Why it s Important Unlike the Level 1 Service Desk where support is provided remotely, Desktop Support, by definition, requires onsite support. Getting to and from the site of a ticket can be very time consuming, and will influence the number of tickets a technician can handle in a day or a month. This, in turn, influences the level of staffing required in the Desktop Support organization. Key Correlations Average Travel Time per Ticket is strongly correlated with the following metrics: Cost per Ticket Number of Tickets per Technician per Month 189

Estimated Travel Time per Ticket (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Estimated Travel Time per Ticket 100.0 90.0 80.0 70.0 Key Statistics Estimated Travel Time per Ticket (minutes) High 99.0 Average ----- 34.5 Median 30.0 Low 10.0 Company XYZ 10.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Desktop Support 190

Workload Metrics Company XYZ 191

Workload Metrics: Tickets per Seat per Month Definition Tickets per Seat per Month is a measure of the volume of Desktop Support work generated by a given user population. The number of Tickets per Seat per Month can vary dramatically from one organization to another, driven by factors such as the age of devices being supported, the number of mobile devices, the location of users (office, home, field), the number of laptop computers, and myriad other factors. Why it s Important The number of Tickets per Seat per Month will drive the workload, and hence the staffing for a Desktop Support group. Desktop Support staffing decisions should be based on this metric, rather than the number of users being supported. Key Correlations Tickets per Seat per Month is strongly correlated with the following metrics: Incidents per Seat per Month Service Requests per Seat per Month 192

Tickets per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Tickets per Seat per Month 1.10 1.00 0.90 0.80 Key Statistics Tickets per Seat per Month High 1.01 Average ----- 0.52 Median 0.46 Low 0.19 Company XYZ 0.33 0.70 0.60 0.50 0.40 0.30 0.20 0.10 0.00 Desktop Support 193

Workload Metrics: Incidents per Seat per Month Definition Incidents per Seat per Month is a key measure of the volume of Desktop Support work generated by a given user population. The number of Incidents per Seat per Month can vary dramatically from one organization to another, driven by factors such as the age of devices being supported, the number of mobile devices, the location of users (office, home, field), the number of laptop computers, and myriad other factors. Why it s Important The number of Incidents per Seat per Month is a major workload driver, and will therefore have a strong impact on staffing decisions for Desktop Support. Key Correlations Incidents per Seat per Month is strongly correlated with the following metrics: Tickets per Seat per Month 194

Incidents per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Incidents per Seat per Month 0.90 0.80 0.70 Key Statistics Incidents per Seat per Month High 0.85 Average ----- 0.39 Median 0.27 Low 0.05 Company XYZ 0.05 0.60 0.50 0.40 0.30 0.20 0.10 0.00 Desktop Support 195

Workload Metrics: Service Requests per Seat per Month Definition Service Requests per Seat per Month is a key measure of the volume of Desktop Support work generated by a given user population. The number of Service Requests per Seat per Month can vary dramatically from one organization to another, driven by factors such as the number of move/add/change requests, the age of devices being supported, the location of users (office, home, field), the frequency of device refreshes, and myriad other factors. Why it s Important The number of Service Requests per Seat per Month is a major workload driver, and will therefore have a strong impact on staffing decisions for Desktop Support. Key Correlations Service Requests per Seat per Month is strongly correlated with the following metrics: Tickets per Seat per Month 196

Service Requests per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Service Requests per Seat per Month 0.30 0.28 0.26 0.24 0.22 0.20 Key Statistics Service Requests per Seat per Month High 0.28 Average ----- 0.12 Median 0.11 Low 0.04 Company XYZ 0.28 0.18 0.16 0.14 0.12 0.10 0.08 0.06 0.04 0.02 0.00 Desktop Support 197

Workload Metrics: Incidents as a % of Total Ticket Volume Definition Incidents as a % of Total Ticket Volume is a fairly self-explanatory metric. It is an indicator of the mix of work (Incidents vs. Service Requests) handled by a Desktop Support group. Most Desktop Support organizations receive more incidents than service requests. Since incidents are generally less costly to resolve than service requests, the higher the number of Incidents as a % of Total Ticket Volume, the lower the Cost per Ticket will be. Why it s Important Incidents are generally unplanned work (e.g., device break/fix), while the majority of service requests are planned work (e.g., move/add/change). Incidents as a % of Total Ticket Volume is therefore a measure of the percentage of Desktop Support work that is made up of unplanned work (incidents). Key Correlations Incidents as a % of Total Ticket Volume is strongly correlated with the following metrics: Cost per Ticket Tickets per Technician per Month 198

Incidents as a % of Total Ticket Volume Sample Report Only. Data is not accurate. Workload Metrics: Incidents as a % of Total Ticket Volume 100.0% 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics Incidents as a % of Total Ticket Volume High 93.4% Average ----- 70.7% Median 75.4% Low 13.8% Company XYZ 13.8% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% Desktop Support 199

About MetricNet: Your Benchmarking Partner 200

Your Project Manager: Jeff Rumburg Jeff Rumburg is a co-founder and Managing Partner at MetricNet, LLC. Jeff is responsible for global strategy, product development, and financial operations for the company. As a leading expert in benchmarking and re-engineering, Mr. Rumburg authored a best selling book on benchmarking, and has been retained as a benchmarking expert by such well-known companies as American Express, Hewlett-Packard, and GM. Prior to co-founding MetricNet, Mr. Rumburg was president and founder of The Verity Group, an international management consulting firm specializing in IT benchmarking. While at Verity, Mr. Rumburg launched a number of syndicated benchmarking services that provided low cost benchmarks to more than 1,000 corporations worldwide. Mr. Rumburg has also held a number of executive positions at META Group, and Gartner, Inc. As a vice president at Gartner, Mr. Rumburg led a project team that reengineered Gartner's global benchmarking product suite. And as vice president at META Group, Mr. Rumburg's career was focused on business and product development for IT benchmarking. Mr. Rumburg's education includes an M.B.A. from the Harvard Business School, an M.S. magna cum laude in Operations Research from Stanford University, and a B.S. magna cum laude in Mechanical Engineering. He is author of A Hands-On Guide to Competitive Benchmarking: The Path to Continuous Quality and Productivity Improvement, and has taught graduate-level engineering and business courses. Mr. Rumburg serves on the Strategic Advisory Board for HDI, formerly the Help Desk Institute. 201

Benchmarking is MetricNet s Core Business Information Technology Call Centers Telecom Desktop Support Service Desk Field Services Technical Support Customer Service Telemarketing/Telesales Collections Cost Benchmarking Satisfaction Customer Satisfaction Employee Satisfaction 202

25 Years of Desktop Support Benchmarking Data More than 1,100 Desktop Support Benchmarks Global Database 30 Key Performance Indicators Nearly 60 Industry Best Practices 203

Meet a Sampling of Our Clients MetricNet Conducts benchmarking for Desktop Support groups worldwide, and across virtually every industry sector. 204

Thank You! We look forward to serving you! 205