Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ

Size: px
Start display at page:

Download "Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ"

Transcription

1 Company XYZ Peer Group Desktop Support Benchmark Company XYZ

2 Report Contents Project Overview and Objectives Page 2 Industry Background Page 37 Performance Benchmarking Summary Page 51 Best Practices Process Assessment Page 78 Interview Themes and Quotes Page 92 Conclusions and Recommendations Page 102 Detailed Benchmarking Comparisons Page 134 Price Metrics Page 135 Productivity Metrics Page 142 Service Level Metrics Page 155 Quality Metrics Page 164 Technician Metrics Page 171 Ticket Handling Metrics Page 184 Workload Metrics Page 191 About MetricNet Page 200 1

3 Project Overview and Objectives Company XYZ 2

4 Project Objectives Review and assess the performance of Company XYZ s desktop support function Benchmark the performance of Company XYZ against a peer group of comparable desktop support organizations Recommend strategies to optimize performance Achieve world-class levels of performance Maximize customer satisfaction 3

5 Project Approach Module 1: Company XYZ Baselining / Data Collection Module 2: Benchmarking and Gap Analysis Module 3: Balanced Scorecard Module 4: Best Practices Process Assessment Module 5: Strategies for Optimized Performance Module 6: Report Development and Presentation of Results 4

6 Module 1: Company XYZ Baselining/Data Collection Core Topics Project Kickoff Data Collection Interviews 5

7 Project Kickoff Meeting Company XYZ Key Objectives: Project Kickoff Meeting Introduce the MetricNet and Company XYZ project teams Discuss the project schedule Distribute the data collection document Answer questions about the project 6

8 Data Collection 7

9 Interviews Company XYZ Interviews Technicians, team leads, supervisors QA/QC, Workforce schedulers, trainers 8

10 Module 2: Benchmarking and Gap Analysis Core Topics Peer Group Selection Benchmarking Comparison Gap Analysis 9

11 Benchmarking Peer Group Selection Scope Scale Complexity IDEAL PEER GROUP Geography Read MetricNet s whitepaper on Benchmarking Peer Group Selection. Go to to get your copy! 10

12 Dynamic Peer Group Selection Scope Scale Complexity Geography Scope refers to the services offered by Desktop Support. The broader the scope of services, the broader the skill set required by the technicians. As scope increases, so too does the cost of providing support. Desktop Support groups selected for benchmarking comparison must be comparable in the scope of services offered. Scale refers to the number of tickets handled by Desktop Support. Virtually everything in Desktop Support is subject to scale economies. This is particularly true when it comes to the volume of tickets handled. The approximate scale effect for volume is 5%. What this means is that every time the number of transactions doubles, you should expect to see the cost per ticket decline by 5%. For this reason, it is important to select benchmarking peer groups that are similar in scale. The complexity of transactions handled will influence the handle time, and hence the cost per ticket. For example, a system reboot is a simple transaction that takes very little time, and costs very little to resolve. By contrast, a complete system reimaging takes much longer and costs much more to resolve. MetricNet uses a proprietary algorithm to determine a weighted complexity index based upon the mix of tickets handled by Desktop Support. The companies chosen for a benchmarking peer group will have similar complexity factors. The main factor that is affected by geography is cost; specifically labor cost. Since labor accounts for 65% of desktop support operating expense, it is important to benchmark desktop support groups that have a common geography. Even within a particular geography, wage rates can differ significantly, so MetricNet makes adjustments to ensure that each Desktop Support organization in a benchmarking peer group is normalized to the same wage rate. 11

13 Desktop Support Benchmark: Key Questions Answered Key Questions How is your Desktop Support group performing? How does your Desktop Support group compare to other comparable support groups? What are the strengths and weaknesses of your desktop support group? What are the areas of improvement for your Desktop Support? How can you enhance Desktop Support performance and achieve world-class status? MetricNet s Benchmarking Database Desktop Support Benchmark Gap Analysis Improvement Recommendations Realized Performance Gains Company XYZ Peer Group Desktop Support Data 12

14 The Benchmarking Methodology Company XYZ Peer Group Desktop Support Performance COMPARE Performance of Benchmarking Peer Group Determine How Best in Class Achieve Superiority Adopt Selected Practices of Best in Class Build a Sustainable Competitive Advantage The ultimate objective of benchmarking Read MetricNet s whitepaper on Desktop Support Benchmarking. Go to to download your copy! 13

15 Summary of Included Desktop Support Metrics Price Productivity Service Level Price per Ticket Price per Incident Price per Service Request Quality Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours Technician Ticket Handling Workload Annual Technician Turnover Daily Technician Absenteeism New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 14

16 Benchmarking KPI Performance Summary Metric Type Price Productivity Service Level Quality Technician Ticket Handling Workload Key Performance Indicator (KPI) Company XYZ Peer Group Statistics Average Min Median Max Price per Ticket $79.98 $ $50.49 $99.69 $ Price per Incident $62.13 $ $39.38 $96.35 $ Price per Service Request $82.84 $ $51.71 $ $ Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's 87.5% 88.9% 79.6% 88.1% 97.1% Technician Utilization 82.9% 58.3% 47.1% 55.5% 82.9% Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours 30.1% 69.9% 30.1% 75.3% 86.7% Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% 33.5% 51.8% 80.7% Customer Satisfaction 92.0% 84.9% 66.5% 86.6% 98.5% Incident First Visit Resolution Rate N/A 84.5% 67.0% 85.3% 95.1% % Resolved Level 1 Capable N/A 18.8% 6.9% 19.1% 29.0% Annual Technician Turnover 62.3% 30.6% 9.6% 28.5% 62.3% Daily Technician Absenteeism 11.0% 4.6% 0.6% 3.9% 11.0% New Technician Training Hours Annual Technician Training Hours N/A Technician Tenure (months) Technician Job Satisfaction 78.6% 78.3% 69.9% 78.1% 85.4% Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 13.8% 70.7% 13.8% 75.4% 93.4% 15

17 Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Company XYZ Peer Group Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective Company XYZ Desktop Support Global Database Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Cost (Efficiency) Lower Price 16

18 Module 3: Balanced Scorecard Core Topics Metrics Selection Metric Weightings Scorecard Construction 17

19 Desktop Support Scorecard: An Overview Desktop Support scorecard employs a methodology that provides you with a single, all-inclusive measure of your Desktop Support performance It combines price, service level, productivity, and quality metrics into an overall performance indicator of your Desktop Support Your Desktop Support score will range between 0 and 100%, and can be compared directly to the scores of other Desktop Support Groups in the benchmark By computing your overall score on a monthly or quarterly basis, you can track and trend your performance over time Charting and tracking your Desktop Support score is an ideal way to ensure continuous improvement in Desktop Support! 18

20 Company XYZ Peer Group Desktop Support Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $ $39.38 $ % 13.7% Price per Service Request 15.0% $ $51.71 $ % 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 19 Your balanced score for each metric is calculated: metric score X weighting

21 Balanced Scores Sample Report Only. Data is not accurate. Balanced Scorecard Summary 100.0% 90.0% 80.0% 70.0% Key Statistics Desktop Support Scores High 90.5% Average % Median 58.0% Low 11.5% Company XYZ 75.2% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% 0.0% Desktop Support 20

22 Scorecard Performance Rankings Overall Ranking Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score 1 18 $61.57 $ % 92.7% 73.6% 83.5% 50.9% 85.4% 90.5% 2 8 $39.38 $ % 94.8% 68.1% 74.8% 56.7% 75.2% 79.7% 3 17 $85.66 $ % 95.1% 65.1% 82.4% 48.8% 78.1% 77.6% 4 Company XYZ $62.13 $ % N/A 82.9% 30.1% 42.0% 78.6% 75.2% 5 15 $99.37 $ % 89.1% 57.1% 80.0% 64.2% 84.4% 74.4% 6 7 $ $ % 94.7% 50.5% 75.1% 49.6% 83.5% 69.9% 7 4 $96.35 $ % 93.5% 55.5% 80.0% 59.7% 76.1% 66.2% 8 16 $ $ % 81.0% 60.5% 75.3% 80.7% 78.7% 64.6% 9 10 $ $ % 86.3% 55.4% 83.1% 48.5% 75.2% 58.8% 10 1 $99.48 $ % 86.1% 48.9% 67.8% 77.0% 84.7% 58.0% 11 9 $87.04 $ % 82.8% 58.2% 61.9% 51.8% 76.3% 57.4% 12 2 $41.29 $ % 80.6% 63.6% 78.9% 59.0% 70.5% 54.6% 13 5 $ $ % 86.3% 47.1% 63.3% 42.8% 77.4% 54.3% 14 6 $ $ % 84.5% 47.3% 86.7% 42.0% 83.7% 52.8% 15 3 $52.04 $ % 73.9% 72.1% 42.3% 75.8% 77.9% 52.1% $84.97 $ % 76.2% 48.6% 80.4% 46.2% 78.1% 39.8% $ $ % 81.5% 49.9% 85.0% 68.8% 80.6% 37.8% $77.02 $ % 75.6% 52.7% 53.9% 33.5% 73.7% 37.5% $ $ % 67.0% 50.5% 43.2% 78.7% 69.9% 11.5% Key Statistics Scorecard Metrics Average $ $ % 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $ $ % 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $ % 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $ % 85.3% 55.5% 75.3% 51.8% 78.1% 58.0% 21

23 Module 4: Best Practices Process Assessment Core Components Company XYZ Self Assessment MetricNet Maturity Ranking Process Assessment Rollup 22

24 Six-Part Model for Desktop Support Best Practices Model Component Definition Strategy Strategy Defining Your Charter and Mission Stakeholder Communication Customer Enthusiasm Human Resources Human Resources Proactive, Life-cycle Management of Personnel Process Expeditious Delivery of Customer Service Performance Measurement Technology Process Technology Leveraging People and Processes Performance Measurement A Holistic Approach to Performance Measurement Stakeholder Communication Proactively Managing Stakeholder Expectations 23

25 Best Practices Evaluation Criteria Ranking Explanation 1 No Knowledge of the Best Practice. 2 Aware of the Best Practice, but not applying it. 3 Aware of the Best Practice, and applying at a rudimentary level. 4 Best Practice is being effectively applied. 5 Best Practice is being applied in a world-class fashion. 24

26 Company XYZ Peer Group Desktop Support Self Assessment Best Practice Strategy Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives Summary Statistics Total Score Average Score

27 Average Score 4.5 Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary Company XYZ Peer Group 26

28 World-Class = Balanced Score Average = Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary 100% 90% 80% Company XYZ Performance Process Assessment Score Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% Process Assessment Score 27

29 Module 5: Strategies for Optimized Performance Core Components Conclusions and Recommendations Roadmap for World- Class Performance 28

30 Conclusions and Recommendations Conclusions and Recommendations fall into six categories 1. Strategy 2. Human Resource Management 3. Call Handling Processes and Procedures 4. Technology 5. Performance Measurement and Management 6. Stakeholder Communication 29

31 KPI Correlations Drive Conclusions Price per Ticket Customer Satisfaction Technician Utilization FCR (Incidents) Service Levels: MTTR Techs/ Total FTE s Absenteeism/ Turnover % Resolved Level 1 Capable Work/ Travel Time SL s MTTR Scheduling Efficiency Technician Satisfaction Coaching Career Path Training Hours 30

32 Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Global Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective World-Class Desktop Support Peer Group Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Price (Efficiency) Lower Price 31

33 Performance Targets will be Established Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) % % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) % % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 32

34 Module 6: Report Development and Presentation of Results Core Topics Conclusions and Recommendations Report Development Presentation of Benchmarking Results 33

35 Write Benchmarking Report 34

36 Presentation of Results The results of the benchmark will be presented in a live webcast Company XYZ 35

37 Deliverables include Project Participation Kit: Summary of Deliverables Project Schedule Data collection questionnaires Project Kickoff Meeting Telephone Interviews Comprehensive Assessment and Benchmarking Report Project Overview and Objectives Industry Background Benchmarking Performance Summary Balanced Scorecard Best Practices Process Assessment Conclusions and Recommendations Onsite Presentation of Results 36

38 Industry Background Company XYZ 37

39 25 Years of Desktop Support Benchmarking Data More than 1,100 Desktop Support Benchmarks Global Database 30 Key Performance Indicators Nearly 60 Industry Best Practices 38

40 Then and Now: The Evolution of Desktop Support Desktop Support KPI s Monthly Desktop Tickets per Seat North American Averages 1988 Last Year Price per Ticket $29 $62 Average Incident Work Time (min:sec) Incidents Resolved on First Contact % Resolved Level 1 Capable Starting Technician Salaries (current dollars) Desktop Cost per Seat per Year 17:40 32:15 74% 68% 54% 22% $37,050 $43,627 $184 $580 39

41 So What s Going on Here? Industry MegaTrends: The Drivers Increasing awareness and understanding of Support TCO (Total Cost of Ownership) End-User Support evolving from a support to a strategic role in the enterprise The growing importance of Desktop Support in shaping end-user opinions on IT 40

42 And What are the Implications? Industry MegaTrends: The Result Increased Emphasis on First Contact Resolution (FCR), and Mean Time to Resolve (MTTR) Strategic Application of Key Performance Indicators (KPI s) Investments in Technician Development New Models for Measuring Desktop Support Value Renewed Emphasis on Internal Marketing Increased Starting Salaries for Technicians 41

43 Tickets, Incidents, and Service Requests Tickets Incidents Service Requests Unplanned work that requires a physical touch to a device Hardware break/fix Device failure Connectivity failure Planned work that requires a physical touch to one or more devices Move/Add/Change Hardware or software upgrade Device refresh Device set-up Incident Volume + Service Request Volume = Ticket Volume 42

44 Characteristics of World-Class Desktop Support Desktop Support consistently exceeds customer expectations Result is high levels of customer satisfaction (> 93%) MTTR is below average for Incidents and Service Requests < 0.7 days for Incidents < 3.8 days for Service Requests Costs are managed at or below industry average levels Price per Ticket, per Incident, and per Service Request is below average Minimizes Total Cost of Ownership (TCO) Desktop Support follows industry best practices Industry best practices are defined and documented Desktop Support follows industry best practices Every transaction adds value A positive customer experience Drives a positive view of IT overall 43

45 Customer Satisfaction Sample Report Only. Data is not accurate. World-Class Desktop Support Defined Higher World-Class Desktop Support BEST-IN-CLASS PERFORMANCE CURVE AVERAGE PERFORMANCE CURVE Average Desktop Support Lower Price per Ticket Higher 44

46 World-Class Desktop Support: Three Sources of Leverage World-Class Desktop Support organizations recognize and exploit three unique sources of leverage: 1. Minimizing Total Cost of Ownership (TCO) 2. Improving End-User productivity 3. Driving High Levels of Customer Satisfaction for Corporate IT 45

47 Cost of Resolution: North American Averages Support Level Price per Ticket Vendor $471 Field Support $196 Level 3 IT (apps, networking, NOC, etc.) Level 2: Desktop Support $85 $62 Level 1: Service Desk $22 46

48 The Tao of SPOC (Single Point of Contact) Level 2: Desktop Support User Community Level 1: Service Desk Level 3: IT Support Field Support Vendor Support 47

49 The Tao of SPOC (Continued) Key SPOC Principles Enterprise takes an end-to-end view of user support User/Customer has a single point of contact for all IT-related incidents, questions, problems, and work requests The Level 1 Service Desk is the SPOC Level 1 is responsible for: Ticket triage Resolution at Level 1 if possible Effective handoffs to n level support Resolution coordination and facilitation Ticket closure Desktop Drive-bys, Fly-bys, and Snags are strongly discouraged 48

50 Productive Hours Lost per Employee per Year Quality of Support Drives End-User Productivity Support Function Service Desk Desktop Support Performance Quartile n = 60 Key Performance Indicator Performance Quartile 1 (top) (bottom) Customer Satisfaction 93.5% 84.5% 76.1% 69.3% First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4% Mean Time to Resolve (hours) Customer Satisfaction 94.4% 89.2% 79.0% 71.7% First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5% Mean Time to Resolve (hours) Average Productive Hours Lost per Employee per Year

51 90% Support Drives Customer Satisfaction for All of IT 84% % Saying Very Important 80% 70% 60% 50% 40% 30% 20% 10% 47% 31% n = 1,044 Global large cap companies Survey type: multiple choice 3 responses allowed per survey 29% 22% 19% 8% 0% Service Desk Desktop Support Network Outages VPN Training Enterprise Applications Factors Contributing to IT Customer Satisfaction Desktop Software 84% cited the service desk as a very important factor in their overall satisfaction with corporate IT 47% cited desktop support as a very important factor in their overall satisfaction with corporate IT 50

52 Performance Benchmarking Summary Company XYZ 51

53 Company XYZ Peer Group Desktop Support Overview Desktop Support Location Hours of Operation Annual Price Paid Tickets Washington, DC 7:00 am - 7:30 pm ET, Monday through Friday $929, Monthly Contact Volume Incidents Service Requests Desktop Support Tech Level 1 Desktop Support Tech Level 2 Desktop Support Tech Level 3 FTE Headcount Operations Manager 1.0 Technology Profile 2.0 Trouble Ticket System BMC, Remedy ITSM V Customer Information System BMC, Remedy ITSM V Automatic Call Distributor (ACD) CISCO, Workforce Management/Scheduling System/Software CCModeler, CCModeler Total 8.0 Interactive Voice Response (IVR) CISCO, Knowledge Management BMC, Remedy ITSM V7.5.0 Labor Reporting Systems CISCO, 8.5 Remote Control Software Logme In, Remotely Anywhere 52

54 Summary of Included Benchmarking Metrics Price Productivity Service Level Price per Ticket Price per Incident Price per Service Request Quality Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours Technician Ticket Handling Workload Annual Technician Turnover Daily Technician Absenteeism New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 53

55 Benchmarking KPI Performance Summary Metric Type Price Productivity Service Level Quality Technician Ticket Handling Workload Key Performance Indicator (KPI) Company XYZ Peer Group Statistics Average Min Median Max Price per Ticket $79.98 $ $50.49 $99.69 $ Price per Incident $62.13 $ $39.38 $96.35 $ Price per Service Request $82.84 $ $51.71 $ $ Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's 87.5% 88.9% 79.6% 88.1% 97.1% Technician Utilization 82.9% 58.3% 47.1% 55.5% 82.9% Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours 30.1% 69.9% 30.1% 75.3% 86.7% Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% 33.5% 51.8% 80.7% Customer Satisfaction 92.0% 84.9% 66.5% 86.6% 98.5% Incident First Visit Resolution Rate N/A 84.5% 67.0% 85.3% 95.1% % Resolved Level 1 Capable N/A 18.8% 6.9% 19.1% 29.0% Annual Technician Turnover 62.3% 30.6% 9.6% 28.5% 62.3% Daily Technician Absenteeism 11.0% 4.6% 0.6% 3.9% 11.0% New Technician Training Hours Annual Technician Training Hours N/A Technician Tenure (months) Technician Job Satisfaction 78.6% 78.3% 69.9% 78.1% 85.4% Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes) Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 13.8% 70.7% 13.8% 75.4% 93.4% 54

56 KPI Gap Summary Metric Type Key Performance Indicator (KPI) Company XYZ Peer Average Performance Gap Price Productivity Service Level Quality Technician Ticket Handling Workload Price per Ticket $79.98 $ % Price per Incident $62.13 $ % Price per Service Request $82.84 $ % Tickets per Technician per Month % Incidents per Technician per Month % Service Requests per Technician per Month % Technicians as a Percent of Total FTE's 87.5% 88.9% -1.6% Technician Utilization 82.9% 58.3% 42.2% Mean Time to Resolve Incidents (hrs) % % of Incidents Resolved in 24 Hours 30.1% 69.9% -57.0% Mean Time to Fulfill Service Requests (Days) % % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% -25.9% Customer Satisfaction 92.0% 84.9% 8.3% Incident First Visit Resolution Rate N/A 84.5% N/A % Resolved Level 1 Capable N/A 18.8% N/A Annual Technician Turnover 62.3% 30.6% % Daily Technician Absenteeism 11.0% 4.6% % New Technician Training Hours % Annual Technician Training Hours N/A 10 N/A Technician Tenure (months) % Technician Job Satisfaction 78.6% 78.3% 0.4% Average Incident Work Time (minutes) % Average Service Request Work Time (minutes) % Estimated Travel Time per Ticket (minutes) % Tickets per Seat per Month % Incidents per Seat per Month % Service Requests per Seat per Month % Incidents as a % of Total Ticket Volume 13.8% 70.7% -80.4% 55

57 KPI Gap Ranking Key Performance Indicator (KPI) Company XYZ Peer Average Performance Gap Service Requests per Technician per Month % Incidents per Seat per Month % Estimated Travel Time per Ticket (minutes) % Price per Service Request $82.84 $ % Price per Incident $62.13 $ % Technician Utilization 82.9% 58.3% 42.2% Tickets per Technician per Month % Tickets per Seat per Month % New Technician Training Hours % Price per Ticket $79.98 $ % Customer Satisfaction 92.0% 84.9% 8.3% Technician Job Satisfaction 78.6% 78.3% 0.4% Incident First Visit Resolution Rate N/A 84.5% N/A % Resolved Level 1 Capable N/A 18.8% N/A Annual Technician Training Hours N/A 10 N/A Technicians as a Percent of Total FTE's 87.5% 88.9% -1.6% Average Service Request Work Time (minutes) % % of Service Requests Fulfilled in 72 Hours 42.0% 56.7% -25.9% % of Incidents Resolved in 24 Hours 30.1% 69.9% -57.0% Average Incident Work Time (minutes) % Technician Tenure (months) % Incidents per Technician per Month % Incidents as a % of Total Ticket Volume 13.8% 70.7% -80.4% Annual Technician Turnover 62.3% 30.6% % Mean Time to Fulfill Service Requests (Days) % Service Requests per Seat per Month % Daily Technician Absenteeism 11.0% 4.6% % Mean Time to Resolve Incidents (hrs) % 56

58 Quartile Rankings: Price and Quality Metrics Price Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Price per Ticket Price per Incident $50.49 $85.33 $99.69 $ $85.33 $99.69 $ $ $39.38 $69.58 $96.35 $ $69.58 $96.35 $ $ $79.98 $62.13 Price per Service Request Quality Metrics $51.71 $ $ (Top) Quartile 2 3 $ $ $ $ $ (Bottom) $82.84 Your Desktop Support Performance Customer Satisfaction Incident First Visit Resolution Rate % Resolved Level 1 Capable 98.5% 92.7% 86.6% 77.3% 92.7% 86.6% 77.3% 66.5% 95.1% 91.8% 85.3% 80.7% 91.8% 85.3% 80.7% 67.0% 6.9% 14.2% 19.1% 22.7% 14.2% 19.1% 22.7% 29.0% 92.0% N/A N/A 57

59 Quartile Rankings: Productivity Metrics Productivity Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Tickets per Technician per Month Incidents per Technician per Month Service Requests per Technician per Month Technicians as a Percent of Total FTE's Technician Utilization % 91.7% 88.1% 86.5% 91.7% 88.1% 86.5% 79.6% 82.9% 64.4% 55.5% 50.2% 64.4% 55.5% 50.2% 47.1% % 82.9% 58

60 Quartile Rankings: Service Level Metrics Service Level Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Mean Time to Resolve Incidents (hrs) % of Incidents Resolved in 24 Hours Mean Time to Fulfill Service Requests (Days) % of Service Requests Fulfilled in 72 Hours % 81.4% 75.3% 62.6% 81.4% 75.3% 62.6% 30.1% % 66.5% 51.8% 47.4% 66.5% 51.8% 47.4% 33.5% % % 59

61 Quartile Rankings: Technician Metrics Technician Performance Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Annual Technician Turnover 9.6% 25.2% 28.5% 35.3% 25.2% 28.5% 35.3% 62.3% 62.3% Daily Technician Absenteeism 0.6% 2.4% 3.9% 2.4% 3.9% 5.7% 5.7% 11.0% 11.0% New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction % 82.1% 78.1% 75.7% 82.1% 78.1% 75.7% 69.9% N/A % 60

62 Quartile Rankings: Ticket Handling Metrics Ticket Handling Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Average Incident Work Time (minutes) Average Service Request Work Time (minutes) Estimated Travel Time per Ticket (minutes)

63 Quartile Rankings: Workload Metrics Workload Metrics 1 (Top) Quartile (Bottom) Your Desktop Support Performance Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume % 84.3% 75.4% 63.0% 84.3% 75.4% 63.0% 13.8% % 62

64 Desktop Support Scorecard: An Overview The Desktop Support scorecard employs a methodology that provides you with a single, all-inclusive measure of your Desktop Support performance It combines cost, service level, productivity, and quality metrics into an overall performance indicator for your Desktop Support Your Desktop Support score will range between 0 and 100%, and can be compared directly to the scores of other Desktop Support Groups in the benchmark By computing your overall score on a monthly or quarterly basis, you can track and trend your performance over time Charting and tracking your Desktop Support score is an ideal way to ensure continuous improvement in Desktop Support! 63

65 Company XYZ Peer Group Desktop Support Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $ $39.38 $ % 13.7% Price per Service Request 15.0% $ $51.71 $ % 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 64 Your balanced score for each metric is calculated: metric score X weighting

66 Balanced Scores Sample Report Only. Data is not accurate. Balanced Scorecard Summary 100.0% 90.0% 80.0% 70.0% Key Statistics Desktop Support Scores High 90.5% Average % Median 58.0% Low 11.5% Company XYZ 75.2% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% 0.0% Desktop Support 65

67 Peer Group Scorecard Summary Data The next two pages illustrate the benchmarking peer group performance for each KPI in the scorecard Page 67 ranks each Desktop Support group from best performer (#18) to worst performer (#14) based upon their balanced scores Page 68 ranks each KPI in the scorecard from best (top row) to worst (bottom row) 66

68 Scorecard Performance Rankings Overall Ranking Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score 1 18 $61.57 $ % 92.7% 73.6% 83.5% 50.9% 85.4% 90.5% 2 8 $39.38 $ % 94.8% 68.1% 74.8% 56.7% 75.2% 79.7% 3 17 $85.66 $ % 95.1% 65.1% 82.4% 48.8% 78.1% 77.6% 4 Company XYZ $62.13 $ % N/A 82.9% 30.1% 42.0% 78.6% 75.2% 5 15 $99.37 $ % 89.1% 57.1% 80.0% 64.2% 84.4% 74.4% 6 7 $ $ % 94.7% 50.5% 75.1% 49.6% 83.5% 69.9% 7 4 $96.35 $ % 93.5% 55.5% 80.0% 59.7% 76.1% 66.2% 8 16 $ $ % 81.0% 60.5% 75.3% 80.7% 78.7% 64.6% 9 10 $ $ % 86.3% 55.4% 83.1% 48.5% 75.2% 58.8% 10 1 $99.48 $ % 86.1% 48.9% 67.8% 77.0% 84.7% 58.0% 11 9 $87.04 $ % 82.8% 58.2% 61.9% 51.8% 76.3% 57.4% 12 2 $41.29 $ % 80.6% 63.6% 78.9% 59.0% 70.5% 54.6% 13 5 $ $ % 86.3% 47.1% 63.3% 42.8% 77.4% 54.3% 14 6 $ $ % 84.5% 47.3% 86.7% 42.0% 83.7% 52.8% 15 3 $52.04 $ % 73.9% 72.1% 42.3% 75.8% 77.9% 52.1% $84.97 $ % 76.2% 48.6% 80.4% 46.2% 78.1% 39.8% $ $ % 81.5% 49.9% 85.0% 68.8% 80.6% 37.8% $77.02 $ % 75.6% 52.7% 53.9% 33.5% 73.7% 37.5% $ $ % 67.0% 50.5% 43.2% 78.7% 69.9% 11.5% Key Statistics Scorecard Metrics Average $ $ % 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $ $ % 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $ % 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $ % 85.3% 55.5% 75.3% 51.8% 78.1% 58.0% 67

69 KPI Scorecard Data in Rank Order Desktop Support Number Price per Incident Price per Service Request Customer Satisfaction Scorecard Metrics Incident First Visit Resolution Rate Technician Utilization % of Incidents Resolved in 24 hours % of Service Requests Fulfilled in 72 hours Technician Job Satisfaction Total Balanced Score Company XYZ $62.13 $ % N/A 82.9% 30.1% 42.0% 78.6% 75.2% Ranking N/A Quartile N/A $39.38 $ % 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% 2 $41.29 $ % 94.8% 73.6% 85.0% 78.7% 84.7% 79.7% 3 $52.04 $ % 94.7% 72.1% 83.5% 77.0% 84.4% 77.6% 4 $61.57 $ % 93.5% 68.1% 83.1% 75.8% 83.7% 75.2% 5 $62.13 $ % 92.7% 65.1% 82.4% 68.8% 83.5% 74.4% 6 $77.02 $ % 89.1% 63.6% 80.4% 64.2% 80.6% 69.9% 7 $84.97 $ % 86.3% 60.5% 80.0% 59.7% 78.7% 66.2% 8 $85.66 $ % 86.3% 58.2% 80.0% 59.0% 78.6% 64.6% 9 $87.04 $ % 86.1% 57.1% 78.9% 56.7% 78.1% 58.8% 10 $96.35 $ % 84.5% 55.5% 75.3% 51.8% 78.1% 58.0% 11 $99.37 $ % 84.5% 55.4% 75.1% 50.9% 77.9% 57.4% 12 $99.48 $ % 82.8% 52.7% 74.8% 49.6% 77.4% 54.6% 13 $ $ % 81.5% 50.5% 67.8% 48.8% 76.3% 54.3% 14 $ $ % 81.0% 50.5% 63.3% 48.5% 76.1% 52.8% 15 $ $ % 80.6% 49.9% 61.9% 46.2% 75.2% 52.1% 16 $ $ % 76.2% 48.9% 53.9% 42.8% 75.2% 39.8% 17 $ $ % 75.6% 48.6% 43.2% 42.0% 73.7% 37.8% 18 $ $ % 73.9% 47.3% 42.3% 42.0% 70.5% 37.5% 19 $ $ % 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Average $ $ % 84.5% 58.3% 69.9% 56.7% 78.3% 58.6% Max $ $ % 95.1% 82.9% 86.7% 80.7% 85.4% 90.5% Min $39.38 $ % 67.0% 47.1% 30.1% 33.5% 69.9% 11.5% Median $96.35 $ % 84.5% 55.5% 75.3% 51.8% 78.1% 58.0% 68

70 Price per Incident Sample Report Only. Data is not accurate. Scorecard Metrics: Price per Incident $ $ $ $ $ $ Key Statistics Price per Incident High $ Average $ Median $96.35 Low $39.38 Company XYZ $62.13 $ $ $ $ $ $ $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 69

71 Price per Service Request Sample Report Only. Data is not accurate. Scorecard Metrics: Price per Service Request $ $ $ $ Key Statistics Price per Service Request High $ Average $ Median $ Low $51.71 Company XYZ $82.84 $ $ $ $ $ $50.00 $0.00 Desktop Support 70

72 Customer Satisfaction Sample Report Only. Data is not accurate. Scorecard Metrics: Customer Satisfaction 100.0% 98.0% 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% 62.0% Key Statistics Customer Satisfaction High 98.5% Average % Median 86.6% Low 66.5% Company XYZ 92.0% Desktop Support 71

73 Incident First Visit Resolution Rate Sample Report Only. Data is not accurate. Scorecard Metrics: Incident First Visit Resolution Rate 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% Key Statistics Incident First Visit Resolution Rate High 95.1% Average % Median 85.3% Low 67.0% Company XYZ N/A 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 72

74 Technician Utilization Sample Report Only. Data is not accurate. Scorecard Metrics: Technician Utilization 85.0% 80.0% 75.0% 70.0% Key Statistics Technician Utilization High 82.9% Average % Median 55.5% Low 47.1% Company XYZ 82.9% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% Desktop Support 73

75 % of Incidents Resolved in 24 Hours Sample Report Only. Data is not accurate. Scorecard Metrics: % of Incidents Resolved in 24 hours 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics % of Incidents Resolved in 24 Hours High 86.7% Average % Median 75.3% Low 30.1% Company XYZ 30.1% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% Desktop Support 74

76 % of Service Requests Fulfilled in 72 Hours Sample Report Only. Data is not accurate. Scorecard Metrics: % of Service Requests Fulfilled in 72 hrs 85.0% 80.0% 75.0% 70.0% 65.0% Key Statistics % of Service Requests Fulfilled in 72 Hours High 80.7% Average % Median 51.8% Low 33.5% Company XYZ 42.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% Desktop Support 75

77 Technician Job Satisfaction Sample Report Only. Data is not accurate. Scorecard Metrics: Technician Job Satisfaction 88.0% 86.0% 84.0% 82.0% Key Statistics Technician Job Satisfaction High 85.4% Average % Median 78.1% Low 69.9% Company XYZ 78.6% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 76

78 Quality (Effectiveness) Sample Report Only. Data is not accurate. Price vs. Quality for Company XYZ Peer Group Desktop Support Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective Company XYZ Desktop Support Global Database Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Price Price (Efficiency) Lower Price 77

79 Best Practices Process Assessment Company XYZ 78

80 Six-Part Model for Desktop Support Best Practices Model Component Definition Strategy Strategy Defining Your Charter and Mission Stakeholder Communication Customer Enthusiasm Human Resources Human Resources Proactive, Life-cycle Management of Personnel Process Expeditious Delivery of Customer Service Performance Measurement Technology Process Technology Leveraging People and Processes Performance Measurement A Holistic Approach to Performance Measurement Stakeholder Communication Proactively Managing Stakeholder Expectations 79

81 Best Practices Evaluation Criteria Ranking Explanation 1 No Knowledge of the Best Practice. 2 Aware of the Best Practice, but not applying it. 3 Aware of the Best Practice, and applying at a rudimentary level. 4 Best Practice is being effectively applied. 5 Best Practice is being applied in a world-class fashion. 80

82 MetricNet Has Defined 72 Desktop Support Best Practices Strategy 7 Best Practices Human Resources 13 Best Practices Process 16 Best Practices Technology Performance Measurement 14 Best Practices Communication 10 Best Practices 12 Best Practices Total Score from 72 to 360 The lowest score possible on the Best Practices Process Assessment is 72: Maturity Level 1 X 72 Best Practices = 72 The highest score possible on the Best Practices Process Assessment is 360: Maturity Level 5 X 72 Best Practices =

83 Strategy: 7 Best Practices Best Practice Strategy Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives Summary Statistics Total Score Average Score

84 Human Resources: 13 Best Practices Best Practice Human Resources Best Practices Defined Company XYZ's Score Peer Group Average 1 2 Desktop Support has a formalized and documented recruiting process for filling vacancies. Job requirements are well defined, and candidates are tested for both technical skills, and customer service soft skills. New hires go through a formal training curriculum, including technical and customer service skills, and are required to pass a proficiency exam before independently handling customer incidents and service requests. Veteran technicians (more than 6 months of experience) have access to training opportunities to improve their 3 skill set, job performance, and the overall performance of the Desktop Support organization. Veteran technicians are required to complete a minimum number of refresher training hours each year. 4 Technician training classes and curricula are specifically designed to maximize customer satisfaction, the number of user incidents resolved on First Contact, and to Minimize the Mean Time to Resolve Individual Technician training plans are clearly defined, documented and regularly updated Desktop Support has a formalized, documented Technician career path. Technicians are made aware of their 6 career advancement opportunities, and are encouraged to proactively manage their careers. Technicians are coached at least once yearly on their career path and career advancements options Technicians have the opportunity to advance their careers in at least two ways: by improving their technical and customer service skills, and by improving their management and supervisory skills. Technicians are coached by their supervisor in one-on-one sessions on a monthly basis. Logged tickets are reviewed, and the supervisor provides specific suggestions to each Technician on how to improve performance. Technicians have quantifiable performance goals (e.g., for First Contact Resolution, Customer Satisfaction, Number of Tickets Handled per Month, etc.), and are held accountable for achieving their goals on a monthly basis. 10 Technicians are eligible for incentives and rewards based upon performance. These could include monetary incentives such as annual bonuses, or other incentives such as time off work, gift certificates, etc Technician performance goals are linked to and aligned with the overall Desktop Support goals and performance Technician Satisfaction surveys are conducted at least once per year, and the results of the survey are used to manage and improve Technician morale Formal Performance reviews are scheduled and completed for all Desktop personnel at least once annually Summary Statistics Total Score Average Score

85 Process: 16 Best Practices Best Practice Process Best Practices Defined Company XYZ's Score Peer Group Average Desktop Support is part of an end-to-end support process, where Level 1 Support acts as the Single Point of Contact (SPOC) for user support. Customers are offered a range of access options to Desktop Support, including live voice, voice mail, , web chat, self-service, fax, and walk-in. Ticket handling processes are standardized, documented, and available online. With few exceptions, the standards are followed by Desktop Support technicians. Escalation points are well defined and documented. These include other support groups (e.g., Level 3 support, Field Support, etc.), and individuals to whom tickets may be escalated. Rules for ticket escalation and transfer are well defined and documented. Technicians know when and where to transfer or route a ticket if they are unable to assist the user. Indirect contact channels, including , Voice Mail, and Faxes are treated with the same priority as live phone calls and chat sessions. The work queues from these channels are integrated, or worked in parallel. Incoming tickets are assigned a severity code based upon the number of users impacted, and the urgency of the incident. System alarms notify Desktop Support when a service level has been breached, whether at Desktop Support, or at another support level within the organization. Desktop Support has a formal, rapid notification and correction process that is activated when a service level has been breached, whether at Desktop Support, or at some other support level. Desktop Support has contingency plans to handle sudden, unexpected spikes in contact volume. These could include having supervisors and other indirect personnel handle incoming calls during a call spike. 11 Desktop Support has contingency plans to handle both short and long term interruptions in service delivery Desktop Support has a well defined service planning and readiness process that works closely with both internal engineering groups and vendors, and continues through product field testing and pre-release. This process enables Desktop Support to train for and prepare for supporting new products and services in the IT environment. Desktop Support has a formal Knowledge Management Process that facilitates the acquisition, qualification, review, approval, and distribution of knowledge into a Knowledgebase. Desktop Support has a mature workforce scheduling process that achieves high technician utilization, while maintaining reasonable service levels Desktop Support has an effective, ongoing process for projecting future workload and staffing requirements Desktop Support conducts periodic Root Cause Analysis (RCA) on the user contact profile to eliminate problems at their source. Summary Statistics N/A N/A N/A Total Score Average Score

86 Technology: 10 Best Practices Best Practice Technology Best Practices Defined Company XYZ's Score Peer Group Average 1 Desktop Support has a full-featured ticket management system that facilitates effective ticket tracking, service level compliance, reporting, and root cause analysis Desktop Support has a comprehensive knowledge management tool that facilitates effective knowledge capture and re-use. Desktop technicians are able to quickly find solutions to user problems by searching the knowledge base. Solutions for the vast majority of user problems and questions can be found in the knowledge base The Desktop Support knowledgebase is used frequently by all Desktop Support technicians, and results in higher First Contact Resolution Rates, and lower resolution times (MTTR). Desktop Support has an effective tool that allows technicians to proxy into a user's computer, take control of the computer, and remotely perform diagnostics and problem solving (e.g., Tivoli, Bomgar, GoTo Assist, etc.). The tool increases remote resolution rates, and reduces contact handle times. Desktop Support has an Automated Password Reset (APR) capability that dramatically reduces the number of password resets that must be performed manually by Desktop Support technicians. Desktop Support has an effective, integrated self-service portal that is available to all users. The self-service portal provides information, FAQ's, and solutions to problems that are more complex than simple password resets. The tool includes a direct link to Desktop Support technicians. Users are aware of the self-service portal, and usage rates are continuously increasing. The ticket management system can track and monitor the skill levels of Desktop Support technicians based on closed tickets by product and/or service code. Desktop Support uses technology alerts/alarms to notify Desktop Support or perform self healing scripts when a customer or system issue is proactively identified. 9 Desktop Support has a multi-year plan for an integrated technology strategy Desktop Support utilizes a capital investment justification process based on ROI, and reports on post installation ROI as part of this process Summary Statistics Total Score Average Score N/A

87 Performance Measurement: 14 Best Practices Best Practice Performance Measurement Best Practices Defined Company XYZ's Score Peer Group Average 1 Cost per Ticket is measured, recorded, and tracked on an ongoing basis. N/A Customer Satisfaction is measured, recorded, and tracked on an ongoing basis First Contact Resolution Rate for Incidents is measured, recorded, and tracked on an ongoing basis First Level Resolution is measured, recorded, and tracked on an ongoing basis. N/A Technician Utilization is measured, recorded, and tracked on an ongoing basis Technician Satisfaction is measured, recorded, and tracked Desktop Support maintains a balanced scorecard that provides a single, all-inclusive measure of Desktop Support performance. Desktop Support tracks the number of tickets that are that could have been resolved by the Desktop Support organization at Level one Desktop Support conducts event driven customer surveys whereby the results of customer satisfaction surveys can be linked back to a specific ticket, and to a specific technician handling the contact at Desktop Support Desktop Support conducts benchmarking at least once per year Desktop Support KPI's are used to establish "stretch" goals Desktop Support measures are used holistically, and diagnostically to identify performance gaps in Desktop Support performance, and to prescribe actions that will improve performance. Desktop Support understands key correlations and cause/effect relationships between the various KPI's. This enables Desktop Support to achieve desired performance goals by leveraging and driving the underlying "causal" metrics. Desktop Support tracks the Mean Time to Resolve (MTTR), and the Percentage of tickets resolved within 24 hours. Summary Statistics Total Score Average Score

88 Communication: 12 Best Practices Best Practice Communication Best Practices Defined Company XYZ's Score Peer Group Average Desktop Support maintains active communication with all stakeholder groups, including Desktop Support organization employees, IT managers, company managers outside of IT, and customers. Desktop Support has a formal communications schedule, and provides customized content for each stakeholder group. Desktop Support has established User Group Liaisons who represent different groups within the user community. Desktop Support meets periodically with the liaisons to learn about user concerns and questions, and to communicate Desktop Support services, plans, and initiatives Desktop Support meets frequently with user groups, and holds "informational briefings" to educate users on 4 supported products and services, hours of operation, training opportunities, tips for getting the most benefit from Desktop Support, etc Desktop Support meets frequently with other IT managers, and is an integral part of key decisions made within IT. Desktop Support plays the role of "voice of the user" within IT. IT is required to deliver a "turnover package" to Desktop Support for all changes that will impact the user environment. This could include application updates, new desktop software, etc. The turnover package is designed to prepare Desktop Support to provide support to users in the affected areas. 7 Customers are told what to expect on resolution time when their ticket is escalated or if a call-back is required Desktop Support monitors all tickets, including those that are escalated, until ticket closure The value added by Desktop Support is communicated to key managers in IT, and expectations are formally established regarding Desktop Support roles and responsibilities Desktop Support tracks the number of training related contacts it receives, and provides feedback to user groups within the organization on training areas that could help to reduce Desktop Support contact volumes. Desktop Support provides training aids to users that enable them use Desktop Support more effectively. These could include log-in screens with Desktop Support phone number, chat windows that can be clicked to initiate a real-time chat session, mouse pads imprinted with Desktop Support IVR menu, etc Desktop Support transmits outbound messages to users announcing major system and network outages, 12 thereby alerting users about potential problems in the IT environment. These proactive messages help to reduce N/A 2.71 contact volumes during incidents that impact a large number of users. Summary Statistics Total Score Average Score

89 Best Practices Process Assessment Summary Best Practices Component Number of Success Factors Average Company XYZ Score Average Peer Group Score Strategy Human Resources Process Technology Performance Measurement Communication Total Score * An average score of 4.0 or above is required in each component of the Best Practices Model to achieve Best Practices Certification 88

90 Average Score 4.5 Sample Report Only. Data is not accurate. Best Practices Process Assessment Summary Company XYZ Peer Group 89

91 Total Process Assessment Scores Sample Report Only. Data is not accurate. Overall Process Assessment Scores Key Statistics Total Process Assessment Score High Average Median Low 83.1 Your Score World-Class

92 World-Class = Balanced Score Average = Sample Report Only. Data is not accurate. Process Maturity vs. Scorecard Performance 100% 90% 80% Company XYZ Performance Process Assessment Score Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% Process Assessment Score 91

93 Interview Themes and Quotes Company XYZ 92

94 MetricNet Conducted Seven Desktop Support Interviews Desktop Support Interviewees Interviewee 1 Interviewee 2 Interviewee 3 Interviewee 4 Interviewee 5 Interviewee 6 Interviewee 7 Title Account Manager Tower Lead Service Desk Program Manager Desktop Support Manger Desktop Support Technician Desktop Support Technician IT Specialist 93

95 Key Themes from Desktop Support Interviews Internal communication within the team is viewed as a key strength of desktop support The consensus among interviewees is that Desktop Support would benefit by having a knowledge base Most interviewees felt that Desktop Support s scope of work needs to be better defined Additionally, most interviewees felt that the support processes need to be better defined and documented Interviewees would like to see the training program improved, and most would like to receive more training The morale of Desktop Support is viewed as average Some interviewees indicated that it would help if they were able to use the same devices that they were being asked to troubleshoot e.g., iphones, and ipads 94

96 Representative Comments from Desktop Support Interviews Desktop Support does not have any processes or documentation. This makes it difficult, especially if the technician does not know how to fix a particular problem. The workload is not balanced, so the work is sometimes not completed on time. The training needs to be improved. There needs to be continuous training and constant measurement of the strengths and performance of the team members. The scope of support is not well defined. This needs to be documented. Having a knowledge base will improve the service that Desktop Support can provide. It will not only shorten the resolution time, but it will also improve the quality of work. Desktop Support needs leadership who can better manage and motivate the team. The morale of the team is average, at best. The team can be motivated by giving out incentives such as free lunches, or by recognizing technicians in front of the team for a job well done. Categorizing the tickets will make the work easier for the technicians. Desktop Support is developing a process where the tickets are resolved on the day that they are received. It is no longer acceptable for tickets to be resolved the following day. Desktop Support needs more training on ipads, iphones, Windows 7 and Company XYZ docs. The technicians also need to be trained on the scope of work for new technologies that will be implemented. It is the responsibility of Desktop Support to change the way the users see the team. Not all users are happy with the service that is being provided. 95

97 Representative Comments from Desktop Support Interviews The technicians know a lot about the work but need stronger leadership. The technicians need to have the same level of knowledge so the team can provide better service. A knowledgebase needs to be created to get everyone working at the same level. It is a good idea to create specialists who can focus on particular issues. This will minimize the time spent in resolving issues and will improve the quality of the resolution. Desktop Support needs more technicians to provide better services, and to ensure that someone will be available in case an urgent issue comes up. The technicians need to be trained on how to treat VIP s. Learning how to deal with VIP s at Company XYZ will allow the technicians to work more effectively, and be more relaxed. Company XYZ is migrating to Apple technology and Desktop Support needs more Apple devices to be able to address the issues better. Having these devices will allow the technicians to recreate the problem and solve it. Internal communication is a big strength. The technicians transfer knowledge within the team. If a technician has a question, whoever knows the answer will assist. Oftentimes Desktop Support is dealing with angry clients because the request has to go through Mexico first. By the time the issue gets passed on to the team, a couple of hours or a couple of days have already passed, and the client is already expecting a resolution. 96

98 Representative Comments from Desktop Support Interviews When a technician leaves the team, it reduces the team s output because the new technicians have to go through a learning curve. The headcount is now adequate, and has improved the personnel/work order incident ratio. The technicians are more relaxed and can focus more on the user. There is no training program. The new hire training is provided on the job by the senior technicians. Desktop Support has personal interaction with the users. The users can explain the issue a bit better in person, and the technicians can provide better troubleshooting because the technicians have seen the problem. Configuring devices, installing computers and troubleshooting hardware issues are some of the strengths of Desktop Support. The Desktop Support s vision is to avoid situations where users are unable to do their work. The main challenge is to resolve issues in a timely manner. The technicians help and support each other. Knowledge is shared with everyone. Desktop Support needs to identify the timeframe for resolving particular types of issues so that users have realistic expectations about how much time is needed to resolve an issue. Desktop Support would like to have a knowledge base so that technicians can easily find information that s needed. The training can be improved by creating manuals. 97

99 Representative Comments from Desktop Support Interviews Processes need to be created not just for the technicians to follow, but also for the users to know how their issue will be handled. Processes need to be followed to ensure that users can keep working. It is difficult not to be stressed when there is so much work. To reduce the stress, the technicians try to help each other and talk about non-work related topics. Some users think that Desktop Support is not providing good service. 98

100 Key Themes from Company XYZ Interviews Company XYZ management is very disappointed with the quality of service provided by Company ABC the service desk in particular is viewed as a failure Company XYZ believes that turnover for both service desk and desktop support is excessive, and has been a huge problem Lack of training for the techs and agents at Company ABC contributes to turnover, and drives the perception that the service desk, in particular, is not very effective Company XYZ believes that the service desk agents are more interested in opening and closing tickets than in resolving the customer s problem Company ABC processes and procedures are not very mature, and are not well documented knowledge management, for example, has not been developed at all by Company ABC Company XYZ interviewees believe that many bank employees have given up on the service desk, and call desktop support directly when they need help 99

101 Representative Comments from Company XYZ Interviews In the beginning, Company ABC brought in some good people. But they all left, and we had to start all over again. Company ABC is not taking responsibility. They are not training their people properly, and they are not pushing ahead with improving their processes. Company ABC is not good at documentation. They came in telling us that they would implement ITIL processes, and that they had ITIL Centers of Excellence that they could rely upon, but none of that has happened. We had many good processes and procedures in place before Company ABC came in. But they have documented none of those. When we push Company ABC to produce documentation, they do so very reluctantly, and what they create is not very good. Every ticket closed by desktop support should be tied to a knowledge article. But they are not doing that. There is really no knowledge management at all. The service desk has a habit of putting tickets into pending status. They leave them there indefinitely because if they open them up again they will breach the service level. The result is that over time hundreds of tickets are left in pending status. Desktop support is doing better than the service desk, but that s not saying much. I think it s fair to say that after three years our outsourcing experiment has failed. 100

102 Representative Comments from Company XYZ Interviews The service desk agents are very nice, and polite. They just don t know how to solve the customers problems because they don t have the training or experience. I hear all the time that people from Company XYZ call the service desk and are told Please be patient, I am just learning. Well, they have been at the bank for three years, so why are they just learning? The calls just take too long, and always end in frustration. The agents in Mexico obviously don t have the skills to resolve the tickets. The agents don t have the knowledge to provide the support we need at the bank. It seems like the incentive at the service desk is to open and close tickets. But they don t seem to care about whether or not they solve the problem. Everyone at Company XYZ is dissatisfied with the service desk. I don t know how the customer satisfaction scores can be so high. I just don t believe it. My biggest concern is that Company ABC doesn t take responsibility. If anything improves it s only because we push them to improve it. They should be taking ownership, but they don t I don t know why the turnover at Company ABC is so high. Maybe they are not paid enough, or maybe they are not treated well, but the high turnover leads to all the other problems we are having with Company ABC. 101

103 Conclusions and Recommendations Company XYZ 102

104 Notable Strengths Company XYZ Peer Group Desktop Support has a number of notable strengths. Price metrics are excellent Top quartile performance for all price metrics Customer Satisfaction is good Second quartile performance Productivity metrics are good Technician Utilization is the highest in the peer group Good technician utilization is one of the key drivers behind Company XYZ s low Price per Ticket Company XYZ s overall performance on the benchmark was above average Company XYZ placed 4 th out of 19 desktop support groups in the benchmark Top Quartile Performance Overall 103

105 But Opportunities for Improvement Remain Service levels were among the worst in the benchmark All service levels are in the 4 th quartile This a primary source of dissatisfaction for Company XYZ Several technician metrics are weak Technician Turnover, Absenteeism, and Tenure are all in the 4 th quartile Most interviewees expressed concerns that processes and procedures in desktop support have not been well defined or documented Several important desktop support metrics are not being tracked by Company XYZ Incident First Visit Resolution Rate % Resolved Level 1 Capable There is a significant difference in perception between Company XYZ and Company ABC, as evidenced by the results of the process assessment Company ABC believes they are doing an outstanding job Company XYZ gives Company ABC significantly lower marks for their performance 104

106 Summary of Benchmarking Recommendations 1. Begin tracking and trending Incident First Visit Resolution Rate, and % Resolved Level 1 Capable 2. Work on improving service levels, which were the lowest of all companies in the benchmark 3. Provide access to the knowledge base, and require that desktop support work in conjunction with the service desk to keep the knowledge base current 4. Develop improved documentation of processes and procedures for desktop support 5. Provide additional training opportunities for desktop support technicians 6. Consider adopting the MetricNet Desktop Support Balanced Scorecard, and update the scorecard monthly 7. Develop an internal communication program in conjunction with the service desk to improve the visibility and reputation of Company XYZ s end-user support functions 8. Discuss major points of difference between Company XYZ and Company ABC on the results of the process maturity assessment 105

107 Some Suggested Performance Targets Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) % % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) % % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 106

108 Customer Satisfaction Sample Report Only. Data is not accurate. Incident First Visit Resolution vs. Customer Satisfaction 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 30% 40% 50% 60% 70% 80% 90% First Visit Resolution Rate (Incidents) 107

109 Technician Satisfaction vs. Customer Satisfaction 100% 90% Customer Satisfaction 80% 70% 60% 50% 40% 30% 20% 10% 0% 30% 40% 50% 60% 70% 80% 90% 100% Technician Satisfaction 108

110 Measuring Ticket Defects: % Resolved Level 1 Capable Price per Ticket Customer Satisfaction Technician Utilization FCR (Incidents) Service Levels: MTTR Techs/ Total FTE s Absenteeism/ Turnover % Resolved Level 1 Capable Work/ Travel Time SL s MTTR Scheduling Efficiency Technician Satisfaction Coaching Career Path Training Hours 109

111 Technician Experience vs. Incident FCR 90% 80% Incident FCR 70% 60% 50% 40% 30% 20% Technician Time on Job (months) 110

112 Technician Job Satisfaction Sample Report Only. Data is not accurate. Annual Training Hours vs. Technician Job Satisfaction 100% 90% 80% 70% 60% 50% 40% Annual Technician Training Hours 111

113 Incident MTTR vs. Customer Satisfaction 100% Customer Satisfaction 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Incident MTTR (days) 112

114 Consider Adopting the Desktop Support Balanced Scorecard * Metric Performance Range Your Balanced Performance Metric Weighting Worst Case Best Case Performance Metric Score Score Price per Incident 15.0% $ $39.38 $ % 13.7% Price per Service Request 15.0% $ $51.71 $ % 13.8% Customer Satisfaction 25.0% 66.5% 98.5% 92.0% 79.7% 19.9% Incident First Visit Resolution Rate 10.0% 67.0% 95.1% 84.5% 62.4% 6.2% Technician Utilization 15.0% 47.1% 82.9% 82.9% 100.0% 15.0% % of Incidents Resolved in 24 hours 5.0% 30.1% 86.7% 30.1% 0.0% 0.0% % of Service Requests Fulfilled in 72 hours 5.0% 33.5% 80.7% 42.0% 18.0% 0.9% Technician Job Satisfaction 10.0% 69.9% 85.4% 78.6% 56.3% 5.6% Total 100.0% N/A N/A N/A N/A 75.2% Step 1 Eight critical performance metrics have been selected for the scorecard * Peer group averages were used for Incident First Visit Resolution Rate since Company XYZ does not track this metric Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step Your balanced score for each metric is calculated: metric score X weighting

115 Desktop Support Balanced Score Sample Report Only. Data is not accurate. And Update the Scorecard Monthly 85% 80% 75% 70% 65% 60% 55% 50% 45% 40% Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 12 Month Average Monthly Score 114

116 Productive Hours Lost per Employee per Year Quality of Support Drives End-User Productivity Support Function Service Desk Desktop Support Performance Quartile n = 60 Key Performance Indicator Performance Quartile 1 (top) (bottom) Customer Satisfaction 93.5% 84.5% 76.1% 69.3% First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4% Mean Time to Resolve (hours) Customer Satisfaction 94.4% 89.2% 79.0% 71.7% First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5% Mean Time to Resolve (hours) Average Productive Hours Lost per Employee per Year

117 PERCEIVED VALUE Internal Communication: Positioning Company XYZ for Future Success HIGHER Perceived Value > Actual Value LOWER Perceived Value < Actual Value LOWER ACTUAL COST VALUE HIGHER 116

118 Where Does Company XYZ Peer Group Desktop Support Operate? PERCEIVED VALUE HIGHER Perceived Value > Actual Value A Common (but Dangerous) Operating Position LOWER Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 117

119 PERCEIVED VALUE #2 Brand Management Sample Report Only. Data is not accurate. Operational Effectiveness First! HIGHER Perceived Value > Actual Value #1 Operational Effectiveness LOWER Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 118

120 PERCEIVED VALUE Sample Report Only. Data is not accurate. Closing the Perception vs. Reality Gap HIGHER Perceived Value > Actual Value Where you Should Be Closing the Perception Gap LOWER Where you Are Perceived Value < Actual Value LOWER ACTUAL VALUE HIGHER 119

121 Image Management: The Five W s 1. Who Who are the Key Stakeholder Groups? 2. What What are the Key Messages? 3. When When are You Going to Communicate Them? 4. Where/How Where/How do You Reach the Stakeholders? 5. Why Why are We Doing This? 120

122 Timing Timing Key Success Factors in Desktop Support Image Management Channels Use All Available Log-in messages Newsletters Reference Guides Asset tags Surveys User liaisons Timing Frequent Contact New employee orientation At session log-in During training During the incident At scheduled sessions Messages Messages Multiple Messages Services Major initiatives Performance Levels FAQ s Success Stories 121

123 Managing Expectations for Desktop Support We ve all heard the expression Expectations Not Set are Expectations Not Met! So, let s get serious about proactively managing expectations! 122

124 123

125 90% Support Drives Customer Satisfaction for All of IT 84% % Saying Very Important 80% 70% 60% 50% 40% 30% 20% 10% 47% 31% n = 1,044 Global large cap companies Survey type: multiple choice 3 responses allowed per survey 29% 22% 19% 8% 0% Service Desk Desktop Support Network Outages VPN Training Enterprise Applications Factors Contributing to IT Customer Satisfaction Desktop Software 84% cited the service desk as a very important factor in their overall satisfaction with corporate IT 47% cited desktop support as a very important factor in their overall satisfaction with corporate IT 124

126 Company XYZ Internal Communication Summary Managing the gap between perception and reality is fairly straightforward It doesn t take a lot of time, or cost a lot of money But it is critically important The success of your support organization depends as much on your image, as it does on your actual performance! The Benefits of effective Image Management Include: Customer loyalty and positive word-of-mouth referrals Credibility, which leverages your ability to Get Things Done! A Positive Image for IT overall High levels of Customer Satisfaction 125

127 World-Class = Balanced Score Average = Sample Report Only. Data is not accurate. Company XYZ Must Improve Process Maturity Over Time 100% 90% 80% Company XYZ Performance Process Assessment Score Balanced Score 75.2% 70% 60% 50% Average = 58.6% 40% 30% 20% Company XYZ 10% Global Database 0% Process Assessment Score 126

128 Best Practice Focus Areas: Strategy Strategy Best Practices Defined Desktop Support has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., Level 1 support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's. Desktop Support has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization. Desktop Support has a well-defined mission, vision, and strategy. The vision and strategy are well-documented, and communicated to key stakeholders in the organization. Company XYZ's Score Desktop Support actively seeks to improve Level 1 Resolution Rates, Incident First Contact Resolution Rate, and key Service Levels by implementing processes, technologies, and training that facilitate these objectives. 2.0 Desktop Support is well integrated into the information technology function. Desktop Support acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. Desktop Support is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment. Desktop Support has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line Desktop Support has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan

129 Best Practice Focus Areas: Human Resources Human Resources Best Practices Defined Technicians have quantifiable performance goals (e.g., for First Contact Resolution, Customer Satisfaction, Number of Tickets Handled per Month, etc.), and are held accountable for achieving their goals on a monthly basis. Technician Satisfaction surveys are conducted at least once per year, and the results of the survey are used to manage and improve Technician morale. Technician performance goals are linked to and aligned with the overall Desktop Support goals and performance Formal Performance reviews are scheduled and completed for all Desktop personnel at least once annually. New hires go through a formal training curriculum, including technical and customer service skills, and are required to pass a proficiency exam before independently handling customer incidents and service requests. Technicians have the opportunity to advance their careers in at least two ways: by improving their technical and customer service skills, and by improving their management and supervisory skills. Desktop Support has a formalized and documented recruiting process for filling vacancies. Job requirements are well defined, and candidates are tested for both technical skills, and customer service soft skills. Technicians are coached by their supervisor in one-on-one sessions on a monthly basis. Logged tickets are reviewed, and the supervisor provides specific suggestions to each Technician on how to improve performance. Desktop Support has a formalized, documented Technician career path. Technicians are made aware of their career advancement opportunities, and are encouraged to proactively manage their careers. Technicians are coached at least once yearly on their career path and career advancements options. Technicians are eligible for incentives and rewards based upon performance. These could include monetary incentives such as annual bonuses, or other incentives such as time off work, gift certificates, etc. Individual Technician training plans are clearly defined, documented and regularly updated. Technician training classes and curricula are specifically designed to maximize customer satisfaction, the number of user incidents resolved on First Contact, and to Minimize the Mean Time to Resolve. Veteran technicians (more than 6 months of experience) have access to training opportunities to improve their skill set, job performance, and the overall performance of the Desktop Support organization. Veteran technicians are required to complete a minimum number of refresher training hours each year. Company XYZ's Score

130 Best Practice Focus Areas: Process Process Best Practices Defined Incoming tickets are assigned a severity code based upon the number of users impacted, and the urgency of the incident. System alarms notify Desktop Support when a service level has been breached, whether at Desktop Support, or at another support level within the organization. Desktop Support has contingency plans to handle sudden, unexpected spikes in contact volume. These could include having supervisors and other indirect personnel handle incoming calls during a call spike. Desktop Support has a well defined service planning and readiness process that works closely with both internal engineering groups and vendors, and continues through product field testing and pre-release. This process enables Desktop Support to train for and prepare for supporting new products and services in the IT environment. Ticket handling processes are standardized, documented, and available online. With few exceptions, the standards are followed by Desktop Support technicians. Escalation points are well defined and documented. These include other support groups (e.g., Level 3 support, Field Support, etc.), and individuals to whom tickets may be escalated. Rules for ticket escalation and transfer are well defined and documented. Technicians know when and where to transfer or route a ticket if they are unable to assist the user. Desktop Support has a mature workforce scheduling process that achieves high technician utilization, while maintaining reasonable service levels. Desktop Support has a formal Knowledge Management Process that facilitates the acquisition, qualification, review, approval, and distribution of knowledge into a Knowledgebase. Desktop Support has a formal, rapid notification and correction process that is activated when a service level has been breached, whether at Desktop Support, or at some other support level. Desktop Support conducts periodic Root Cause Analysis (RCA) on the user contact profile to eliminate problems at their source. Desktop Support has contingency plans to handle both short and long term interruptions in service delivery. Desktop Support has an effective, ongoing process for projecting future workload and staffing requirements. Desktop Support is part of an end-to-end support process, where Level 1 Support acts as the Single Point of Contact (SPOC) for user support. Customers are offered a range of access options to Desktop Support, including live voice, voice mail, , web chat, self-service, fax, and walk-in. Indirect contact channels, including , Voice Mail, and Faxes are treated with the same priority as live phone calls and chat sessions. The work queues from these channels are integrated, or worked in parallel. Company XYZ's Score N/A N/A N/A 129

131 Best Practice Focus Areas: Technology Technology Best Practices Defined Desktop Support has an effective tool that allows technicians to proxy into a user's computer, take control of the computer, and remotely perform diagnostics and problem solving (e.g., Tivoli, Bomgar, GoTo Assist, etc.). The tool increases remote resolution rates, and reduces contact handle times. Desktop Support has a full-featured ticket management system that facilitates effective ticket tracking, service level compliance, reporting, and root cause analysis. Desktop Support has a comprehensive knowledge management tool that facilitates effective knowledge capture and reuse. Desktop technicians are able to quickly find solutions to user problems by searching the knowledge base. Solutions for the vast majority of user problems and questions can be found in the knowledge base. Desktop Support uses technology alerts/alarms to notify Desktop Support or perform self healing scripts when a customer or system issue is proactively identified. The Desktop Support knowledgebase is used frequently by all Desktop Support technicians, and results in higher First Contact Resolution Rates, and lower resolution times (MTTR). Desktop Support has an effective, integrated self-service portal that is available to all users. The self-service portal provides information, FAQ's, and solutions to problems that are more complex than simple password resets. The tool includes a direct link to Desktop Support technicians. Users are aware of the self-service portal, and usage rates are continuously increasing. Desktop Support has a multi-year plan for an integrated technology strategy. Desktop Support utilizes a capital investment justification process based on ROI, and reports on post installation ROI as part of this process. Desktop Support has an Automated Password Reset (APR) capability that dramatically reduces the number of password resets that must be performed manually by Desktop Support technicians. The ticket management system can track and monitor the skill levels of Desktop Support technicians based on closed tickets by product and/or service code. Company XYZ's Score N/A 130

132 Best Practice Focus Areas: Performance Measurement Performance Measurement Best Practices Defined First Contact Resolution Rate for Incidents is measured, recorded, and tracked on an ongoing basis. Customer Satisfaction is measured, recorded, and tracked on an ongoing basis. Technician Utilization is measured, recorded, and tracked on an ongoing basis. Desktop Support understands key correlations and cause/effect relationships between the various KPI's. This enables Desktop Support to achieve desired performance goals by leveraging and driving the underlying "causal" metrics. Desktop Support KPI's are used to establish "stretch" goals. Desktop Support measures are used holistically, and diagnostically to identify performance gaps in Desktop Support performance, and to prescribe actions that will improve performance. Desktop Support conducts event driven customer surveys whereby the results of customer satisfaction surveys can be linked back to a specific ticket, and to a specific technician handling the contact at Desktop Support. Technician Satisfaction is measured, recorded, and tracked. Company XYZ's Score Desktop Support tracks the Mean Time to Resolve (MTTR), and the Percentage of tickets resolved within 24 hours. 1.0 Desktop Support tracks the number of tickets that are that could have been resolved by the Desktop Support organization at Level one. Desktop Support conducts benchmarking at least once per year. Desktop Support maintains a balanced scorecard that provides a single, all-inclusive measure of Desktop Support performance. Cost per Ticket is measured, recorded, and tracked on an ongoing basis. First Level Resolution is measured, recorded, and tracked on an ongoing basis N/A N/A 131

133 Best Practice Focus Areas: Communication Communication Best Practices Defined Desktop Support maintains active communication with all stakeholder groups, including Desktop Support organization employees, IT managers, company managers outside of IT, and customers. Desktop Support tracks the number of training related contacts it receives, and provides feedback to user groups within the organization on training areas that could help to reduce Desktop Support contact volumes. IT is required to deliver a "turnover package" to Desktop Support for all changes that will impact the user environment. This could include application updates, new desktop software, etc. The turnover package is designed to prepare Desktop Support to provide support to users in the affected areas. Desktop Support monitors all tickets, including those that are escalated, until ticket closure. The value added by Desktop Support is communicated to key managers in IT, and expectations are formally established regarding Desktop Support roles and responsibilities. Desktop Support has a formal communications schedule, and provides customized content for each stakeholder group. Customers are told what to expect on resolution time when their ticket is escalated or if a call-back is required. Desktop Support meets frequently with other IT managers, and is an integral part of key decisions made within IT. Desktop Support plays the role of "voice of the user" within IT. Desktop Support has established User Group Liaisons who represent different groups within the user community. Desktop Support meets periodically with the liaisons to learn about user concerns and questions, and to communicate Desktop Support services, plans, and initiatives. Desktop Support meets frequently with user groups, and holds "informational briefings" to educate users on supported products and services, hours of operation, training opportunities, tips for getting the most benefit from Desktop Support provides training aids to users that enable them use Desktop Support more effectively. These could include log-in screens with Desktop Support phone number, chat windows that can be clicked to initiate a real-time chat session, mouse pads imprinted with Desktop Support IVR menu, etc. Desktop Support transmits outbound messages to users announcing major system and network outages, thereby alerting users about potential problems in the IT environment. These proactive messages help to reduce contact volumes during incidents that impact a large number of users. Company XYZ's Score N/A 132

134 Some Suggested Performance Targets Performance Metric Current Performance Target Performance Incident First Visit Resolution Rate N/A 85.0% % Resolved Level 1 Capable N/A 15.0% Technician Job Satisfaction 78.6% 80.0% Mean Time to Resolve Incidents (hrs) % % of Incidents Resolved in 24 hours 30.1% 70.0% Mean Time to Fulfill Service Requests (Days) % % of Service Requests Fulfilled in 72 hours 42.0% 60.0% Annual Technician Training Hours N/A 20 Balanced Score 75.2% 81.7% Achieving the performance targets recommended above will result in a desktop support balanced score of 81.7%, and elevate Company XYZ to second place in the benchmark 133

135 Detailed Benchmarking Comparisons Company XYZ 134

136 Price Metrics Company XYZ 135

137 Definition Price Metrics: Price per Ticket Price per Ticket is the amount paid to the outsourcer for each Desktop Support ticket handled. It is typically calculated by dividing the annual fee paid to the outsourcer by the annual Desktop Support ticket volume. Ticket volume includes both incidents and service requests. Why it s Important Price per Ticket is one of the most important Desktop Support metrics. It is a measure of contract efficiency and effectiveness with your outsourcer. A higher than average Price per Ticket is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Ticket is not necessarily good, particularly if the low price is achieved by sacrificing Customer Satisfaction or service levels. Every outsourced Desktop Support Organization should track and trend Price per Ticket on an ongoing basis. Key Correlations Price per Ticket is strongly correlated with the following metrics: Price per Incident Price per Service Request Incident First Visit Resolution Rate Average Incident Work Time Average Service Request Work Time Average Travel Time per Ticket 136

138 Price per Ticket Sample Report Only. Data is not accurate. Price Metrics: Price per Ticket $ $ $ $ $ $ Key Statistics Price per Ticket High $ Average $ Median $99.69 Low $50.49 Company XYZ $79.98 $ $ $ $ $ $ $ $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 137

139 Definition Price Metrics: Price per Incident Price per Incident is the amount paid to the outsourcer for each Desktop Support incident handled. It is typically calculated by dividing the annual fee paid to the outsourcer (for incidents only) by the annual Desktop Support incident volume. Why it s Important Price per Incident is one of the most important Desktop Support metrics. It is one of the key components of Price per Ticket; the other being the Price per Service Request. A higher than average Price per Incident is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Incident is not necessarily good, particularly if the low price is achieved by sacrificing quality of service. Every Desktop Support organization should track and trend Price per Incident on a monthly basis. Key Correlations Price per Incident is strongly correlated with the following metrics: Price per Ticket Incident First Visit Resolution Rate Average Incident Work Time Average Travel Time per Ticket 138

140 Price per Incident Sample Report Only. Data is not accurate. Price Metrics: Price per Incident $ $ $ $ $ $ Key Statistics Price per Incident High $ Average $ Median $96.35 Low $39.38 Company XYZ $62.13 $ $ $ $ $ $ $80.00 $60.00 $40.00 $20.00 $0.00 Desktop Support 139

141 Definition Price Metrics: Price per Service Request Price per Service Request is the amount paid to the outsourcer for each Desktop Support service request handled. It is typically calculated by dividing the annual fee paid to the outsourcer (for service requests only) by the annual Desktop Support service request volume. Why it s Important Price per Service Request is one of the most important Desktop Support metrics. It is one of the key components of Price per Ticket; the other being the Price per Incident. A higher than average Price per Service Request is not necessarily a bad thing, particularly if accompanied by higher than average quality levels. Conversely, a low Price per Service Request is not necessarily good, particularly if the low price is achieved by sacrificing quality of service. Every Desktop Support organization should track and trend Price per Service Request on a monthly basis. Key Correlations Price per Service Request is strongly correlated with the following metrics: Price per Ticket Average Service Request Work Time Average Travel Time per Ticket 140

142 Price per Service Request Sample Report Only. Data is not accurate. Price Metrics: Price per Service Request $ $ $ $ Key Statistics Price per Service Request High $ Average $ Median $ Low $51.71 Company XYZ $82.84 $ $ $ $ $ $50.00 $0.00 Desktop Support 141

143 Productivity Metrics Company XYZ 142

144 Productivity Metrics: Tickets per Technician per Month Definition Tickets per Technician per Month is the average monthly ticket volume divided by the average Full Time Equivalent (FTE) technician headcount. Ticket volume includes both incidents and service requests. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Tickets per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Ticket Handle Time. Conversely, a high number of technician handled tickets may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Ticket Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Tickets per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Incident Work Time Average Service Request Work Time Average Travel Time per Ticket 143

145 Tickets per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Tickets per Technician per Month Key Statistics Tickets per Technician per Month High Average Median 93.5 Low 32.3 Company XYZ Desktop Support 144

146 Productivity Metrics: Incidents per Technician per Month Definition Incidents per Technician per Month is the average monthly incident volume divided by the average Full Time Equivalent (FTE) technician headcount. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Incidents per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Incident Handle Time. Conversely, a high number of technician handled incidents may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Incident Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Incidents per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Incident Work Time Average Travel Time per Ticket Incidents as a % of Total Ticket Volume 145

147 Incidents per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Incidents per Technician per Month Key Statistics Incidents per Technician per Month High Average Median 56.9 Low 19.2 Company XYZ Desktop Support 146

148 Productivity Metrics: Service Requests per Tech per Month Definition Service Requests per Technician per Month is the average monthly service request volume divided by the average Full Time Equivalent (FTE) technician headcount. Technician headcount is the average FTE number of employees and contractors handling Desktop Support tickets. Why it s Important Service Requests per Technician per Month is an important indicator of technician productivity. A low number could indicate low Technician Utilization, poor scheduling efficiency or schedule adherence, or a higher than average Service Request Handle Time. Conversely, a high number of technician handled service requests may indicate high Technician Utilization, good scheduling efficiency and schedule adherence, or a lower than average Service Request Handle Time. Every Desktop Support group should track and trend this metric on a monthly basis. Key Correlations Service Requests per Technician per Month is strongly correlated with the following metrics: Technician Utilization Average Service Request Work Time Average Travel Time per Ticket Incidents as a % of Total Ticket Volume 147

149 Service Requests per Technician per Month Sample Report Only. Data is not accurate. Productivity Metrics: Service Requests per Tech per Month Key Statistics Service Requests per Technician per Month High Average Median 17.8 Low 7.1 Company XYZ Desktop Support 148

150 Productivity Metrics: Technicians as a Percent of Total FTE's Definition This metric is the Full Time Equivalent Technician headcount divided by the total Desktop Support headcount. It is expressed as a percentage, and represents the percentage of total Desktop Support personnel who are engaged in direct customer service activities. Why it s Important The Technician headcount as a percent of total Desktop Support headcount is an important measure of management and overhead efficiency. Since non-technicians include both management and nonmanagement personnel (e.g., supervisors and team leads, QA/QC, trainers, etc.), this metric is not a pure measure of management span of control. It is, however, a more useful metric than management span of control because the denominator of this ratio takes into account all personnel that are not directly engaged in customer service activities. Key Correlations Technicians as a % of Total Headcount is strongly correlated with the following metrics: Cost per Inbound Contact Cost per Minute of Inbound Handle Time 149

151 Technicians as a Percent of Total FTE's Sample Report Only. Data is not accurate. Productivity Metrics: Technicians as a Percent of Total FTE's 98.0% 96.0% 94.0% 92.0% Key Statistics Technicians as a Percent of Total FTE's High 97.1% Average % Median 88.1% Low 79.6% Company XYZ 87.5% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% Desktop Support 150

152 Definition Productivity Metrics: Technician Utilization Technician Utilization is the average time that a Technician spends handling both inbound and outbound contacts per month, divided by the number of work hours in a given month. The calculation for Technician Utilization is shown on the next page. Why it s Important Technician Utilization is the single most important indicator of Technician productivity. It measures the percentage of time that the average Technician is in work mode, and is independent of Contact Handle Time or call complexity. Key Correlations Technician Utilization is strongly correlated with the following metrics: Inbound Contacts per Technician per Month Cost per Inbound Contact Cost per Minute of Inbound Handle Time Technician Occupancy Average Speed of Answer 151

153 Technician Utilization Defined ((Average number of inbound calls handled by a tech in a month) X (Average inbound handle time in minutes) + Technician Utilization = (Average number of outbound calls handled by a tech in a month) X (Average outbound handle time in minutes)) (Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr) Tech Utilization is a measure of actual time worked by techs in a month, divided by total time at work during the month It takes into account both inbound and outbound contacts handled by the techs But it does not make adjustments for sick days, holidays, training time, project time, or idle time 152

154 Example: Desktop Support Technician Utilization Inbound Contacts per Technician per Month = 375 Outbound Contacts per Technician per Month = 225 Average Inbound Contact Handle Time = 10 minutes Average Outbound Contact Handle Time = 5 minutes Technician Utilization = ((Average number of inbound Contacts handled by a Technician in a month) X (Average inbound handle time in minutes) + (Average number of outbound Contacts handled by a Technician in a month) X (Average outbound handle time in minutes)) (Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr) Technician Utilization ((375 Inbound Contacts per Month) X (10 minutes) + (225 Outbound Contacts per Month) X (5 minutes) = = 50.4% (21.5 working days per month) X (7.5 work hours per day) X (60 minutes/hr) Technician Utilization 153

155 Technician Utilization Sample Report Only. Data is not accurate. Productivity Metrics: Technician Utilization 85.0% 80.0% 75.0% 70.0% Key Statistics Technician Utilization High 82.9% Average % Median 55.5% Low 47.1% Company XYZ 82.9% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% Desktop Support 154

156 Service Level Metrics Company XYZ 155

157 Service Level Metrics: Mean Time to Resolve Incidents Definition The Mean Time to Resolve Incidents is the average number of working hours that elapse from the time an incident is reported, until the time the incident is closed. Non working hours are not included in the calculation. If, for example, an incident is reported at 3:00 pm on a Tuesday, and the ticket is closed at 3:00 pm on Wednesday, the MTTR will be 8 hours, not 24 hours. Why it s Important Service levels, including the MTTR for incidents and service requests, is a key driver of customer satisfaction with Desktop Support. Key Correlations Mean Time to Resolve Incidents is strongly correlated with the following metrics: Customer Satisfaction Average Incident Work Time Average Travel Time per Ticket % of Incidents Resolved in 8 Hours 156

158 Mean Time to Resolve Incidents (hrs) Sample Report Only. Data is not accurate. Service Level Metrics: Mean Time to Resolve Incidents Key Statistics Mean Time to Resolve Incidents (hrs) High Average Median 6.30 Low 3.00 Company XYZ Desktop Support 157

159 Service Level Metrics: % of Incidents Resolved in 24 Hours Definition The % of Incidents Resolved in 24 hours is fairly self-explanatory. Non-working days are not included in the calculation. So, for example, an incident that is reported at 1:00 pm on a Friday will be resolved in 24 hours if the ticket is closed by 1:00 pm on a Monday. Why it s Important Service levels, including the % of Incidents Resolved in 24 Hours, is a key driver of customer satisfaction with Desktop Support. Key Correlations % of Incidents Resolved in 24 Hours is strongly correlated with the following metrics: Customer Satisfaction Average Incident Work Time Average Travel Time per Ticket Mean Time to Resolve Incidents 158

160 % of Incidents Resolved in 24 Hours Sample Report Only. Data is not accurate. Service Level Metrics: % of Incidents Resolved in 24 Hours 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics % of Incidents Resolved in 24 Hours High 86.7% Average % Median 75.3% Low 30.1% Company XYZ 30.1% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% Desktop Support 159

161 Service Level Metrics: Mean Time to Fulfill Service Requests Definition The Mean Time to Fulfill Service Requests is the average number of working days that elapse from the time a service request is logged, until the time the service request is completed. Non working days are not included in the calculation. If, for example, a service request is logged at 3:00 pm on a Friday, and the ticket is closed at 3:00 pm on Tuesday, the MTTF will be 2 working days, not 4 working days. Why it s Important Service levels, including the MTTF for service requests and incidents, is a key driver of customer satisfaction with Desktop Support. Key Correlations Mean Time to Fulfill Service Requests is strongly correlated with the following metrics: Customer Satisfaction Average Service Request Work Time Average Travel Time per Ticket % of Service Requests Fulfilled in 24 Hours 160

162 Mean Time to Fulfill Service Requests (Days) Sample Report Only. Data is not accurate. Service Level Metrics: Mean Time to Fulfill Service Requests Key Statistics Mean Time to Fulfill Service Requests (Days) High 6.40 Average Median 3.10 Low 1.20 Company XYZ Desktop Support 161

163 Service Level Metrics: % of Service Requests Fulfilled in 72 Hours Definition The % of Service Requests Fulfilled in 72 Hours is fairly self-explanatory. Non working days are not included in the calculation. So, for example, a Service Request that is logged at 1:00 pm on a Friday will be fulfilled in 24 hours if the request is fulfilled by 1:00 pm the following Wednesday. Why it s Important Service levels, including the % of Service Requests Fulfilled in 72 Hours, is a key driver of customer satisfaction with Desktop Support. Key Correlations % of Service Requests Fulfilled in 72 Hours is strongly correlated with the following metrics: Customer Satisfaction Average Service Request Work Time Average Travel Time per Ticket Mean Time to Fulfill Service Requests 162

164 % of Service Requests Fulfilled in 72 Hours Sample Report Only. Data is not accurate. Service Level Metrics: % of Service Requests Fulfilled in 72 Hours 85.0% 80.0% 75.0% 70.0% 65.0% Key Statistics % of Service Requests Fulfilled in 72 Hours High 80.7% Average % Median 51.8% Low 33.5% Company XYZ 42.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% Desktop Support 163

165 Quality Metrics Company XYZ 164

166 Definition Quality Metrics: Customer Satisfaction Customer Satisfaction is the percentage of customers who are either satisfied or very satisfied with their Desktop Support experience. This metric can be captured in a numbers of ways including automatic after-call IVR surveys, follow-up outbound (live Technician) calls, surveys, postal surveys, etc. Why it s Important Customer Satisfaction is the single most important measure of Desktop Support quality. Any successful Desktop Support will have consistently high Customer Satisfaction ratings. Some Desktop Support managers are under the impression that a low Cost per Inbound Contact may justify a lower level of Customer Satisfaction. But this is not true. MetricNet s research shows that even Desktop Support s with a very low Cost per Inbound Contact can achieve consistently high Customer Satisfaction ratings. Key Correlations Customer Satisfaction is strongly correlated with the following metrics: First Contact Resolution Rate Call Quality 165

167 Customer Satisfaction Sample Report Only. Data is not accurate. Quality Metrics: Customer Satisfaction 100.0% 98.0% 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% 62.0% Key Statistics Customer Satisfaction High 98.5% Average % Median 86.6% Low 66.5% Company XYZ 92.0% Desktop Support 166

168 Quality Metrics: Incident First Visit Resolution Rate Definition Incident First Visit Resolution Rate is the percentage of incidents that are resolved on the first visit to the customer. Incidents that require a second visit, or are otherwise unresolved on the first visit for any reason, do not qualify for Incident First Visit Resolution. Why it s Important Incident First Visit Resolution Rate is one of the biggest drivers of Customer Satisfaction. A high Incident First Visit Resolution Rate is almost always associated with high levels of Customer Satisfaction. Desktop Support groups that emphasize training and have good technology tools generally enjoy a higher than average Incident First Visit Resolution Rate. Key Correlations Incident First Visit Resolution Rate is strongly correlated with the following metrics: Customer Satisfaction New Technician Training Hours Annual Technician Training Hours Average Incident Work Time 167

169 Incident First Visit Resolution Rate Sample Report Only. Data is not accurate. Quality Metrics: Incident First Visit Resolution Rate 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% Key Statistics Incident First Visit Resolution Rate High 95.1% Average % Median 85.3% Low 67.0% Company XYZ N/A 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 168

170 Quality Metrics: % Resolved Level 1 Capable Definition % Resolved Level 1 Capable is the percentage of tickets resolved by Desktop Support that could have been resolved by the Level 1 Service Desk. This metric is generally tracked by sampling desktop tickets after the fact to determine the percentage that could have been resolved at Level 1, or by having the Desktop Support technician check a box on the trouble ticket when closing a ticket, that indicates that the ticket could have been resolved at Level 1. Why it s Important Tickets resolved by Desktop Support that could have been resolved by the Level 1 Service Desk represent defects. Since the cost of resolution is typically much higher at Desktop Support than it is for Level 1 support, every ticket that is unnecessarily escalated by Level 1 to Desktop Support incurs unnecessary costs. To minimize TCO for end-user support, the % Resolved Level 1 Capable should be as low as possible. Key Correlations % Resolved Level 1 Capable is strongly correlated with the following metrics: Average Incident Work Time Tickets per Seat per Month Incidents per Seat per Month 169

171 % Resolved Level 1 Capable Sample Report Only. Data is not accurate. Quality Metrics: % Resolved Level 1 Capable 30.0% 28.0% 26.0% 24.0% 22.0% Key Statistics % Resolved Level 1 Capable High 29.0% Average % Median 19.1% Low 6.9% Company XYZ N/A 20.0% 18.0% 16.0% 14.0% 12.0% 10.0% 8.0% 6.0% 4.0% 2.0% 0.0% Desktop Support 170

172 Technician Metrics Company XYZ 171

173 Technician Metrics: Annual Technician Turnover Definition Annual Technician Turnover is the percentage of Technicians that leave Desktop Support, for any reason (voluntarily or involuntarily), on an annual basis. Why it s Important Technician turnover is costly. Each time a Technician leaves Desktop Support, a new Technician needs to be hired to replace the outgoing Technician. This results in costly recruiting, hiring, and training expenses. Additionally, it is typically several weeks or even months before a Technician is fully productive, so there is lost productivity associated with Technician turnover as well. High Technician turnover is generally associated with low Technician morale in a Desktop Support. Key Correlations Annual Technician Turnover is strongly correlated with the following metrics: Daily Technician Absenteeism Annual Technician Training Hours Customer Satisfaction Net First Contact Resolution Rate Cost per Inbound Contact Technician Job Satisfaction 172

174 Annual Technician Turnover Sample Report Only. Data is not accurate. Technician Metrics: Annual Technician Turnover 65.0% 60.0% 55.0% 50.0% 45.0% Key Statistics Annual Technician Turnover High 62.3% Average % Median 28.5% Low 9.6% Company XYZ 62.3% 40.0% 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% Desktop Support 173

175 Technician Metrics: Daily Technician Absenteeism Definition Daily Technician Absenteeism is the average percentage of Technicians with an unexcused absence on any given day. It is calculated by dividing the number of absent Technicians by the total number of Technicians that are scheduled to be at work. Why it s Important High Technician Absenteeism is problematic because it makes it difficult for a Desktop Support to schedule resources efficiently. High absenteeism can severely impact a Desktop Support s operating performance, and increase the likelihood that service level targets will be missed. A Desktop Support s ASA and Call Abandonment Rate typically suffer when absenteeism is high. Also, chronically high absenteeism is often a sign of low Technician morale. Key Correlations Daily Technician Absenteeism is strongly correlated with the following metrics: Annual Technician Turnover Technician Job Satisfaction Technician Utilization Cost per Inbound Contact Contacts per Technician per Month 174

176 Daily Technician Absenteeism Sample Report Only. Data is not accurate. Technician Metrics: Daily Technician Absenteeism 12.0% 11.0% 10.0% 9.0% Key Statistics Daily Technician Absenteeism High 11.0% Average % Median 3.9% Low 0.6% Company XYZ 11.0% 8.0% 7.0% 6.0% 5.0% 4.0% 3.0% 2.0% 1.0% 0.0% Desktop Support 175

177 Technician Metrics: New Technician Training Hours Definition The name of this metric is somewhat self explanatory. New Technician Training Hours is the number of training hours (including classroom, CBT, self-study, shadowing, being coached, and OJT) that a new Technicians receives before he/she is allowed to handle customer contacts independently. Why it s Important New Technician Training Hours are strongly correlated with Call Quality and Net First Contact Resolution Rate. particularly during a Technician s first few months on the job. The more training a new Technician receives, the higher the Call Quality and Net FCR will typically be. This, in turn, has a positive effect on many other performance metrics including Customer Satisfaction. Perhaps most importantly, training levels have a strong impact on Technician morale: Technicians who receive more training typically have higher levels of job satisfaction. Key Correlations New Technician Training Hours are strongly correlated with the following metrics: Call Quality Net First Contact Resolution Rate Customer Satisfaction Inbound Contact Handle Time Technician Job Satisfaction 176

178 New Technician Training Hours Sample Report Only. Data is not accurate. Technician Metrics: New Technician Training Hours Key Statistics New Technician Training Hours High 146 Average Median 61 Low 0 Company XYZ Desktop Support 177

179 Technician Metrics: Annual Technician Training Hours Definition Annual Technician Training Hours is the average number of training hours (including classroom, CBT, self-study, shadowing, etc.) that a Technician receives on an annual basis. This number includes any training hours that a Technician receives that are not part of the Technician s initial (new Technician) training, but it does not include routine team meetings, shift handoffs, or other activities that do not involve formal training. Why it s Important Annual Technician Training Hours are strongly correlated with Call Quality, Customer Satisfaction, and Net First Contact Resolution Rate. Perhaps most importantly, training levels have a strong impact on Technician morale: Technicians who train more typically have higher levels of job satisfaction. Key Correlations Annual Technician Training Hours are strongly correlated with the following metrics: Call Quality Net First Contact Resolution Rate Customer Satisfaction Inbound Contact Handle Time Technician Job Satisfaction 178

180 Annual Technician Training Hours Sample Report Only. Data is not accurate. Technician Metrics: Annual Technician Training Hours Key Statistics Annual Technician Training Hours High 56 Average Median 3 Low 0 Company XYZ N/A Desktop Support 179

181 Technician Metrics: Technician Tenure (months) Definition Technician tenure is the average number of months that Technicians have worked on a particular Desktop Support. Why it s Important Technician tenure is a measure of Technician experience. Virtually every metric related to Desktop Support cost and quality is impacted by the level of experience the Technicians have. Key Correlations Technician tenure is strongly correlated with the following metrics: Cost per Inbound Contact Call Quality Customer Satisfaction Annual Technician Turnover Training Hours Coaching Hours Inbound Contact Handle Time Net First Contact Resolution Rate Technician Job Satisfaction 180

182 Technician Tenure (months) Sample Report Only. Data is not accurate. Technician Metrics: Technician Tenure (months) Key Statistics Technician Tenure (months) High 88.0 Average Median 37.4 Low 14.6 Company XYZ Desktop Support 181

183 Technician Metrics: Technician Job Satisfaction Definition Technician Job Satisfaction is the percent of Technicians in a Desktop Support that are either satisfied or very satisfied with their jobs. Why it s Important Technician Job Satisfaction is a proxy for Technician morale. And morale, while difficult to measure, is a bellwether metric that affects almost every other metric in Desktop Support. High performance Desktop Supports almost always have high levels of Technician Job Satisfaction. Perhaps more importantly, this metric can be controlled and improved through training, coaching, and career pathing. Key Correlations Technician Job Satisfaction is strongly correlated with the following metrics: Annual Technician Turnover Customer Satisfaction Daily Technician Absenteeism Net First Contact Resolution Rate Technician Training Hours Inbound Contact Handle Time Technician Coaching Hours Cost per Inbound Contact 182

184 Technician Job Satisfaction Sample Report Only. Data is not accurate. Technician Metrics: Technician Job Satisfaction 88.0% 86.0% 84.0% 82.0% Key Statistics Technician Job Satisfaction High 85.4% Average % Median 78.1% Low 69.9% Company XYZ 78.6% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% Desktop Support 183

185 Ticket Handling Metrics Company XYZ 184

186 Ticket Handling Metrics: Average Incident Work Time Definition Average Incident Work Time is the average time that a technician spends to resolve an incident. This does not include travel time to and from the customer, or time between visits if multiple visits are required to the user s desktop to resolve an incident. It includes only the time that a technician spends actually working on an incident. Why it s Important Incident Work Time is one of the basic units of work in Desktop Support. Average Incident Work Time, therefore, represents the amount of labor required to resolve one incident. Key Correlations Average Incident Work Time is strongly correlated with the following metrics: Cost per Incident Incidents per Technician per Month Incident First Visit Resolution Rate 185

187 Average Incident Work Time (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Average Incident Work Time Key Statistics Average Incident Work Time (minutes) High 40.6 Average Median 19.2 Low 11.1 Company XYZ Desktop Support 186

188 Ticket Handling Metrics: Average Service Request Work Time Definition Average Service Request Work Time is the average time that a technician spends to resolve a service request. This does not include travel time to and from the customer, or time between visits if multiple visits are required to fulfill a service request. It includes only the time that a technician spends actually fulfilling a service request. Why it s Important Service Request Work Time is one of the basic units of work in Desktop Support. Average Service Request Work Time, therefore, represents the amount of labor required to fulfill one service request. Key Correlations Average Service Request Work Time is strongly correlated with the following metrics: Cost per Service Request Service Requests per Technician per Month 187

189 Average Service Request Work Time (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Average Service Request Work Time Key Statistics Average Incident Work Time (minutes) High Average Median 45.0 Low 21.0 Company XYZ Desktop Support 188

190 Ticket Handling Metrics: Estimated Travel Time per Ticket Definition Average Travel Time per Ticket is the average round trip travel time to get to and from the site of a user or device being serviced. In a high density user environment (e.g., high rise office building) the Travel Time per Ticket will typically be less than 20 minutes. By contrast, in a more distributed user environment (e.g., field or campus locations), the Travel Time per Ticket will be correspondingly longer. Why it s Important Unlike the Level 1 Service Desk where support is provided remotely, Desktop Support, by definition, requires onsite support. Getting to and from the site of a ticket can be very time consuming, and will influence the number of tickets a technician can handle in a day or a month. This, in turn, influences the level of staffing required in the Desktop Support organization. Key Correlations Average Travel Time per Ticket is strongly correlated with the following metrics: Cost per Ticket Number of Tickets per Technician per Month 189

191 Estimated Travel Time per Ticket (minutes) Sample Report Only. Data is not accurate. Ticket Handling Metrics: Estimated Travel Time per Ticket Key Statistics Estimated Travel Time per Ticket (minutes) High 99.0 Average Median 30.0 Low 10.0 Company XYZ Desktop Support 190

192 Workload Metrics Company XYZ 191

193 Workload Metrics: Tickets per Seat per Month Definition Tickets per Seat per Month is a measure of the volume of Desktop Support work generated by a given user population. The number of Tickets per Seat per Month can vary dramatically from one organization to another, driven by factors such as the age of devices being supported, the number of mobile devices, the location of users (office, home, field), the number of laptop computers, and myriad other factors. Why it s Important The number of Tickets per Seat per Month will drive the workload, and hence the staffing for a Desktop Support group. Desktop Support staffing decisions should be based on this metric, rather than the number of users being supported. Key Correlations Tickets per Seat per Month is strongly correlated with the following metrics: Incidents per Seat per Month Service Requests per Seat per Month 192

194 Tickets per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Tickets per Seat per Month Key Statistics Tickets per Seat per Month High 1.01 Average Median 0.46 Low 0.19 Company XYZ Desktop Support 193

195 Workload Metrics: Incidents per Seat per Month Definition Incidents per Seat per Month is a key measure of the volume of Desktop Support work generated by a given user population. The number of Incidents per Seat per Month can vary dramatically from one organization to another, driven by factors such as the age of devices being supported, the number of mobile devices, the location of users (office, home, field), the number of laptop computers, and myriad other factors. Why it s Important The number of Incidents per Seat per Month is a major workload driver, and will therefore have a strong impact on staffing decisions for Desktop Support. Key Correlations Incidents per Seat per Month is strongly correlated with the following metrics: Tickets per Seat per Month 194

196 Incidents per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Incidents per Seat per Month Key Statistics Incidents per Seat per Month High 0.85 Average Median 0.27 Low 0.05 Company XYZ Desktop Support 195

197 Workload Metrics: Service Requests per Seat per Month Definition Service Requests per Seat per Month is a key measure of the volume of Desktop Support work generated by a given user population. The number of Service Requests per Seat per Month can vary dramatically from one organization to another, driven by factors such as the number of move/add/change requests, the age of devices being supported, the location of users (office, home, field), the frequency of device refreshes, and myriad other factors. Why it s Important The number of Service Requests per Seat per Month is a major workload driver, and will therefore have a strong impact on staffing decisions for Desktop Support. Key Correlations Service Requests per Seat per Month is strongly correlated with the following metrics: Tickets per Seat per Month 196

198 Service Requests per Seat per Month Sample Report Only. Data is not accurate. Workload Metrics: Service Requests per Seat per Month Key Statistics Service Requests per Seat per Month High 0.28 Average Median 0.11 Low 0.04 Company XYZ Desktop Support 197

199 Workload Metrics: Incidents as a % of Total Ticket Volume Definition Incidents as a % of Total Ticket Volume is a fairly self-explanatory metric. It is an indicator of the mix of work (Incidents vs. Service Requests) handled by a Desktop Support group. Most Desktop Support organizations receive more incidents than service requests. Since incidents are generally less costly to resolve than service requests, the higher the number of Incidents as a % of Total Ticket Volume, the lower the Cost per Ticket will be. Why it s Important Incidents are generally unplanned work (e.g., device break/fix), while the majority of service requests are planned work (e.g., move/add/change). Incidents as a % of Total Ticket Volume is therefore a measure of the percentage of Desktop Support work that is made up of unplanned work (incidents). Key Correlations Incidents as a % of Total Ticket Volume is strongly correlated with the following metrics: Cost per Ticket Tickets per Technician per Month 198

200 Incidents as a % of Total Ticket Volume Sample Report Only. Data is not accurate. Workload Metrics: Incidents as a % of Total Ticket Volume 100.0% 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% Key Statistics Incidents as a % of Total Ticket Volume High 93.4% Average % Median 75.4% Low 13.8% Company XYZ 13.8% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% Desktop Support 199

201 About MetricNet: Your Benchmarking Partner 200

202 Your Project Manager: Jeff Rumburg Jeff Rumburg is a co-founder and Managing Partner at MetricNet, LLC. Jeff is responsible for global strategy, product development, and financial operations for the company. As a leading expert in benchmarking and re-engineering, Mr. Rumburg authored a best selling book on benchmarking, and has been retained as a benchmarking expert by such well-known companies as American Express, Hewlett-Packard, and GM. Prior to co-founding MetricNet, Mr. Rumburg was president and founder of The Verity Group, an international management consulting firm specializing in IT benchmarking. While at Verity, Mr. Rumburg launched a number of syndicated benchmarking services that provided low cost benchmarks to more than 1,000 corporations worldwide. Mr. Rumburg has also held a number of executive positions at META Group, and Gartner, Inc. As a vice president at Gartner, Mr. Rumburg led a project team that reengineered Gartner's global benchmarking product suite. And as vice president at META Group, Mr. Rumburg's career was focused on business and product development for IT benchmarking. Mr. Rumburg's education includes an M.B.A. from the Harvard Business School, an M.S. magna cum laude in Operations Research from Stanford University, and a B.S. magna cum laude in Mechanical Engineering. He is author of A Hands-On Guide to Competitive Benchmarking: The Path to Continuous Quality and Productivity Improvement, and has taught graduate-level engineering and business courses. Mr. Rumburg serves on the Strategic Advisory Board for HDI, formerly the Help Desk Institute. 201

203 Benchmarking is MetricNet s Core Business Information Technology Call Centers Telecom Desktop Support Service Desk Field Services Technical Support Customer Service Telemarketing/Telesales Collections Cost Benchmarking Satisfaction Customer Satisfaction Employee Satisfaction 202

204 25 Years of Desktop Support Benchmarking Data More than 1,100 Desktop Support Benchmarks Global Database 30 Key Performance Indicators Nearly 60 Industry Best Practices 203

205 Meet a Sampling of Our Clients MetricNet Conducts benchmarking for Desktop Support groups worldwide, and across virtually every industry sector. 204

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009 Unleashing the Enormous Power of Help Desk KPI s Help Desk Best Practices Series January 20, 2009 The Premise Behind Help Desk KPI s We ve all heard the expression If you re not measuring it, you re not

More information

Company XYZ. Call Center Peer Group Benchmark. Company XYZ

Company XYZ. Call Center Peer Group Benchmark. Company XYZ Company XYZ Call Center Peer Group Benchmark Company XYZ Report Contents Project Overview and Objectives Page 2 Industry Background Page 34 Performance Benchmarking Summary Page 42 Interview Themes and

More information

Industry Megatrends: A Benchmarking Perspective

Industry Megatrends: A Benchmarking Perspective Industry Megatrends: A Benchmarking Perspective 20 Years of Help Desk Data More than 1,200 Help Desk Benchmarks Global Database 30 Key Performance Indicators Nearly 80 Industry Best Practices 1 Then and

More information

Metric of the Month: Percent Resolved Level 1 Capable

Metric of the Month: Percent Resolved Level 1 Capable Metric of the Month: By Jeff Rumburg Every month, in the Industry Insider, I will highlight one key performance indicator (KPI) for the service desk or desktop support. I will define the KPI, provide recent

More information

The True Cost of Desktop Support:

The True Cost of Desktop Support: : Understanding the Critical Cost Drivers of Desktop Support By Jeff Rumburg Managing Partner at: Introduction Most companies believe that the cost of desktop support consists entirely of the personnel,

More information

Unleashing the Enormous Power of Call Center KPIs. Call Center Best Practices Series

Unleashing the Enormous Power of Call Center KPIs. Call Center Best Practices Series Unleashing the Enormous Power of Call Center KPIs Call Center Best Practices Series MetricNet at the Annual HDI Conference in Orlando Wednesday April 25 th : The Zen of Support The Path to Strategic Enlightenment

More information

Metric of the Month: Tickets per User per Month

Metric of the Month: Tickets per User per Month INDUSTRY INSIDER 1 Metric of the Month: Tickets per User per Month By Jeff Rumburg Every month, in the Industry Insider, I highlight one key performance indicator (KPI) for the service desk or desktop

More information

Service and Support as a Business

Service and Support as a Business KPI s that Tell the Big Picture By Jeff Rumburg Managing Partner at: Introduction Most IT professionals are familiar with the operational metrics of service and support. KPI s such cost per ticket, first

More information

Metric of the Month: Tickets per Technician per Month

Metric of the Month: Tickets per Technician per Month INDUSTRY INSIDER 1 Metric of the Month: Tickets per Technician per Month By Jeff Rumburg Every month, in the Industry Insider, I highlight one key performance indicator (KPI) for the service desk or desktop

More information

The True Cost of Desktop Support: Understanding the Critical Cost Drivers

The True Cost of Desktop Support: Understanding the Critical Cost Drivers The True Cost of Desktop Support: Understanding the Critical Cost Drivers by Jeff Rumburg Most companies believe that the cost of desktop support consists entirely of the personnel, technology, and facilities

More information

The Service Desk Survival Guide 2005 Peter McGarahan

The Service Desk Survival Guide 2005 Peter McGarahan The Service Desk Survival Guide 2005 Peter McGarahan The Outlook For 2005 Go After Project Funding Globalization/Consolidation Outsourcing/Offshoring Career Development in/outside of IT Cost Containment

More information

ITSM Process Description

ITSM Process Description ITSM Process Description Office of Information Technology Incident Management 1 Table of Contents Table of Contents 1. Introduction 2. Incident Management Goals, Objectives, CSFs and KPIs 3. Incident Management

More information

Metrics That Matter. Presented by: Pete McGarahan

Metrics That Matter. Presented by: Pete McGarahan Metrics That Matter Presented by: Pete McGarahan WHY? Consistent measurement/tracking provides an important feedback loop for continuous improvement Defining/reporting consistent metrics forces the Help

More information

Metric of the Month: First Contact Resolution

Metric of the Month: First Contact Resolution Metric of the Month: First Contact Resolution By Jeff Rumburg Every month, the Industry Insider will highlight one key performance indicator (KPI) for the service desk or desktop support. We will define

More information

HOW TO OPTIMIZE INFRASTRUCTURE SUPPORT SERVICES. BETSOL The Right Solution,Right Now

HOW TO OPTIMIZE INFRASTRUCTURE SUPPORT SERVICES. BETSOL The Right Solution,Right Now BETSOL The Right Solution,Right Now HOW TO OPTIMIZE INFRASTRUCTURE SUPPORT SERVICES A look at the people and processes of best-in-class, cost-effective organizations IN THIS WHITE PAPER, YOU LL FIND LEADING

More information

Ongoing Help Desk Management Plan

Ongoing Help Desk Management Plan Ongoing Help Desk Management Plan HELP DESK IMPLEMENTATION /MANAGEMENT The Vendor shall provide in its Response to DIR a Help Desk Implementation Plan which shall include, but not be limited to: a. Customer

More information

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness AUDITOR GENERAL S REPORT ACTION REQUIRED IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness Date: September 18, 2013 To: From: Wards: Audit Committee Auditor General All Reference

More information

Improving. Summary. gathered from. research, and. Burnout of. Whitepaper

Improving. Summary. gathered from. research, and. Burnout of. Whitepaper Whitepaper Improving Productivity and Uptime with a Tier 1 NOC Summary This paper s in depth analysis of IT support activities shows the value of segmenting and delegatingg activities based on skill level

More information

ENTERPRISE SERVICE DESK (ESD) SERVICE DELIVERY GUIDE

ENTERPRISE SERVICE DESK (ESD) SERVICE DELIVERY GUIDE National Aeronautics and Space Administration NASA Shared Services Center Stennis Space Center, MS 39529-6000 www.nssc.nasa.gov Enterprise Service Desk Service Delivery Guide NSSDG-2410-0001 Basic Version

More information

Remote Support: Key Metrics to drive Improvement in your Center

Remote Support: Key Metrics to drive Improvement in your Center Remote Support: Key Metrics to drive Improvement in your Center Jeremy Curley, Director of Business Solutions, Bomgar Mike Sell, Director of Strategic Alliances, Bomgar 0 Why Are Metrics Not Improving?

More information

Session 410 Improving the Customer Experience through Desktop Support

Session 410 Improving the Customer Experience through Desktop Support Session 410 Improving the Customer Experience through Desktop Support Rae Ann Bruno rbruno@businesssolutionstraining.com Desktop Support Team Desktop support is the literal face of I.T. Customer expectations

More information

Helping Midsize Businesses Grow Through HR Technology

Helping Midsize Businesses Grow Through HR Technology Helping Midsize Businesses Grow Through HR Technology As a business grows, the goal of streamlining operations is increasingly important. By maximizing efficiencies across the board, employee by employee,

More information

Optimizing Service Delivery Through Analytics and ITSM. Rich Jaso VP, Unisys Global Managed Services

Optimizing Service Delivery Through Analytics and ITSM. Rich Jaso VP, Unisys Global Managed Services Optimizing Service Delivery Through Analytics and ITSM Rich Jaso VP, Unisys Global Managed Services Agenda Challenges Facing the Market in Optimizing IT Support ITSM Analytics Driving Operational Change

More information

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University. Data Sheet Cisco Optimization s Optimize Your Solution using Cisco Expertise and Leading Practices Optimizing Your Business Architecture Today, enabling business innovation and agility is about being able

More information

Metric of the Month: The Service Desk Balanced Scorecard

Metric of the Month: The Service Desk Balanced Scorecard INDUSTRY INSIDER 1 Metric of the Month: The Service Desk Balanced Scorecard By Jeff Rumburg Every month, in the Industry Insider, I highlight one key performance indicator (KPI) for the service desk or

More information

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6 Process Description Incident/Request HUIT Process Description v6.docx February 12, 2013 Version 6 Document Change Control Version # Date of Issue Author(s) Brief Description 1.0 1/21/2013 J.Worthington

More information

The Seven Most Important Performance Indicators for the Service Desk

The Seven Most Important Performance Indicators for the Service Desk How Does YOUR Service Desk Stack Up? The Seven Most Important Performance Indicators for the Service Desk By Jeff Rumburg and Eric Zbikowski Managing Partners at: Introduction Today s Service Desk technologies

More information

ITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists

ITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists Incident ITSM Maturity Model 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident process exists Incident policies governing incident Incident urgency, impact and priority

More information

IT Support Supervisor #02982 City of Virginia Beach Job Description Date of Last Revision: 12-14-2015

IT Support Supervisor #02982 City of Virginia Beach Job Description Date of Last Revision: 12-14-2015 City of Virginia Beach Job Description Date of Last Revision: 12-14-2015 FLSA Status: Exempt Pay Plan: Administrative Grade: 12 City of Virginia Beach Organizational Mission & Values The City of Virginia

More information

COUNTY OF ORANGE, CA Schedule 2D Service Desk Services SOW SCHEDULE 2D SERVICE DESK SERVICES SOW. for. Date TBD

COUNTY OF ORANGE, CA Schedule 2D Service Desk Services SOW SCHEDULE 2D SERVICE DESK SERVICES SOW. for. Date TBD SCHEDULE 2D SERVICE DESK SERVICES SOW for COUNTY OF ORANGE, CA Date TBD Schedule 2D Service Desk Services SOW Table of Contents 1.0 Service Desk Services Overview and Service Objectives... 1 1.1 Service

More information

Small Business. Leveraging SBA IT resources to support America s small businesses

Small Business. Leveraging SBA IT resources to support America s small businesses Small Business Administration Information Technology Strategic Plan ( ITSP) 2012-2016 Leveraging SBA IT resources to support America s small businesses Message from the Chief Information Officer The Small

More information

Benchmark Against Best Practice Service Delivery Metrics

Benchmark Against Best Practice Service Delivery Metrics Benchmark Against Best Practice Service Delivery Metrics Featuring: Julie Giera, Forrester Pierre Champigneulle, BearingPoint Host: Jason Schroedl, newscale Internal Service Delivery The average company

More information

Managing a 24x7x365 Support Center and Network Engineering for a Government Agency QUICK FACTS

Managing a 24x7x365 Support Center and Network Engineering for a Government Agency QUICK FACTS [ Government, Managed Services Offering, Network Infrastructure Services Support Services ] TEKSYSTEMS GLOBAL SERVICES CUSTOMER SUCCESS STORIES Client Profile Industry: Government Employees: More than

More information

TECHNOLOGY IT ROADMAP SERVICE CORE

TECHNOLOGY IT ROADMAP SERVICE CORE BE FREE BE FREE OF TECHNOLOGY IT ROADMAP SERVICE CORE TALK TO OUR EXPERTS 1.877.222.8615 www.bestit.com WHY GET AN IT ROADMAP? Enterprise competitive performance is a critical differentiator in today s

More information

APPENDIX 3 TO SCHEDULE 8.1

APPENDIX 3 TO SCHEDULE 8.1 APPENDIX 3 TO SCHEDULE 8.1 TO THE COMPREHENSIVE INFRASTRUCTURE AGREEMENT 1.0 Transition Services and Affected Employees The highest priority in the design of Northrop Grumman s transition plan is to transfer

More information

The Evolving Role of Process Automation and the Customer Service Experience

The Evolving Role of Process Automation and the Customer Service Experience The Evolving Role of Process Automation and the Customer Service Experience Kyle Lyons Managing Director Ponvia Technology Gina Clarkin Product Manager Interactive Intelligence Table of Contents Executive

More information

APPENDIX 4 TO SCHEDULE 3.3

APPENDIX 4 TO SCHEDULE 3.3 EHIBIT J to Amendment No. 60 - APPENDI 4 TO SCHEDULE 3.3 TO THE COMPREHENSIVE INFRASTRUCTURE AGREEMENT APPENDI 4 TO SCHEDULE 3.3 TO THE COMPREHENSIVE INFRASTRUCTURE AGREEMENT EHIBIT J to Amendment No.

More information

Introduction to ITIL: A Framework for IT Service Management

Introduction to ITIL: A Framework for IT Service Management Introduction to ITIL: A Framework for IT Service Management D O N N A J A C O B S, M B A I T S E N I O R D I R E C T O R C O M P U T E R O P E R A T I O N S I N F O R M A T I O N S Y S T E M S A N D C

More information

Global Service Desk. Superior end-user support for the Adaptive Enterprise. HP Services

Global Service Desk. Superior end-user support for the Adaptive Enterprise. HP Services Global Service Desk Superior end-user support for the Adaptive Enterprise HP Services The HP Global Service Desk is a cost-effective way to reduce the complexity of your IT organization while delivering

More information

The Importance of First Contact Resolution. 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com

The Importance of First Contact Resolution. 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com 24/7 Service Desk Immediate Support. Long-term Value. DSScorp.com Contents 3 5 7 9 11 13 The Power of Partnership Defining FCR Improving Productivity Decreasing Cost Getting Started About the Author /

More information

ATTACHMENT V2. Transnet

ATTACHMENT V2. Transnet ATTACHMENT V2 HELP AND SERVICE DESK SERVICES TOWER For Transnet Help and Service Desk Service Tower Table of Contents 1.0 Help and Service Desk Services Overview and Objectives... 1 1.1 Services Overview...

More information

Enhancing Business Performance Through Innovative Technology Solutions

Enhancing Business Performance Through Innovative Technology Solutions Enhancing Business Performance Through Innovative Technology Solutions Contact Center = Customer Experience FIELD SERVICE Customer Service BACK OFFICE CONTACT CENTER BRANCH OFFICE Help Desk HR Finance

More information

Customer Contact Center Benchmarking Results Executive Summary

Customer Contact Center Benchmarking Results Executive Summary Customer Contact Center Benchmarking Results Executive Summary XYC Company SAP Value Engineering Agenda. Executive Summary. Company Baseline, Metrics and Performance Drivers. Best Practices 4. Participant

More information

ITS HELP DESK SUPPORT. Pace University and ITS Help Desk

ITS HELP DESK SUPPORT. Pace University and ITS Help Desk ITS HELP DESK SUPPORT Pace University and ITS Help Desk Summary This document covers the help desk support services offered from the ITS Help Desk for Pace University members. User Services ITS Help Desk

More information

Transform Your Service Desk by Using Award-winning Strategies

Transform Your Service Desk by Using Award-winning Strategies Transform Your Service Desk by Using Award-winning Strategies Charlotte HDI Chapter Meeting March16, 2012 Who is Technisource? National Technology Talent and Services Provider Leading service desk provider

More information

Development of a Balanced Scorecard for Service Desk KPIs

Development of a Balanced Scorecard for Service Desk KPIs Development of a Balanced Scorecard for Service Desk KPIs Presented by: Robert Higgins and Jason Reid Position: IT Service Desk and Telecommunications Manager and Team Leader Balanced IT Scorecard History

More information

Information Technology Engineers Examination. Information Technology Service Manager Examination. (Level 4) Syllabus

Information Technology Engineers Examination. Information Technology Service Manager Examination. (Level 4) Syllabus Information Technology Engineers Examination Information Technology Service Manager Examination (Level 4) Syllabus Details of Knowledge and Skills Required for the Information Technology Engineers Examination

More information

Service Level Agreement and Management By: Harris Kern s Enterprise Computing Institute

Service Level Agreement and Management By: Harris Kern s Enterprise Computing Institute Service Level Agreement and Management By: Harris Kern s Enterprise Computing Institute Service Level Management Service Level Management deals with how user service requirements are understood and managed.

More information

ADDENDUM 5 TO APPENDIX 4 TO SCHEDULE 3.3

ADDENDUM 5 TO APPENDIX 4 TO SCHEDULE 3.3 ADDENDUM 5 TO APPENDIX 4 TO SCHEDULE 3.3 TO THE Statement of Technical Approach for Help Desk Services Northrop Grumman s help desk solution will put in place the people, processes and tools to deliver

More information

Drive Down IT Operations Cost with Multi-Level Automation

Drive Down IT Operations Cost with Multi-Level Automation White White Paper Paper Drive Down IT Operations Cost with Multi-Level Automation Overview Reducing IT infrastructure and operations (I+O) budgets is as much on the mind of CIOs today as it s ever been.

More information

Scomis Service Report Spring 2015 (1 January 2015 to 12 April 2015)

Scomis Service Report Spring 2015 (1 January 2015 to 12 April 2015) Scomis Service Report (1 January to 12 April ) Service Desk The Scomis Service Desk provides front-line telephone support for over 600 establishments. During the Service Desk received a total of 9,320

More information

Contact Center Technology Monitoring

Contact Center Technology Monitoring tech line / oct 2012 Contact Center Technology Monitoring Monitoring allows companies to detect outages and issues for quick resolution, and enables effective planning for prevention and optimization going

More information

Cisco Network Optimization Service

Cisco Network Optimization Service Service Data Sheet Cisco Network Optimization Service Optimize your network for borderless business evolution and innovation using Cisco expertise and leading practices. New Expanded Smart Analytics Offerings

More information

Yale University Incident Management Process Guide

Yale University Incident Management Process Guide Yale University Management Process Guide Yale University Management Process 1 of 17 Introduction Purpose This document will serve as the official process of Management for Yale University. This document

More information

Let Your Call Center Customer Service Representatives be a Judge!

Let Your Call Center Customer Service Representatives be a Judge! Let Your Call Center Customer Service Representatives be a Judge! Written and Researched By Mike Desmarais, President of SQM Group Page 1 INTRODUCTION One of the best places to start improving both customer

More information

Creating Service Desk Metrics

Creating Service Desk Metrics Creating Service Desk Metrics Table of Contents 1 ITIL, PINK ELEPHANT AND ZENDESK... 3 2 IMPORTANCE OF MONITORING, MEASURING AND REPORTING... 3 2.1 BALANCED MANAGEMENT INFORMATION CATEGORIES... 3 2.2 CSFS,

More information

SESSION 709 Wednesday, November 4, 9:00am - 10:00am Track: Strategic View

SESSION 709 Wednesday, November 4, 9:00am - 10:00am Track: Strategic View SESSION 709 Wednesday, November 4, 9:00am - 10:00am Track: Strategic View The Business of IT Provisioning Bill Irvine Transformation Strategist, Accelerate Innovation, VMware billirvine@comcast.net Session

More information

City of Hapeville, GA VC3Advantage Work Order

City of Hapeville, GA VC3Advantage Work Order City of Hapeville, GA VC3Advantage Work Order ServiceAdvantage Work Order No. [ VC3INC-1097-62019 ] under the Master Services Agreement, dated. July 1, 2015 Atlanta Columbia Raleigh 1301 Gervais Street,

More information

Enabling Chat -- Key Success Factors in Chat Implementation

Enabling Chat -- Key Success Factors in Chat Implementation Enabling Chat -- Key Success Factors in Chat Implementation 0 WHY SWITCH TO CHAT SUPPORT? Benefits of Chat Support Additional method of support for customers Concurrent sessions improve productivity Reduced

More information

HP Service Manager software

HP Service Manager software HP Service Manager software The HP next generation IT Service Management solution is the industry leading consolidated IT service desk. Brochure HP Service Manager: Setting the standard for IT Service

More information

The Journey to World-Class

The Journey to World-Class The Journey to World-Class Achieving World-Class Performance Presented to: Presented by: The Hackett Group 13 July 2010 Contents Project Background Network Rail HR Baseline Executive Summary Process Level

More information

Department of Information Technology

Department of Information Technology Lines of Business LOB #132: END USER SERVICES Department of Information Technology Purpose The End User Services LOB in the Department of Information Technology is responsible for providing direct technical

More information

A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management

A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management 2016 BOMGAR CORPORATION ALL RIGHTS RESERVED WORLDWIDE 1 I need help. The little button disappeared. The red one

More information

The Importance of Information Delivery in IT Operations

The Importance of Information Delivery in IT Operations The Importance of Information Delivery in IT Operations David Williams Notes accompany this presentation. Please select Notes Page view. These materials can be reproduced only with written approval from

More information

HELP DESK MANAGEMENT PLAN

HELP DESK MANAGEMENT PLAN AT&T Help Desk Overview: DIR Platinum Customer Status AT&T is committed to continuous process improvement to meet DIR requirements and expectations, and to improve the end user experience. Today s process

More information

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001

IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001 IT HELPDESK POLICY & PROCEDURE IT-P-001 Date:25 September, 2013 Amemberof, LAUREATE I'.;TlRNAT'Oi'lAl. UWII[RSITIB Stamford International University Policy Policy Statement This Policy has been written

More information

ITIL by Test-king. Exam code: ITIL-F. Exam name: ITIL Foundation. Version 15.0

ITIL by Test-king. Exam code: ITIL-F. Exam name: ITIL Foundation. Version 15.0 ITIL by Test-king Number: ITIL-F Passing Score: 800 Time Limit: 120 min File Version: 15.0 Sections 1. Service Management as a practice 2. The Service Lifecycle 3. Generic concepts and definitions 4. Key

More information

1.1 Please indicate below if any aspect of the service is legally mandated by any of the following and provide the relevant reference.

1.1 Please indicate below if any aspect of the service is legally mandated by any of the following and provide the relevant reference. Response ID:172; 100888516 Data 1. Support Services Report Template Report Info Name of the person completing this report : Mercedes Alvarez-Arancedo Title of the person completing this report : CIS Director

More information

Administration A. Superintendent. Technology Services Proposal. Board of Education Dr. Bruce Law Superintendent of Schools DATE: July 30 2015

Administration A. Superintendent. Technology Services Proposal. Board of Education Dr. Bruce Law Superintendent of Schools DATE: July 30 2015 Dr. Bruce Law Pamela Bylsma Tammy Prentiss Domenico Maniscalco Bill Eagan Superintendent Assistant Superintendent Assistant Superintendent Chief Human Chief Financial of Schools for Academics for Student

More information

EXIN.Passguide.EX0-001.v2014-10-25.by.SAM.424q. Exam Code: EX0-001. Exam Name: ITIL Foundation (syllabus 2011) Exam

EXIN.Passguide.EX0-001.v2014-10-25.by.SAM.424q. Exam Code: EX0-001. Exam Name: ITIL Foundation (syllabus 2011) Exam EXIN.Passguide.EX0-001.v2014-10-25.by.SAM.424q Number: EX0-001 Passing Score: 800 Time Limit: 120 min File Version: 24.5 http://www.gratisexam.com/ Exam Code: EX0-001 Exam Name: ITIL Foundation (syllabus

More information

IT Service Management by SAP Africa (ITSM) Dirk Smit ALM Engagement Manager

IT Service Management by SAP Africa (ITSM) Dirk Smit ALM Engagement Manager IT Service Management by SAP Africa (ITSM) Dirk Smit ALM Engagement Manager Optimize IT Operations Process Support Business Goals CIO CEO/CFO Reliable Business Support Changes to improve IT services are

More information

Investing in the Help Desk

Investing in the Help Desk W H I T E P A P E R Investing in the Help Desk By Karen Schoemehl Selling the Help Desk as a Strategic Investment The Help Desk of the new millennium is one that will play an integral role in any business

More information

INCIDENT MANAGEMENT & REQUEST FULFILLMENT PROCESSES. Process Owner: Service Desk Manager. Version: v2.0. November 2014 Page 0

INCIDENT MANAGEMENT & REQUEST FULFILLMENT PROCESSES. Process Owner: Service Desk Manager. Version: v2.0. November 2014 Page 0 INCIDENT MANAGEMENT & REQUEST FULFILLMENT PROCESSES Process Owner: Service Desk Manager Version: v2.0 November 2014 Page 0 Document Revision History Revision Description Date Approved by Number V1.0 Initial

More information

Alcatel-Lucent Managed Services Overview

Alcatel-Lucent Managed Services Overview Alcatel-Lucent Managed Services Overview Operators have to continuously evolve their networks and be savvy about the use of technology to meet the exploding bandwidth demand being created by today s end

More information

Service Management Foundation

Service Management Foundation Management Foundation From Best Practice to Implementation 2008 IBM Corporation Agenda Management Foundation: - Fundamental building blocks for successful Management - ITIL v3: What s new in Operations

More information

The Modern Service Desk: How Advanced Integration, Process Automation, and ITIL Support Enable ITSM Solutions That Deliver Business Confidence

The Modern Service Desk: How Advanced Integration, Process Automation, and ITIL Support Enable ITSM Solutions That Deliver Business Confidence How Advanced Integration, Process Automation, and ITIL Support Enable ITSM Solutions That Deliver White Paper: BEST PRACTICES The Modern Service Desk: Contents Introduction............................................................................................

More information

Helpdesk Incident & Request Management Procedure For

Helpdesk Incident & Request Management Procedure For IT HELPDESK PROCEDURE Helpdesk Incident & Request Management Procedure For Author: Helpdesk Owner: IT Head Organisation: Karvy Stock Broking Ltd. Document No: CIT ITHP 01 Version No: 1.1 Release Date:

More information

SESSION 509 Tuesday, November 3, 11:15am - 12:15pm Track: Strategic View

SESSION 509 Tuesday, November 3, 11:15am - 12:15pm Track: Strategic View SESSION 509 Tuesday, November 3, 11:15am - 12:15pm Track: Strategic View Using the Building Blocks of ITIL to Connect Development and Operations Elizabeth Fortunato Deputy Chief, US Federal Courts elizabeth_fortunato@ao.uscourts.gov

More information

The Keys to Successful Service Level Agreements Effectively Meeting Enterprise Demands

The Keys to Successful Service Level Agreements Effectively Meeting Enterprise Demands A P P L I C A T I O N S A WHITE PAPER SERIES SYNTEL, A U.S.-BASED IT SERVICE PROVIDER WITH AN EXTENSIVE GLOBAL DELIVERY SERVICE, SUGGESTS SPECIFIC BEST PRACTICES FOR REDUCING COSTS AND IMPROVING BUSINESS

More information

HP Service Manager software. The HP next-generation IT Service Management solution is the industry-leading consolidated IT service desk.

HP Service Manager software. The HP next-generation IT Service Management solution is the industry-leading consolidated IT service desk. software The HP next-generation IT Service solution is the industry-leading consolidated IT service desk. : setting the standard for IT service management solutions with a robust lifecycle approach to

More information

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. IT Must Innovate to Meet Rising Business Expectations

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. IT Must Innovate to Meet Rising Business Expectations GoldMine Datasheet Title Subtitle: Reinvent your Sales, Marketing and Support Proceses IT Must Innovate to Meet Rising Business Expectations IT Must Innovate to Meet Rising Business Expectations Business

More information

At the beginning of my career as a desktop support manager, I searched everywhere

At the beginning of my career as a desktop support manager, I searched everywhere SEPTEMBER 2013 Desktop Support Metrics Written by Mike Hanson Data analysis by Jenny Rains At the beginning of my career as a desktop support manager, I searched everywhere for examples of industry-standard

More information

GSN Cloud Contact Centre Partnership Datasheet

GSN Cloud Contact Centre Partnership Datasheet GSN Cloud Contact Centre Partnership Datasheet Commercial in Reference: GSN Partnership Datasheet Version: 1.1 Global Speech Networks Pty Ltd Level 8, 636 St Kilda Road Melbourne, Victoria 3004 +61 3 9015

More information

MITEL MICONTACT CENTER AND MIVOICE CALL ACCOUNTING TRAINING OPTIONS WHITEPAPER

MITEL MICONTACT CENTER AND MIVOICE CALL ACCOUNTING TRAINING OPTIONS WHITEPAPER MITEL MICONTACT CENTER AND MIVOICE CALL ACCOUNTING TRAINING OPTIONS WHITEPAPER SEPTEMBER 2013 ABOUT THIS DOCUMENT This document is intended to help Mitel Dealers and Sales representatives understand the

More information

IAOP: Creating Sustainable value in Outsourcing Klaus Koefoed

IAOP: Creating Sustainable value in Outsourcing Klaus Koefoed IAOP: Creating Sustainable value in Outsourcing Klaus Koefoed April 28 th 2010 Copyright 2009 Accenture All Rights Reserved. Accenture, its logo, and High Performance Delivered are trademarks of Accenture.

More information

ITIL V3 Foundation Certification - Sample Exam 1

ITIL V3 Foundation Certification - Sample Exam 1 ITIL V3 Foundation Certification - Sample Exam 1 The new version of ITIL (Information Technology Infrastructure Library) was launched in June 2007. ITIL V3 primarily describes the Service Lifecycle of

More information

Systems Support - Standard

Systems Support - Standard 1 General Overview This is a Service Level Agreement ( SLA ) between document: and Enterprise Windows Services to The technology services Enterprise Windows Services provides to the customer The targets

More information

ASIAN PACIFIC TELECOMMUNICATIONS PTY LTD STANDARD FORM OF AGREEMENT. Schedule 3 Support Services

ASIAN PACIFIC TELECOMMUNICATIONS PTY LTD STANDARD FORM OF AGREEMENT. Schedule 3 Support Services ASIAN PACIFIC TELECOMMUNICATIONS PTY LTD STANDARD FORM OF AGREEMENT Schedule 3 Support Services December 2013 Table of Contents 1. SERVICE SCHEDULE 3 SUPPORT SERVICES... 3 1.1 OVERVIEW... 3 1.2 STANDARD

More information

Unisys Innovation Plan

Unisys Innovation Plan Unisys Innovation Plan June 12th, 2015 Our View of the Market Unisys regards user support, cloud/datacenter, applications, data, analytics, social, service management and security as essential components

More information

The State of Tennessee. Category: Enterprise IT Management Initiatives. Managing by Metrics, A Process Improvement Initiative

The State of Tennessee. Category: Enterprise IT Management Initiatives. Managing by Metrics, A Process Improvement Initiative The State of Tennessee Category: Enterprise IT Management Initiatives Managing by Metrics, A Process Improvement Initiative 2009 NASCIO Recognition Award Nomination For work performed in 2008 Executive

More information

BMC and ITIL: Continuing IT Service Evolution. Why adopting ITIL processes today can save your tomorrow

BMC and ITIL: Continuing IT Service Evolution. Why adopting ITIL processes today can save your tomorrow BMC and ITIL: Continuing IT Service Evolution Why adopting ITIL processes today can save your tomorrow What does it mean to adopt ITIL? Implementing ITIL? Don t. That s outdated thinking. Today s successful

More information

INSPECTOR GENERAL. Availability of Critical Applications. Audit Report. September 25, 2013. Report Number IT-AR-13-008 OFFICE OF

INSPECTOR GENERAL. Availability of Critical Applications. Audit Report. September 25, 2013. Report Number IT-AR-13-008 OFFICE OF OFFICE OF INSPECTOR GENERAL UNITED STATES POSTAL SERVICE Availability of Critical Applications Audit Report September 25, 2013 Report Number September 25, 2013 Availability of Critical Applications Report

More information

How Does YOUR Call Center Stack Up?

How Does YOUR Call Center Stack Up? How Does YOUR Call Center Stack Up? Call Center KPI s The Five Most Important Performance Indicators for Customer Service Call Centers (Part 2 of a 6-part Series on Call Center Benchmarking) By Jeff Rumburg

More information

Technical support in the healthcare industry is fast-paced and multifaceted. Support

Technical support in the healthcare industry is fast-paced and multifaceted. Support december 2014 Opportunities for Healthcare Support Centers to by Jenny Rains, Senior Research Analyst, HDI Technical support in the healthcare industry is fast-paced and multifaceted. Support analysts

More information

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. Automating Service & Support with Voice Self-Service

GoldMine Datasheet Title. Subtitle: Reinvent your Sales, Marketing and Support Proceses. Automating Service & Support with Voice Self-Service GoldMine Datasheet Title Subtitle: Reinvent your Sales, Marketing and Support Proceses Automating Service & Support with Voice Self-Service Automating Service & Support with Voice Self-Service The Changing

More information

311 2014 Budget Hearing

311 2014 Budget Hearing 311 2014 Budget Hearing Department found on pages E38 - E42 in budget book Presentation to Ways and Means/Budget Committee August 29, 2013 1 311 Minneapolis 311-2014 7am 7pm $3,215,000 Weekend hours $

More information

2014 Defense Health Information Technology Symposium Service Desk Strategy for the Defense Health Agency

2014 Defense Health Information Technology Symposium Service Desk Strategy for the Defense Health Agency Mr. Wayne Speaks - Acting Branch Chief Mr. Bill Novak - Acting Branch Chief of Operations DHA Infrastructure and Operations 2014 Defense Health Information Technology Symposium Service Desk Strategy for

More information

Ohio University Office of Information Technology

Ohio University Office of Information Technology Standard Desktop Support General Overview Standard Desktop Support service level of the Office of Information Technology (OIT) is comprised of: The Standard Desktop Support (SDS) Service The general levels

More information

ISO 20000-1:2005 Requirements Summary

ISO 20000-1:2005 Requirements Summary Contents 3. Requirements for a Management System... 3 3.1 Management Responsibility... 3 3.2 Documentation Requirements... 3 3.3 Competence, Awareness, and Training... 4 4. Planning and Implementing Service

More information

Sample Slide Deck for IDRC Readout

Sample Slide Deck for IDRC Readout Sample Slide Deck for IDRC Readout Report ID: 6122 1 Our Experts SC Photo Bruce Belfiore Senior Research Executive & Chief Executive Officer BenchmarkPortal SC Name Here Certified Auditor Senior Consultant

More information