Stanford / MIT Benchmarking IT Help Desk
|
|
|
- Piers Pitts
- 10 years ago
- Views:
Transcription
1 Stanford / MIT Benchmarking IT Help Desk Final Presentation November 13, 2002
2 Agenda Project Goals Help Desk Benchmarking Goals & benchmark metrics Initial data comparisons Findings & hypotheses Quick wins and lessons Tools and timelines to get there Benchmarking as a Methodology 2
3 Benchmarking Project Goals Help Desk Specific Enable comparisons between institutions Develop ongoing management tool» Determine metrics» Develop initial comparisons & findings» Identify tools needed, future plans Benchmarking in Higher Education Enable comparisons among schools and industry Develop methodology Provide a test-case Develop strategy to expand See additional context in Appendix 1 Project History & Goals 3
4 Benchmarks must tie to management goals 1. Support client needs with quality service 2. Be responsive 3. Be cost effective 4. Provide appropriate level of investment 5. Develop effective, mature processes 6. Maintain high-performing, competent team 7. Support rollout of new systems 4
5 Goals must tie to specific metrics Invest Appropriately Be Cost Effective % of budget Cost per case by topic Clients served/fte Total costs by topic Cases by media, including self-help Support Customer Needs with High Quality Service Annual customer survey Spot-surveys on selected transactions Be Responsive Elapsed time per case Call abandonment Hold time Time to answer Support Rollout of New Systems Case volume by topic 3 months before and after launch Minutes per case Develop High Performing, Competent Teams Employee satisfactn survey Individual perf. metrics Team performance metrics Training $$ / FTE % Help Desk certification Case volume compared to staff skills mix Develop Effective, Mature Processes # of contacts vs. # of days to resolve Origin of Help Desk cases See Appendix 6 for discussion of specific data elements to calculate benchmarks 5
6 Caveat: Best benchmarking is yet to come Problems with historical data Assumptions Extrapolations Simply missing some data Similar but very different operations Common metrics going forward Don t t focus too heavily on current data; look to future! 6
7 Context for Comparison: Help Desk Operations Organization Structure Location Offices Staffing ACD Tool Media MIT Consolidated + unquantified support in acad. depts. Single unit; junior/senior mixed; informal limit 15 min Single, separate location Cubes + call center 2-4 hour blocks; many stdts 4 phone numbers Home-grown Java Heavy 50%+ Stanford Distributed support model across ITSS Tier 1 & 2 (10 min limit Tier 1) Multiple, distributed Individual offices Full-time assignments 1 published #, then phone tree Customized Remedy Heavy Web-form See more details in Appendix 3. How Each Help Desk Works 7
8 Context for Comparison: FY 02 Sizing Demographics MIT Stanford Variance Faculty & Staff 9,230 10,792 17% Students 10,204 14,173 39% Total Population 19,434 24,965 28% University Consolidated Budget $1,535,949 $1,937,900 26% IT Department Information Annual Base Budget $x $x 1 83% Full Time Staff (FTE) % Help Desk Information With Students No Students Annual Base Budget $x $x 2 5% Full Time Staff (FTE) % Tickets Processed 43,553 56, % 1 Includes providing telecommunications for Stanford s hospital 2 Does not include Student Help Desk due to no tracking/ticketing system. Approximate increase with students would be +$275K and +5 or 6 FTEs. 8
9 Gauging investment and effectiveness MIT Stanford Variance IT Dept Budget / University Budget 2.9% 4.2% Help Desk Budget / IT Budget 4.2% 2.3% Tickets / School Population % Population per HD Employee 712 1, % Tickets / Help Desk FTE 1,595 3,017 89% Help Desk Budget / Ticket $41.83 $ % 1 This ratio s meaningfulness is affected because it does not include Student Help Desk numbers due to no tracking/ticketing system. 9
10 Goal: Be cost effective $200 $180 $160 Cost per Ticket $140 $120 MIT Stanford $100 $80 $60 $40 $20 $0 Accounts Backup Business Apps Business Func Cluster Connectivity Courseware Hardware OS Software Other Printing Desk Software Security/Virus Web See supporting data in Appendix 9
11 Goal: Be cost effective $480,000 $400,000 Total Annual Cost by Help Desk Topic Annual Cost $320,000 $240,000 $160,000 MIT Stanford $80,000 $0 Accounts Backup Business Apps Business Func Cluster Connectivity Courseware Hardware OS Software Other Printing Desk Software Security/Virus Web See supporting data in Appendix 9 11
12 Goal: Be cost effective Cases & Complexitys MIT First Contact Helpdesks Estimated Complexity of Case Accts/ID's/Auth Backup Bus App Suppt Bus. Functions Cluster Connectivity Courseware Hardware OS Software Other Printing Productivity SW Security/Virus Web See supporting data in Appendix 9 12
13 Goal: Be cost effective Stanford Helpdesks: Tier Where Resolved Cases & Complexity Other Level 2 Level 1 0 Accts/ID's/Auth Backup Bus App Suppt Bus. Functions Cluster Connectivity Courseware Hardware OS Software Other Printing Productivity SW Security/Virus Web See supporting data in Appendix 9 13
14 Goal: Support rollout of new systems 7,000 Impact: New Business Applications 6,000 Number of Trouble Tickets 5,000 4,000 3,000 2,000 Kronos/HR People Soft Jan. Axess Probs. March HR Salary Setting May ITSS Help Desk Level 1 ITSS Help Desk Level 2 Legacy Applications Rollout New Applications 1,000 - Total Help Tickets A S O N D J F M A M J J 12 Months FY02 Some rollout application tickets included in HDLevels 1 & 2. See supporting data in Appendix 9 (Stanford data only). 14
15 Preliminary data offers initial observations Implementation choices affect Help Desk costs MIT» » Connectivity to desktop» Security, Kerberos Stanford» Accounts, authentications» Business apps Time is money. Topics are expensive when they are complex, must escalate, or relate to unique applications Specialists are required more frequently for unique, proprietary issues System rollouts create overall spikes and some dips in specific areas 15
16 Initial observations Student employees» MIT Help Desk employs more students at a lower overall budget» More FTEs but difficult now to gauge overall effectiveness of using students Structured tiers:» Using structured tiers may support a greater number of cases» Resolving at Tier 1 significantly reduces costs» You can either tier the work through process choices, or can tier the staff to handle only certain types of work Media of case submission may affect costs» Web submission may support greater number of cases/fte 16
17 Goal: Be responsive Responsiveness to Phone Calls Stanford Tier 1 only Jan-Aug 02 MIT- All desks Jul 01 Jun 02 Average Speed to Answer (seconds) Abandon Rate % 17% 13% Time before Caller Abandons (seconds) Average Call Length (seconds) 252 (4.2 min) 415 (6.9 min) Time between Calls (for staffer) 4.9 min Average # of Calls Monthly HDI Industry Comparisons % 5 min. Speed to Answer = Time a call waited before being answered by a person. Abandon rate = % of calls where customer hung up; includes callers that hung up after reaching voice mail. Call Length = Time from when a call is answered to when it is released. Time between Calls = Time between calls that an ACD agent handles. Agent closes out ticket before getting back in queue to receive calls. 17
18 Goal: Support customers with high quality service Customer satisfaction appears comparable across institutions. MIT Computing Help Desk (All Levels) and Stanford Help Desk (Level 1) "Snapshot" Surveys after Recently Closed Cases Annualized Average Timeliness of Response 5.0 Quality of Resolution 5 Point Likert Scale Courtesy and Professionalism Technical Competence Overall Satisfaction 0.0 MIT 2001 SU 2002 Improvement efforts do bear fruit as shown in MIT s two annual survey results. Likert Scale MIT Computing Help Desk (All Levels) from IS Annual Customer Satisfaction Surveys Timeliness of Response Quality of Resolution Courtesy and Professionalism Technical Competence Overall Satisfaction Ability to Get Through Turnaround Time See supporting data in Appendix 5 18
19 Goal: Develop effective, mature processes The vast majority of cases are resolved quickly. Help Desk Overall Time to Close Cases -- Help Desk Case neglected or Customer unresponsive % % % 80.00% 70.00% Days to Close Cases % 50.00% 40.00% 30.00% n of Interactions % 9.23% 10.95% 4.09% 3.40% 2.35% More 20.00% 10.00%.00% Desirable Days 19
20 Goal: Maintain high-performing, competent team Individual performance can vary greatly; must consider Hawthorne effect Managers D Staff Member C (75%) Staff Member A Stanford IT Help Desk (Level 2) Hours Logged FY02 Staff Member I 461 Staff Member E (55%) 532 Staff Member B Staff Member H (80%) Staff Member G Staff Member F
21 Goal: Maintain high-performing, competent team More team or employee-related metrics are desirable but need more high level discussion Employee satisfaction survey Customer satisfaction tied to individual service providers Individual performance metrics MIT already tracks by team Stanford tracks by individual Help Desk Certifications Behavioral and/or technical competencies 21
22 Initial performance data also yields some Customer satisfaction observations» Appears comparable across both institutions» Improvements efforts did increase satisfaction over 2 years (MIT) Process effectiveness» Better categorization, identification of the work may help with faster escalation. Get to the right place faster.» Vast majority are resolved quickly Employee performance» Can vary significantly among individuals» Metrics do affect behavior (Stanford) 22
23 The data also raise good questions Processes Which cases should be escalated more quickly? Should you tier the work or the organization? organization? How does web-submission affect cost? Staffing Is student employment effective? What additional training should tier 1 1 receive? How should each institution use employee performance data? Support intensive systems Should support intensive systems be replaced? How can we help with new system design to minimize Help Desk requirements? Investments to improve efficiencies Which tools should we acquire to improve performance? 23
24 Quick Wins for Implementation MIT Stanford Tracking Track tickets at student-staffed Unix desk Track internal hand-offs or tiers/escalations explicitly Standardize work reporting categories Track type of media for each case Consolidate reporting functions into one ticket system Examine excess ticket counts in specific categories In Place Quick Win Quick Win In Place Quick Win Quick Win Quick Win In Place Quick Win Quick Win In Place Quick Win Customer feedback Initiate or increase transaction-based spot surveys Quick Win Quick Win Proactively use data Generate & review weekly metric reports Generate weekly standards for tickets processed or time spent and use as part of individual performance management Quick Win Not quick Quick Win Quick Win Reconfigure Space Reconfigure space and move staff to allow for more efficiencies and collaboration In Place Quick Win 24
25 of Help Desk Benchmarking Software or Hardware Investments Customer feedback Management High Performing Team Scope Remedy changes needed for bmarking; ; engage consultant Casetracker to allow consultant to track touch minutes per case and escalations (tiers) both within and outside HD Knowledge Management system Pilot use to help HD staff retrieve standard answers Evaluate usefulness of client use (self-help) ACD call-board to display calls waiting in queue Create dashboard reports and process for regular reporting Self-serve password reset tools Collaborate on annual customer survey Define process for using customer survey responses Create cross-functional ITSS team for Delphi rollout Institute regular review of metrics with finance Create Help Desk Standard Oprtng Procdrs & Handbk Solicit employee feedback for process and job improvement Track % of HD Certifications and training $ per employee MIT In Place Long Term In Place In Place Long Term Stanford 25
26 Cost to Implement Metrics Software or Hardware Investments Remedy consultant to program changes Casetracker consultant Knowledge Management system Pilot use to help HD staff Evaluate use of client self-serve ACD call-board to display queued calls Self-serve password reset tools Joint customer satisfaction survey Creation of Standard Operating Procedures Self Creation of Dashboard Next 6 months MIT Stanford $ 75K $ 60K $ 300K $300K $ 15K $ 60K $10K $10K $32K $15K 26
27 Stanford: Help Desk Dashboard MONTHLY Cases by Media Calls Walk-In Self-Help TO BE DEVELOPED Customer Satisfaction - Spot Ticket Surveys IT Help Desk Tech Support Student J F M A M J J A S O N D 0 Courtesy Tech Knowledge Overall Satis 3,400 3,200 3,000 2,800 2,600 2,400 2,200 2,000 1,800 1,600 1,400 1,200 1, Level 1 Help Desk # of Tickets Created by Employee F M A M J J A S Avg Another Metric Call Length (Tier 2) Phone-In Statistics Abandon Time to Answer Hold Time Actual Goal Problems by Cause - Tier 1 -- for Month of XXXXX Current Previous % Problem Resolution 100% 80% 60% 40% 20% Tier 3 Tier 2 Tier 1 Account ID Authority Bus App Cluster Securty/Virus Connectvty Data Backup Other HW Print SW - OS SW -Personal Web Telecom 0% 27
28 MIT: Help Desk Dashboard MONTHLY MIT First Contact Helpdesks MIT First Contact Help Desks Cases By Method % 50% Cases / Complexity Accounts/ID's/Authority Backup Business Application Support Business Functions Cluster Connectivity Courseware Hardware OS Software Other Printing Productivity Software Security/Virus Web Specialist (Tier 2) Simple (Tier 1) Referral (Tier 3) 40% 30% 20% 10% 0% Jan- Feb- Mar- Apr- May- Jun- 02 TO BE DEVELOPED Customer Satisfaction - Spot Ticket Surveys Courtesy Tech Overall Satis Knowledge IT Help Desk Tech Support Student 02 Jul-02 Aug- Sep- Messages 02 ( , Web) Interactive (Voice, Char) In-Person (Walk In, Site Visit) Internal (Transfers, Referrals) MIT First Contact Helpdesks Interactive Communications (Phone, Chat) Answered % MIT First Contact Helpdesks Message Communications ( , Web Requests) Time to Close Cases Call Length (specialist) (0-600) Call Length (simple) (0-600) Phone Time to Answer (0-60) Hold Time (0-60) Notes: 1) The MIT ACD currently does not collect hold time. 2) Times are in seconds 3) MIT currently cannot distinguish specialist from simple calls; actual is average Actual Goal Time between exchanges (specialist) (0-6000) Number of exchanges (specialist) (0-6) Answered% Time between exchanges (simple) (0-600) Time to first response (0-600) Number of exchanges (simple) (0-6) Notes: This is a sample dashboard showing metrics to be delivered on a monthly basis pending some tool improvements. 1. The Cases by Complexity graph shows real ticket counts and categorizations. Complexity is currently estimated based on time-to-resolve. 2. We do not currently have complete Helpdesk Interactive Communications data such as time to first response. This graph is a placeholder. 3. Cases by Method shows additional data. The graph breaks out and web requests. Ultimately we want to consolidate this data into the categories shown in the graph s legend. Actual Goal Cases % 90.00% % % % 60.00% % % % %9.23% 10.95% 20.00% %3.40%2.35% 10.00% 0.00% More Days 28
29 Next Steps for MIT/Stanford Help Desk Benchmarking Months: D J F M A M J J A S Implement selected metrics (dashboard) and quick wins Track common metrics; re-evaluate Begin software modifications Implement Knowledge Management system and ACD call board Implement customer spot surveys; then annual customer survey (?) Implement management, oprtns changes Consider inviting others to benchmark Ongoing Ongoing Annual? = On-site visits 29
IT Support Center Call Handling Metrics Week Beginning: January 3, 2016
IT Support Center Call Handling Metrics Week Beginning: January 3, 1 Avg Speed to Answer (Target :) Avg Abandon Time (Target 3:) Avg Talk Time (Target 7:) Avg After Call Work Time (Target 3:) Avg Speed
24x7 Help Desk Services Questions & Answers for RFP 40016_21030705
24x7 Help Desk Services Questions & Answers for RFP 40016_21030705 1. What % of the call volume that was listed in the RFP was related to LMS (BB and Moodle) support? See Table 5 2. What is the number
Metrics That Matter. Presented by: Pete McGarahan
Metrics That Matter Presented by: Pete McGarahan WHY? Consistent measurement/tracking provides an important feedback loop for continuous improvement Defining/reporting consistent metrics forces the Help
ILTA 2010 Strategic Unity. The Guru s Guide for Helpdesk and Deskside Support
ILTA 2010 Strategic Unity The Guru s Guide for Helpdesk and Deskside Support 8/26/2010 Introduction You can only manage what you measure. Measuring service levels, ticket volumes, and user satisfaction
Helpdesk Software: Service Desk Management or Glorified Database? Tweet using #APP6
Helpdesk Software: Service Desk Management or Glorified Database? Tweet using #APP6 Who s Talking Lance Waagner Chief Executive Officer Intelliteach Founded 1998 Over 150 employees Global 24-hour servicedesk
Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009
Unleashing the Enormous Power of Help Desk KPI s Help Desk Best Practices Series January 20, 2009 The Premise Behind Help Desk KPI s We ve all heard the expression If you re not measuring it, you re not
Company XYZ. Peer Group Desktop Support Benchmark. Company XYZ
Company XYZ Peer Group Desktop Support Benchmark Company XYZ Report Contents Project Overview and Objectives Page 2 Industry Background Page 37 Performance Benchmarking Summary Page 51 Best Practices Process
IMT Performance Metrics and Qualitative Feedback
IMT Performance Metrics and Qualitative Feedback December 2015 Liz McNaughton Content Page 1.0 Executive Summary 1 1.1 Summary of statistics 2 2.0 Service performance 3 2.1 Significant incidents 3 2.1.2
Remote Support: Key Metrics to drive Improvement in your Center
Remote Support: Key Metrics to drive Improvement in your Center Jeremy Curley, Director of Business Solutions, Bomgar Mike Sell, Director of Strategic Alliances, Bomgar 0 Why Are Metrics Not Improving?
COMPUTER SERVICES HELP DESK USING METRICS FOR BUSINESS PROCESS IMPROVEMENT
EXECUTIVE SUMMARY This document provides an overview of how the Computer Services Help Desk utilizes metrics and other types of assessment tools in making data-driven decisions on how to improve its business
Creating Service Desk Metrics
Creating Service Desk Metrics Table of Contents 1 ITIL, PINK ELEPHANT AND ZENDESK... 3 2 IMPORTANCE OF MONITORING, MEASURING AND REPORTING... 3 2.1 BALANCED MANAGEMENT INFORMATION CATEGORIES... 3 2.2 CSFS,
3rd Edition August 2012. The Guru s Guide for. Desk Support. Law firm specific metrics & key performance indicators
3rd Edition August 212 The Guru s Guide for Legal Service Desk Support Law firm specific metrics & key performance indicators WELCOME TO THE GURU S GUIDE, 3 RD EDITION I have been involved with the law
LEGAL SERVICE DESK SUPPORT
3rd Edition August 212 THE GURU S GUIDE FOR LEGAL SERVICE DESK SUPPORT LAW FIRM SPECIFIC METRICS & KEY PERFORMANCE INDICATORS WELCOME TO THE GURU S GUIDE, 3 RD EDITION I have been involved with the law
APPENDIX 4 TO SCHEDULE 3.3
EHIBIT J to Amendment No. 60 - APPENDI 4 TO SCHEDULE 3.3 TO THE COMPREHENSIVE INFRASTRUCTURE AGREEMENT APPENDI 4 TO SCHEDULE 3.3 TO THE COMPREHENSIVE INFRASTRUCTURE AGREEMENT EHIBIT J to Amendment No.
SD Monthly Report Period : August 2013
SD Monthly Report Period : 213 SD KPI Dashboard The role of this report is to capture both quantitative and qualitative data that reflects how Service Delivery and DIT services are performing against agreed
ITD Help Desk Traffic Report May 2002
ITD Help Desk Traffic Report May 2002 Call volumes and resolution times within the CONSULT Remedy workgroup June 10, 2002 Christopher King Help Desk Manager NC State University [email protected] Information
At the beginning of my career as a desktop support manager, I searched everywhere
SEPTEMBER 2013 Desktop Support Metrics Written by Mike Hanson Data analysis by Jenny Rains At the beginning of my career as a desktop support manager, I searched everywhere for examples of industry-standard
Executive Branch IT Reorganization Project Plan
Office of Information Resource Management Executive Branch Project Plan Work Program Funded by for IT Appropriations Reorganization 2007, 2009 and Five Small Projects Date: August 2009 Version: 1.3 Revision
COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*
COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) 2 Fixed Rates Variable Rates FIXED RATES OF THE PAST 25 YEARS AVERAGE RESIDENTIAL MORTGAGE LENDING RATE - 5 YEAR* (Per cent) Year Jan Feb Mar Apr May Jun
COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*
COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) 2 Fixed Rates Variable Rates FIXED RATES OF THE PAST 25 YEARS AVERAGE RESIDENTIAL MORTGAGE LENDING RATE - 5 YEAR* (Per cent) Year Jan Feb Mar Apr May Jun
Metric of the Month: First Contact Resolution
Metric of the Month: First Contact Resolution By Jeff Rumburg Every month, the Industry Insider will highlight one key performance indicator (KPI) for the service desk or desktop support. We will define
IT-P-001 IT Helpdesk IT HELPDESK POLICY & PROCEDURE IT-P-001
IT HELPDESK POLICY & PROCEDURE IT-P-001 Date:25 September, 2013 Amemberof, LAUREATE I'.;TlRNAT'Oi'lAl. UWII[RSITIB Stamford International University Policy Policy Statement This Policy has been written
MIS. Platform Services. Software. Business Intelligence. Technology Customer Support
* Custom application development Payroll applications (time & attendance) Student assignment algorithm HR applications (staffing, transfers etc.) School safety web applications Mobile platform applications
Aryzta Commercial Excellence (ACE) ACE Project Overview
Aryzta Commercial Excellence (ACE) ACE Project Overview ( Cloud for Customer, mobile BI, Product Catalogue ) Agenda Introduction to ARYZTA Overview of the Project: Background, Objectives and Key Business
State Network Consumer Assistance Small Group Consultation Call Center
State Network Consumer Assistance Small Group Consultation Call Center Support for this presentation was provided through a grant from the Robert Wood Johnson Foundation s State Health Reform Assistance
By default, the Dashboard Search Lists show tickets in all statuses except Closed.
Set Up 1. Login with NetId at http://fpweb.utk.edu 2. The first time you login, you will be asked to run this: 3. Enable Pop ups from FootPrints Creating Searches to Show Active Status Tickets By default,
Computing & Telecommunications Services Monthly Report March 2015
March 215 Monthly Report Computing & Telecommunications Services Monthly Report March 215 CaTS Help Desk (937) 775-4827 1-888-775-4827 25 Library Annex [email protected] www.wright.edu/cats/ Last Modified
A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management
A Screen Share is Worth a Thousand Chats Donald Hasson Director of ITSM Product Management 2016 BOMGAR CORPORATION ALL RIGHTS RESERVED WORLDWIDE 1 I need help. The little button disappeared. The red one
Program: ServiceOakville Program Based Budget 2016 2018. Page 19
Program: Program Based Budget 2016 2018 Page 19 Program: Vision Statement: To deliver citizen focused services that combine people, processes and technology to maximize value to the public. Mission Statement:
ITS Help Desk Annual Report 2012
ITS Help Desk Annual Report 2012 Contents Executive Summary... 1 Scope of the ITS Help Desk... 1 Status of ITS Help Desk Priorities... 2 Metrics... 3 Service Levels... 3 Number of Incidents Tracked...
Presented By: Daniel Chetty [email protected] 01628 771 811
Candidate Name: Raymond - 145175 Job Applied For: Availability: Key Skills: 1 st /2 nd Line Support Interview: Immediate Start: immediate An experienced IT professional with a proven track record of supporting
Consumer ID Theft Total Costs
Billions Consumer and Business Identity Theft Statistics Business identity (ID) theft is a growing crime and is a growing concern for state filing offices. Similar to consumer ID theft, after initially
Continual Service Improvement: The Catalyst for Service Desk Excellence and Enterprise Productivity
Continual Service Improvement: The Catalyst for Service Desk Excellence and Enterprise Productivity Introduction By definition, the service desk produces measurable business value by enabling improved
STRATEGIC BUSINESS PLAN QUARTERLY KPI REPORT FOR: FISCAL YEAR 2015, QUARTER 2 (JULY THROUGH DECEMBER 2014)
STRATEGIC BUSINESS PLAN QUARTERLY KPI REPORT FOR: FISCAL YEAR 215, QUARTER 2 (JULY THROUGH DECEMBER ) CONTENTS BALANCED SCORECARD OF KEY PERFORMANCE INDICATORS SAFETY & SECURITY SLIDE VEHICLE, PASSENGER
Global Service Desk (GSD) IT Executive Learning Series
1 Global Service Desk (GSD) IT Executive Learning Series Tom Painter Director, Global IT Operations and Application Management and Computer / Telephone Integration (CTI) Services THIS PRESENTATION SUMMARY
Effective Reporting. The Nuts and Bolts of Best Practice Reporting for the Contact Center. What Is Effective Reporting?
1 The Nuts and Bolts of Best Practice Reporting for the Contact Center What Is? Effective reporting is defined by the results obtained by reviewing the report data and making changes to improve performance.
CedarCrestone 2013 2014 HR Systems Survey Highlights. In Denver!
CedarCrestone 2013 2014 HR Systems Survey Highlights HR Technologies, Deployment Choices, and Metrics 16 th Annual Edition In Denver! Lexy Martin Vice President, Research and Analytics CedarCrestone [email protected]
1. INTRODUCTION AND CONTACT INFORMATION
1. INTRODUCTION AND CONTACT INFORMATION Thank you for participating in the 2014 Alabama n Instructional Surveys for public schools. The INFORMATION TECHNOLOGY SURVEY should be completed by the Information
Attachment 2 Performance Metrics
Attachment 2 Performance Metrics The following metrics are the metrics that the Contractor is required to meet to satisfy the Contract. In addition, the Contractor will be required to provide regular reports
Smart Reporting: Using Your Service Desk to Better Manage Your IT Department
Smart Reporting: Using Your Service Desk to Better Manage Your IT Department Making Technology Work for You Smart Reporting: Using Your Service Desk to Better Manage Your IT Department Introduction Need
COUNTY OF ORANGE, CA Schedule 2D Service Desk Services SOW SCHEDULE 2D SERVICE DESK SERVICES SOW. for. Date TBD
SCHEDULE 2D SERVICE DESK SERVICES SOW for COUNTY OF ORANGE, CA Date TBD Schedule 2D Service Desk Services SOW Table of Contents 1.0 Service Desk Services Overview and Service Objectives... 1 1.1 Service
AT&T Global Network Client for Windows Product Support Matrix January 29, 2015
AT&T Global Network Client for Windows Product Support Matrix January 29, 2015 Product Support Matrix Following is the Product Support Matrix for the AT&T Global Network Client. See the AT&T Global Network
QUICK FACTS. Establishing a Telephony Service Desk System to Enhance Telecommunications Support. TEKsystems Global Services Customer Success Stories
[ Retail, Support Services ] TEKsystems Global Services Customer Success Stories Client Profile Industry: Retail Revenue: $50.7 billion Employees: Approximately 167,000 Geographic Presence: Headquartered
NOMINATION FORM. Category for judging: 5 - Digital Government: Government to Government (G to G)
NOMINATION FORM Title of Nomination: Project/System Manager: Job Title: Agency: Enterprise Technology Service Desk Elizabeth Dignan Program Manager State Technology Office Department: Address: 4030 Esplanade
Student Facing Customer Service Contact
Student Loans Company Student Facing Customer Service Contact Presenter: Rebecca Dowding, Jamie Law & Sharon Parkin Department: Customer Services Date: 25 27 March 2015 www.slc.co.uk Customer Services
RIGHTNOW MULTI-CHANNEL CONTACT CENTER BENCHMARK REPORT
RIGHTNOW MULTI-CHANNEL CONTACT CENTER BENCHMARK REPORT Improving the Customer Experience While Reducing Operating Costs 1 2009 RightNow Technologies. All rights reserved. RightNow and RightNow logo are
IT Help Desk Management Survey Questionnaire January 2007
IT Help Desk Management Survey Questionnaire January 2007 Thank you for participating in the IT Help Desk Management study being conducted by the EDUCAUSE Center for Applied Research, or ECAR. This survey
Department of Information Technology
Lines of Business LOB #132: END USER SERVICES Department of Information Technology Purpose The End User Services LOB in the Department of Information Technology is responsible for providing direct technical
Customer Contact Center Benchmarking Results Executive Summary
Customer Contact Center Benchmarking Results Executive Summary XYC Company SAP Value Engineering Agenda. Executive Summary. Company Baseline, Metrics and Performance Drivers. Best Practices 4. Participant
TECHNICAL SUPPORT GUIDE
TECHNICAL SUPPORT GUIDE Copyright 2009 Fiberlink Corporation. All rights reserved. Information in this document is subject to change without notice. The software described in this document is furnished
ITS Help & Service Desk Annual Report 2013-2014
ITS Help & Service Desk Annual Report 2013-2014 Philipose, Ruby R, IT Manager The University of Texas at Austin, Information Technology Services 2013-2014 ITS HELP DESK ANNUAL REPORT 2013-14 EXECUTIVE
Real World Proactive ITIL Continuous Improvement Practices Part 1. Mickey Nakamura
Real World Proactive ITIL Continuous Improvement Practices Part 1 Mickey Nakamura Part 1 Topics Mickey Nakamura Purpose of Today s Discussion Benefits of Proactive Continuous Improvement Lifecycle Approach
Customer Care Center Details
Enterprise IT Shared Services Service Level Agreement Customer Care Center Details (Revision Date: October 7, 2010) Going From Good to Great DOCUMENT CONTROL REVISION HISTORY DATE DESCRIPTION 9.15.10 INITIAL
Development of a Balanced Scorecard for Service Desk KPIs
Development of a Balanced Scorecard for Service Desk KPIs Presented by: Robert Higgins and Jason Reid Position: IT Service Desk and Telecommunications Manager and Team Leader Balanced IT Scorecard History
Continual Service Improvement How to Provide Value on an Ongoing Basis
Continual Service Improvement How to Provide Value on an Ongoing Basis Sacramento HDI The Big Event Ken Hayes, CSI Director May 17, 2012 Who is Technisource? Technology Talent and Services Provider Leading
MNsure Assessment Summary
MNsure Assessment Summary Contact Center Technical Program Management Software and Data January 17, 2014 Executive Summary Based on Optum s initial review, we are able to conclude that, while MNsure will
Kennebec Valley Community College Information Technology Department Service Level Agreement
Kennebec Valley Community College Information Technology Department Service Level Agreement 1. Introduction 1.1 Information Technology Department Mission Statement 1.2 Information Technology Department
Logging In. Supervisor Set Up. Contents. No table of contents entries found.
Contents No table of contents entries found. Logging In Launch the Mysphere Call Center icon located on your desktop by double clicking it. Your log in screen will appear. (To create an icon on your desktop,
Organizational Development Plan
Commonwealth of Massachusetts Consolidation Planning Project Plan Version 11.0 05/03/2009 Revision History The table below serves to track the key revisions made to this document for change control purposes.
Request for Proposals (RFP) Managed Services, Help Desk and Engineering Support for Safer Foundation www.saferfoundation.org
Request for Proposals (RFP) Managed Services, Help Desk and Engineering Support for Safer Foundation www.saferfoundation.org IMPORTANT NOTICE All proposal question and inquiries must be sent by email to
Helpdesk and Technology Support Procedures
Procedures: Helpdesk and Technology Procedure Date: 10/27/2009 1.0 Purpose The Information Technology Services (ITS) Helpdesk is the single point of contact for technology support for all CCBC students,
Pega Texas Conference 2012 Department of Veterans Affairs. Terry Riffel, Director Financial Services Center October 26, 2012
Pega Texas Conference 2012 Department of Veterans Affairs Terry Riffel, Director Financial Services Center October 26, 2012 1 Agenda Pega Projects at the FSC Invoice Payment Processing System Healthcare
GTA Board of Directors September 4, 2014
GTA Board of Directors September 4, 2014 Our Strategic Vision Our Mission A transparent, integrated enterprise where technology decisions are made with the citizen in mind To provide technology leadership
Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6
Process Description Incident/Request HUIT Process Description v6.docx February 12, 2013 Version 6 Document Change Control Version # Date of Issue Author(s) Brief Description 1.0 1/21/2013 J.Worthington
Hospital IT Service Desk Benchmarks
Hospital IT Service Desk Benchmarks A Comprehensive Study of Industry-Wide Hospital IT Service Desk Performance Metrics 2015 90 John Street 7th Floor New York, NY 10038 (888) 858-5648 NIT Health 90 John
Improving. Summary. gathered from. research, and. Burnout of. Whitepaper
Whitepaper Improving Productivity and Uptime with a Tier 1 NOC Summary This paper s in depth analysis of IT support activities shows the value of segmenting and delegatingg activities based on skill level
The Service Desk Survival Guide 2005 Peter McGarahan
The Service Desk Survival Guide 2005 Peter McGarahan The Outlook For 2005 Go After Project Funding Globalization/Consolidation Outsourcing/Offshoring Career Development in/outside of IT Cost Containment
NGITS ITSM STRATEGY JAYASHREE RAGHURAMAN SHIVA CHANDRASHEKHER VIKAS SOLANKI
NGITS ITSM STRATEGY JAYASHREE RAGHURAMAN SHIVA CHANDRASHEKHER VIKAS SOLANKI AGENDA JAYASHREE 1. Introduction 2. Problem Statement 3. Recommendations 4. Organization Structure 5. Roadmap 6. Cost Benefit
https://support.avaya.com Avaya Automated Services Nov-2013
https://support.avaya.com Avaya Automated Services Nov-2013 Fast Powerful Seamless 2 Proactive Content Usage Optimization Tools Webinars Product Hub Avaya Knowledge Base Web Chat Web Talk Web Video Service
BT Retail Social Media making it easy for our customers
BT Retail Social Media making it easy for our customers Dawn Walton, GM Central Planning Nigel Elliott, Social Media Channel Manager Kerry Gulloch, Social Media Communities Manager Agenda o How we ended
NYSED DATA DASHBOARD SOLUTIONS RFP ATTACHMENT 6.4 MAINTENANCE AND SUPPORT SERVICES
NYSED DATA DASHBOARD SOLUTIONS RFP ATTACHMENT 6.4 MAINTENANCE AND SUPPORT SERVICES 1. Definitions. The definitions below shall apply to this Schedule. All capitalized terms not otherwise defined herein
Service desk / Desktop / PC Services Director: Apurva Mehta Manager: Raymond Tse
Service desk / Desktop / PC Services Director: Apurva Mehta Manager: Raymond Tse Overview of the department Key statistics and major accomplishments Goals from FY 09-10 and its follow Challenges we are
1. Introduction. 2. Performance against service levels 1 THE HIGHLAND COUNCIL. Agenda Item. Resources Committee 26 th March 2003 RES/43/03
1 THE HIGHLAND COUNCIL Resources Committee 26 th March 2003 Performance report for January / February 2003 Report by the Information Systems Client Manager Agenda Item Report No 18 RES/43/03 Summary This
ITS Help Desk Annual Report 2011
ITS Help Desk Annual Report 2011 Contents Executive Summary... 1 Status of ITS Help Desk Priorities... 1 Scope of the ITS Help Desk... 2 Metrics... 2 Service Levels... 3 Number of Incidents Tracked...
How to Select the Right Remote Support Tool:
How to Select the Right Remote Support Tool: A practical guide for the support desk owner LogMeInRescue.com 1 Executive Summary Today s customer support and IT service organizations are charged with supporting
Student Contact Center: From Development to Delivery. A Vision for Anytime, Anywhere Student Services
Student Contact Center: From Development to Delivery A Vision for Anytime, Anywhere Student Services The Vision Single Source of Information Walk In Needs Met Here! Phone Email Agents empowered Departments
No deviation. 5% deviation. No deviation. 10% of the voice mail messages may be answered within 60 minutes. No deviation.
C.4.1: A Fully- Functional Help Desk The contractor shall establish and maintain a fully functional Help Desk facility providing on-site support from 7:30 a.m. to 6:00 p.m., Monday through Friday (except
University Systems Desktop Support Service Level Commitment
University Systems Desktop Support Service Level Commitment The Purpose of this Service Level Commitment (SLC) is to formally define the level of service University Systems will provide to UVic faculty,
