Measuring the Service Performance of Information Technology Departments: An Internal Service Management Approach



Similar documents
An Assessment of Services of Cell-phone Service Providers by Utilizing the SERVQUAL Model: A Study of Valsad District

Exploring Graduates Perceptions of the Quality of Higher Education

APPLYING HIERCARCHIAL SERVICE QUALITY MODEL IN MEASURING MOBILE PHONE SERVICE QUALITY IN ALGERIA

Service Quality in the Hair Salon Industry

Measurement of E-service Quality in University Website

Quality evaluation in logistic services

CUSTOMER EXPECTATIONS AND PERCEPTIONS OF SERVICE QUALITY OF MOBILE PHONE SERVICE PROVIDERS IN KERALA - A GAP ANALYSIS

CONCURRENT SESSIONS Wednesday 8:30 12:30 KEMPINSKI HOTEL CORVINUS Erzsébet tér 7-8, Budapest V.

Relationship between Website Attributes and Customer Satisfaction: A Study of E-Commerce Systems in Karachi

Measuring service quality in city restaurant settings using DINESERV scale

Journal of Business & Economics Research November, 2004 Volume 2, Number 11

Measuring Quality in Graduate Education: A Balanced Scorecard Approach

AN ASSESSMENT OF SERVICE QUALTIY IN INTERNATIONAL AIRLINES

AN ANALYSIS OF THE SERVICE GAP OF ADVERTISING AGENCIES IN BANGLADESH: AN EMPIRICAL STUDY ON AD AGENCY CLIENTS

Grounded Benchmarks for Item Level Service Quality Metrics. Michael Vogelpoel, Anne Sharp, University of South Australia

Assessment of service quality dimensions: a study in a vehicle repair service chain

SERVQUAL and Model of Service Quality Gaps:

EVALUATING LIBRARY SERVICE QUALITY: USE OF LibQUAL+ Julia C. Blixrud Association of Research Libraries

Evaluating the Relationship between Service Quality and Customer Satisfaction in the Australian Car Insurance Industry

ADJUSTING SERVQUAL MODEL IN A HIGH EDUCATION LIBRARY SERVICE

SERVICE QUALITY AS A FACTOR OF MARKETING COMPETITIVENESS. Aleksandar Grubor, Ph. D. Assistant Professor. Suzana Salai, Ph. D.

The Investigation in Service Quality Management of 3G Business for Telecom Operators

UNLEASH POTENTIAL THROUGH EFFECTIVE SERVICE QUALITY DETERMINANTS

The Contribution of Service Quality and Partnership Quality on IT Outsourcing Success 1

COMPLEMENTARY OBJECTIVE AND SUBJECTIVE MEASURES OF HOSPITAL SERVICES QUALITY

Does Trust Matter to Develop Customer Loyalty in Online Business?

IMPROVING THE CRM SYSTEM IN HEALTHCARE ORGANIZATION

Measuring IS System Service Quality with SERVQUAL: Users Perceptions of Relative Importance of the Five SERVPERF Dimensions

Impact of Service Quality Dimensions Towards Customer Satisfaction In Indian Call Centers

An Empirical Investigation of Service Quality and Customer Satisfaction in Professional Accounting Firms: Evidence from North Cyprus

Consumer Perceptions on Service Quality of Telemarketing in Malaysia

GENERATION Y EXPECTATIONS OF QUALITY IN MASTER OF BUSINESS ADMINISTRATION PROGRAMS. for the. Christian Business Faculty Association.

Study of Determinants of e-crm in Influencing Consumer Satisfaction in B2C Websites

Strategic Supply Chain for Global Customer Relationship in e-business Management

Ankara, Turkey,

ISSN: International Journal Of Core Engineering & Management (IJCEM) Volume 2, Issue 9, December 2015

INFORMATION SYSTEMS OUTSOURCING: EXPLORATION ON THE IMPACT OF OUTSOURCING SERVICE PROVIDERS SERVICE QUALITY

THE IMPACT OF ACCOUNTING INFORMATION SYSTEMS (AIS) DEVELOPMENT LIFE CYCLE ON ITS EFFECTIVENESS AND CRITICAL SUCCESS FACTORS

The impact of quality management principles on business performance. A comparison between manufacturing and service organisations

Service Quality and Customer Satisfaction in a Telecommunication Service Provider

Service Quality Issues in Financial Services

OUTCOME OF PATIENT SATISFACTION IN HOSPITAL SERVICES, CHENNAI. S.Sharmila MBA., **Dr.Jayasree Krishnan PhD., ABSTRACT

A Proposed Scale for Measuring E-service Quality

Enhancing Customer Relationships in the Foodservice Industry

Factors Affecting Customer s Perception of Service Quality: Comparing Differences among Countries - Case study: Beauty salons in Bandung and Tokyo

CALIDAD TÉCNICA Y PEDAGÓGICA EN EL SERVICIO DE E-LEARNING. Chaminda Chiran Jayasundara (1) University of Colombo, Sri Lanka.

Impact of Service Quality on Customers Satisfaction: A Study from Service Sector especially Private Colleges of Faisalabad, Punjab, Pakistan

Data Coding and Entry Lessons Learned

Integrated Model for Quality Assurance of ICT-based Services

Factors affecting professor facilitator and course evaluations in an online graduate program

The Relationship Between Information Systems Management and

The relationship between service quality and customer satisfaction: the example of CJCU library

How Do We Assess Students in the Interpreting Examinations?

Profiling Top Service Firms. Abstract. Introduction

Service Quality Value Alignment through Internal Customer Orientation in Financial Services An Exploratory Study in Indian Banks

SERVICE QUALITY AND PURCHASE OF LIFE INSURANCE AT JUBILEE INSURANCE COMPANY OF KENYA LIMITED KINYANJUI PETER NJENGA

Relationship Quality as Predictor of B2B Customer Loyalty. Shaimaa S. B. Ahmed Doma

Service Quality, Customer Satisfaction and Loyalty: A Test of Mediation

Effective customer relationship management of health care: a study of hospitals in Thailand

Service quality in fitness centres: literature review and further research Abstract

COMPARISON OF SERVICE QUALITY BETWEEN PRIVATE AND PUBLIC HOSPITALS: EMPIRICAL EVIDENCES FROM PAKISTAN

IMPLICATIONS OF LOGISTIC SERVICE QUALITY ON THE SATISFACTION LEVEL AND RETENTION RATE OF AN E-COMMERCE RETAILER S CUSTOMERS

The Investigation of the Influence of Service Quality toward Customer Engagement in Service Dominant Industries in Thailand

Determinants of Service Quality among the Internet Service Providers in Kenya

ABSTRACT JEL: M31, M12. KEYWORDS: Service quality, SERVQUAL, Front-line employees, Customer expectations, Public sector, Mauritius INTRODUCTION

Esther-R. Mbise a & Ronald S. J. Tuninga b a College of Business Education, Dar es Salaam, Tanzania. Published online: 23 May 2013.

in a Complaints Management System

Service Quality Assessment: A Study of Consumer Satisfaction in Indian Insurance Sector

Identification of Telecom Service Quality Dimensions in India with Fuzzy Analysis

AER reference: 52454; D14/54321 ACCC_09/14_865

THE USE OF PERFORMANCE MEASURES IN SMALL TO MEDIUM ENTERPRISES (SMEs) AN EXPLORATORY STUDY


A STUDY ON CUSTOMER SATISFACTION WITH REFERENCE TO MOBILE SERVICE PROVIDERS IN HYDERABAD (INDIA)

Measuring Service Supply Chain Management Processes: The Application of the Q-Sort Technique

Using a blogging tool to assess online discussions: an integrated assessment approach

A Study on Customer Orientation as Mediator between Emotional Intelligence and Service Performance in Banks

Examining antecedents of satisfaction for marketing/management students in higher education

Technology Complexity, Personal Innovativeness And Intention To Use Wireless Internet Using Mobile Devices In Malaysia

Software Engineering Practices in Jordan

Ogunnaike Olaleke Oluseye 1, Obamiro John Kolade 2, Ogbari Mercy 3

Performance Metrics: Service Quality

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

Customer Loyalty in E-Commerce

Is the Internet Making Retail Transactions More Efficient? : Comparison of Online and Offline CD Retail Markets

Journal of Internet Banking and Commerce

A Study on Importance and Satisfaction of Service Quality for Online Stock Trading

A Study of Quality Based Service Design for Wedding Photography Services

Transcription:

Abstract Measuring the Service Performance of Information Technology Departments: An Internal Service Management Approach Helen Kang Dr. Graham Bradley School of Accounting University of New South Wales Sydney 2052, Australia Email: helen.kang@unsw.edu.au In recent years, internal services such as Information Technology (IT) have become quite critical in organisational control due to the enormous size of their expenditure. As a result, managers have faced growing pressure to measure the performance of IT departments. However, the applicability of traditional performance measures in an IT setting is at best questionable, and thus, there is an urgent need for alternative Performance Measurement (PM) systems. This study develops an alternative PM system based on the concept of service quality and SERVQUAL. The study then proceeds to measure the service performance of an IT department in one of the leading universities in Australia. Keywords Internal Services, IT Performance Measurement, Service Quality, SERVQUAL BA0208, DB02, EI0204, EI0206.03, EI0207 INTRODUCTION As many organisations search for ways to compete more effectively in today s ever-growing markets, managers are giving more attention to internal services, which are any types of services being provided and received within an organisation. One internal service function that especially has gained a lot of attention is the Information Technology (IT) due to the enormous size of its expenditure. For example, in the U.S., IT expenditure has been estimated at 2.2% of all revenue: a hefty chunk of most companies' after-tax profit margins (Axson 1996, p16). As a result, measuring the performance of IT departments has become one of the most critical aspects of organisational control. Unfortunately, the applicability of traditional financial performance measures in the IT setting is at best questionable. Indeed, organisations especially have struggled with traditional measures to reflect the service performance of IT departments and their largely intangible benefits. As a result, there has been an urgent call for new types of performance indicators. One alternative measure that recently has become quite popular in the IT setting is service quality (Watson et al. 1993). It is proposed that the concept of service quality can be applied to measure the performance of IT departments. In this study, a conceptual model of IT service quality was developed based on the service quality gaps model, first introduced by Parasuraman et al. (1985). Furthermore, the service performance of an IT department was measured using a modified version of SERVQUAL, a service quality measurement instrument, first introduced by Parasuraman et al. (1988). Proc. 10 th Australasian Conference on Information Systems, 1999 462

MEASURING THE PERFORMANCE OF IT DEPARTMENTS In the past, IT departments dealt primarily with providing the secondary support to other departments such as sales, finance and customer services departments. However, this concept of secondary support has now been replaced. It is no longer merely a function integrated into a discernible work flow; instead, an IT department is a free standing department that provides legitimate and important internal services to other divisions or work units (Watson et al. 1993). With the growing importance of IT departments and their services, managers began to realise that there is a real need to measure their service performance given the large investments and expenditures made in the name of IT service. Previously, the traditional performance measures, which are mostly financial, have been used to assess the performance of IT departments. Unfortunately, these traditional measures were found to have some serious shortcomings when used to measure the service performance of IT departments (McKeen et al. 1993). Subsequently, it became quite apparent that there is a need for alternative measures of IT performance. One such alternative measure that recently has become quite popular is service quality. The concept of service quality is not new. It first originated from the field of marketing, which proposes that there is a need for organisations to understand and measure customer expectations (Parasuraman et al. 1985). That is, organisations must listen continuously to their customers in order to improve the level of service quality they provide, and consequently, enhance their overall organisational performance. The concept of using service quality to measure the performance of IT departments is due to the acknowledgment that their service performance can be determined by the perception of customers of IT services within an organisation. The quality of IT services being provided then becomes essential to the management control of IT departments. That is, there is a need to develop a conceptual model of IT service quality and measurement systems. Indeed, the existing IT studies have concluded that specific IT performance measures, such as heavier systems usage, improved decision making via greater information timeliness and quality, direct cost savings, increased revenue, greater user proficiency, and increased user satisfaction, do not necessarily justify the expenditures of IT investments (Galletta et al. 1989). Thus, with continuing problems faced by organisations when measuring the performance of IT departments, a growing number of managers and researchers have turned to service quality as an alternative (Watson et al. 1993, Kettinger et al. 1994, Pitt et al. 1995, Kettinger et al. 1997). Service Quality The Measurement of IT Performance? Service quality is an abstract and elusive construct because of three features unique to the service delivery intangibility, heterogeneity, and inseparability of production and consumption. Unfortunately, none of these features can be measured using traditional performance measures since they are based on manufacturing products that are tangible, homogeneous, and separable from their production and consumption. According to Parasuraman et al. (1994), most of the previous service quality studies have concentrated on the general nature of service quality and its components. While the importance of quality was becoming more widely recognised, its conceptualisation and measurement have typically remained understudied. To try to fill this research void, Parasuraman, Berry and Zeithaml began a series of systematic and multi-phased research programme in the mid 1980s, focusing on the concept and measurement of service quality (Parasuraman et al. 1988, 1991a, 1991b, 1991c, 1993). 463

After the initial conception of their service quality gaps model in 1985, they began the process of developing an instrument for quantifying customers assessment of service quality performance. SERVQUAL, a multiple item instrument designed to measure service quality along the five dimensions of service was the result (Parasuraman et al. 1988). SERVQUAL is a concise multiple item scale that contains 22 pairs of Likert-type items, where each item is recast into two statements. One half of these items are intended to measure customers expectations about organisations in general within the service categories being investigated, and the other 22 matching items are intended to measure their perceptions about the particular organisation, whose service quality is being assessed. The items are presented in a 7-point response format with anchors strongly agree and strongly disagree. Service quality is then measured by calculating the difference scores between corresponding items, the difference between customers perception and expectation of service, along 5 dimensions of service. The 5 distinct but correlated dimensions of service are: Tangibility - appearance of physical facilities, equipment, personnel and communication materials; Reliability - ability to perform the promised service dependably and accurately; Responsiveness - willingness to help customers and provide prompt service; Assurance - knowledge and courtesy of employees and their ability to inspire trust and confidence; and, Empathy - caring individualised attention the firm provides its customers. In order to examine the applicability of SERVQUAL instrument in the IT setting, several empirical IT studies adapted the concept of service quality and SERVQUAL instrument to measure the performance of IT service. These studies also considered whether the use of difference scores and the above dimensionality of service would be applicable in the IT setting. Kettinger et al. (1994) applied the SERVQUAL instrument to provide more specific information about how users of IT department perceive the quality level of IT services being provided. The study posited that the existing measures of IT performance, such as user satisfaction, may not be comprehensive enough to capture the more detailed dimensions of service quality covered in SERVQUAL. The study concluded that SERVQUAL can provide additional focus in measuring the functional dimensions of IT service. The study also advocated the use of difference scores measure, since it provides a superior indicator of customer satisfaction through its mechanism for gauging the magnitude of difference between a user s expectations and perceptions. This is also supported by Pitt et al. (1995), which proposed another advantage of using SERVQUAL. Because SERVQUAL is a general measure of service quality, it is well suited to benchmarking. That is, IT managers can potentially use SERVQUAL to benchmark their performance against other departments and organisations in the same industry. Pitt et al. (1997) reaffirmed their 1995 study and restated that the 5 dimensions of service quality seem to be as applicable to the IT setting as they are to any other organisational settings. It also agreed that while perceptions-only measurement of service quality has marginally better predictive and convergent validity, this comes at a considerable expense to managerial diagnostics. In addition, the study contended that at least in theory, three-columnformat SERVQUAL, the newest version of SERVQUAL, seems to have the most potential as the measurement of IT service performance. Kettinger et al. (1997) also supported this. 464

CONCEPTUAL MODEL OF IT SERVICE QUALITY In a further development, Parasuraman et al. (1994) proposed that there are three distinct levels of service desirable, adequate and actual. For the purposes of this study, these three service levels are applied to the IT setting, and re-defined as: Ideal level of IT service - the level of service IT customers (suppliers) would like to receive (provide) in order to meet the customers requirements, based on their needs and past experience; Acceptable level of IT service - the minimum (feasible) level of service IT customers (suppliers) are willing to receive (provide) given the constraints of personnel, technology and organisational limitations; and, Perceived level of IT service - the actual level of service perceived by IT customers (suppliers). By integrating these three levels of IT service into the original gaps model by Parasuraman et al. (1985), a new conceptual model of IT service quality was developed in this study. This new model is shown in Figure 1. The major difference between the two models is the acknowledgment in the new model that IT customers are aware of limitations imposed on IT suppliers due to the personnel, technology and other organisational factors. In addition, the gaps between perceptions of customers and suppliers regarding the three levels of IT service are also acknowledged. The new conceptual model of IT service quality identifies that there are 7 gaps between suppliers and customers of IT service. These are defined as: Between Suppliers and Customers of IT service Gap 1: the difference between IT suppliers and customers perceptions of Ideal level of IT service; Gap 2: the difference between IT suppliers and customers perceptions of Acceptable level of IT service; Gap 3: the difference between IT suppliers and customers perceptions of Actual level of IT service; For Customers of IT service Gap 4: the difference between IT service level customers would like to receive, and what they would accept, given the limitations due to personnel, technology and other organisational factors; Gap 5: the difference between IT service level acceptable to customers, and the actual level of IT service perceived by customers; For Suppliers of IT service Gap 6: the difference between IT suppliers perception of what customers require, and the level of IT service they can provide given the constraints due to personnel, technology and other organisational factors; and Gap 7: the difference between IT service level IT suppliers can provide, and the actual level of IT service being provided. It is proposed that understanding these 7 gaps is essential to the successful management of IT service quality. Furthermore, measuring these gaps using the SERVQUAL instrument will provide IT managers and customers with an improved understanding of the requirements, expectation and performance being achieved by the IT department. 465

IT Customer Needs IT Customer s Past Experience IT CUSTOMERS GAP 1 GAP 4 Ideal Level of IT Service for Customer Service Requirements Acceptable Level of IT Service Given Personnel, Technology and Organisational Limitations GAP 2 GAP 5 Perceived Actual Level of IT Service for Operations and Support Functions GAP 3 Level of IT Service Actually Being Provided Communications to IT Customers GAP 7 Translation of IT Service Perceptions into Service Quality Requirements Given Personnel, Technology and Organisational Limitations GAP 6 IT SUPPLIERS IT Managers Perception of Customer Service Requirements Figure 1: The Conceptual Model of IT Service Quality RESEARCH METHODOLOGY The research design consisted of a SERVQUAL survey questionnaire, which was distributed to staff members who are customers, and suppliers of IT service in the Faculty of Commerce and Economics at one of the leading universities in Australia. A university setting was selected because of the researcher s access to its facilities and staff members. The selection of the Faculty of Commerce and Economics was mainly due to the existence of an internal IT department within the Faculty - the Technology Support Group (TSG), which provides a variety of IT services to the Faculty staff. For the purpose of this study, the components of the conceptual model of IT service quality developed in Figure 1 are defined as follows: 466

Suppliers - any member of the Technology Support Group (TSG), the staff-only IT service department exclusive to the Faculty of Commerce and Economics; Customers - any staff member of the Faculty of Commerce and Economics, including academics, administrative assistants, and research fellows with access to the TSG s IT services and have used the service at least once; and, IT Service - a variety of IT services offered by the TSG, including the management of staff inquiries and follow-up requests, computer technical support for both hardware and software, virus prevention and removal, installation of operating systems, applications and peripherals, technical support for administration, teaching and research, and general customer assistance. The Faculty of Commerce and Economics consists of 9 Schools, the Deans Unit, and the Faculty office. It has over 300 academic and support staff and all members of the staff have access to IT services by the TSG if they require. In summary, a total of 106 survey questionnaires were distributed, and 98 questionnaires were received at a response rate of 92%. Note that all 9 members of the TSG returned their survey questionnaires. Out of the 98 questionnaires returned, only 75 participants indicated that they have used at least once IT services provided by the TSG, and thus, they were selected for this study. Modification of SERVQUAL Previously, Kettinger et al. (1997) developed a modified version of SERVQUAL to measure the IT performance. In their study, the 22 original SERVQUAL questions were condensed into 13 questions by omitting the tangibility dimension of service discussed above. The rationale for the omission was that most of IT services are provided at the customers own settings due to the nature of services being requested. That is, it is rare for customers to visit the IT department with problems since they often don t know what the problem is, and rarer still for customers to consider the visual appeals of the IT department as important. This rationale is also adopted in the current study. For the purpose of the current study, the 13 questions in a 2-column format (perception and expectation) used by Kettinger et al. (1997) were further modified into 16 statement-like questions in a 3-column, 7-point response format to measure the three levels of IT service discussed above. The additional questions were added following several discussions with the TSG manager and two academics with wide knowledge of IT research; see Appendix 1. Although the 3-column-format SERVQUAL is uncommon, it is also considered to have the most potential in the IT research (Kettinger et al. 1997). Furthermore, a pilot testing of the questionnaire carried out on 9 randomly selected members of the Faculty validated its consistency. This format enabled the listing of three levels of IT service in one row, directing participants to complete the questionnaire by questions, not by service levels. The rationale for this was, that in order to measure the gaps between the three IT service levels, participants must consider each service level relative to the other two. That is, participants should consider each 16 statements individually and rank them as such for three different levels of IT services. Furthermore, existing studies have not considered the impact of using the SERVQUAL questionnaire on BOTH suppliers and customers of service. This posed an interesting dilemma in terms of the survey instrument. The 16 statements modified to suit the 3-columnformat had to be rephrased to suit two different groups of participants. However, the changes were limited to wordings only; the nature of the questions remained the same for both groups. 467

RESULTS The Measurement System Construct Validity The 16 questions were factor analysed using the Varimax rotated factor-loading matrices based on the actual level, and on the different scores (ideal - actual), in keeping with previous factor analyses carried out by Parasuraman et al. (1994) to examine their dimensionality and distinctiveness; see Appendix 2. According to the analysis, only 2 distinct service dimensions were found to exist in the IT service setting used in this study. The dimensions were consistent for both the actual level of service and the difference scores. Furthermore, the Cronbach reliability alphas for these 2 dimensions were found to be in excess of 0.85. Further analysis of the 2 dimensions revealed that each dimension was indeed very distinct and specific. Factor 1 items were concerned with the TSG personal attributes. That is, the 10 items loaded under F1 dealt with the quality of suppliers. The 5 items loaded under Factor 2 were found to deal with the quality of IT service. That is, they were concerned with the IT service attributes - are they within a reasonable timeframe? Are they right the first time? Are they prompt?; see Appendix 1. The factor analysis raised several interesting points. Firstly, it seems that in the IT setting, SERVQUAL items do not neatly fit into the 5 traditional service dimensions introduced by Parasuraman et al. (1988). Instead, there are only two distinct components, which are divided according to whether an item deals with the TSG personnel s personal attributes, or to do with the delivery process of IT service. The Service Performance of TSG As identified in the conceptual model (Figure 1), there are 7 gaps between suppliers and customers of IT service. For the purpose of this analysis, the 7 gaps, that is, the difference scores were calculated. The significance of each of these gaps was then determined by two non-parametric tests at a 95% significance level. The Mann-Whitney test, which tests for the equivalence or differences of two independent samples, was used for the gaps 1 to 3 (difference between customers and suppliers IT service levels), and the Wilcoxon test, which tests for the differences between the two related variables, was used for the gaps 4 to 7. Table 1 is the summary of the gap scores and their significance overall, by the two service dimension factors identified for the study (significant gaps shaded). Overall Factor 1 TSG Personal Attributes Factor 2 IT Service Attributes Score Sig Score Sig Score Sig GAP 1 0.15 0.461 0.29 0.165 0.15 0.427 GAP 2 0.36 0.191 0.68 0.025 0.05 0.907 GAP 3 0.12 0.908 0.41 0.394 0.15 0.772 GAP 4 1.08 0.000 0.97 0.000 1.28 0.000 GAP 5 0.39 0.061 0.14 0.476 1.25 0.000 GAP 6 0.87 0.011 0.58 0.021 1.38 0.011 GAP 7 0.63 0.008 0.41 0.018 1.15 0.008 Table 1: Gap Scores and Significance Overall and by Factors From Table 1, it can be seen that there was no significant difference between the suppliers and customers overall ideal, acceptable and actual levels of IT service at a 95% significance level, and for Factor 2 dimension; Gaps 1,2 and 3. That is, the suppliers and customers perceptions of the overall IT service performance were found to be no different from each 468

other. This can also be seen in Figures 2 and 4. Figure 2 shows the overall mean rankings (out of 7) for both customers and suppliers at the ideal, acceptable and actual IT service levels, and depicts the 7 gaps. Figure 3 depicts Factor 1 dimension of IT service, and figure 4, the Factor 2 dimension of IT service. 7 OVERALL CUSTOMER TSG 6.5 gap1 6 5.5 5 4.5 Rating gap4 gap6 gap2 gap7 gap5 gap3 4 3.5 Ideal Acceptable Actual Service Level Figure 2: Overall Comparison of Customer/Supplier Service Levels However, Table 1 does show that Factor 1 TSG Personal Attributes dimension of IT service had a significant Gap 2. This can be seen in Figure 3 where Gap 2 is the difference at the acceptable level of IT service between customers and the TSG. It indicates that the TSG personnel clearly believed that customers require a much higher acceptable level of service than its customers do believe. That is, the standard the TSG personnel apply to themselves was found to be higher than the standard customers expect of TSG. Rating 7 6.5 6 5.5 5 4.5 4 FACTOR 1 CUSTOMER TSG 3.5 Ideal Acceptable Actual Service Level Figure 3: Personal Attributes Comparison of Customer/Supplier Service Level Table 1 also shows that there is a significant difference between the ideal and acceptable levels of IT service for both suppliers and customers, Gaps 4 and 6, overall and for both factors. This indicates that both suppliers and customers were aware of limitations imposed on the suppliers of IT service due to technology, personnel and other organisational factors, and furthermore, customers were willing to accept a level of service which is significantly less than their ideal level. This can be seen in Figures 2 to 4 Gaps 4 and 6 are represented by the slope of the lines between the ideal and acceptable levels of IT service. 469

Gap 5, the difference between the acceptable and actual level of IT service from the customers perspective, was significant only for the Factor 2 dimension of IT service. This is depicted in Figure 4 by a sharp decline in the slope of the line. This indicates that although customers found the actual level of IT service to be significantly indifferent from what is acceptable overall, and for Factor 1 dimension, they did found the actual level of IT service for Factor 2 to be significantly lower than what is acceptable. Rating 7 6.5 6 5.5 5 4.5 4 FACTOR 2 CUSTOMER 3.5 Ideal Acceptable Actual Service Level TSG Figure 4: IT Service Attributes Comparison of Customer/Supplier Service Level Gap 7 is shown in Table 1 to be significant overall, and for both Personal and IT Service Attributes. Again this can be seen in Figures 2 to 4 where the slope of the line for TSG between the acceptable level and the actual level is quite steep. This indicates the TSG personnel s belief that their level of performance is significantly lower that the acceptable level, although the corresponding gap 5 from customers perspective was significant only for the Factor 2 dimension of IT service. In summary the results for Factor 1 TSG Personal Attributes show that: Customers and suppliers have very similar views on the ideal levels of service required and the actual levels of service being delivered (Gaps 1 and 3 were insignificant); They have very different views on the acceptable level of service required. Customers are prepared to accept a much lower level of service than the TSG personnel (Gap 2 was significant); and Customers perceive very little difference in the service they are prepared to accept and the service provided. However, the TSG perceive their service as low and unacceptable (Gap 5 was insignificant, gap 7 significant). The results show for Factor 2 IT Service Attributes show that: Both customers and suppliers have very similar views about the ideal and acceptable service levels required and the actual service levels being achieved (Gaps 1 to 3 were insignificant); Both customers and suppliers perceive a significant difference between the ideal and acceptable levels (Gaps 4 and 6 were significant); and The actual level of service is significantly lower that what customers and suppliers are prepared to accept (Gaps 5 and 7 were significant). 470

CONCLUSION In this paper, a conceptual model of IT service quality was developed, and based this conceptual model, a modified SERVQUAL measurement system was introduced. The new measurement system was then applied to an IT department to measure its service performance. The measurement system clearly identified two IT service dimension factors compared to the four factors in the Kettinger et al. (1997), and in the original SERVQUAL model (minus tangibility dimension). These results may indicate that SERVQUAL, when applied to the IT service setting, may result in a lower number, and different types of service dimensions. The performance of an IT department was measured using the 7 gaps identified in the conceptual model. This particular measurement system allowed the comparison between the customers and suppliers perception on the three levels of IT service quality. The study, however, had several limitations. Firstly, only 9 IT suppliers participated in the study, making the statistical analysis rather weak, although it must be pointed out that it is not atypical for an IT department to have less than a dozen staff. Secondly, the dimensionality of IT service must be considered in more detail to validate the 2 dimensions identified in the current study. Indeed, further studies need to be undertaken in other IT service settings before the usefulness of the conceptual model and its measurement system can be confirmed. REFERENCES Axson, D.A.J. (1996), It Doesn t Have To Be Spend and Hope, Financial Executive, Sep/Oct, pp. 18-24. Brown, T.J., Churchill, Jr., G.A. & Peter, J.P. (1993), Improving the Measurement of Service Quality, Journal of Retailing, Vol. 69, No. 1, Spring, pp. 127-139. Cronin, Jr., J.J. & Taylor, S.A. (1994), SERVPERF Versus SERVQUAL: Reconciling Performance-Based and Perceptions-Minus-Expectations Measurement of Service Quality, Journal of Marketing, Vol. 58, January, pp. 125-131. Davis, T.R.V. (1992), Satisfying Internal Customers: The Link to External Customer Satisfaction, Planning Review, Vol. 20, No. 1, pp. 34-37. Eccles, R. G. (1991), The Performance Measurement Manifesto, Harvard Business Review, Jan - Feb, pp. 131-137. Fisher, J. (1992), Use of Non-financial Performance Measures, Journal of Cost Management, Spring, pp. 31-38. Galletta, D.F. & Lederer, A.L. (1989), Some Cautions on the Measurement of User Information Satisfaction, Decision Sciences, Vol.20, No.3, pp. 419-438. Kettinger, W.J. & Lee, C.C. (1994), Perceived Service Quality and User Satisfaction with the Information Services Function, Decision Sciences, (25,5/6), pp. 737-766. ---------- (1995), Global Measures of Information Service Quality: A Cross-National Study, Decision Sciences, (26,5), pp. 569-587. ---------- (1997), Pragmatic Perspectives on the Measurement of Information Systems Service Quality, MIS Quarterly, June, (21,2), pp. 223-240. McKeen, J. D. & Smith, H. A. (1993), The Relationship between Information Technology Use and Organisational Performance, ed. Banker, R.D., Kauffman, R. J. & Mahmood, 471

M.A., Strategic Information Technology Management: Perspective on Organisational Growth and Competitive Advantage, pp. 405-443. Parasuraman, A., Zeithaml, V.A. & Berry, L.L. (1985), A Conceptual Model of Service Quality and Its Implications for Future Research, Journal of Marketing, Vol. 49, Fall, pp. 41-50. --------- (1988), SERVQUAL: A Multiple-Item Scale for Measuring Consumer Perceptions of Service Quality, Journal of Retailing, 64 (Spring), pp. 12-40. Parasuraman, A., Berry, L.L. & Zeithaml, V.A. (1991a), Understanding Customer Expectations of Service, Sloan Management Review, 39, Spring, pp. --------- (1991b), Perceived Service Quality as a Customer-Based Performance Measure: An Empirical Examination of Organisational Barriers Using an Extended Service Quality Model, Human Resource Management, Vol. 30, no. 3, Fall, pp. 335-364. --------- (1991c), Refinement and Reassessment of the SERVQUAL Scale, Journal of Retailing, 67, Winter, pp. 420-450. --------- (1993), More on Improving Service Quality Measurement, Journal of Retailing, Vol. 69, No. 1, Spring, pp. 140-147. --------- (1994), Alternative Scales for Measuring Service Quality: A Comparative Assessment Based on Psychometric and Diagnostic Criteria, Journal of Retailing, Vol.70, No. 3, pp 201-230. Pitt, L.F., Watson, R.T. & Kavan, C.B. (1995), Service Quality: A Measure of Information Systems Effectiveness, MIS Quarterly, June, (19,2), pp. 173-185. --------- (1997), Measuring Information Systems Service Quality: Concerns for a Complete Canvas, MIS Quarterly, June, (21,2), pp. 209-221. --------- (1997), The Squandered Computer, The Information Economics Press, Connecticut. Van Dyke, T.P., Kappelman, L.A. & Prybutok, V.R. (1997), Measuring Information Systems Service Quality: Concerns on the Use of the SERVQUAL Questionnaire, MIS Quarterly, June, (21,2), pp. 195-201. Watson, R.T., Pitt, L.F., Cuningham, C. & Nel, D. (1993), User Satisfaction and Service Quality of the IS Department, Journal of Information Technology, Vol.8, pp. 257-265. ACKNOWLEDGEMENTS We would like to acknowledge and express our deepest gratitude to the personnel of the TSG for their cooperation, and to Professor Marcus O Connor for his invaluable comments and assistance with statistics. APPENDIX 1 Service Dimension Factor 1 Personal Attributes When it comes to. Q3 TSG personnel showing a sincere interest in solving your problems Q4 TSG personnel keeping their appointments Q8 TSG personnel s willingness to help you Q9 The trustworthiness of TSG personnel 472

Q10 The courtesy of TSG personnel Q11 The level of expertise of TSG personnel Q12 The availability of services during business hours Q13 The availability of services after business hours (5:00 9:00 pm weekdays) Q14 Receiving person-to-person individual attention from TSG personnel Q15 TSG personnel having your best interests at heart Q16 TSG personnel understanding your specific requests Service Dimension Factor 2 IT Service Attributes When it comes to Q1 Receiving requested services within a reasonable timeframe Q2 Receiving requested services right the first time Q5 Being informed about exactly when the request can be completed Q6 Being informed regularly about the status of your requests Q7 Receiving prompt services without delays APPENDIX 2 Factor Loading Matrices (Eigenvalues of.3 or greater)* SERVQUAL QUESTIONS USED IN THE STUDY ACTUAL LEVEL OF IT SERVICE DIFFERENCE SCORES (IDEAL relative to ACTUAL) Factor 1 Factor 2 Factor 1 Factor 2 Q1.36.75.40.71 Q2.37.64.36.66 Q3.66.39.61.39 Q4.58.47.54.46 Q5 --.86 --.88 Q6 --.84 --.88 Q7.35.83.46.67 Q8.85 --.81 -- Q9.81 --.76 -- Q10.67.35.70 -- Q11.66 --.46 -- Q12.63.39.70 -- Q14.68 --.74 -- Q15.82 --.85 -- Q16.53 --.51 -- COPYRIGHT Helen Kang 1999. The authors assign to ACIS and educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to ACIS to publish this document in full in the Conference Papers and Proceedings. Those documents may be published on the World Wide Web, CD-ROM, in printed form, and on mirror site on the World Wide Web. Any other usage is prohibited without the express permission of the authors. Question 13 was omitted from the analysis since less than half of the responses were applicable. 473