EVALUATING LIBRARY SERVICE QUALITY: USE OF LibQUAL+ Julia C. Blixrud Association of Research Libraries



Similar documents
Policies and Procedures Manual CONSUELLA ASKEW WALLER AMY HOSETH MARTHA KYRILLIDOU

SERVICE QUALITY IN ACADEMIC LIBRARIES: AN ANALYSIS OF LibQUAL+ SCORES AND INSTITUTIONAL CHARACTERISTICS

ADJUSTING SERVQUAL MODEL IN A HIGH EDUCATION LIBRARY SERVICE

Measuring Quality in Graduate Education: A Balanced Scorecard Approach

GRADUATE PROGRAM REVIEW POLICY. Texas Southern University

Academic Program Review

The Graduate School:

Quality metrics in academic libraries: Striving for excellence

CONCURRENT SESSIONS Wednesday 8:30 12:30 KEMPINSKI HOTEL CORVINUS Erzsébet tér 7-8, Budapest V.

GENERATION Y EXPECTATIONS OF QUALITY IN MASTER OF BUSINESS ADMINISTRATION PROGRAMS. for the. Christian Business Faculty Association.

Quality metrics in academic libraries: Striving for excellence

: MEASURING CUSTOMER PERCEPTIONS: A COLLABORATIVE PROJECT CONDUCTED BY STUDENTS FOR A MIDWEST TRUCKING COMPANY

Assessing User Needs, Satisfaction, and Library Performance at the University of Washington Libraries

Ankara, Turkey,

REPORT OF A REQUESTED FOCUS VISIT FOR CHANGE. Assurance Section. Youngstown State University Youngstown, Ohio. February 18-20, 2008 FOR

A New Approach to Needs Assessment and Communication to Connect and Collaborate with Faculty

Research Project. Abstracts AAP ACADEMY ADMINISTRATION PRACTICE. Growth Starts With Knowledge

A Theory-guided Approach to Library Services Assessment1

Student Success at the University of South Carolina: A comprehensive approach Category: Academic Support

UNLEASH POTENTIAL THROUGH EFFECTIVE SERVICE QUALITY DETERMINANTS

Academic Quality Improvement Project (AQIP) and accreditation: When assessment is not optional

SCHOOL OF COMMUNICATION TENURE AND PROMOTION CRITERIA, GUIDELINES FOR CREATIVE, PROFESSIONAL, SCHOLARLY ACHIEVEMENT

Chenoa S. Woods, Ph.D.

Environmental Detectives: An Environmental Science Curriculum for Middle Schools Year 3 Evaluation. Cynthia A. Char, Char Associates

How To Create A Library Service Quality Evaluation Tool For The National Science Library

The Open University s repository of research publications and other research outputs

Leadership and Learning: The Journey to National Accreditation and Recognition

RefWorks investigated An appropriate bibliographic management solution for health students at King s College London?

Professional Education Unit

DOCTOR OF EDUCATION (Ed.D.) DEGREE PROGRAM IN CURRICULUM AND INSTRUCTION WITH EMPHASIS IN CURRICULUM STUDIES

SOUTH DAKOTA BOARD OF REGENT. Full Board ******************************************************************************

Learning from our Users: Using Assessment to Drive Change. LibQUAL+ Share Fair ALA Annual Conference New Orleans, Louisiana June 27, 2011

Previous Approvals: April 5, 2005; May 6, 2008; November 2, 2010; May 3, 2011, May 3, 2011, May 7, 2013

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Bachelor of Social Work

HIGHER EDUCATION PRACTICE RESEARCH ABSTRACTS

Using metrics for institutional advancement

Department of Psychology

Strategic Plan San Luis Obispo County Community College District

University Libraries Strategic Goals and Objectives. extracted from: A Strategic Plan for the UNLV Libraries:

Cognition, Instruction, & Learning. University of Connecticut. Technology Graduate Program. Department of Educational Psychology.

Reproductions supplied by EDRS are the best that can be made from the original document.

Academic Program Review Handbook

EXPANDING FACULTY DIVERSITY AT UMBC

Assessment and Review of Graduate Programs: Doctoral

Department/Division and School: Indiana University School of Social Work

Appendix A. MSU Digital Preservation Proposal April Project: Preserving MSU s Digital Assets

RWWL STRATEGIC PLAN: Building a 21st Century Learning Community Advancing the Academic Village

Outcome: Compare and contrast different research methods used by psychologists including their respective advantages and disadvantages.

An Integrated Analysis of Quantitative and Qualitative Data for Identifying Factors in Information Services: A Working Paper

How To Prepare For Graduate School

Strategic Agenda for Library-based Research Data Support Services

Professional Development for the Virtual School Setting: Aligning and Designing

CUSTOMER EXPECTATIONS AND PERCEPTIONS OF SERVICE QUALITY OF MOBILE PHONE SERVICE PROVIDERS IN KERALA - A GAP ANALYSIS

Student Assessment - A Body of Research and Practice

Changing Roles for Libraries: Providing Implementation and Ongoing Support for a Shared ETD Center at Two Ohio Universities

Maryland and The Southern Regional Education Board

Graduate Student Career Services: Meeting Students Needs

National Commission for Academic Accreditation & Assessment. Standards for Quality Assurance and Accreditation of Higher Education Programs

I - Institutional Information

Improving Board Engagement with Educational Quality. Institutional Case Report Templates

Engineering Technology Department Bylaws 2011

Programs in Higher & Postsecondary Education Ed.D. Program in Higher and Postsecondary Education The Ed.D. in Higher and Postsecondary Education

ACRL Public Relations Award Submission Narrative. Milner Library Illinois State University. October, 2004

Measurement of E-service Quality in University Website

Expected spring 2018, City University of New York, The Graduate Center Environmental Psychology

CAROL KRAKER STOCKMAN, Ph.D Castleman Street, Pittsburgh, Pennsylvania / ckstockman@gmail.com

Report to. Faculty, Administration, Trustees, and Students. Inter American University of Puerto Rico. Metropolitan Campus. San Juan, Puerto Rico

Curriculum Vitae. March 2012

DEPARTMENT PLAN. The Department of Counseling, Educational, and Developmental Psychology. College of Education and Human Development

Summary of Critical Success Factors, Action Items and Performance Measures

elearning Integration at York University Consultation Summary Describing York s 2017 Vision for elearning

The Investigation in Service Quality Management of 3G Business for Telecom Operators

Georgia s Technology Literacy Challenge Fund Grant Application. Bremen City Schools. Section 3: Grant Proposal

BETHANY H. FLORA 201 Third Street Radford, VA (cell) (work)

NEW PROGRAM PROPOSAL. and Policy Studies ]

Utah State University. Distance-Delivered Education Doctorate (EdD) Serving Utah s Place-Bound and Time-Bound Educators

Appendix D: Summary of studies on online lab courses

Designing Effective Online Course Development Programs: Key Characteristics for Far-Reaching Impact

We are developing a Facebook page and the website is consistently being updated.

to encourage, support, monitor, publicize, raise funds for, and administer current and future initiatives in pre-college education;

University of Delaware College of Health Sciences Department of Behavioral Health and Nutrition

ED Library Collection Development in an Electronic Age. ERIC. Digest.

University of Connecticut Ph.D. Degree Program in Educational Psychology: Cognition, Instruction and Learning Technologies (CILT)

LIBRARY RESOURCES IN SUPPORT OF PROGRAMS IN THE SCHOOL OF SOCIAL WORK

Academic Unit Action Plan: SOCIAL WORK

Teaching Philosophy and Agenda

UNDERWATER ROBOTS: A MODEL FOR INTERDISCIPLINARY ENGAGED LEARNING AT ELON. Sirena Hargrove-Leak Department of Physics Dual Degree Engineering Program

Service Quality and Customer Satisfaction in a Telecommunication Service Provider

Graduate Handbook EDUCATIONAL PSYCHOLOGY

Sales and Marketing Optimization. Distribution Research Consortium

COLLEGE OF VISUAL ARTS AND DESIGN Department of Art Education and Art History DOCTORAL PROGRAM IN ART EDUCATION PROCEDURES MANUAL

CLINICAL RESEARCH GENERIC TASK DESCRIPTIONS

Conducting Implementation-Informed Evaluations: Practical Applications and Lessons from Implementation Science

7.1 Assessment Activities

DESIGNING OUR FUTURE

Assessing the quality of online courses from the students' perspective

Survey Population and Response Rates

ASSOCIATION FOR GENERAL AND LIBERAL STUDIES 2008 AGLS Awards for Improving General Education: Effective Program Processes

Curriculum Vitae (CV)

Transcription:

EVALUATING LIBRARY SERVICE QUALITY: USE OF LibQUAL+ Julia C. Blixrud Association of Research Libraries Academic and research libraries are currently engaging in activities to try to define new metrics that better describe their service activities. Increased pressure from funding authorities and accreditation agencies, and greater than before demands from the users of services have encouraged academic and research institutions, and thus their libraries, to move towards more outcome-based assessment instead of relying merely on input, output, or resource metrics. Outcome measures show how well an organization serves its users; they demonstrate an institution's efficiency and effectiveness. One promising approach being tested at various libraries in the United States and Canada is LibQUAL+, an emerging standardized measure of library service quality across institutional library contexts. It is adapted from an instrument called SERVQUAL (for SERVice QUALity), which is grounded in the "Gap Theory of Service Quality" developed by the marketing research team of A. Parasuraman, V.A. Zeithaml, and L.L. Berry. i This tool allows a web-based method of administration and analysis and eases the burden of administration locally, creating a scaleable and replicable protocol. It also makes readily available large normative data on user perceptions and expectations of library service quality. LibQUAL+ was initially developed as a self-financed pilot project by interested members of the Association of Research Libraries (ARL) in collaboration with the Texas A&M University Libraries (TAMU) and subsequently received substantial funding from the U.S. Department of Education's Fund for the Improvement of Postsecondary Education (FIPSE). The three-year research and development project goals include: a) establishing a library service quality assessment program at ARL; b) developing web-based tools for assessing library service quality; c) developing mechanisms and protocols for evaluating libraries; and d) identifying best practices in providing library service. Project Development To address ARL member interest in defining metrics that better describe their contribution to their institutions, two ARL committees jointly developed a New Measures Initiative in 1999 that would develop alternatives to expenditure metrics as measures of library performance. Members of the Statistics and Measurement Committee and the Research Library Leadership and Management Committee had identified several areas in which new measures would be particularly helpful. While ARL s descriptive statistics had served useful purposes for many years, the input or expenditure based statistics provided no information about service quality. The data only recorded the resource allocations among member libraries. A focus on expenditures did not necessarily meet the new demands for accountability and evaluation and members were encouraged to come forward with suggestions for projects that would address this new measures agenda. 1

To begin to address the interest in service quality as one area in which new measures were needed, Texas A&M University Libraries offered their experience with the SERVQUAL instrument to the ARL community. They had a six-year history in regrounding the instrument for library purposes. The professors who had originally developed SERVQUAL were from Texas A&M University and there were other current TAMU faculty with an interest and expertise in the areas of qualitative measures necessary to ensure a reliable and valid instrument. TAMU also had a telecommunications infrastructure that could support the administration of a national web-based survey instrument. At the October 1999 ARL meeting, institutions were asked to volunteer to participate in a pilot project that would test a regrounded SERVQUAL instrument. Thirty institutions expressed interest and a diverse group of twelve was selected. Costs for the project were borne primarily by Texas A&M University, with each of the pilot libraries underwriting $2,000 of the costs for deliverables. An ambitious timeline was set the institutions were selected in Fall 1999, a planning meeting with participants was held in January 2000 at the ALA Midwinter meeting, the regrounding of the instrument was completed in the winter, the surveys were conducted in April 2000, and the results were made available to the participants in July 2000 at the ALA Annual conference. The regrounding of the instrument was conducted as a qualitative process through a series of interviews with library user representatives (e.g., faculty, graduate students, undergraduates) at the participating pilot institutions. The Cognition and Information Technologies Laboratory (CITL) at Texas A&M University assisted with survey design and worked with campus liaisons to develop a customized front-end web page. In addition, hardware and software required for survey administration and data capture and analysis were acquired. Responsibilities for the 12 pilot institutions included drawing random samples of email addresses from faculty, graduate student, and undergraduate user groups; seeking approval of the administration of the survey instrument by human subjects review boards; and preparing their own user communities for administration of the survey through public relations notices. The administration of the survey was conducted during the spring and each campus chose a time to conduct the survey that worked best with their campus calendar. In early June, all the data had been captured and was automatically downloaded into SPSS for analysis. About 5,000 responses were received from the twelve campuses. 2001 Survey Administration The experience gained from the first year s pilot enabled ARL and TAMU to prepare a funding proposal to the U.S. Department of Education's Fund for the Improvement of Postsecondary Education (FIPSE) to extend the project to a larger and more diverse set of institutions. ARL was awarded $498,368 by FIPSE in September 2000 for the project, "Service Effectiveness in Academic Research Libraries." The project, now named LibQUAL+, would redefine survey questions, dimensions, and data gathering processes to develop a service that ARL and other academic libraries 2

could use to determine their own service effectiveness. Members of the Big 12 Plus Libraries Consortium (BTP) unanimously endorsed the project in October and most of its 30 members decided to participate in the Spring 2001 survey. Also in October, ARL hosted a symposium, New Culture of Assessment in Academic Libraries: Measuring Service Quality, where a description of the project and project results from the Spring 2000 survey were presented. At that time a call for participation from other libraries was issued. Forty-three institutions were interested. Representatives from the libraries participating in the spring 2001 LibQUAL+ project activities met in January 2001 in Washington, D.C, during ALA Midwinter. The meeting provided the LibQUAL+ team with an opportunity to update them on the timeline and procedures for the coming months. Logistical and technical issues were discussed and the meeting gave participants an opportunity for in-person discussion with one another and the LibQUAL+ team. ii Some of the participants had been part of the 2000 survey and were able to share their local experiences. As with the first year, participant costs included a $2,000 fee, and any internal costs to obtain the necessary email addresses, conduct promotional activities to encourage responses, and liaison staff time. Over 20,000 individuals from 43 universities in the United States and Canada completed the Spring 2001 survey. Individual and aggregate analyses were conducted on the data and results given to the participating institutions at the summer ALA conference. Suggestions for improving the administration of the survey were made to the LibQUAL+ team and included such things as more detail on campus procedures, suggestion for changes to the instrument design, and comments on question construction. 2002 Survey Administration A call for participation was issued for the 2002 survey in Summer 2001. In addition to individual institutional responses (including several institutions who had participated in previous years), two consortia decided to participate. Over fifty members of OhioLINK, a consortium of academic libraries from 78 Ohio universities, colleges, community colleges and the State Library of Ohio, and more than 40 members of the Association of Academic Health Sciences Libraries (AAHSL) decided to participate as distinct groups. Both consortia added questions exclusive to their community to provide an opportunity to gather data to benchmark common services. The process for administering the survey followed the same schedule as previous years. Campus liaisons first met as a group at ALA Midwinter. At this meeting additional training was provided at a two-day workshop at which presentations by Parasuraman and members of the LibQUAL+ team focused on the gap theory behind the instrument. Participants were introduced to the qualitative theory, methods, and results that had informed the development of the LibQUAL+ instrument. The LibQUAL+ team described the project deliverables and participants heard from institutions who had previously conducted a survey and learned what they had done with the results. In addition, a participants manual had been prepared to provide detailed information on the 3

project timeline and steps to be taken by campus liaisons. The manual included information obtaining the campus human subjects review approval, defining and gathering email addresses for the sample population, the survey instrument, technical assistance, project deliverables, and dissemination. It also included sample forms, messages, and public relations communications from previous participants. The survey instrument was opened for use in early March and closed at the end of May with responses from 78,000 individuals in 164 institutions. Individual and aggregate analyses of data for many of the institutions will be distributed at the ALA meeting in June. The OhioLINK and AAHSL analyses will be conducted separately over the summer. Survey Instrument and Results The SERVQUAL survey instrument was selected as the basis for development due to its long history and experience with it in academic research libraries. iii As developed by the marketing research group of Parasuraman, Zeithaml, and Berry for the for-profit sector, the SERVQUAL instrument measures service quality across five dimensions: Reliability; the ability to perform the promised service dependably and accurately; Assurance; i.e., knowledge and courtesy of employees and their ability to convey trust and confidence; Empathy; i.e., the caring, individualized attention the firm provides to its customers; Responsiveness; i.e., willingness to help customers and provide prompt service; Tangibles; i.e., appearance of physical facilities, equipment, personnel, and communications materials. The original instrument asks twenty-two questions across the five dimensions. For each question, the customer is asked for their impressions of service quality according to minimum service levels, desired service levels, and perceived performance. Gap scores are calculated for each question between the minimum and perceived expectations and the desired and perceived expectations. A zone of tolerance is the difference between the minimum and desired scores. Optimally, perceived performance should fall within that zone. Scores that fall outside the zone (particularly below) should raise warnings with managers. Building on the SERVQUAL model, Texas A&M had found through their assessments in 1995, 1997, and 1999, that there were three library service dimensions isolated by SERVQUAL: tangibles; i.e., appearance of physical facilities, equipment, personnel, and communication materials; reliability; i.e., ability to perform the promised service dependably and accurately; 4

affect of library service, which combines the more subjective aspects of library service, such as responsiveness, assurance, and empathy. iv The qualitative work to reground the instrument resulted in the addition of questions to the survey that would test five dimensions through 41-items in the Spring 2000 project: Affect of service Reliability Library as place Provision of physical collections Access to information The survey was administered by sending a message to a random set of email addresses that includes an invitation to participate in the survey and links to a survey URL. Each institution s survey was custom designed so that the user sees an institutional logo. Questions asked respondents to indicate their minimum and desired levels of library service and their perceptions of their library s service on a scale of 1 to 9. [see Figure 1] As respondents answered the questions, data was collected at a server at TAMU. Data analysis was conducted using a hierarchical model of factor analysis. Results indicated there was an area to investigate surrounding issues of personal control and navigation. Additional questions were developed and the Spring 2001 survey was expanded to address five dimensions through 56 items: Affect of service Library of place Reliability Self reliance Access to information The survey was conducted through a similar web-based process and results and analysis of 2001 led to the hypothesis (being tested in 2002 through a 25-item survey) that the dimensions of service that make up a users perception of service quality include: Service affect; i.e., responsiveness, assurance, empathy, and reliability the human dimensions of library service Library as place; i.e., campus center of intellectual life, but may not be a concern if the physical facilities are adequate Personal control; i.e., ability to navigate both the information universe in general and the web in particular Information access; i.e., ubiquity of access meaning information delivered in the format, location, and time of choice and comprehensive collections. Both the OhioLINK and AAHSL library groups will have their additional questions analyzed in the context of service quality within their peer groups. 5

Institutions participating in the LibQUAL+ project receive custom radar graphs representing each major constituency group, and aggregate information [see Figure 2] to which they can compare their results. In addition, a binder with customized summaries, including statistics for all variables comparing summary institutional data to peer-group averages and medians is also provided. The reports provide the library with information on gap scores. In addition, because there were enough responses from the 2001 survey, it was possible to create score norms tables. Norm tables allow conversion of observed scores into derived scores and are used to generate both generic and specialized tables. From project data, institutions participating in LibQUAL+ can identify in which dimensions and for which specific services need improvement, according to their users. They can also compare their service quality with that of peer institutions in an effort to develop benchmarks and understanding of best practices. A substantial body of literature is being developed from the LibQUAL+ project and a bibliography is regularly updated and available at <http://www.coe.tamu.edu/%7ebthompson/servqbib.htm>. This literature discusses such things as the quantitative and qualitative analyses for the project, administering a webbased survey, representativeness vs. responsiveness, score reliability, and response rates. Many more documents are expected as the spring 2002 and 2003 data are analyzed. Of interest particularly will be reports from institutions that have been participating in the project since its inception. They will provide examples for the use of longitudinal data. Future Plans The LibQUAL+ project will continue one more year with FIPSE funding and then it is expected to become a self-supporting ARL program. As part of the 2003 survey administration, the instrument will be translated into French for two French-speaking schools in Canada. Institutions interested in participating in the 2003 survey are welcome to contact ARL. The instrument to be used in 2003 likely will be the same one used in 2002. The procedures and timeline will also follow the same patterns as previous years. Campus liaisons will be encouraged to participate actively in training sessions and will be encouraged to share their experience in using the results to benchmark their performance with peer institutions. With over 164 participating institutions in 2002, there will many opportunities for institutions to learn best practices from each other. In addition to the funding from FIPSE, ARL and TAMU have also received funding from the National Science Foundation to adapt the LibQUAL+ instrument for use in the Science, Math, Engineering and Technology Education Digital Library community. Goals for this 3-year grant include: a) defining the dimensions of digital library service quality from the users' perspectives; b) developing a tool for measuring user perceptions and expectations of digital library service quality across NSDL digital library contexts; and c) identifying digital library best practices that permit generalizations across operations and development platforms. This project will begin in late 2002 with its own qualitative development effort. 6

Further information about the LibQUAL+ project can be found at <http://www.libqual.org>. Julia C. Blixrud, Association of Research Libraries i A. Parasuraman, V.A. Zeithaml, and L.L. Berry, A conceptual model of service quality and its implications for future research, Journal of Marketing, 49 (1985):41-50. ii Members of the LibQUAL+ team from ARL include: Duane Webster, Executive Director, Martha Kyrillidou, Senior Program Officer for Statistics and Measurement, Julia Blixrud, Director of Information Services, Dru Mogge, Program Officer for Internet Services, Jonathan Sousa, Technical Applications Development Manager for New Measures, Consuella Askew Waller, LibQUAL+ Program Specialist, Amy Hoseth, New Measures Projects Assistant; from Texas A&M University: Fred Heath, Director, Library Services, Colleen Cook, Executive Associate Dean, Texas A&M University Libraries, Bruce Thompson, Professor and Distinguished Research Scholar, Department of Educational Psychology, Yvonna Lincoln, Professor and Program Director of Higher Education, Educational Administration Department. iii See, for example, Vicki Coleman, Yi (Daniel) Xiao, Linda Bair, and Bill Chollett, Toward a TQM Paradigm: Using SERVQUAL to Measure Library Service Quality, College & Research Libraries 58 (May 1997): 237-251; Susan Edwards and Mairead Browne, Quality in Information Services: Do Users and Librarians Differ in Their Expectations? Library & Information Science Research 17 (Spring 1995): 163-182; Danuta A. Nitecki, An Assessment of the Applicability of SERVQUAL Dimensions as a Customer-based Criteria for Evaluating Quality of Services in an Academic Library (Ph.D. dissertation, University of Maryland, 1995). iv C. Cook, V. Coleman, and F. Heath, SERVQUAL: A client-based approach to developing performance indicators, in Proceedings of the 3 rd Northumbria international conference on performance measurement in libraries and information services, 27-31 August 1999, Newcastle-upon-Tyne: Information North, (2000): 211-218. 7