Assessing informatics students satisfaction with a web-based courseware system



Similar documents
THE EFFECTIVENESS OF USING LEARNING MANAGEMENT SYSTEMS AND COLLABORATIVE TOOLS IN WEB-BASED TEACHING OF PROGRAMMING LANGUAGES

Effectiveness of Online Instruction

Instructional Design Strategies for Teaching Technological Courses Online

DESIGNING A HYBRID CLASS IN ACCOUNTING: HOW TO GET MAXIMUM MILEAGE?

Distributing Course Materials Through Online Assistance

Getting Started with WebCT

An Iterative Usability Evaluation Procedure for Interactive Online Courses

Application of Project-driven Teaching Practice Based on Sakai

Using Online Video Lectures to Enhance Engineering Courses

Issues in Information Systems Volume 15, Issue II, pp , 2014

An Investigation on Learning of College Students and the Current Application Situation of the Web-based Courses

Assessing Blackboard: Improving Online Instructional Delivery

Students Perception Toward the Use of Blackboard as a Course. Delivery Method. By Dr. Ibtesam Al mashaqbeh

EFL LEARNERS PERCEPTIONS OF USING LMS

The Effect of Software Facilitated Communication on Student Outcomes in Online Classes

Blackboard Learning System: Student Instructional Guide

COMPARATIVE ASSESSMENT OF ONLINE HYBRID AND FACE-TO- FACE TEACHING METHODS

South Georgia State College Distance Learning Policy

Running head: USABILITY ENGINEERING, COGNITIVE SCIENCE, AND HEALTHCARE INFORMATION SYSTEMS

Achievement and Satisfaction in a Computer-assisted Versus a Traditional Lecturing of an Introductory Statistics Course

PhD Qualifying Examination: Human-Computer Interaction

STUDENT EVALUATION OF INSTRUCTIONAL EFFECTIVENESS OF WEB-BASED LEARNING STRATEGIES

IMPLEMENTATION GUIDE for MEDICAL TERMINOLOGY ONLINE for MASTERING HEALTHCARE TERMINOLOGY, Third Edition Module 7: Male Reproductive System

AC : FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION

Learning Management System Self-Efficacy of online and hybrid learners: Using LMSES Scale

Quality Measurement and Good Practices in Web-Based Distance Learning:

The information explosion has resulted in changes in

Using Web-based Tools to Enhance Student Learning and Practice in Data Structures Course

Faculty and Student Evaluations of a Web Based Nursing Program. Associate Professor of Nursing College of Nursing, University of New Mexico

The Impact of Web-Based Instruction on Performance in an Applied Statistics Course

Getting an Edge in Online Education: Developing an Online Learning Web Portal

Training Services Course Catalog TRAINING SERVICES

Improving Distance Education Through Student Online Orientation Classes

Strategies to Enhance Learner s Motivation in E-learning Environment

Tools for engaging online learners: Increasing student involvement in online classes

E-learning evaluation: A cross-technique comparison

Presented at the 2014 Celebration of Teaching, University of Missouri (MU), May 20-22, 2014

The Journal of Applied Business Research Volume 18, Number 2

Going green: paperless technology and feedback from the classroom

The Effect of Web-Based Learning Management System on Knowledge Acquisition of Information Technology Students at Jose Rizal University

AN EVALUATION OF THE USEFULNESS OF WEB-BASED LEARNING ENVIRONMENTS The Evaluation Tool into the Portal of Finnish Virtual University

Comparatively Assessing The Use Of Blackboard Versus Desire2learn: Faculty Perceptions Of The Online Tools

Reproductions supplied by EDRS are the best that can be made from the original document.

Online Baccalaureate Program CIS 101 Syllabus Computer Fundamentals and Applications Spring 2011

Assessing the quality of online courses from the students' perspective

Design of Expanded Assessment Management System for Open-Source Moodle LMS Module

Using Interactive Multimedia Web Interface for Multicultural Healthcare Education

MyMathLab / MyStatLab Advanced Interactive Training Guide

Master of Arts in Instructional Technology Program. Policies and Procedures Manual

The Effectiveness Analysis of the Online Tools for Engineering Faculty Needs

CREATING ON-LINE MATERIALS FOR COMPUTER ENGINEERING COURSES

Evaluation in Online STEM Courses

AC : A STUDY OF TRADITIONAL UNDERGRADUATE STU- DENT ENGAGEMENT IN BLACKBOARD LEARNING MANAGEMENT SYSTEM

Effective Online Instructional Competencies as Perceived by Online University Faculty and Students: A Sequel Study

Formative Evaluations in Online Classes. course improvements is also increasing. However, there is not general agreement on the best

Continuing Education and Knowledge Retention: A Comparison of Online and Face-to-Face Deliveries

STUDENT ATTITUDES TOWARD WEB-BASED COURSE MANAGEMENT SYSTEM FEATURES

Experience with a process for Software Engineering web-course development

Introduction to Industrial and Organizational Psychology PSY 319 Spring, 2013 (Section 1)

EXPERIENCE WITH A PROCESS FOR SOFTWARE ENGINEERING WEB-COURSE DEVELOPMENT

Teaching Online - A Time Comparison

Teaching Phonetics with Online Resources

Analysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course

A Comparison of E-Learning and Traditional Learning: Experimental Approach

Evaluating Usability of E-Learning Systems in Universities

A WEB-BASED QUIZ GENERATION TOOL USING ACTIVE SERVER PAGES

Northeastern State University Online Educator Certificate

Online Exams: History. Online Exams: Formative. Modes of Delivery. Online Exams: Design & Delivery. Online Exams: Build your own 4/6/2012

2.12 DISTANCE EDUCATION OR EXECUTIVE DEGREE PROGRAMS

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course

Procrastination in Online Courses: Performance and Attitudinal Differences

THE UNIVERSITY OF TEXAS AT TYLER SCHOOL OF NURSING. NURS WEB COURSE PROFESSIONAL DEVELOPMENT FOR THE RN Fall 2015

Course Syllabus. Upon completion of all six modules, participants will have:

Delta Journal of Education ISSN

University Students' Perceptions of Web-based vs. Paper-based Homework in a General Physics Course

Distance Learning at Middlesex Community College

Experiences with Tutored Video Instruction for Introductory Programming Courses

Internet-Based Learning Tools: Development and Learning Psychology (DLP) Experience

Virtual Teaching in Higher Education: The New Intellectual Superhighway or Just Another Traffic Jam?

Using A Learning Management System to Facilitate Program Accreditation

Online Course Delivery and Student Satisfaction 1 INTRODUCTION. As early as the 1960s, the potential and the power of computer-based training

Virtual (mobile) Office hours (telephone or Skype) can be arranged via (12 modules x 3 hours = 36 hours)

Issues of Pedagogy and Design in e-learning Systems

A Training Module You Can Count On!!! Presented by Sharon Jackson, Brookhaven College

A SUCCESSFUL STAND ALONE BRIDGE COURSE

Online Collection of Student Evaluations of Teaching

Alternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level Course

COURSE MANAGEMENT SYSTEM USAGE AT A LOCAL MEDICAL COLLEGE: AMEANS TO DIMINISH TUITION INCREASE.

Teaching Without a Classroom: Delivering Courses Online

Integrating Instructional Technology into the Classroom: Laptops and LearningSpace in Business Administration

How to Make Instructional Design and Learning Fun

Exploring Learner s Patterns of Using the Online Course Tool in University Classes. Yoshihiko Yamamoto and Akinori Usami

Building Online Learning Communities: Factors Supporting Collaborative Knowledge-Building. Joe Wheaton, Associate Professor The Ohio State University

Appendix D: Summary of studies on online lab courses

Journal of Information Technology Impact

STUDENTS ACCEPTANCE OF A LEARNING MANAGEMENT SYSTEM FOR TEACHING SCIENCES IN SECONDARY EDUCATION

MISSISSIPPI STATE UNIVERSITY COLLEGE OF EDUCATION. DEPARTMENT of LEADERSHIP & FOUNDATIONS COURSE SYLLABUS. Human Resources Administration

Faculty Best Practices Using Blended Learning in E-learning and Face-to-Face Instruction

Can Prospective Usability Evaluation Predict Data Errors?

Extending Classroom Interaction to the Cyberspace with Facebook, Moodle and Blogger

Transcription:

International Journal of Medical Informatics (2004) 73, 181 187 Assessing informatics students satisfaction with a web-based courseware system Todd R. Johnson*, Jiajie Zhang, Zhihua Tang, Constance Johnson, James P. Turley School of Health Information Sciences, University of Texas Health Science Center at Houston, 7000 Fannin Suite 600, Houston, TX 77030, USA KEYWORDS Medical informatics; User-computer interface; World wide web; Educational technology Summary This study assessed health informatics student satisfaction with two subsequent versions of Prometheus, a web-based courseware system. Prometheus versions 4 and 5 were assessed to gauge the effect of modifications to improve the usability of the system. The Questionnaire for User Interaction Satisfaction (QUIS, version 7.0) was administered at the end of fall semester 2001 (in which Prometheus version 4 was used) and again at the end of Spring 2002 (in which Prometheus version 5 was used). QUIS contains measures of user satisfaction of the overall system and 11 specific dimensions, including screen, terminology and system information, learning, system capabilities, manuals and online help, multimedia, and teleconferencing. In general, students had favorable judgments of Prometheus, and their satisfaction level remained relatively stable across the two versions. However, usability enhancements incorporated into version 5 produced no significant differences in student satisfaction ratings. The results of this study provide a benchmark for comparing the relative usability of alternative courseware systems and demonstrate the utility of user satisfaction surveys for assessing and improving courseware systems. With increases in the need for compliance education and the education of clinicians with just in time knowledge, courseware systems will become an integral part of the clinical information systems, increasing the importance of usability studies such as this one. 2003 Elsevier Ireland Ltd. All rights reserved. 1. Introduction With the growth in popularity of the world wide web, educators have been quick to offer their courses over the web and various companies and other groups have begun to offer environments (often called courseware or learning management systems) for developing and delivering web-based courses. Many research studies, including those *Corresponding author. Tel.: +1-713-500-3921; fax: +1-713-500-3929. E-mail address: todd.r.johnson@uth.tmc.edu (T.R. Johnson). done in the context of healthcare and health informatics education, have focused on the effectiveness of web-based education and the general satisfaction of the students [1 5]. These studies generally support the conclusion that distance learning effectiveness and student satisfaction are equivalent to traditional face-to-face learning environments. However, very little research has systematically analyzed the usability needs of the instructors and students who use web-based courseware systems. As a result, many, if not all of the existing courseware environments are often difficult or clumsy to use [6,7]. This creates a problem for educators and learners who must struggle with 1386-5056/$ see front matter 2003 Elsevier Ireland Ltd. All rights reserved. doi:10.1016/j.ijmedinf.2003.12.006

182 T.R. Johnson et al. the technology instead of focusing on the course content and objectives. Some research even suggests that poor usability negatively affects learning [8]. As educational courseware is used to bring new knowledge to practicing clinicians, the usability of systems will be more important in a stressful clinical environment than it is in a purely educational one. To improve the usability of courseware systems, we must begin to use formal methods from the field of human factors engineering [9,10]. Such methods provide a basis to compare alternative systems, to gauge the effects of changes to a courseware system, and to help developers decide where to focus their efforts to improve usability. Health informatics education is a particularly challenging environment for courseware systems, especially with the highly interdisciplinary, groupbased, student-driven program at the School of Health Information Sciences (SHIS) [11]. First, many informatics students are technically very sophisticated and expect a great deal of power and flexibility in a courseware system. Second, interdisciplinary team projects are often the primary means of assessing student performance in SHIS classes, placing extra demands on a courseware system to support collaborative work. Third, because students often provide significant amounts of course material (for student presentations in class), students must often supply session content, a function that is typically done by the instructor. Finally, the diverse professional background of SHIS students (many already have advanced biomedical degrees) results in unique educational needs that sometime necessitate changes to a course as it is being taught in order to meet its educational objectives. For example, it is often necessary to change the class schedule to allow more class sessions for a certain topic or to add or delete background topics to meet student needs. Such changes put extra demands on courseware systems in terms of functionality for supporting change and for communicating changes to students. The study reported in this paper has two objectives. The first is to assess the usability of one courseware system, Prometheus, through a detailed user satisfaction survey of health informatics students. The survey examines student satisfaction with the major components of the system s interface. The second objective is to determine whether changes made to improve the usability of Prometheus had a measurable effect on students assessments of its usability. The results reported in this paper also provide a benchmark with which to compare other courseware systems in use at other informatics programs. Prometheus is an open-source, web-based course development and teaching environment originally developed at George Washington University and now owned by Blackboard Inc. It is in use by over 50 universities throughout the USA. The environment supports online syllabi, course outlines, multimedia lecture material, online quizzes and exams, a grade book, electronic discussion lists, whiteboards, chat, and file sharing. Prometheus supports online course development using a form-driven interface, web-based HTML editor, and file uploading and linking for more complex content, such as PowerPoint slides and Word files. Convery, Nuttall, and Bodenheider recently compared Prometheus to Theo (their experimental courseware interface) using a small-scale user test in which twelve subjects were asked to accomplish three tasks on both systems [12]. Subjects generally found Theo easier to use; however, this study did not assess Prometheus in actual use among a wide range of students, nor did it systematically assess the interface components. 2. Design We administered a user interface satisfaction survey as part of a comprehensive usability evaluation of the Prometheus courseware system s user interface. In December 2000, three of the authors first conducted a heuristic evaluation [13] of the student interface to the system. 1 In heuristic evaluation, three or more usability experts independently review an interface for violations to a small set of well-known usability heuristics. Evaluators then independently rate the severity of the violations, resulting in an ordered list of potential usability problems. In July 2001, we performed a small scale user test [9] of the courseware. Although both approaches revealed problems with Prometheus (version 4) interface design, they focused on a limited number of system functions and drew on the experience of a relatively small number of users/experts. In order to evaluate Prometheus usability in a more ecologically valid context, in fall 2001, we administrated the Questionnaire for User Interaction Satisfaction (QUIS) to all students currently enrolled in health informatics classes at SHIS, after they had been using the system for an extended period of time. The Prometheus developers at George Washington University used the 1 This evaluation was funded through a contract from George Washington University to the University of Texas Health Science Center at Houston.

Assessing informatics students satisfaction with a web-based courseware system 183 results from the heuristic evaluation and user tests when they revised Prometheus version 4. In early Spring, 2002, the newly improved Prometheus system (version 5) was installed and used throughout the semester. At the end of the semester, we again surveyed the students in the class with the same questionnaire on their satisfaction with the system. In addition, we also conducted heuristic evaluation as well as user testing on the new student interface. The results from the second round of usability assessment were compared to those from the first to determine the effect of the revision. In this paper, we focus on the two surveys that measured users subjective satisfaction with the two versions of the Prometheus system. Students used Prometheus to augment the instructor-led courses they were enrolled in. Each SHIS course had a section in Prometheus, accessible by the instructor and all enrolled students. Each Prometheus course included, at a minimum, a syllabus, course outline, file sharing section, and discussion boards. The course outline contained a list of sessions (one for each day in which the course meets) along with the topics for that date. Each topic contained information pertaining to the topic, such as readings, assignments, presentations, objectives, files, messages and so on. The files section allowed students and the instructor to post and share files and included a group feature that allowed groups of students to share files among themselves. The messages section allowed users to send email to one or more of the course participants. In addition to these sections, some courses made use of Prometheus chat and gradebook. Although Prometheus provides online quizzes and tests, few SHIS courses used this feature, because most were based on group projects and homework assignments. Student training in the use of Prometheus was done informally through the class instructors, other students, and through online help documents. 3. Methodology 3.1. Participants Participant demographics for both surveys are summarized in Table 1. Participants of the first survey were recruited from a pool of students who were officially enrolled in SHIS classes in the fall semester 2001 and had used Prometheus (version 4) at least once during that semester. Emails were sent to 106 students using Email addresses they had supplied within Prometheus. Four of these mes- Table 1 Respondent demographics for survey 1 and 2 Survey 1 2 N 45 36 Males 23 14 Females 22 22 Median age (years) 30.5 32 Internet usage >3 years (%) 87 89 Prometheus usage 1 6 months (%) 56 22 Prometheus usage >1 year (%) 29 49 sages were undeliverable. One individual replied that he or she had not actually used Prometheus, reducing the subject population to 105. Forty-five respondents (23 males and 22 females) provided valid responses, resulting in a 42.8% response rate. These participants age ranged from 26 to 57, with a median of 30.5. Participants differed considerably in their general computer experience: 55% of them had used one or two operating systems, while the other 45% had worked with at least three different operating systems. However, the majority (87%) of the participants were experienced users of the Internet, and had been surfing the web for at least 3 years. Participants experiences with the Prometheus system were also varied. Fifty-six percent of them had used the system between 1 and 6 months, and 29% had more than 1-year of experience. Participants of the second survey were recruited in a similar way as the first one. Prometheus version 4 was replaced with version 5 just prior to the start of spring semester 2002. At the end of that semester, an Email message announcing the survey was sent to 82 students who were officially enrolled in SHIS classes in the spring semester 2002 and had used Prometheus (by then the system had been upgraded to version 5) at least one time during the semester. Three of these Emails were undeliverable and two individuals replied that they had not used the new version of Prometheus, reducing the subject population to 80. Valid responses were obtained from 36 students (14 males and 22 females), resulting in a 45% response rate. Their age ranged from 22 to 63, with a median of 32. Participants in the second survey also differed in their general computer experience in terms of familiarity with different operating systems. However, 89% of the participants reported that they had been using the Internet for at least 3 years. Compared to the first survey, participants in the second one were more familiar with the Prometheus system. Those with

184 T.R. Johnson et al. 4. Results Fig. 1 Sample QUIS question. less than 6 month experience had dropped to 22%, while the percentage with more than 1-year experience had increased to 49%. Among the 36 participants, 20 had also taken part in the first survey. 3.2. Materials Both surveys were administrated with the same instrument the Questionnaire for User Interaction Satisfaction (QUIS) version 7.0, developed and licensed by the Human Computer Interaction Lab at the University of Maryland at College Park. Past research has shown that the QUIS provides decent reliability and validity in assessing user s subjective satisfaction with various types of computer interfaces [14 16]. The original questionnaire contains measures of user satisfaction with the overall system as well as nine specific dimensions. In the current study, the questionnaire was configured to include the overall satisfaction measure and seven interface factors that are pertinent to the Prometheus system. These factors include screen, terminology and system information, interface learnability, system capabilities, manuals and online help, multimedia, and teleconferencing. The questions in the QUIS were organized hierarchically such that the measurement of each interface factor consisted of questions that tap into different facets of that factor. Each facet, in turn, was measured with a set of sub-questions which allowed for finer-grained analysis. Fig. 1 shows a sample QUIS question. Each question consisted of a pair of semantic differentials. The participants were asked to rate the system on a 9-point scale along this dimension, with 1 being the least favorable judgment and 9 the most favorable. Each question also came with a not applicable option in case the participant considered it to be irrelevant to the user interface or his or her experience. Overall, the survey questionnaire consisted of 102 questions. Both surveys were conducted online. When a survey was under way, a webpage was set up and the participants filled out the questionnaire from that site on their own time schedule. Data analyses were based on those questions where participants provided a valid rating score. Questions answered Not applicable (i.e., a N/A response) or simply skipped by an individual were excluded. Participants responses to the questions in each survey were first averaged within each interface category. All participants ratings in a single category were then averaged to reveal their consensus on that dimension. Furthermore, participants ratings in all eight interface categories were again averaged to determine their overall rating of the Prometheus user interface. In general, participants had favorable judgments of the user interface, and their satisfaction level remained relatively stable across the two versions. On a 9-point scale, the average rating score for version 4 was 6.23 (S.D. = 1.43), and that for version 5 was 6.17 (S.D. = 1.36). The difference between the two (with a total of 81 respondents) was not significant, t(79) = 0.167, P = 0.87. To compare the two versions of Prometheus, the profiles of participants responses in both surveys were depicted against each other, as shown in Fig. 2. For each interface factor, the average rating across all the participants is displayed. Error bars surrounding the mean responses represent the standard error of their respective mean values. The two solid lines that run across in each survey are the overall mean across all the sections in a survey. Note that the valid number of participant responses varied across the interface factors on both surveys. In Fig. 2, the numbers underneath the X-axis show the number of valid participant responses corresponding to each interface factor. The profiles of the two surveys revealed the strengths and weaknesses of each version of Prometheus user interface. With Version 4, users were most satisfied with their ability to learn complex system tasks (average rating = 6.51) while least satisfied with manuals and online help (average rating = 5.37) With version 5, participants gave the system s multimedia feature the most favorable judgment (average rating = 6.87) but still considered the Manual and Online Help section the least favorable (average rating = 5.48) Although the rating scores on each single dimension fluctuated across the two versions, none of the differences reached statistical significance. Because 20 students participated in both surveys, their responses were treated as matched pairs and submitted to further analyses. In order to control for the overall response trend in each survey so as to observe the relative strength of each interface factor across the two system versions, deviation

Assessing informatics students satisfaction with a web-based courseware system 185 Fig. 2 Comparison of user responses. Fig. 3 Response profiles based on ratings from those who participated in both surveys.

186 T.R. Johnson et al. scores were calculated by subtracting the overall mean from each participant s rating on each dimension. Fig. 3 shows the response profiles based on data from these 20 participants. Again the N= line underneath the X-axis displays the number of valid responses in each interface factor. As can be seen in Fig. 3, while these participants ratings on the first five dimensions remained relatively stable, their response regarding Manuals and Online Help, Multimedia, and Teleconferencing fluctuated considerably. However, none of these differences were significant. 5. Discussion These results provide information about the relative strengths and weaknesses of different aspects of Prometheus user interface. Dimensions that fall below the overall mean (the solid line in the figures) are good areas for future improvements. Each of these dimensions represents average responses of a number of subquestions that assess various facets relating to the dimension. Analysis of the average responses to these questions provides more detailed information about the areas of the interface that are particularly weak or strong. For example, the terminology and system dimension consists of two major subgroups of questions: one assessing the terminology used to communicate with the user and the other with the quality of the instructions and feedback given to the users. Prometheus scored above the overall mean on questions relating to terminology, meaning that the users were happy with the terms used by Prometheus (e.g., file, class, lecture, upload, etc.); however, questions dealing with the quality of the feedback scored below the mean, indicating that students were not satisfied with the quality of instructions and feedback, such as onscreen instructions for uploading a file. Version 5 of Prometheus included a variety of modifications that were intended to improve the user experience over that of version 4. Despite this, our results show no significant difference between the two versions. We believe that this is the result of major changes to the Prometheus user interface that, while eliminating some of the problems, introduced new problems. For example, the Files section of Prometheus (where users can upload, download, organize, and share files) was completely rewritten to include important new functionality and a completely new interface. While this interface solved some of the major problems found in version 4, it introduced new problems. For instance, the designers added important visual feedback about file status and permissible operations (an improvement over version 4), but the icons used to denote this information were difficult to understand and remember. In addition, some of the major conceptual problems with Prometheus were not addressed between versions. For instance, assignments are simply files attached to the course outline and uploaded assignment solutions from the students are just like any other file in the Files area. Neither version of Prometheus recognizes that some files are assignments and other files are student solutions to specific assignments. When an instructor marks up an assignment and returns it to the student it is treated just like any other file in the student s files section Prometheus does not maintain any relationship between assignments, solutions, and graded assignments. A number of limitations of this study may have affected the results. The small sample size (a total of 81 participants for both surveys) affected our ability to detect small changes in satisfaction among the interface components. This sample size was a result of our small student population and also of the moderate response rates (42.8 and 45%). Several factors may have affected the response rates. Studies have shown that E-mail surveys typically have lower response rates than direct mail surveys [17,18], although we believe that this was probably a minor factor in the present study, because the population was familiar with computers and electronic communication. A more likely factor was the length of the survey. With 102 questions, the QUIS requires several minutes to complete and many participants commented informally that the survey was too long. 6. Conclusion As the importance and adoption of online courses increases, so does the importance of usability. Learners and educators need to focus on course content and objectives, not on the technology. Online courses offer certain advantages over traditional face-to-face courses, especially by offering anytime, anywhere access and collaborative tools that span time and space. However, to maximize these benefits courseware user interfaces must meet the needs and capabilities of the users. User interface satisfaction surveys, such as the QUIS, provide important measures of courseware usability for comparing and improving systems. This technique and other user-centered design methods will enable us to build courseware systems that support, rather than hinder, learning. As informatics educators and researchers we must strive to apply the standards that we use for developing

Assessing informatics students satisfaction with a web-based courseware system 187 health information systems to the systems that we use to educate our students. Studies such as this will need to be expanded to understand how clinicians can learn new knowledge and new techniques in the clinical arena. With just in time education a possibility, the role of usability becomes more critical. The merging of the educational and clinical environments will create a new environment for learning, creating a need to integrate learning processes into traditional clinical informatics. Acknowledgements QUIS 7.0 was used under license from the University of Maryland. Blackboard, Prometheus, QUIS, Word and PowerPoint are trademarks of their respective owners. References [1] M. Allen, J. Bourhis, N. Burrell, E. Mabry, Comparing student satisfaction with distance education to traditional classrooms in higher education: a meta-analysis, Am. J. Dist. Educ. 16 (2) (2002) 83 97. [2] R. Phipps, J. Merisotis, What s the Difference? A Review of Contemporary Research on the Effectiveness of Distance Learning in Higher Education, The Institute for Higher Education Policy. Available at: http://www.ihep.com/pubs/ PDF/Difference.pdf. Accessed Sep 24, 2003. [3] W.R. Hersh, K. Junium, M. Mailhot, P. Tidmarsh, Implementation and evaluation of a medical informatics distance education program, J. Am. Med. Inform. Assoc. 8 (6) (2001) 570 584. [4] L.K. Goodwin, Web-based informatics education: lessons learned from 5 years in the trenches, Proc. AMIA Symp. (2002) 300 304. [5] A.R. Leasure, L. Davis, S.L. Thievon, Comparison of student outcomes and preferences in a traditional vs. world wide web-based baccalaureate nursing research course, J. Nurs. Educ. 39 (4) (2000) 149 154. [6] C. Ruman, J. Gillette, Distance learning software usefulness and usability: user-centered issues in practical deployment, Ed at a Distance 15 (3) (2001). [7] W.A. Nelson, K.A. Bueno, S. Huffstutler, If you build it, they will come. but how will they use it? J. Res. Comput. Educ. 32 (2) (1999) 270 286. [8] O. Parlangeli, E. Marchigiani, S. Bagnara, Multimedia systems in distance education: effects of usability on learning, Interact. Comput. 12 (1) (1999) 37 49. [9] J. Nielsen, Usability Engineering, Morgan Kaufmann, San Diego, 1993. [10] D.A. Norman, S.W. Draper (Eds.), User Centered System Design. Lawrence Erlbaum Associates, Hillsdale, NJ, 1986. [11] J.P. Turley, C. Johnson, T. Johnson, J. Zhang, A clean slate: initiating a graduate program in health informatics, MD Comput. 18 (1) (2001) 47 48. [12] T. Convery, B. Nuttall, B. Bodenheimer, Web-based courseware application usability, in: Proceedings of the ACM Southeast Conference (ACMSE03), Savannah, Georgia, 2003, pp. 399 404. [13] J. Nielsen, R.L. Mack (Eds.), Usability Inspection Methods. Wiley, New York, 1994. [14] J.P. Chin, V.A. Diehl, K.L. Norman, Development of an instrument measuring user satisfaction of the humancomputer interface, in: CHI 88 Conference Proceedings: Human Factors in Computing Systems, Wshington, DC, 1988, pp. 213 218. [15] B.D. Harper, K.L. Norman, Improving user satisfaction: The questionnaire for user interaction satisfaction, version 5.5, in: Proceedings of the 1st Annual Mid-Atlantic Human Factors Conference, Virginia Beach, VA, 1993, pp. 224 228. [16] H.X. Lin, Y.Y. Choong, G. Salvendy, A proposed index of usability: a method for comparing the relative usability of different software systems, Behav. Inf. Technol. 16 (1997) 267 278. [17] R. Jones, N. Pitt, Health surveys in the workplace: comparison of postal, email and world wide web methods, Occup. Med. 49 (8) (1999) 556 558. [18] D. Dillman, Mail and Internet Surveys, The Tailored Design Method, 2nd edition, Wiley, New York, 2000.