Assessing Educational Web-site Usability using Heuristic Evaluation Rules

Similar documents
Investigating the Relative Importance of Design Criteria in the Evaluation of the Usability of Educational Websites from the Viewpoint of Students

Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations

Appendix A Protocol for E-Learning Heuristic Evaluation

Harvard Graduate School of Design: Heuristic Evaluation Report

A model for assessing the quality of e-commerce systems

E-learning evaluation: A cross-technique comparison

Screen Design : Navigation, Windows, Controls, Text,

Proceedings of the 4th International Conference on HCI, Stuttgart, September 1991

Running head: USABILITY ENGINEERING, COGNITIVE SCIENCE, AND HEALTHCARE INFORMATION SYSTEMS

Effective Web Communication. Bunnyfoot Services

Usability Issues in Web Site Design

Heuristic Evaluation of Three Cellular Phones

An Iterative Usability Evaluation Procedure for Interactive Online Courses

To measure or not to measure: Why web usability is different. from traditional usability.

Observing and describing the behavior of a subject without influencing it in any way.

Designing and Evaluating a Web-Based Collaboration Application: A Case Study

HCI Report: Website Evaluation

Microsoft Outlook Introduction

HOW UNIVERSAL DESIGN PRINCIPLES CAN ENHANCE THE INTERFACE OF 3D PRINTING PROGRAMS

258 International Arab Journal of e-technology, Vol. 3, No. 4, June The Website of the University of Jordan: Usability Evaluation.

User Interface Design

Journal of Naval Science and Engineering 2015, Vol. 11, No.3, pp

A Comparison Between User Interface Design for Different Kinds of Mobile Applications

Spiel. Connect to people by sharing stories through your favorite discoveries

Web Usability: Principles and Evaluation Methods

Evaluation of E-Learning Management Systems by Lecturers and Students in Ugandan Universities: A Case of Muni University

An Introduction to. Metrics. used during. Software Development

INTERNATIONAL STANDARD ON ASSURANCE ENGAGEMENTS 3000 ASSURANCE ENGAGEMENTS OTHER THAN AUDITS OR REVIEWS OF HISTORICAL FINANCIAL INFORMATION CONTENTS

Integrating The Student ID Card with the Ticketing System FINAL REPORT

Software Development and Usability Testing

Improving Government Websites and Surveys With Usability Testing and User Experience Research

Sherry Pagoto, PhD University of Massachusetts Medical School. Kristin Schneider, PhD Rosalind Franklin University

Exploring new ways of Usability testing for an E-Science/ Scientific research application

User research for information architecture projects

Welcome 5. Four steps to apply for Grants for the arts 5. Eligibility 7

ISO 14001:2004 EMS Internal Audit Guidance

A comparison of the security and usability of personal firewalls

Issues in Information Systems Volume 15, Issue II, pp , 2014

Top 10 Skills and Knowledge Set Every User Experience (UX) Professional Needs

GUIDELINES FOR WEB DESIGN

Usability Studies: Testing Your Website

3D Interactive Information Visualization: Guidelines from experience and analysis of applications

ISO and Industry Standards for User Centred Design

Getting Started with the DCHR Service Desk. District Service Management Program

The aims: Chapter 14: Usability testing and field studies. Usability testing. Experimental study. Example. Example

Balancing the 5Es: Usability

MS Internet Explorer Vs Opera Comparative Usability Test

Tracking translation process: The impact of experience and training

A Framework for Integrating Software Usability into Software Development Process

OpenIMS 4.2. Document Management Server. User manual

Bad designs. Chapter 1: What is interaction design? Why is this vending machine so bad? Good design. Good and bad design.

Comparison of heuristic evaluation and user testing of public administration portal

Kalliopi Dimitrouli Maria Peristeri

Teaching & Media: A Systematic Approach

User interface design. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1

Interface Design Rules

Human-Computer Interaction Standards

Human Computer Interaction

Evaluating teaching. 6.1 What is teacher evaluation and why is it important?

User experience storyboards: Building better UIs with RUP, UML, and use cases

Internal Quality Assurance Arrangements

UX UI. An intro to the world of User Experience and User Interface

Study of characteristics of Effective Online Help System to Facilitate Nurses Interacting with Nursing Information System

INTERNATIONAL FRAMEWORK FOR ASSURANCE ENGAGEMENTS CONTENTS

Exploring the Usability of Municipal Web Sites: A Comparison Based on Expert Evaluation Results from Four Case Studies

Heuristic Evaluation Quality Score (HEQS): Defining Heuristic Expertise

Statistics on E-commerce and Information and Communication Technology Activity

WRITING A CRITICAL ARTICLE REVIEW

Pilot Testing and Sampling. An important component in the data collection process is that of the pilot study, which

Screen Design : Navigation, Windows, Controls, Text,

Electronic Document Management Using Inverted Files System

Web site evaluation. Conducted for. (The Client) By Information & Design. October 17th, 1998

MASTER S DEGREE IN EUROPEAN STUDIES

"Success and failure factors in health information systems development" Jytte Brender Associate Research Professor

Usability Heuristics for the Web. 1. Visibility of system status

College of Communication and Information. Library and Information Science

Testing Websites with Users

The Royal College of Pathologists response to Lord Carter s report on operational productivity, February 2016

Applicant Workflow Hiring Managers

Website Builder Overview

Memo. Open Source Development and Documentation Project English 420. instructor name taken out students names taken out OSDDP Proposal.

Brock University Content Management System Training Guide

9 The Difficulties Of Secondary Students In Written English

Navigational Consistency in Websites: What Does it Mean to Users?

Selection and use of ISO 9000

Programme Specification Integrated Degree Programmes

Missing Data. A Typology Of Missing Data. Missing At Random Or Not Missing At Random

The Essential Guide to User Interface Design An Introduction to GUI Design Principles and Techniques

Assignment # 4 Summing up the Course

Merging learner performance with browsing behavior in video lectures

USER SATISFACTION EVALUATION OF AN EDUCATIONAL WEBSITE

Muhammad F Walji PhD and Jiajie Zhang PhD The University of Texas Health Science Center at Houston

Evaluation of Fiji National University Campus Information Systems

Low Cost Usability Testing

Ranges TEC delivers mainly to a target clientele of Secondary School Students undertaking VCAL who:

starting your website project

JHSPH HUMAN SUBJECTS RESEARCH ETHICS FIELD TRAINING GUIDE

Assessment of children s educational achievements in early childhood education

USABILITY: EFFECTIVE, EFFICIENT, AND SATISFYING

Re-balancing of Generalized Assembly Lines Searching Optimal Solutions for SALBP

Transcription:

Assessing Educational Web-site Usability using Heuristic Evaluation Rules Nektarios Kostaras, Mixalis Xenos Hellenic Open University, School of Sciences & Technology, Patras, Greece nkostaras@eap.gr, xenos@eap.gr Abstract Usability evaluation is a very important procedure for the quality assessment of websites. In this paper usability evaluation methods are discussed. The application of one of these methods, the Heuristic evaluation, is further examined and the findings of its employment in the usability assessment of the new website of Hellenic Open University are described. 1. Introduction Quality assessment and in particular usability evaluation is an important phase in the development of a website, which is often overlooked by modern web applications developers. Assessment becomes necessary nowadays as the web becomes gradually a platform of complex applications with increased interactivity and a front end of databases and corporate information systems. This new use of the medium increases the importance of the usability, as the web is used for accomplishment of complex tasks, like learning, retrieving information, interacting and collaborating with peers [Shum (1996)]. Today s highly interactive web applications tend to adopt interaction styles borrowed from traditional software. This is not however always acceptable, since the web poses special requirements that need to be taken into consideration [Bevan (1998)]. For instance, the characteristics of web users are not always well known in advance and can vary considerably. According to Nielsen [Nielsen (1993)], the highly quoted usercentered design methodology is considered applicable in this new context. The three principles of a user-centered design [Rubin (1994)] are presented as follows. 1. An early focus on users and tasks This is not just simply identifying and categorizing users, but advocate direct contact between users and the design team throughout the development life cycle. On the other hand, caution should be taken on the fact that direct contact itself can be hazardous if it not structured. What is required is a systematic, structured, approach to the collection of information from and about the users. 2. Empirical measurement of system usage

544 11 th Panhellenic Conference in Informatics In this case emphasis is placed on behavioral measurements of ease of learning and ease of use early in the design process, through the development and testing of prototypes with actual users. 3. Iterative design whereby a system is designed, modified, and tested repeatedly Iterative design allows for the complete overhaul and rethinking of a design, through early testing of conceptual models and design ideas. It is a process of design, test, redesign and retest in order to bring the system to the desired state. This approach brings iterative usability evaluation at the centre of the design process. In this paper we present the usability assessment of the new website of Hellenic Open University (www.eap.gr), which replaced the previous site that was in operation for six years. Two representative pages of the new website are shown in figure 1. The website offers useful information to the public and to the academic community and serving as a portal that leads to other useful sites and services of the university. The aforementioned usability assessment of the website, which was based on adapted methods proposed in the literature, was conducted by Software Quality Research Group of Hellenic Open University. The method used in the experiment was heuristic evaluation by usability experts. Figure 1. Web pages of HOU website The following text is structured in four sections. In section 2 a summary of the most important and the most appropriate usability evaluation methods is presented. In section 3 an outline of the Heuristic evaluation method, which is the method used in the assessment, is presented together with the assessment procedure. In section 4 the actual assessment and its results are presented. Finally, the conclusions of the assessment are discussed in section 5. 2. Usability evaluation The term usability is described in ISO 9241-11 [ISO (2003)] as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use. The effectiveness is defined as the accuracy and completeness with which users achieve specified goals. The

Technological and Sociological Effects in the use of e-applications 545 efficiency measures the resources expended in relation to the accuracy and completeness with which users achieve goals. Finally satisfaction is the freedom from discomfort, and positive attitudes towards the use of the product. Nielsen [Nielsen (1993)] further described usability according to the five following basic parameters: Easiness and speed of learning of system use efficiency to use easiness to remember system use after certain period of time reduced numbers of user errors and easy recovery from them subjective satisfaction of users Many attempts have been made to measure usability according to these dimensions and evaluate interactive software systems. Some of the most established usability evaluation methods, relevant to our context of applications are briefly discussed in this section. There are two main categories that these evaluation methods can be distinguished: The Analytic and the Empirical methods. The analytic methods are based on theoretical models which simulate the behaviour of a user or on standards and rules. These methods often used in the laboratory at the phase of the syntax of specifications often before the development of the prototypes and without the participation of users. The empirical methods are based on the development and evaluation of the behaviour or the characteristics of a prototype or completed system. The empirical methods can be performed either in a laboratory or in the place that the system is in full operation. In the evaluation process the participant can be representative users as well as usability specialists [Avouris (2003); Crosby (1996); Lindgaard (1994)]. Attempting a different grouping of the methodologies used for the evaluation of usability, three basic categories of methods can be distinguished: Inspection methods, Experimental methods and Inquiry methods. These three categories include various methods that can be generally used in the evaluation of usability. In the case of the reported experiment, an outline of the methods that are suitable for the evaluation of a completed and fully operational system, like the website of HOU, is presented in the following paragraphs. Usability Inspection methods are based on having evaluators inspect or examine usability-related aspects of a user interface. Usability inspectors can be usability specialists, but they can also be software development consultants with special expertise, end users with content or task knowledge, or other types of professionals. In most cases, usability inspection methods aimed at finding usability problems in an existing user interface design and make recommendations for fixing the problems thus improving the usability of the system [Nielsen & Mack (1994)]. In this particular case the most appropriate from the inspection methods are Heuristic Evaluation, Cognitive Walkthrough and Design Guidelines and Standards. Heuristic evaluation is

546 11 th Panhellenic Conference in Informatics a method that is mainly based on rules of thumb and general skill, knowledge and experience of the evaluators. This method involves the judgment of usability specialists, on whether a user interface complies with established usability principles which called the Heuristics. Usability specialists judge either according to their own point of view or according to the observations of simple users of the interface. Cognitive Walkthrough is an inspection method that focuses on evaluating a user interface in terms of ease of learning. This method evaluates each step necessary to perform a task and in process reveals design errors that would interfere with learning by exploration. Design Guidelines and Standards are inspections where an interface is checked for conformance with a comprehensive list of usability guidelines and international standards. However this is a complicated procedure because of the amount of the guidelines and standards that exist and requires a high degree of expertise. The Experimental methods involve the observation of individual users performing specific tasks with the system under evaluation. In this method the evaluators are appropriate end users that perform representative tasks, under the discrete attendance of usability experts. The observation takes place in a properly designed and organized usability laboratory [Rubin J. (1994)] that is essential for this method. Experimental methods include: Performance measurement, Thinking Aloud Protocol and User logging. In the Performance measurement the system performance is evaluated against pre-defined criteria, like time to complete a task or numbers of errors made. Thinking Aloud Protocol requires the evaluators to express loud their thoughts, feelings and opinions while interacting with the system. Finally User logging involves the record of the evaluator s activities with the use of special equipment like cameras specialized software e.t.c. Finally, two of the most important Inquiry methods are the User Interviews and the Questionnaires. In these methods usability experts make direct questions to the users about the system. The answer to the questions can help the evaluators, which in this case are the usability experts, to draw conclusions about the parts of the system interface that pose difficulties to the users. On line questionnaires are particularly suitable to web applications [Feinberg & Johnson (1998)]. From all the above mentioned methods one, of the most suitable methods for usability evaluation of the website of HOU is the Heuristic Evaluation by usability experts. This method has been employed during the experiment discussed in the following section. 3. The Experiment Heuristic Evaluation [Nielsen & Mack (1994)] is a method that is easy to use (can be taught in a half-day seminar), it is fast (about a day for most evaluations) and it is

Technological and Sociological Effects in the use of e-applications 547 relatively cheap. It can also be employed in systems that are completed and fully operational. The process of the evaluation starts with a presentation of the interface that will be evaluated. In the process the evaluators work alone and they do not communicate with each other. The evaluation is separated in two phases. In the first phase the evaluator goes through the entire interface once to get the feel for the flow of the interaction and the general scope of the system. In the second phase the evaluator goes through the interface several times, inspects the various dialog elements and compares them with a list of recognised usability principles called Heuristics. These principles are not strictly defined and each Heuristic s value is dependent on the case and the usability specialist working with it [Avouris (2003)]. A set of usability Heuristics derived from a factor analysis of 249 usability problems [Nielsen & Mack (1994)] is the following: Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation This set of usability Heuristics is considered as an integrated set and covers all the characteristics of HOU website that were under evaluation. Hence it was the set of Heuristics that we used in the usability evaluation we conducted. The results of the method can be recorded as written reports from each evaluator, in the case of usability experts, or can be derived by having evaluators verbalize their comments to an observer (usability expert) as they go through the interface. Typically the method lasts between one or two hours for each individual evaluator. If longer sessions needed it would be better to split up the evaluation into several smaller sessions each concentrating on a part of the interface. A recommended number of evaluators that should participate in a heuristic evaluation are five, but certainly at least three according to Nielsen [Nielsen & Mack (1994)]. In the presented case 5 evaluators were involved. Two of the evaluators were usability specialists and the other three were experienced in Heuristic evaluation. 4. Method of the study

548 11 th Panhellenic Conference in Informatics During the study the evaluators were presented with the HOU website interface and they were encouraged to navigate through the application and carefully validate the implementation of each Heuristic rule. When a rule violation was detected, the evaluator identified where the violation occurred. At the end each evaluator filled a report describing his findings. In the following table (table 1), the number of detected errors for each Heuristic rule is presented. Table 1. Number of errors found in each heuristic rule Heuristic Rule 1 2 3 4 5 6 7 8 9 10 Number of detected errors 3 0 6 12 1 1 3 11 0 1 During the study thirty eight (38) usability flaws were revealed. Many of the usability problems were reported by more than one evaluator, confirming Nielsen s findings that four to five evaluators typically unveil 80 % of the overall usability problems. In the following paragraphs detected violations for each heuristic rule are presented and discussed. 1. Visibility of system status The Evaluators found that in the left menu when a link is selected it is not highlighted as it happens with other hyperlinks throughout the website and this confuses the user because it can not see easily which hyperlink has selected. Another flow that was discovered and that makes the user confused is the fact that when the right menu has many choices it becomes long and you have to scroll down the page in order to see the rest of the menu choices. This is not so clear because the bottom frame is static and does not move. 2. Match between system and the real world The terminology used in the website found very precise, clear and appropriate for typical users. Hence in this category there is not anything worth mention in terms of usability problem. 3. User control and freedom In this case the evaluators found that there is a lack of navigational links in the web pages. For instance if a page is selected and because of the amount of information gets too long there is no link that can lead to back to the top. Also there are no links that can lead to the main page of the category. Another problem detected was that once you navigate from the introduction page of the site to its first page then you can not return to the introduction page by using the button Back of the Internet Explorer. 4. Consistency and standards General guidelines and standards are followed across the site. Evaluators found some consistency problems though. The first pages of some categories are blank which

Technological and Sociological Effects in the use of e-applications 549 results in the confusion of the users. There is a variation of fond sizes and fonts in some pages compared with the majority of the pages. Another confusing thing is the bottom menu. This menu is not obvious with the first sight. It should have been in button format instead so as to be consistent with the top menu and easily recognizable. Finally in the left menu in some cases there is not clear with the first sight which is the link and witch is the header of a group of links that is not a link. 5. Error prevention The website is carefully designed, therefore no errors detected from the evaluators except from one which was found in the search engine. 6. Recognition rather than recall In general all options and possible actions are visible. In some cases though menus become too deep (6 levels deep), a factor that makes users confused because they have to remember where they must go in order to find something. 7. Flexibility and efficiency of use In general the site does not pose flexibility and efficiency problems. The problem here focuses in the section called Επικοινωνία. In that section there is an alphabetic list of people that can be found. This makes the search of the user difficult. It would be better for the user if people were categorized in terms of their characteristics like the department they work. Also it would be very useful if someone could be found by typing his/her name in the search engine. 8. Aesthetic and minimalist design Evaluators found that there are news and announcements in the introductory page of the site, which is not good practice in the design of a website, there is another link called news which not so obvious and finally a link for announcements. All these it would be better to be positioned in one place so as not to confuse the user. Despite the fact that the site is written in only one language (Greek) in the introductory page there is a Greek flag that is not necessary. Normally flags are employed in the first page of site so as to point out that there is a choice of language. In some cases you have to make to steps in order to access a webpage like Portal instead of navigate straight to the target. Finally in the section of education too much colour has been used. Furthermore the four coloured bars that represent the schools of the university are not so obvious that are links. 9. Help users recognize, diagnose, and recover from errors In the case of HOU website there is not much need for error messages. 10. Help and documentation For HOU website there is no need for the existence of Help and Documentation. The only thing that is missing and which would be very useful for the users is the existence of a site map.

550 11 th Panhellenic Conference in Informatics 5. Discussion-conclusions In this paper different usability evaluation methods of web sites were presented. One particular evaluation method was discussed and used for the assessment of HOU website. This approach provided useful insight into the application and revealed various usability problems most of which were not previously detected. Throughout this assessment, Heuristic evaluation was conducted by experts. This method is suitable for formative evaluation, as it can be used during design in prototypes of the application, but it can be used very effectively in a system that is already in use, like HOU website. The effectiveness of the method is depicted in the results of the assessment, where 38 usability flaws were detected. Furthermore, this method turns out to be time and cost effective. Acknowledgements This work was funded by the European Union - European Social Fund (75%), the Greek Government - Ministry of Development - General Secretariat of Research and Technology (25%) and the Private Sector in the frames of the European Competitiveness Programme (Third Community Support Framework - Measure 8.3 - programme PENED - contract no.03εδ832). References ISO 9241-11 (2003), Ergonomic requirements for office work with visual display terminals Guidance on usability. Avouris Ν. (2003), Human Computer Interaction (in Greek). Hellenic Open University Publications. Bevan N. (1998), Usability Issues in Web Site Design, Proceedings UPA 98 Conference, Washington DC. Crosby P. (1996), Quality is still free, McGraw Hill, New York. Feinberg, S., Johnson, P. Y. (1998), Designing and Developing Surveys on WWW Sites Getting Feedback on your Web Site, ACM 16th International Conference on Systems Documentation pp:38-42. Lindgaard G. (1994), Usability Testing and System Evaluation: A Guide for Designing Useful Computer Systems, Chapman and Hall, London, UK. Nielsen J. (1993), Usability Engineering, Academic Press, London. Nielsen J., Mack R. L. (1994), Usability Inspection Methods, John Wiley & Sons, Inc., New York. Rubin J. (1994), Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, John Wiley & Sons, Inc. Shum S. B. (1996), The Missing Link: Hypermedia Usability Research & the Web, ACM SIG-CHI Bulletin, Vol.28, No.4, pp.68-75.