Sharing designer and user perspectives of web site evaluation: a cross-campus collaborative learning experience



Similar documents
Utilising Online Learning in a Humanities Context

Testing Websites with Users

Keywords: Introduction. Laura Downey 1 Vignette Corporation 3410 Far West Blvd., #300 Austin, TX ldowney@vignette.

The Usability of Electronic Stores based on the Organization of Information and Features

Applying Cost-Benefit Analysis for Usability Evaluation Studies on an Online Shopping Website

Building Effective Blended Learning Programs. Harvey Singh

new techniques of information organization and retrieval have become both possible and necessary. Now that the Internet and the World Wide Web (WWW)

USING A WEB BASED COLLABORATIVE LEARNING MANAGEMENT TOOL TO TEACH PROFESSIONAL ISSUES

APPENDIX A (CFO/263/09) Merseyside Fire & Rescue Service ICT Outsourcing Procurement Support. Final Report

Chapter 1 Introduction to Computer-Aided Learning

Developing online discussion forums as student centred peer e-learning environments

Educational Media, Online Learning, Didactical Design, Master Program, Internet

REVEALED: THE BLUEPRINT FOR BUSINESS SUCCESS Find out if you are achieving your full growth potential

How To Teach At The Australian Council For Education

Data Warehousing and Decision Support Tales from a Virtual Classroom

Course Specification

Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations

Analyzing lifelong learning student behavior in a progressive degree

Issues and Trends of Information Technology Management in Contemporary Organizations

E-learning evaluation: A cross-technique comparison

Evaluating Pedagogical Quality: An Integral Part of Course Development Strategy

BLENDED LEARNING APPROACH TO IMPROVE IN-SERVICE TEACHER EDUCATION IN EUROPE THROUGH THE FISTE COMENIUS 2.1. PROJECT

Learning from web-based instructional systems and cognitive style

Criteria for the Accreditation of. MBM Programmes

HOW CAN WE EDUCATE STUDENTS ON THE WEB ENGINEERING DISCIPLINE VIA THE WEB? THE NTUA'S APPROACH. Symeon Retalis, Paris Avgeriou, Manolis Skordalakis

Appendix A Protocol for E-Learning Heuristic Evaluation

New pedagogies and delivery models for developing open online courses for international students

An Iterative Usability Evaluation Procedure for Interactive Online Courses

Leah Hughes Capstone Project Faculty Teaching Certificate Program

VIRTUAL UNIVERSITIES FUTURE IMPLICATIONS FOR

Online Collaboration - A Model For Success

Evaluation of an Applied Psychology Online Degree Maggie Gale, University of Derby

British Journal of Educational Technology Vol 35 No

Course Specification MSc Information Management (INMAM)

New product development within ABC Electronics

The University s course specification template has been developed to fulfil three main functions; it shall act:

Enhancing teaching and learning at UCL: the Access to Core Course Materials Project and the Key Skills Web Development Project

BLENDED LEARNING AND LOCALNESS: THE MEANS AND THE END

Listening to Student Opinions about Group Assessment

Survey Results and Further Issues in Construction Automation Education

Editorial: Learning, teaching and disseminating knowledge in business process management

How To Teach Chinese Language To Distance Learners

Virtual Teams and Group Collaboration Technologies:Challenges in Supporting Distributed Groups

Honours Degree (top-up) Computing Abbreviated Programme Specification Containing Both Core + Supplementary Information

Programme Specification (Postgraduate) Date amended: March 2014

Lessons Learned by engineering students on placement

The Value of Consulting

COURSE OR HONOURS SUBJECT TITLE: BSc Hons Information Technologies with/without DPP/DPP(I)/DIAS with CertHE and AB exit awards (FT)

Case Study: Public Relations

REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME

Programme Specification. MSc Human Resource Management. Valid from: September 2015 Faculty of Business

The Project Browser: Supporting Information Access for a Project Team

How To Complete An Rics Postgraduate Diploma In Project Management

Simulating Distributed Leadership. Dr Ian Heywood and Russell Williams

Strengthening the Performance Framework:

Twin approaches to research: the Penn State and Flinders experience

A STRUCTURED METHODOLOGY FOR MULTIMEDIA PRODUCT AND SYSTEMS DEVELOPMENT

Article. Developing Statistics New Zealand s Respondent Load Strategy. by Stuart Pitts

REGULATIONS FOR THE DEGREE OF MASTER OF SCIENCE IN INFORMATION TECHNOLOGY IN EDUCATION (MSc[ITE])

1. Building online social communities: Helping your members cross the observer / participant barrier

Course outline. Code: EDU101 Title: Human Development and Learning

Evaluation of Fiji National University Campus Information Systems

Location, Location, Location: Challenges of Outsourced Usability Evaluation Murphy, John; Howard, Steve; Kjeldskov, Jesper; Goshnick, Steve

Deckblatt. Zugriff von:

CASE STUDIES, TECHNOLOGY AND PEDAGOGY: DELIVERING CASES THROUGH MULTIPLE PLATFORMS

Reproductions supplied by EDRS are the best that can be made from the original document.

Users as Producers: students using video to develop learner autonomy

Developing Clinical Self-Assessment Skills in First-year Dental Students

INTENSIVE TEACHING IN LAW SUBJECTS

How To Teach An Mba

City University of Hong Kong. Information on a Course offered by Department of Information Systems with effect from Semester A in 2014 / 2015.

The Orientation of Australian Coursework-focussed Marketing Masters Degrees

Factors that Influence the Occupational Health and Safety Curricula. Jeffery Spickett. Division of Health Sciences Curtin University Australia

Interface Design Rules

Psychology. Undergraduate


Valuing collaboration in design for the World Wide Web: a creative team approach

ENGAGING STUDENTS IN ONLINE CLASSES WITH WIMBA: A PRELIMINARY STUDY

THE CHANGING ROLE OF LIBRARIANS AND

To measure or not to measure: Why web usability is different. from traditional usability.

Valid from: 2012 Faculty of Humanities and Social Sciences Oxford and Cherwell Valley College Thames Valley Police

Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview

STUDENT PERSPECTIVES ON COMMUNICATION: A CASE STUDY ON DIFFERENT METHODS OF COMMUNICATION USED BY ENGINEERING STUDENTS

Designing and Evaluating a Web-Based Collaboration Application: A Case Study

ONLINE LEARNING AND COPING STRATEGIES. Marcela Jonas, University of the Fraser Valley, Canada

Pedagogical Criteria for Successful Use of Wikis as Collaborative Writing Tools in Teacher Education

Quality Assurance of Medical Appraisers

UCD TEACHING AND LEARNING

How To Use Data Mining For Knowledge Management In Technology Enhanced Learning

Good Practice in Assessment Case Study Foundation Studies in the Department of Tourism, Leisure, Hotel, and Sport Management.

Study of characteristics of Effective Online Help System to Facilitate Nurses Interacting with Nursing Information System

"A Role With No Edges": The Work Practices of Information Architects

Teaching and Learning Strategy for the Department of Medical Physics and Bioengineering (MPB) Stage 1: the narrative or vision

Technology Use in an Online MBA Program. Mengyu Zhai, Shijuan Liu, Curt Bonk, Seung-hee Lee, Xiaojing Liu, Richard Magjuka Indiana University

Hong Kong University elearning

Web-Based Informatics Education: Lessons Learned From Five Years in the Trenches

Usability Testing and Search Log Analysis

Recorded Lectures: An Opportunity for Improved Teaching and Learning

Programme Specification (Postgraduate) Date amendedmarch 2014

Document management concerns the whole board. Implementing document management - recommended practices and lessons learned

Transcription:

British Journal of Educational Technology Vol 33 No 3 2002 267 278 Sharing designer and user perspectives of web site evaluation: a cross-campus collaborative learning experience Penny Collings and Jon Pearce Penny Collings is a Senior Lecturer in Information Systems in the Division of Management and Technology, University of Canberra. She is researching computer supported collaborative work and learning. Jon Pearce is Senior Lecturer in the Department of Information Systems, The University of Melbourne, where he has research interests in interactivity and on-line learning. Email addresses: Jon Pearce jonmp@unimelb.edu.au; Penny Collings pennyc@ise.canberra.edu.au Abstract In this paper we present an online, collaborative process that facilitates usability evaluation of web sites. The online workspace consists of simple and effective proformas and computer-mediated discussion space to support usability evaluation. The system was designed and used by staff and students at two universities. Students, working in small teams, at each university, developed web sites and then evaluated the usability of web sites developed at the other university, using the results to improve their own sites. Our project evaluations show that the process provides valuable feedback on web site usability and provides students with the experience of usability evaluation from two important perspectives: those of a user and of a developer. Further, students develop important generic skills: the ability to participate in and critique computer supported cooperative work environments. Introduction Students at the University of Melbourne and the University of Canberra are involved in courses that use information technology to supplement face-to-face activities. For example, students usually attend campus-based classes and prepare for these by doing exercises or reading material provided electronically. If they miss a class they can catch up by checking online resources. They often communicate with each other and staff using email. Many students involved in computing courses have created and used online workspaces or multimedia web sites to support their group project work. A significant component of their assessable work is developed and presented online (Collings et al, 1997, 1998, 2001). Published by Blackwell Publishers Ltd, 108 Cowley Road, Oxford, OX4 1JF, UK and 350 Main Street, Malden, MA 02148, USA.

268 British Journal of Educational Technology Vol 33 No 3 2002 Building on these work and learning practices, we set up a collaborative project between the two universities which enabled students to: extend their understanding of computer-mediated communication to include cooperative work with students they had not met; take advantage of the communication medium by participating in a task that is pedagogically significant in relation to their discipline; be active participants in their own learning, to learn both in a realistic context and by collaboration with others. These are elements of a constructivist approach to learning (Jonassen, 1999; Ewing et al, 1998; Fowell, 1998) using computer-mediated communication. The staff involved in this project (the authors of this paper) are engaged in teaching subjects addressing issues in human-computer interaction. At the University of Melbourne, the subject involves learning multimedia design and developing web sites for university research groups. At the University of Canberra, students participate in a behavioural simulation in which they create and work in a fictitious organisation called the Cultural Heritage Authority. The students are required to develop information systems to support the work of the organisation. They have considerable discretion about the information systems that they develop but these typically include an intranet and one or more web sites. The project reported in this paper strengthened all students understanding of the information technology design and development process by having students at each university evaluate the others web sites and provide feedback to the design teams. In this paper we present the collaborative usability evaluation process we developed and consider its pedagogical value. Defining web site usability evaluation: process and criteria Process Web site usability evaluation means different things to different people but its general purpose is to provide input to an iterative and participative design process (Nielsen, 1993; Monk et al, 1993; Lindgaard, 1994; Spool et al, 1999). Some usability evaluation processes involve heuristic evaluation in which several usability experts evaluate the interface and judge its compliance with usability heuristics; some involve users, who provide feedback by undertaking typical tasks which the system has been designed to support. There are some other approaches to evaluating the usability of web sites specifically involving remote users. These include the use of agents to collect usability data from remote users (Hilbert and Redmiles, 1999), the use of a bulletin board to collect feedback from remote users (Millen, 1999) and the use of computer-to-computer video conferencing supported by an online whiteboard to observe, listen to and collect ideas from remote evaluation participants (Hammontree et al, 1994 and Hartson et al, 1996). These systems involve remote participants as users and facilitate the use of a

Sharing perspectives of web site evaluation 269 variety of users from potentially all over the world to give feedback on systems that are designed for a multinational audience. They do not involve remote users designing and then reporting the results of a usability evaluation process and do not offer the two perspectives that participants gained in our project (that of user and developer). Our project offers some of the potential advantages of involving users around the world but has a focus on teaching and learning the process and value of usability evaluation. Whatever the process, it is important to consider why the one chosen is appropriate and how it will yield results that will assist in informing the design process. Developing a process was a learning activity for the students involved. They had some lectures, accessed the literature and held group discussions about the evaluation process. Our students decided on an informal approach that they believed would lead to results that were valuable to the designers and efficient for the users. They typically worked in small teams, undertook individual heuristic evaluations using the criteria shown below, then developed a team report based on discussions between the team members in which they came to an agreed set of evaluation comments. They provided feedback that informally combined the perspectives of a user and a (beginning) expert reviewer. Criteria In the year prior to our work, University of Canberra interface design students discussed and selected some usability heuristics based on Nielsen (1994) and these were put forward for consideration in this project. University of Melbourne students were not comfortable with these and debated the matter in a seminar. As a result, heuristics were proposed and accepted between the two universities (see Table 1) though the process was a little rushed. The debate indicated that students who are just starting in the area of usability evaluation may find these heuristics difficult to understand. It may be important to find criteria more readily understood by everyday users rather than those with more formal professional skills in user interface design, psychology and ergonomics (Tweddle et al, 1998). The heuristics used in our project need further refinement as part of the continuing development of the process. Table 1: Usability heuristics proposed by students Users should be kept informed where they are and where they should go. There should be appropriate use of language and media. User friendly navigation control. Consistent design throughout site; following web conventions. Error prevention: Debug, debug, debug! Recognition rather than recall. Allow user to customise the site to their own needs. Aesthetic and minimalist design. Error messages to be expressed intelligibly. Help and documentation.

270 British Journal of Educational Technology Vol 33 No 3 2002 Supporting remote web site usability evaluation: the design of the online co-operative work environment Although they had considerable experience with online workspaces, neither group of students had experience working in a solely electronic mode for setting up and completing their collaborative work. In this case we assumed that structure, by way of discrete and well-defined tasks, would facilitate communication and workflow between them and that this would be complemented by the taking of defined roles (as designers and as evaluators). This strategy fitted well with our experience with student project groups undertaking short, explicit, collaborative tasks (Walker, 1999). In this section we present one site (Statistical Playground) to illustrate the structure of the online workspace that facilitated computer-mediated communication, usability evaluation and the sharing of information. For full details of each proforma and of the usability heuristics see http://simnotes.canberra.edu.au/muceval.nsf?opendatabase. The structure was provided by breaking the process into three tasks, each of which was supported by a web proforma (see Figure 1): registration, as the design team, of their site for evaluation; sign-up, as an evaluating team, to evaluate a site; posting of evaluation comments. Figure 1: Web site registration, sign-up and usability evaluation process with room for extra comments

Sharing perspectives of web site evaluation 271 The registration proforma was the first item completed by each student development team and it required them to provide (see Figure 2): the name and URL of their site; a site specification (this typically included the aim of the site and some background to assist users to understand the business/work domain); the design features of the site (so that users could consider the intentions of the design group and any specific standards the group had used); scenarios for use (common tasks that users could be expected to want to undertake). The sign-up process was next and involved completing a comment form, eg, UC day group N will evaluate this site (see Figure 1). The sign-up proforma allowed an evaluation team to choose a particular site to evaluate. Following sign up, student teams evaluated the web site they had selected. The final step, the posting of results, required the completion of an evaluation proforma. The form required the evaluating team to provide: the name and membership of the usability evaluation team; any general comments that the evaluation team wished to make; Figure 2: Site registration (completed by designers)

272 British Journal of Educational Technology Vol 33 No 3 2002 Figure 3: Site evaluation showing three of the criteria used results organised under ten usability criteria (generally as in Table 1); these heuristics were described in some detail on the evaluation proforma. An example, with general comments and the response to three of the criteria, is shown in Figure 3. The usability evaluation process was supported by: a comment form through which students raised and discussed any issues of immediate importance to the evaluation process; this was used to keep groups up-todate, for example, about the status of the development or evaluation of a site (see Figure 1 for an example of the use of comments); a link to the Nielsen usability heuristics (Nielsen, 1997) and to a site by Instone who adapted Nielsen s heuristics for use in evaluating web sites (Instone, 1997); a link to a web site where usability evaluation is discussed (Bickford, 1998); a synchronous discussion area (AussieMOO) where students could raise issues as they arose (this was not used to any significant degree). The starting point (home page) for the project is shown in Figure 4.

Sharing perspectives of web site evaluation 273 Figure 4: Home page for the collaborative usability evaluation of web sites One week of web site evaluations For this exercise to be successful, students had to feel committed to completing the process in a short period. To facilitate this we created an online asynchronous discussion area where students met a few weeks before running their usability test, learning a little about Lotus Notes (in the case of the University of Melbourne students) and developing their shared agreement to undertake the work. This was done in a friendly, rather jocular manner. Then, at the agreed time, in the last two weeks of semester when students felt they had developed their sites online to a testable stage, students firstly registered their sites for evaluation then signed up to perform evaluations of other sites. The intention was that every site would be evaluated and the students and staff tried to achieve this by having an equal number of teams and sites to evaluate. The proforma assisted this process but did not ensure it. Students then proceeded to test and provide feedback on their nominated site(s). Some sites were, in fact, evaluated by several groups, some (eg, where the site seemed incomplete or the site name was not informative) were not evaluated at all. In some cases University of Canberra students evaluated their peers sites. These matters were negotiated by comments at the evaluation site, occasional email exchange informally set up through the site registration process and occasional phone calls and emails between

274 British Journal of Educational Technology Vol 33 No 3 2002 staff. The system proved robust enough to cope with these variations and pointed to other potential uses of the system (eg, that any user could evaluate any site and that a site could be evaluated by multiple users or user groups, to the advantage of all concerned). Despite some logistical problems, students considered that the process generated valuable feedback about each site. By the University of Canberra deadline, their four core sites had received evaluations though more came in later and could not be used. Fourteen of seventeen registered University of Melbourne sites had been tested; the other three were not ready in time. Project evaluation The project was evaluated by both students and staff. The three staff and one research consultant at the University of Canberra and the two staff at the University of Melbourne who were responsible for the subjects, discussed pedagogical and logistical issues during and at the completion of the project. 85 students at the University of Melbourne and 58 students at the University of Canberra provided the following data that formed the basis of the evaluation process: The web sites, the site registration data, the agreed usability criteria and the completed site usability evaluation feedback proformas were available as a formal record of the work done. The asynchronous discussion area in which the students introduced themselves and in which the University of Melbourne students learnt a little about how to use Lotus Notes was available for perusal. Students at the University of Canberra completed a questionnaire about their perceptions of and satisfaction with the usability evaluation process a day after the process was due to be completed. 34 students (of the 58 enrolled) completed the questionnaire. Students at the University of Canberra wrote a short reflective paper reporting and evaluating the activity. Seventeen papers, some by groups, others by individuals, were available from the 58 participants. Staff at the University of Melbourne and the University of Canberra collected some informal data by discussing usability evaluation issues with students both during and after the evaluation process. Some of the data are problematic because, when the University of Canberra survey was run, many of the University of Melbourne usability tests had not yet been completed. Further, of the University of Canberra students completing the questionnaire, some were more active in the web site development than others. All were involved in the evaluation of University of Melbourne sites. This meant that University of Canberra students felt their feedback was of considerable value to University of Melbourne students; however, only those actively involved in the University of Canberra web site development process found the University of Melbourne feedback to be useful. Overall, the evaluation data were categorised and interpreted by the expert staff participants. This interpretive work led to the following outcomes and suggestions for

Sharing perspectives of web site evaluation 275 how to improve the process (the outcomes given here are based on a report in Walker, 1999 and confirmed by University of Melbourne staff). Positive features of the collaboration noted by the students were: It provided insights into the systems analysis, design and development process by placing participants in the role of users for another site as well as the developers of their own. It provided information about how other people approach similar tasks. It provided an exercise in analysing a system from a purely design perspective, independent of content. Students thought that external users were likely to form a less subjective evaluation this exercise provided such an opportunity for feedback. The knowledge that there were to be external users evaluating the site made developers more focussed on developing a good product. However, students also reported a number of problems that reduced the effectiveness of the collaboration: The usability tests took place right at the end of semester and were very rushed; many reports were not received in time to be used to make changes to the sites. University of Canberra students were given 10% of their marks for this exercise. University of Melbourne students were given a smaller mark. The co-ordination of assessment value is important as students regard this as a reflection of the value staff place on the work and hence of the effort and commitment required of them. A simpler way of listing sites to be tested is needed. Site names need to be informative. The proforma could be improved by having a specifically named sign-up form where testers indicate that they will test a site. The feedback form could contain a section where the usability evaluation method used is described. Some sites changed radically while the usability tests were being performed and this caused some frustrations and delays. A number of different sets of design guidelines were being used, which made evaluation for conformance to standards difficult (though this was not, typically, the basis of evaluations). The low level of contact between the University of Melbourne and University of Canberra students created a number of concerns. The student testers did not have a convenient way to discuss the business purpose of the site with domain experts. This meant that they were not always clear about the purpose of the site, particularly as the developers were not able to see users using their site or to navigate through it with them. Staff also indicated that they would like to further discuss and develop the usability criteria used to categorise and report the outcomes of the usability evaluations. Learning outcomes and further development Overall, the outcomes were seen to be surprisingly valuable by participants. Usability testing is a very important aspect of software development. This exercise was extremely helpful in bringing focus to the issues involved in carrying out a usability test and using the

276 British Journal of Educational Technology Vol 33 No 3 2002 results to improve the quality of the developed software. Even though we do not know how many of the issues raised in our evaluation report were addressed by the developers of the site we evaluated, it was an extremely valuable experience for our group. (written reflection by one UC student group) Further, students found the process to be a valuable way of focussing on usability issues both as designers and users. We believe that the whole exercise has been very useful to all of us. The idea of bilateral testing of our product by Melbourne University and University of Canberra was very good. This gave us an opportunity to view a system from both [a] user s as well as designer s point of view. (written reflection by one UC student group) The bottom line is that the students are given the chance to feel the real world in systems analysis and design. (written reflection by one UC student group) Several groups said that each member undertook an heuristic evaluation and that the group then discussed the outcomes to reach a consensus position. This provided further learning opportunities, for example: It was a good idea that we were encouraged to test the site individually and then prepare a combined report at the group level this allows a greater understanding of the concepts, procedures and usefulness of usability testing. (written reflection by one UC student group) However, this process could be improved by having each group explicitly describe their evaluation method (on the evaluation proforma) for two reasons: 1. knowledge of a range of methods may be shared between groups; 2. the method used may influence the value ascribed by the development team to the test results. As far as the design of co-operative work is concerned, little feedback was generated about the online support side of things. This type of experiential learning of generic skills often goes unnoticed by students, at the time. Also, their previous experience, and that of the designers, led to a simple and effective means of online collaboration. Students found the online environment easy to use but were able to critique the system to suggest small changes to the site. However, students did raise a broader important issue about this type of collaborative work and whether it can all be undertaken remotely and electronically. Students said that local users who knew the business domain would give one view of the system, remote users would give another combined, that would give valuable feedback. a mix of in-house and external people should be asked to test the system. This would give a balance and larger sample size so that the feedback can be analysed to see if there was any material difference between the two groups. (written reflection by one UC student group) Further, the local users may have background knowledge about the site or could more readily acquire it. The local users could also be observed by the design team. The online system is flexible and can easily support this. Local and/or remote users can place their evaluations on the web site.

Sharing perspectives of web site evaluation 277 Conclusion It is important to note the limitations imposed by engaging remote usability evaluation but also to consider its strengths. Overall, the outcome was positive. Students gained insights into the usability evaluation process, the value of independent usability tests of their web site designs and the process of also providing such feedback to others. They also extended their understanding of how to use computer-mediated communication to engage in collaborative work and learning. The project provides a prototype of a system sharing designer and user perspectives of web site evaluation. Acknowledgement The authors thank Associate Professor Michael Nott, University of Melbourne, and Dr David Walker, University of Canberra for their contribution to the design, implementation and evaluation of the project. Funding for some design and data analysis for this project came from a 1998 CUTSD grant, TLC: Building a Teaching and Learning Community in the Information Systems Discipline, and from a 1998 University of Canberra infrastructure grant for flexible learning. References Bickford P (1998 date of viewing) Human interface: usability testing http://www.devworld. apple.com/mkt/informed/appledirections/may97/humaninterface.html Collings P A, Kleeman D, Richards-Smith A and Walker D W (1997) Developing new group work practices: an evaluation of the design and use of groupware-based work systems for a graduate student course in Information Systems in Gable G G and Weber R A G (eds) The Confluence of Theory and Practice: 3rd Pacific Asia Conference on Information Systems Proceedings QUT Brisbane 185 192. Collings P A, Pitt C and Walker D W (1998). Enriching information technology courses through flexible delivery in Goodell J E (ed) Proceedings of the Australasian Joint Regional Conference of GASAT and IOSTE Curtin University 283 289. Collings P, Trevitt C and Absalom M (2001) Developing a shared understanding of IT-supported learning environments, Proceedings of OZCHI2001. Ewing J M, Dowling J D and Coutts N (1998) Learning using the world wide web: a collaborative learning event Journal of Educational Multimedia and Hypermedia 81 3 22 Association for the Advancement of Computing in Education. Fowell S (1998) Applying constructivist principles to the information curriculum Proceedings of SITIS98 UNSW (unpublished). Hammontree M, Weiler P, and Nayak N (1994) Remote usability testing Interactions July 21 25. Hartson H R, Castillo J K, Kelso J and Neale W (1996) Remote evaluation: the network as an extension of the usability laboratory CHI96 Proceedings ACM Press 228 235. Hilbert D M and Redmiles D F (1999) Separating the wheat from the chaff in internet-mediated user feedback expectation-driven event monitoring SIGGROUP Bulletin 20 1 35 40. Instone K (1997 date of viewing) Usability Heuristics for the Web http://webreview.com/wr/ pub/97/10/10/usability/sidebar.html Jonassen D (1999 date of viewing) Thinking Technology: Toward a Constructivist Design Model ftp://ithaca.icbl.hw.ac.uk/pub/nato_asi/dhj.txt.gz Lindgaard G (1994) Usability Testing and Systems Evaluation: A Guide for Designing Useful Computer Systems. Chapman & Hall London. Millen D R (1999) Remote usability evaluation: user participation in the design of a web-based email service SIGGROUP Bulletin 20 1 40 44. Monk A, Wright P, Haber J and Davenport L (1993) Improving Your Human-Computer Interface a Practical Technique. Prentice Hall.

278 British Journal of Educational Technology Vol 33 No 3 2002 Nielsen J (1993) Usability Engineering. Academic Press Boston. Nielsen J (1994) Enhancing the explanatory power of usability heuristics CHI 94 Proceedings ACM 152 158. Nielsen J (1997 date of viewing) Ten Usability Heuristics http://www.useit.com/papers/heuristic/ heuristic_list.html Spool J M, Scanlon T, Schroeder W, Snyder C and DeAngelo T (1999) Web Site Usability: A Designers Guide. Morgan Kaufmann Publishers California Tweddle S, Avis P, Wright J and Waller T (1998) Towards criteria for evaluating web sites. British Journal of Educational Technology 29 3 267 270. Walker D W (1999) Analysis of Groupware Usage 1997 1998. University of Canberra Division of Management and Technology Internal Report R99/113.