WebTracer: Evaluating Web Usability with Browsing History and Eye Movement

Similar documents
Remote Usability Evaluation of Mobile Web Applications

Analyzing Individual Performance of Source Code Review Using Reviewers Eye Movement

Empirical Project Monitor: A Tool for Mining Multiple Project Data

Universal Simple Control, USC-1

Interactive Information Visualization of Trend Information

Awase-E: Image-based Authentication for Mobile Phones using User s Favorite Images

15.4 Predictive Models

Input and output devices for specific needs

Development of the Coding Process Analysis System for Programming Education. Tatsuyuki Takano, Takashi Kohama, Osamu Miyakawa. (pp.

ADDING DOCUMENTS TO A PROJECT. Create a a new internal document for the transcript: DOCUMENTS / NEW / NEW TEXT DOCUMENT.

USING EYE TRACKING TECHNOLOGY FOR WEB SITE USABILITY ANALYSIS: THE APPLICATION OF ERICA TO GEFANUC.COM

Johannes Sametinger. C. Doppler Laboratory for Software Engineering Johannes Kepler University of Linz A-4040 Linz, Austria

Power Point 2003 Table of Contents

A new cost model for comparison of Point to Point and Enterprise Service Bus integration styles

Designer-Oriented Electric Field Analysis System for Power Cable Accessories

On-line Manual Video Capture Software

Personal Identification Techniques Based on Operational Habit of Cellular Phone

ScianNews Vol. 9, No. 1 Fall 2006 CRF Design, Kyung-hee Kelly Moon 1

THE DESIGN OF WORKER S BEHAVIOR ANALYSIS METHOD IN WORKPLACE USING INDOOR POSITIONING TECHNOLOGY

Integrating The Student ID Card with the Ticketing System FINAL REPORT

Video in Logger Pro. There are many ways to create and use video clips and still images in Logger Pro.

An Introduction to. Metrics. used during. Software Development

Fig.1 Electoronic whiteboard and programming education system

Enterprise Organization and Communication Network

Teaching AutoCAD in the Virtual Classroom

Creating a Poster in PowerPoint A. Set Up Your Poster

Use of Human Big Data to Help Improve Productivity in Service Businesses

Remote Usability Testing Using Eyetracking

Web Usability Probe: A Tool for Supporting Remote Usability Evaluation of Web Sites

Module 9. User Interface Design. Version 2 CSE IIT, Kharagpur

Communication Support System Between Persons with Dementia and Family Caregivers using Memories

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS

Chapter 3. Application Software. Chapter 3 Objectives. Application Software

Use your mobile phone as Modem loading Videos

Virtual Fitting by Single-shot Body Shape Estimation

Implementing Portfolio Management: Integrating Process, People and Tools

Toward Cognitive Modeling for Predicting Usability

Introduction to Measurement Tools

Software Development and Usability Testing

CREATE A 3D MOVIE IN DIRECTOR

Introduction to Word 2007

Shutting down / Rebooting Small Business Server 2003 Version 1.00

A Design of DC/DC Converter of Photovoltaic Generation System for Streetcars

A Research Framework for Empirical Software Engineering Collaboration and Its Application in a Software Development Project

Overview of Microsoft Office Word 2007

Windows Media Player 10 Mobile: More Music, More Choices

Intinno: A Web Integrated Digital Library and Learning Content Management System

Testing Websites with Users

A Case Study of Calculation of Source Code Module Importance

Top 10 Skills and Knowledge Set Every User Experience (UX) Professional Needs

Maryland Technology Literacy Standards for Students

Voice Driven Animation System

The investigation is an individual project undertaken by you with support from your teacher/lecturer to show that you can:

Testing Requirements for Mobile Applications

Writing Reports BJECTIVES ONTENTS. By the end of this section you should be able to :

An Empirical Study of Process Management and Metrics based on In-process Measurements of a Standardized Requirements Definition Phase

I. The SMART Project - Status Report and Plans. G. Salton. The SMART document retrieval system has been operating on a 709^

Information & Data Visualization. Yasufumi TAKAMA Tokyo Metropolitan University, JAPAN ytakama@sd.tmu.ac.jp

Using Emergent Behavior to Improve AI in Video Games

Ubiquitous Memories: Wearable Interface for Computational Augmentation of Human Memory based on Real World Objects

Introduction to Smart Board. Table of Contents. Connection Basics 3. Using the Board (Basics) 4. The Floating Tools Toolbar 5-6

F-Series Desktop User Manual F20. English - Europe/New Zealand

PASS-IMAGE AUTHENTICATION METHOD TOLERANT TO RANDOM AND VIDEO-RECORDING ATTACKS

Intelligent Analysis of User Interactions with Web Applications

MimioStudio ActivityWizard

SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software.

Beyond the Dashboard. A solution that integrates analysis throughout the entire decision-making

Base Station Design and Architecture for Wireless Sensor Networks

Suggestions of Prototyping Tool for Service Design Co-Creation

Development and Application of a Distance Learning System by Using Virtual Reality Technology

Organizational Social Network Analysis Case Study in a Research Facility

Provisioning algorithm for minimum throughput assurance service in VPNs using nonlinear programming

How we design & build websites. Menu 2 Menu 3 Menu 4 Menu 5. Home Menu 1. Item 1 Item 2 Item 3 Item 4. bob. design & marketing

C1. Developing and distributing EPM, a tool for collecting quantitative data.

Usability Testing. Credit: Slides adapted from H. Kiewe

Perfect PDF 8 Premium

Choosing your Preferred Colours in Windows

ZoomText 10.1 for Windows 8 Quick Reference Guide Addendum

A Framework for Improving Web 2.0 Interaction Design

METHODOLOGIES FOR STUDIES OF PROGRAM VISUALIZATION

ETERE SNMP CONSOLE: A centralized, automatic and real-time monitoring of broadcast system networks

Working With Microsoft PowerPoint

An Online Automated Scoring System for Java Programming Assignments

Applications and Benefits of Ethnographic Research

A Multi-Agent Approach to a Distributed Schedule Management System

A Behavioral Biometric Approach Based on Standardized Resolution in Mouse Dynamics

Site-Specific versus General Purpose Web Search Engines: A Comparative Evaluation

HCI in Software Development

Creating a Time-lapse Effect in Corel VideoStudio Pro

Effects of Pronunciation Practice System Based on Personalized CG Animations of Mouth Movement Model

Hands-on Practice. Hands-on Practice. Learning Topics

Create a Poster Using Publisher

User research for information architecture projects

A DESIGN AND DEVELOPMENT OF E-LEARNING CONTENT FOR MULTIMEDIA TECHNOLOGY USING MULTIMEDIA GAME

Pass-Image Authentication Method Tolerant to Video-Recording Attacks

HOW SENIOR FRIENDLY IS YOUR WEBSITE? Stephanie Dailey, M.A. Senior Public Affairs Specialist National Institute on Aging

Action settings and interactivity

Virtual Spirits control panel V5

Exploring new ways of Usability testing for an E-Science/ Scientific research application

Windows XP Pro: Basics 1

Transcription:

WebTracer: Evaluating Web Usability with Browsing History and Eye Movement Noboru Nakamichi, Makoto Sakai, Jian Hu, Kazuyuki Shima, Masahide Nakamura, Ken ichi Matsumoto Graduate School of Information Science, Nara Institute of Science and Technology 8916-5, Takayama, Ikoma, Nara, 630-0101, JAPAN {noboru-n, jian-hu, shima, masa-n, matumoto} @is.aist-nara.ac.jp SRA Key Technology Laboratory, Inc. Marusho Bldg. 5F, 3-12, Yotsuya, Shinjuku-ku, Tokyo, 160-0004, JAPAN sakai@sra.co.jp Abstract This paper describes a WWW site evaluation tool, WebTracer, which can record user's gazing points, a user s operational data, and the screen image of browsed pages. In addition, the WebTracer can replay a user s browsing operations. In an evaluation experiment, we record without interruption, a user's browsing operations using WebTracer. The average execution time per task in the experiment was 2 minutes and 48 seconds. We interviewed users by applying the usability-testing-support function based on a replay and summary of the WebTracer. The average time required for an interview was 19 minutes. 16 comments on the average were obtained for the execution time. The experimental results show that the various summarized user-operational data were helpful for getting more comments about web usability from the subjects. 1 Introduction Designing attractive Web sites is a crucial problem in business, since Web sites directly reflect the images and sales of companies (Goto & Cotler, 2002). Therefore, usability evaluation for web pages is now an important concern in finding flaws and shortcomings in the pages with respect to usability (Jakob, 1993). Web usability testing is a popular way to conduct usability evaluation. Web usability testing requires subjects (users ) to browse a target web site, and then evaluators get feedback from the users based on an interview. Usability testing has been widely studied and various methods have been proposed to date. However, most conventional methods must occasionally (or periodically) interrupt the user s browsing operations, to get opinions within a certain period of time on the web pages browsed. This discontinuous browsing creates difficulties in evaluating the usability of a whole Web site, which consists of a number of pages. To achieve effective continuous testing, we need to first record how users browse the entire site, and then we need to perform the interview by replaying the recorded data. Finally, we need to justify the feedback by analysis of the recorded data.

Time stamp Gazing point position Mouse event Screen image (JPEG file name) [00:00:07.750, 0000000035, 07/08/02] {MouseMove. (Position. 378 425) (Message. 200) (Window. "Internet [00:00:07.750, 0000000036, 07/08/02] {Eye. (Position. 939 743)} [00:00:07.750, 0000000038, 07/08/02] {MouseMove. (Position. 379 421) (Message. 200) (Window. "Internet [00:00:07.800, 0000000039, 07/08/02] {MouseMove. (Position. 379 420) (Message. 200) (Window. "Internet [00:00:07.860, 0000000041, 07/08/02] {MouseMove. (Position. 380 419) (Message. 200) (Window. "Internet [00:00:07.910, 0000000042, 07/08/02] {MouseMove. (Position. 381 418) (Message. 200) (Window. "Internet [00:00:07.970, 0000000043, 07/08/02] {Eye. (Position. 187 624)} [00:00:07.970, 0000000044, 07/08/02] {ButtonDown. (Position. 381 418) (Moved. 882.0) (Message. 201) (Window. "Internet [00:00:07.970, 0000000045, 07/08/02] {CaptureImage. WT0208070115400001.jpg} [00:00:07.970, 0000000046, 07/08/02] {MouseMove. (Position. 381 418) (Message. 200) (Window. "Internet Figure 1: Example of data collected by the WebTracer However, no tool currently exists to allow the cooperation of the above tasks effectively, although each task requires sophisticated methodologies. Therefore, conventional methods had to utilize much less data than was actually performed by the users. In this research, we have developed an integrated usability evaluation tool, called WebTracer, which can record user operational data without discontinuation of user operational data and can be replayed. In addition, we demonstrated WebTracer s effectiveness through an experimental evaluation. 2 WebTracer WebTracer is an integrated environment for web usability testing. It can record a user s browsing operations, replay the user s recorded browsing history, and provide analysis tools which can depict graphs and calculate statistical equations. WebTracer is optimize d especially in the following two features. 2.1 Recording web operation WebTracer records the various user operational data needed for replay and analysis. Specifically, WebTracer records user s gazing points via the camera eye, mouse movements and clicks, keyboard inputs, and the screen image of the browsed pages. An example of data collected by WebTracer is shown in Figure 1. Unless the appearance of the browsed page changes, WebTracer does not record browsed screen image. The image is captured only when a transition of the browsed page is triggered by a user's events (e.g., mouse click to follow the next links). Thus, the size of the recorded image can be significantly reduced to 1/10 to 1/20 of the size of recorded data when compared with data recorded in an Mpeg-2/4 format. 2.2 Usability testing support based on replay and summary WebTracer can support usability testing by using a replay of the user s operations, summarized data, and graphs derived from the recorded data. By using the summarized data, we can capture the characteristics and statistics of each page, which helps with the analysis of a Web site. Recorded data are summarized in the form of a table for every page, as is shown in Figure 2. The data can also be shown in graph form. An example of an eye movement statistics graph is shown in Figure 3. In addition, an example of the replay screen with the eyemark of the user (the user s gazing point) is shown in Figure 4. The replay feature reproduces operations, such as the eyemark

and mouse cursors, operations performed when the page is being browsed. In another window, WebTracer can display other events, such as a keystroke. Moreover, at any time during the recording, we can insert annotations and replay these annotations later. Figure 2: Example of a summary (summarized browsing history) Eye movement distance Eye movement speed Order of browsed pages Figure 3: Example of eye movement statistics graph Eyemark Mouse cursor Figure 4: Example of replay screen with eyemark 3 Experimental Evaluations We have conducted an experiment, to evaluate the effectiveness of WebTracer in a Web usability evaluation. In the experiment, we asked the three subjects to find objective information within a

company Website. Then, with the support of WebTracer, the subjects were interviewed and their comments were recorded. Table 1: The average number of comments by three subjects Task1 Task 2 Task3 Task 4 Task5 Total (%) Phase1: Summary and Graph 0.7 2.3 2.3 1.7 2.3 9.3 11.6 Phase2: Replay 7.7 13.7 12.0 8.0 18.7 60.0 74.4 Phase3: Subject s memory 2.3 0.0 0.0 1.3 0.3 4.0 4.9 Phase4: Fast forward replay 0.7 0.7 1.7 1.0 3.3 7.3 9.1 3.1 Experiment Design First, we asked the three subjects who often use the WWW to do a task, and we recorded the operational data using the WebTracer. Tasks for the subjects included gathering the following five pieces of information from a company Website. Task 1: finding the way to a certain place in the company Task 2: finding out the number of employees Task 3: finding information about compensation benefits of the company Task 4: finding specific news about the company Task 5: finding a technical method of construction Secondly, we interviewed each subject about the ease of using an object Website after the end of a task. The interview is divided into four phases, and the outline of each phase is as follows. Phase 1: An interview based on statistics obtained from recorded data. WebTracer shows the summarized browsing history. We interviewed each subject based on the summary and the eye movement graph as shown in Fig. 2. We had the subject point out the difficulty of use in focusing on a web page with both the large moving distance and moving speed of the subject s gazing point. Phase 2: The interview based on the replay screen in which the eyemark was placed. We had the subject point out the difficulty of use with the replay screen as shown in Fig. 3 regarding the web page from which the comment was obtained by phase 1. Phase 3: The interview based on the memory of the subject. We conducted an interview based on the memory of the subject for whom the whole task was performed. Comments which were obtained and which overlapped by phase 2 are not included in the number of comments obtained by this phase. Phase 4: The interview based on the fast forward replay screen. We had the subject point out the difficulty of use with the fast-forwarding replay screen. Comments which were obtained and which overlapped by phase 3 are not included in the number of comments obtained by this phase. 3.2 Results and Discussion The average execution time per task in the experiment was 2 minutes and 48 seconds. Also, the average time of the interview was 19 minutes. Therefore, the total time spent for entire process was 21 minutes and 48 seconds. The average number of comments given by the three subjects is summarized in Table 1. For execution time, we obtained 16 comments on average. The comments include usability problems, thinking during operation. We assume that the unit of a comment is every sentence uttered by a

subject. The number of comments becomes material data of the usability problem for evaluator judges. As shown in Table 1, about 95% of the entire comments are obtained from the graph and replay (Phase 1, 2 and 4) using the WebTracer. Moreover, about 74% of the entire comments are obtained from Phase2 by replaying of WebTracer. We can see that the replay function of the user operation which added the eyemark in WebTracer is effective from the experiment results. Subjects can remember their operations, can point out the points, which is hard to use, and a evaluator can ask an interview more easily by replaying of WebTracer. We believe that the WebTracer made the phases of this evaluation experiment possible. For example, we consider the case where "Usability laboratory" carries out the interview phases. The "Usability laboratory" can be applied to Phase2 and Phase4, if the evaluator records the subjects operations with a video camera. However, by replay of the video camera, we still cannot know a user's gazing point. Therefore, we assume that the number of comments obtained from subjects decreases compared with the WebTracer. Phase3 is also made possible by performing a questionnaire-based evaluation at the end of the task. However, "Usability laboratory" cannot be used for Phase1, since no quantitative result is available. Usability laboratory cannot record a detailed user s operational data and show the summarized data. Consequently, 9.3 comments in Table 1 would not be available. As a result, the Usability laboratory would miss 11.6% of the entire comments shown in Table 1. 4 Conclusions In this research, we have presented an integrated usability evaluation tool, WebTracer, and have also conducted an experimental evaluation. WebTracer allows evaluators to perform an efficient Web usability evaluation with optimized features, i.e., with recording operations, replay, summaries and graphs. Web Tracer records the user s gazing point as well as the user's operations, and replays the user s operation in the same screen. From the user s operations performed in each page, we were able to capture the characteristics of the page. As a result of the evaluation experiment, we have obtained many useful comments with respect to usability. Our future work includes refining the evaluation procedure, as well as comparing the proposed method with other evaluation methods. If WebTracer spreads widely, research of Web usability will become less difficult. Moreover, a usability evaluation in an actual development also becomes eas ier. Furthermore, we expect that software which is easier to use will increases. References Etgan, M. & Cantor, J. (1999). What does getting WET (Web Event-logging Tool) mean for web usability?, HFWEB99. Goto, K. & Cotler, E. (2002). Web ReDesign. Peason Education. Nielsen, J. (1993). Usability Engineering. Academic Press. Okada, H., & Asahi, T. (1999). GUI TESTER: log-based usability testing tool for graphical user interfaces. IEICE Trans. on Information And Systems, Vol.E82-D, No.6, pp.1030-1041. Paganelli, L. (2002). Intelligent analysis of user interactions with web applications. IUI'02. Torii, K., Matsumoto, K., Nakakoji, K., Takada, Y., Takada, S., & Shima, K. (1999). Ginger2: An environment for computer-aided empirical software engineering. IEEE Trans. on Soft. Eng., Vol.25, No.4, pp.472-492.