GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS



Similar documents
Project Management within ManagePro

AIM Dashboard-User Documentation

Digital Signage with Apps

Office of History. Using Code ZH Document Management System

14.1. bs^ir^qfkd=obcib`qflk= Ñçê=emI=rkfuI=~åÇ=léÉåsjp=eçëíë

Central Management Software CV3-M1024

Asset Track Getting Started Guide. An Introduction to Asset Track

Real Time Monitor. A Real-Time Windows Operator Interface. DDE Compliant. (for remote data display)

Excel What you will do:

SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software.

Writer Guide. Chapter 15 Using Forms in Writer

SMART Board Training Outline Trainer: Basel Badran

MyOra 3.0. User Guide. SQL Tool for Oracle. Jayam Systems, LLC

Instructions for Creating a Poster for Arts and Humanities Research Day Using PowerPoint

Digital Sentry DS ControlPoint

Getting Started with IntelleView POS Administrator Software

Tracking Project Progress

SonicWALL GMS Custom Reports

BUILD2WIN Lab 2.6. What s New in BUILD2WIN?

Adobe Acrobat 6.0 Professional

Computer Skills Microsoft Excel Creating Pie & Column Charts

Universal Simple Control, USC-1

LEGENDplex Data Analysis Software

Citrix EdgeSight for Load Testing User s Guide. Citrix EdgeSight for Load Testing 3.8

Microsoft Project 2007 Level 2: Working with Resources and Managing a Project

Chapter 9 Slide Shows

Sharing Files and Whiteboards

PEMBINA TRAILS SCHOOL DIVISION. Information Technology Department

Microsoft Access 2010 handout

Citrix EdgeSight for Load Testing User s Guide. Citrx EdgeSight for Load Testing 2.7

Microsoft Outlook And- Outlook Web App (OWA) Using Office 365

Infinity 2020 Perimeter Intrusion Detection System

Smoke Density Monitor application documentation

MARS STUDENT IMAGING PROJECT

Chapter 15 Using Forms in Writer

InfoView User s Guide. BusinessObjects Enterprise XI Release 2

Ovation Operator Workstation for Microsoft Windows Operating System Data Sheet

Using Impatica for Power Point

Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102

Using. An excerpt from Smart Technology's getting started manual explaining the basics of the SmartBoard and how to use it.

Load testing with. WAPT Cloud. Quick Start Guide

Avigilon Control Center Web Client User Guide

Microsoft Publisher 2010: Web Site Publication

USER MANUAL Detcon Log File Viewer

WebSphere Business Monitor V6.2 Business space dashboards

Reviewer s Guide. Morpheus Photo Animation Suite. Screenshots. Tutorial. Included in the Reviewer s Guide:

COGNOS 8 Business Intelligence

Microsoft PowerPoint 2011

Working with Video in PowerPoint 2010

User Tutorial on Changing Frame Size, Window Size, and Screen Resolution for The Original Version of The Cancer-Rates.Info/NJ Application

Microsoft PowerPoint 2010 Computer Jeopardy Tutorial

Creating Custom Crystal Reports Tutorial

Final Software Tools and Services for Traders

Your Personal Trading Journal

How to Develop Accessible Linux Applications

EXCEL PIVOT TABLE David Geffen School of Medicine, UCLA Dean s Office Oct 2002

WebSphere Business Monitor

Interactive Voting System. IVS-Basic IVS-Professional 4.4

CHAPTER 11: SALES REPORTING

Create a Poster Using Publisher

FOR WINDOWS FILE SERVERS

SAS BI Dashboard 3.1. User s Guide

Appspace 5.X Reference Guide (Digital Signage) Updated on February 9, 2015

Administration. Welcome to the Eastwood Harris Pty Ltd MICROSOFT PROJECT 2010 AND PMBOK GUIDE FOURTH EDITION training course presented by

Pepperstone MetaTrader 4 User Guide:

Getting Started with Excel Table of Contents

Microsoft PowerPoint 2008

ACCESS Importing and Exporting Data Files. Information Technology. MS Access 2007 Users Guide. IT Training & Development (818)

Eye tracking in usability research: What users really see

Weston Public Schools Virtual Desktop Access Instructions

Filtering with Microsoft Outlook

Outlook Web Access 2003 New Features Guide

graphical Systems for Website Design

CHAPTER 6: ANALYZE MICROSOFT DYNAMICS NAV 5.0 DATA IN MICROSOFT EXCEL

Module One: Getting Started Opening Outlook Setting Up Outlook for the First Time Understanding the Interface...

Release Notes DAISY 4.0

Making Visio Diagrams Come Alive with Data

Access 2007 Creating Forms Table of Contents

Nintex Workflow 2010 Help Last updated: Friday, 26 November 2010

Microsoft PowerPoint 2010

ETS. Major Field Tests. Proctor Administrator Manual

How To Use Senior Systems Cloud Services

How to Create and Send a Froogle Data Feed

Adobe InDesign Creative Cloud

Snap 9 Professional s Scanning Module

Table of Contents. Part I Welcome. Part II Introduction. Part III Getting Started. Part IV The User Interface. Part V Quick Start Tutorials

Designing Reports in Access

COMMON CUSTOMIZATIONS

HOW TO LINK AND PRESENT A 4D MODEL USING NAVISWORKS. Timo Hartmann t.hartmann@ctw.utwente.nl

How To Test Your Web Site On Wapt On A Pc Or Mac Or Mac (Or Mac) On A Mac Or Ipad Or Ipa (Or Ipa) On Pc Or Ipam (Or Pc Or Pc) On An Ip

Chapter 12 Creating Web Pages

How to Use Motion Detection in ACTi Cameras

Analyzing Data Using Excel

NVMS User Manual

TestManager Administration Guide

How To Change Your Site On Drupal Cloud On A Pcode On A Microsoft Powerstone On A Macbook Or Ipad (For Free) On A Freebie (For A Free Download) On An Ipad Or Ipa (For

Quick Help Guide (via SRX-Pro Remote)

Create an Excel BI report and share on SharePoint 2013

PaperlessPrinter. Version 3.0. User s Manual

Transcription:

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS Chris kankford Dept. of Systems Engineering Olsson Hall, University of Virginia Charlottesville, VA 22903 804-296-3846 cpl2b@virginia.edu ABSTRACT The Eye-gaze Response Interface Computer Aid (ERICA) is a computer system developed at the University of Virginia that tracks eye movement. Originally developed as a means to allow individuals with disabilities to communicate, ERICA was then expanded to provide methods for experimenters to analyze eye movements. This paper describes an application called GazeTracker TM that facilitates the analysis of a test subject's eye movements and pupil response to visual stimuli, such as still images or dynamic software applications that the test subject interacts with (for example, lnternet Explorer). Keywords Eye-tracking~ usability, Internet, windows. 1. INTRODUCTION The Eye-gaze Response Interface Computer Aid (ERICA) is a computer that tracks eye movement. ERICA was originally developed to assist the disabled by providing them with a means to communicate. As ERICA developed, another potential market for the system became apparent - the system could serve to facilitate analysis of a user's eye movements and fluctuations in pupil dilation. Software needed to be developed to allow experimenters to construct experiments and to analyze the raw data stream of gaze positions and pupil dilations that ERICA emits sixty times a second. GazeTracker TM is the application that resulted from this development. It currently provides two analysis modes - image analysis and application analysis. When the GazeTracker application opens, the user must either open an image analysis session or an application analysis session. 2. IMAGE ANALYSIS Image analysis is the section of CrazeTracker designed to assist experimenters in obtaining and analyzing eye-tracking data from test subjects. GazeTracker stores a test subject's gaze positions and characteristic dimensions of pupil dilation to a database. It also presents different analysis techniques to the experimenter to facilitate study of the data. When image analysis is opened, the experimenter may either open a previously created and saved session from the database or create a new session. 2.1 Image Analysis Setup Wizard When the user chooses to create a new session, GazeTracker presents a wizard to guide the experimenter through the variety of choices that it presents. The first page in the wizard allows the experimenter to define which images will appear in the slide show. A slide show is simply an ordered collection of static images that are presented to the test subject, and GazeTracker records the eye response of the test subject to these images. The images may be bitmaps, jpegs, or gifs. After choosing the images to present, the experimenter defines the general recording attributes for the session. These attributes allow the user to specify how the images are to be shown to the test subjects. Figure 1 shows the page with the options available to setup the slide show. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without foe provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Eye Tracking Research & Applications Symposium 2000 Palm Beach Gardens, FL, USA 2000 ACM ISBN 1-58113-280-810010011.-$5.00 Figure 1. Slide Show Wizard Setup. Thispage allows experimenters to customize the presentation of the slide show images. 51

The experimenter has three sections of options to set - buffer region options, slide advancement options, and slide presentation options. Buffer regions are blank images that the experimenter can specify to appear at the beginning of the slide show and/or between each image. These buffer regions are black, white, or gray. Slide advancement options control how new slides appear on the screen. The experimenter may have the next slide appear after a certain time interval has elapsed or after a key is pressed on the keyboard. Also, the experimenter may allow test subjects to return to previous images in the slide show by having them press another key on the keyboard. Slide presentation options control how the image is displayed on the computer screen. The image may optionally be stretched to fit the display. In addition, the aspect ratio of the image can be preserved. When the aspect ratio is preserved, or the image is not being stretched, then GazeTracker fills in the area of the monitor not obscured by the picture with a solid color. The experimenter may specify what this color is. Lastly, the experimenter can choose whether or not to randomize the presentation order of the images to the test subjects. The experimenter may apply these settings to all images or advance through the wizard and specify unique settings for each image. When the experimenter is specifying settings for each image, they are presented with the same options described previously. In addition, when the randomizing option is enabled, the experimenter may group particular images together. For example, the user may wish to always have image C follow image B. The order of presentation for all other images will be randomized; however, whenever image B appears, image C will always appear next. After setting up the options for the session, the experimenter may save these options to a database for easy retrieval for future testing of subjects. GazeTracker is now ready to record subject data, or the experimenter may choose to setup a questionnaire to be administered to the test subject either before or after recording subject data. 2.2 Questionnaires Questionnaires provide a means for experimenters to gather information about their subjects and to store this information to the database along with the session attributes and recorded eyetracking data. Cra~Tracker provides a sequence of wizards and dialog boxes that allow experimenters to define questions and allowable answers. Experimenters may create a question with a number of selectable pre-set answers or have questions that just allow a free response on the part of the test subject. The questionnaire appears as a wizard to the test subject, with the potential answers appearing as radio buttons and the free response area appearing as an edit box. As the user answers a question, they press the next button in the questionnaire to advance to the next question, or they may press the back button to return to a previous question. Figure 2 shows an example question from a questionnaire. Figure 2. Example Question. Questionnaires allow experimenters to gather additional information from their test subjects. Different questionnaires may appear before and after a test subject views the slide show. After the test subject answers all of the questions in the pre-questionnaire, GazeTraeker is ready to record that subject's eye-tracking data. The experimenter presses the start button on the last page of the questionnaire, and then the images appear before the user, in accordance to the settings the experimenter previously entered into the setup wizard. During the recording of gaze data, if multiple monitors are attached to the computer, then a secondary monitor may be set up to replicate the monitor the user is viewing the slideshow on. On this secondary monitor, the user's gaze position is shown by a red X superimposed on the current image in the slide show. At the bottom left of the monitor, a graph appears which shows the recording of the pupil diameter or area response of the test subject. Experimenters use this feature to see a real time picture of how the experiment is progressing and to verify the quality of the gaze dam being collected while the subject views the presented image. After the experimenter finishes recording data, then the experimenter may save the data to the database along with general subject information, such as the subject's name and address, and then analyze the data using the tools described in the remainder of this section. 2.3 GazeTrails After data is recorded, then GazeTracker superimposes GazeTrails over the images as the experimenter views each image. A GazeTrail shows the subject's ocular scan path by drawing a connected path of the recorded gaze position data. These trails display the pattern of the user's gaze, as shown in Figure 3. The trails are multicolored, cycling fi'om green to yellow to blue to red. The cycle can be specified to repeat after a particular duration of time. For example, a value of one second would cause the trail to reset from red to green after one second of recorded data. Therefore, a quickly moving gaze will have a long transition of color along the gaze path, whereas a slow moving gaze will show tight transitions between colors along the gaze path. This allows the experimenter to easily see how quickly the user was looking from one position to another. On the 52

GazeTrall, labels appear to show the position of each individual sample. Also, the CrazeTrail identifies fixations, a certain number of gazepoints correlating to a certain dwell time within a certain region, and shows them to the experimenter. The experimenter may also perform a real time playback of this data with a red X appearing on the image at each gazepoint and the GazeTrail appearing behind the X. gives an immediate qualitative feedback on which zones were viewed and which zones were ignored while simultaneously overlaying quantitative numbers onto the bar. -- --,? S T~S?~ Duration Figure 3. GazeTrail. The GazeTrail cycles from green to red to show the speed at which the user's gaze is changing. The trail labels gazepoints and fixations as well. Additionally, researchers may specify how GazeTracker calculates fixations by adjusting a number of different parameters. Experimenters may specify the minimum number of gazepoints to include in a fixation, the minimum duration of a fixation, the diameter of a fixation (in screen pixels), and the maximum gap time between points in a fixation. This last parameter allows the clustering algorithm to cope with missing data due to tracking failure caused by eye blinks. The Gaz~Trall and playback is of interest to researchers; however, to be able to meaningfully analyze the eye-tracking data, researchers need to be able to define regions of interest, which GazeTracker calls LookZ~nes. 2.4 LookZones LookZ~nes are means to filter the recorded gaze data and provide a variety of measures for particular areas defined by the researcher in the image. The researcher may delineate these areas at any time - either before or after recording subject data. When the experimenter wishes to create a LookZone, they press the Look,Zone button and draw a rectangular region over the image by pressing and holding the left mouse button down and dragging the mouse. The experimenter may then define a name for the LookZone and associate a LookZone with a particular LookZone collection, which is a grouping of particular LookZones that the researcher can define. LookZones provide a variety of metrics. They give the total time spent in that region, the percentage of time spent in that region, the time until the first fixation in the zone, and the number of times the region is fixated upon. GazeTracker's 3D analysis mode provides a method to view the LookZone time data. When this mode is invoked, the slide show image is shown in three dimensions with bars positioned over a LookZ~ne's area to show the total time spent in each Look,Zone as shown in Figure 4. The two LookZones shown in this figure could have been labeled Sunset 1 and Sunset 2. This 3D view Tirno ~ Look, Zone 0.t sec. ~- t 8 % Tm'~e O~S~le LookT.,on~ 2 8 sec. or g8 2 % Figure 4. 3D Analysis. This window combines a bar chart with a three dimensional view of the image. The bar represents the time spent in a particular area. LookZone data is also available in GazeTracker's graphing modes. 2.5 Graphing GazeTracker also provides a means to graph data in Microsoft Excel. Essentially, GazeTracker will open Excel and output the mount of time particular slides are shown on the screen and output the dwell times in the various LookZones into a workbook in Microsoft Excel. GazeTracker will then automatically generate different graphs of the data. Pie graphs and bar graphs are generated at the user's request. GazeTracker also provides a fully customizable pupil diameter/area graphing feature to analyze the pupil data. The graph is presented as a type of line graph. By clicking on a particular data point displayed in the graph, a small area near the graph shows exactly what the user was looking at when their pupil diameter was that data point's value. Many of the eye-tracking measures are also exportable to a text file so that researcher may input the values into their own statistical packages. Furthermore, all of these analysis options are also available in GazeTracker's application analysis mode. 3. APPLICATION ANALYSIS Application analysis provides a method for effectively analyzing how individuals interface with a computer. This analysis mode records the gaze position on the computer display and pupil diameter of a user as they operate the computer. It also records all keystrokes and mouse clicks and movements that the test subject performs while the analysis session is recording. In this recording mode, both the eye-gaze system and the application 53

analysis software are running in the background of the computer, passively recording the actions of the user with little and generally unnoticed degradation in system performance. Application analysis incorporates many of the basic analysis concepts already described, but there are some analysis functions unique to this mode of analysis. Upon opening application analysis mode, the experimenter may either open a previously saved session ~om the database or create a new session via the application analysis setup wizard. 3.1 Setup Wizard Application analysis's setup wizard offers a broad range of customizable features that allow experimenters to configure the analysis session to suit their specific needs. The experimenter may specify which processes they wish to analyze, or they may automatically have windows added to the analysis, thereby recording data for every window the user works in. Also, the experimenter may specify whether to record how the user interacts with the desktop. For the l_x~kzone options, the experimenter can allow application analysis to automatically load the LookZone configurations associated with the windows to be analyzed. Application analysis also has questionnaires that function in the same way described previously. Once the test subject presses the record button in the GazeTracker software, the system begins monitoring the test subject's actions. The system stores the positions of all mouse dicks and movements. It also records all keystrokes that occur. As the eye-gaze system determines the test subject's point of regard, the software receives and records this data. The data is separated and stored in coordinates relative to the upper left-hand corner of the window in which the data event is occurring, whether this data is gaze data or mouse input data. This allows the data to be stored independently of the window's actual position. When the test subject looks into or performs an input action into a window that has no events recorded thus far, then the system begins recording data relative to this window so long as the user is working in the window. If the window has a LookZone configuration associated with it, then the configuration is loaded. As in the case in image analysis, if multiple monitors are attached to the computer, then one monitor may be setup to replicate the contents of the monitor the test subject is using. A red X superimposed on the display of this secondary monitor will show the gaze position of the test subject. Also, the bottom left portion will show a graph of pupil diameter. This allows experimenters to verify the quality of the recorded data and see a real time picture of how the experiment progresses. Once the experimenter finishes recording the data, the data replay works in a similar manner as image analysis's replay. A red X appears on the screen where the user was looking; however, the system replays all input events as well, all at the proper time in the playback. This causes the test subject's actions to be completely simulated by the software giving the experimenter an exact reproduction of what occurred on the computer during the testing. The experimenter may also look at the GazeTrails, delineate l-x~kzones, and perform graphing functions in the same way described above; however, application analysis possesses some notable differences it how it implements these analysis modes due to the complexity in analyzing how users interact with software as opposed to images. These enhancements are geared toward facilitating analysis of test subjects who interact with web browsers, such at Netscape Navigator or Internet Explorer. 3.2 Internet GazeTrails When the test subject interacts with a web page, it is not sufficient to merely store the relative offset of the gazepoint in the browser window. If the test subject is looking at the web page, then the page may have been scrolled. Storing the gazepoint with no indication of where the browser is currently scrolled to is useless. The gaze data would not be able to actually identify where the subject was looking. Therefore, GazeTracker can communicate with web browsers and obtain the current scroll position. GazeTracker can then automatically determine if a gazepoint is in a scrollable region and store the scrollbar position. This allows GazeTracker to show whether or not a subject did look at a particular graphic on the web page regardless of where the web page was scrolled during recording or during analysis. This compensation is hidden from the user - no interaction is needed on their part for this to occur. The GazeTrails shown to researchers in application analysis automatically perform this compensation on the displayed data. Furthermore, the test subject may scroll to an area, scroll to another area, then scroll back to the original area. To accommodate this type of behavior, GazeTracker has scroll sets. A scroll set breaks the gaze data up into the different instances of where a scroll bar was at different values on the currently shown portion of the web page. This allows experimenters to easily see the data for each time the test subjects scrolled that portion of the page into view. LookZones in application analysis also benefit from this compensation. 3.3 lnternet LookZones Experimenters create LookZones in the same manner as they do in image analysis. They press the LookZone button and then draw the zone over the actual application. For LookZones to be useful in application analysis, they also needed to track the scroll position of the browser to update their positions. Application analysis's LookZones accomplish this by updating their position in a similar method to which gazepoints identify their scroll location. Therefore, the LookZones are ensured to stay over the region that the experimenter wishes them to provide data for. Application analysis also has a number of other functions geared toward web analysis. GazeTracker can also parse the HTML in a web page and automatically generate LookZones over regions of interest. Figure 5 shows the dialog box that presents the different options for parsing the HTML. 54

Figure 5. Automatic LookZone Creation. GazeTracker can automatically parse HTML and create LookZones with these options. GazeTracker can identify all hyperlinks and all graphics on the web page. GazeTraeker will then create zones around these regions. The experimenter can have GazeTracker inflate the zones by a particular number of pixels as well. This lets experimenters have a buffer region around the hyperlink or image in case the eye-tracking data is inaccurate. Furthermore, researchers can specify the minimum size for a LookZone. If GazeTracker finds an image or hyperlink that is too small, then it will not create the LookZone. This is needed because some web pages use transparent gifs for formatting the page layout, and these images would be of no interest in analysis. Automatically generating LookZones on hyperlinks lets researchers easily see if a user clicked on the hyperlink in addition to identifying if the test subject fixated on the link. Lastly, Ga~Tracker identifies when the user navigates to new web pages and associates the eye tracking data with the particular page the subject viewed. This greatly eases analysis of the data. 4. CONCLUSION The software described in this paper makes analysis of eyetracking data far easier than previously possible. GazeTracker focuses on providing the greatest amount of flexibility for the researcher by providing an intuitive and simple means for recording data and an immediate, graphical means for analyzing data. The data may also be exported into other statistical packages. Even more integration into the browser applications is planned for the future, as is the development of more tools to facilitate analysis across test subjects. This software is currently available with the ERICA system and is used by a number of different researchers in their eye-tracking studies; however, the software can be easily modified to record eye-gaze data from other eye-gaze systems such as an ASL eye-tracker. 55