Video-Based Eye Tracking



Similar documents
Eye-tracking. Benjamin Noël

Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University

PRODUCT SHEET.

Designing eye tracking experiments to measure human behavior

Eye tracking in usability research: What users really see

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)

Tobii AB. Accuracy and precision Test report. Tobii Pro X3-120 fw Date: Methodology/Software version: 2.1.7*

Tobii Technology AB. Accuracy and precision Test report. X2-60 fw Date: Methodology/Software version: 2.1.7

A Guided User Experience Using Subtle Gaze Direction

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS

Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding

The Eyelink Toolbox: Eye Tracking with MATLAB and the Psychophysics Toolbox.

EMR-9 Quick Start Guide (00175)D 1

EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification

Psychology equipment

Translog-II: A Program for Recording User Activity Data for Empirical Translation Process Research

Design of Multi-camera Based Acts Monitoring System for Effective Remote Monitoring Control

Agent Simulation of Hull s Drive Theory

PRIMING OF POP-OUT AND CONSCIOUS PERCEPTION

A System for Capturing High Resolution Images

User Manual. Tobii Eye Tracker ClearView analysis software. February 2006 Copyright Tobii Technology AB

Eyetracker Output Utility

SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software.

Implementation of eye tracking functions in the Presentation interface for the EyeLink-II eye tracking system, Version 0.9-Beta

USING EYE TRACKING TO EVALUATE ALTERNATIVE SEARCH RESULTS INTERFACES

Eye Tracking Instructions

Fixplot Instruction Manual. (data plotting program)

Tobii 50 Series. Product Description. Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker

Graphic Design. Background: The part of an artwork that appears to be farthest from the viewer, or in the distance of the scene.

TRIMBLE ATS TOTAL STATION ADVANCED TRACKING SYSTEMS FOR HIGH-PRECISION CONSTRUCTION APPLICATIONS

Studying Human Face Recognition with the Gaze-Contingent Window Technique

Data Analysis Methods: Net Station 4.1 By Peter Molfese

RIVA Megapixel cameras with integrated 3D Video Analytics - The next generation

Tutorial for Tracker and Supporting Software By David Chandler

Chapter I Model801, Model802 Functions and Features

The main imovie window is divided into six major parts.

How to test ocular movements in PSP Jan Kassubek

This tutorial assumes that Visual3D has been installed and that a model has been created as described in Tutorial #1.

Maya 2014 Basic Animation & The Graph Editor

Tobii TX 300. Experiment Room. Participant

Effective Use of Android Sensors Based on Visualization of Sensor Information

User Manual V1.0. Remote Software

A Cognitive Approach to Vision for a Mobile Robot

Cognitive Neuroscience. Questions. Multiple Methods. Electrophysiology. Multiple Methods. Approaches to Thinking about the Mind

Watch Your Garden Grow

TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS

Product Characteristics Page 2. Management & Administration Page 2. Real-Time Detections & Alerts Page 4. Video Search Page 6

Manual Analysis Software AFD 1201

Programs for diagnosis and therapy of visual field deficits in vision rehabilitation

(I) s(t) = s 0 v 0 (t t 0 ) a (t t 0) 2 (II). t 2 = t v 0. At the time. E kin = 1 2 m v2 = 1 2 m (a (t t 0) v 0 ) 2

POWER NETWORK iPcams 2/10 Rev. A

A method of generating free-route walk-through animation using vehicle-borne video image

History of eye-tracking in psychological research

Eye-Tracking Methodology and Applications in Consumer Research 1

Building Interactive Animations using VRML and Java

Subjects. Subjects were undergraduates at the University of California, Santa Barbara, with

DVS Net (Integration)

Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the Eye Tracking Accuracy

Video Tracking Software User s Manual. Version 1.0

COOKBOOK. for. Aristarchos Transient Spectrometer (ATS)

Issues in the Design and Development of Experimental Software for Use With an Eye Tracking System

Twelve. Figure 12.1: 3D Curved MPR Viewer Window

Tutorial for Programming the LEGO MINDSTORMS NXT

TAGARNO AS Sandøvej Horsens Denmark Tel: Mail: mail@tagarno.com

ε: Voltage output of Signal Generator (also called the Source voltage or Applied

Automated corneal-reflection eye-tracking in infancy: Methodological. developments and applications to cognition

Research-Grade Research-Grade. Capture

Using angular speed measurement with Hall effect sensors to observe grinding operation with flexible robot.

Anime Studio Debut vs. Pro

MSc in Autonomous Robotics Engineering University of York

THE MS KINECT USE FOR 3D MODELLING AND GAIT ANALYSIS IN THE MATLAB ENVIRONMENT

MPC 4. Machinery Protection Card Type MPC 4 FEATURES. Continuous on-line Machinery Protection Card

Timing Guide for Tobii Eye Trackers and Eye Tracking Software

The Tobii I-VT Fixation Filter

Introduction to Videoconferencing

Getting Started: Creating the Backdrop

Template-based Eye and Mouth Detection for 3D Video Conferencing

Intelligent Monitoring Software

Multi Client (Integration)

Transcription:

Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department of Neurological and Behavioral Sciences, Medical School, University of Siena, 53100 Siena, Italy b Department of Information Engineering, Robotics and Systems Lab, University of Siena, 53100 Siena, Italy ABSTRACT: We present an independent, flexible, and easily programmable software program for generating a wide set of visual stimuli paradigms in eyemovement studies. The software, called ASTIDET (Advanced Stimuli Design for Eye Tracking), has been interfaced in real time with a high speed videobased eye tracking system in order to obtain a reliable measurement of saccades. Two saccadic paradigms have been tested (gap and memory guided tasks) in 10 normal subjects. The preliminary results confirm that ASTIDET is a user-friendly software and can be interfaced with a video-based eyetracking device in order to obtain reliable measurement of saccades. KEYWORDS: video-based eye tracking; saccades; analysis; software PURPOSE The aim of this study was to develop an independent, flexible, and easily programmable software program for generating visual stimuli for eye-movement studies. 1 We developed a software program called ASTIDET (Advanced Stimuli Design for Eye Tracking) as an easy-to-use program for stimulus generation, real-time data acquisition, and analysis in video-based eye-tracking applications. PROGRAM DESCRIPTION ASTIDET allows quick design of two-dimensional video sequences to be used as stimuli for a wide spectrum of eye-tracking paradigms. ASTIDET was developed in Visual C++ in order to benefit from its flexibility and also to allow possible future Address for correspondence: Antonio Federico, M.D., Dept. of Neurological and Behavioral Sciences, Medical School, University of Siena, Viale Bracci 2, 53100 Siena, Italy. Voice: +39-0577-585763; fax +39-0577-40327. federico@unisi.it Ann. N.Y. Acad. Sci. 1039: 575 579 (2005). 2005 New York Academy of Sciences. doi: 10.1196/annals.1325.071 575

576 ANNALS NEW YORK ACADEMY OF SCIENCES FIGURE 1. Eye-tracking functional scheme. ASTIDET acts as a stimulus generator for evoking eye movements and performs real-time acquisition of eye movements and blink removal. integration with advanced graphical libraries (such as VTK or Open GL) and virtual reality applications. ASTIDET works on a PC monitor (the editor monitor ), allowing interactive design of a visual stimulus, which is then presented on a second monitor (the scene monitor ) in front of which the subject is seated. The editor interface allows design of a static version of the dynamic stimulus and specification of the motion (along straight lines) and timing parameters of the visual targets (colored dots) that make up the visual stimulus. Additionally, the program allows integration of stimulus multimedia applications (e.g., MPEG, MPG, and AVI movies) that can be loaded into the scene monitor for specific purposes. As shown in the functional scheme (FIG. 1), ASTIDET works together with a video-based eye-tracking system in which the remote infrared pan tilt camera (ASL model 504 multispeed), running up to 240 Hz, tracks the eye. The infrared camera works by capturing video images of the pupil and corneal reflection of the subject s eye. These images (video frames) are processed in real time by the controller module that is provided with the eye-tracking system. The controller defines the line of gaze by extrapolating the x and y coordinates relative to the screen being viewed. ASTIDET can read eye data in real time via a standard RS232 connection with the controller; thus, it is able to process online data of the subject s gaze and pupil size. Acquired data are prefiltered for blinks by the ASTIDET program, and then filtered again (Gaussian filtering) for blinks and other noise components of the signal using ILAB for Matlab, 2 which currently includes functions for detection and analysis of saccades. The editing interface (FIG. 2) consists of a grid reference with a simple and userfriendly graphic interface: by clicking the left mouse button, the operator can set the points through which the animated sequences will move. This task can be performed easily using the grid reference system, allowing the supervisor to design the static sequence precisely. The spacing between grid lines can be selected by the operator, depending on the subject-to-monitor distance and the cm/pixel ratio for the screen. The value in degrees corresponding to the position of the mouse can be visualized on the grid interface. With the right mouse button the operator defines the parameters

RUFA et al.: VIDEO-BASED EYE TRACKING 577 FIGURE 2. Static sequences for a wide spectrum of eye-tracking paradigms can be generated in the editor interface by ASTIDET. By clicking the left mouse button, the operator can set the points from which the animated sequences will move. describing the motion of the target, that is, the velocity, color, and size (in pixels) of the moving dot. It is also possible to make the target disappear for a predefined time interval and then to make it reappear in another part of the monitor. These properties allow the system to generate a wide spectrum of eye-tracking paradigms. 3 In addition, the previously created static scene can be saved for later use with ASTIDET. METHODS In order to evaluate the performance of ASTIDET, we generated specific saccadic paradigms and tested them on 10 subjects (all of whom gave informed consent) at the University of Siena. In the experimental setup, the subject was placed in a dark room with the eye at 72 cm from the scene monitor. The visual angle was 25. To minimize head movement, the head of the subject was constrained using a chinrest. The spatial resolution of the eye-tracker (0.1 deg), its sampling rate (240 Hz), and the linear range (±30 horizontally, ±20 vertically) were sufficient for reliable measurement of saccades. Calibration of the tracker device was performed before each trial using the ASTIDET software by having the subject look at a sequence of nine targets on the scene monitor. Offsets and infrared camera parameters were adjusted iteratively by the experimenter until the line of gaze coincided with each of the nine points. Each subject was tested using two frequently used saccadic paradigms (gap task and memory-guided saccadic task), during two separate experimental sessions. 4 The gap stimulus may elicit both regular saccades and express saccades (short-latency saccades, latency <100 ms) elicited when a novel stimulus is presented after a fixation point is turned off (gap-stimulus). In this paradigm the peripheral target was presented at +4, 6, 8, 10, and 12 ). Memory-guided saccades move the eyes toward the location at which a peripheral cue was previously presented for a few seconds. In this paradigm we used a central target as the fixation point, and the lateral cue was briefly presented randomly at the same eccentric target positions used in the

578 ANNALS NEW YORK ACADEMY OF SCIENCES gap paradigm. After a delay, during which central fixation is maintained, the central fixation target is switched off and the subject moves the eyes to where the cue had been presented. The gap and memory-guided paradigms were easily generated using the ASTIDET software by following the experimental parameters reported by Pierrot-Deseilligny. 5,6 In the gap paradigm we evaluated the peak velocity, duration, latency, and gain of each saccade, whereas in the memory-guided paradigm, we considered amplitude, latency, error, and gain of each saccadic movement. RESULTS Results are reported in TABLE 1 for the gap-paradigm parameters. Saccades were detected using a velocity threshold criterion of 30 deg/s. The mean values of duration, peak velocity, latency, and gain were comparable with those reported inthe literature. 7 In this experiment we did not consider express saccades. FIGURE 3 shows the response to the memory-guided taskfor one trial in one subject. In this task we considered saccadic error, latency, and gain of the saccadic eye movement. The results in the bottom right corner of FIGURE 3 compare well with those reported in the literature for normal subjects. FIGURE 3. Memory-guided task and results. While the subject looks at a central red dot, a second visual target is briefly (50 ms) presented laterally. After a period during which the fixation is maintained, the central dot is turned off (go signal), and the subject s eyes move to where the cue had been presented.

RUFA et al.: VIDEO-BASED EYE TRACKING 579 TABLE 1. Saccade analysis (mean values ± SD) for different amplitudes in terms of duration, latency, peak velocity, and gain Amplitude (deg) n Duration (ms) Peak velocity (deg/s) Latency (ms) Gain 4 9 21.405 ± 3.611 84.22 ± 11.818 232.6 ± 1.505 1.107 ± 0.002 6 9 22.043 ± 3.354 104.37 ± 17.668 199.5 ± 1.980 1.207 ± 0.002 8 10 30.872 ± 4.352 127.22 ± 18.604 186.0 ± 2.107 1.207 ± 0.002 10 10 44.474 ± 2.861 195.92 ± 11.645 188.5 ± 0.011 1.200 ± 0.002 12 15 54.225 ± 6.853 270.92 ± 19.898 187.0 ± 3.360 1.107 ± 0.002 CONCLUSION The preliminary results presented here confirm that ASTIDET can be interfaced with a video-based eye-tracking device in order to obtain reliable measurement of saccades. The system is very easy to use and calibration is fairly accurate. We are currently working on implementing real-time digital filtering of data in the ASTIDET environment. We are also implementing this software in order to interface it with a transcranial magnetic stimulator (TMS) and functional MRI. REFERENCES 1. DUCHOWSKI, A.T. 2002. A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 34: 455 470. 2. GITELMAN, D.R. 2002. ILAB: a program for postexperimental eye movement analysis. Behav. Res. Methods Instrum. Comput. 34: 605 612. 3. LEIGH, R.J. & D.S. ZEE. 1999. The Neurology of Eye Movements, 3 rd ed. Oxford University Press. New York. 4. LEIGH, R.J. & C. KENNARD. 2004. Using saccades as a research tool in the clinical neurosciences. Brain 127: 460 477. 5. PIERROT-DESEILLIGNY, C.H., C.J. PLONER, R.M. MURI, et al. 2002. Effects of cortical lesions on saccadic eye movements in humans. Ann. N.Y. Acad. Sci. 956: 216 229. 6. PIERROT-DESEILLIGNY, C.H., R.M. MURI, C.J. PLONER, et al. 2003. Decisional role of the dorsolateral prefrontal cortex in ocular motor behaviour. Brain 126: 1460 1473. 7. BECKER, W. 1989. Metrics. In The Neurobiology of Saccadic Eye Movements. R.H. Wurtz & M.E. Goldberg, Eds.: 13 67. Elsevier. Amsterdam.