Automated corneal-reflection eye-tracking in infancy: Methodological. developments and applications to cognition
|
|
|
- Roland Lloyd
- 9 years ago
- Views:
Transcription
1 Automated corneal-reflection eye-tracking in infancy: Methodological developments and applications to cognition Richard N. Aslin and Bob McMurray University of Rochester This is the introductory article for a Thematic Collection of four articles on the use of eye-tracking systems in studies of infant perception and cognition that will appear in a Special Issue of Infancy in 2004.
2 2 Since the mid-1800s, experimental psychologists have been using eye movements and the gaze direction to make inferences about perception and cognition in adults (Müller, 1826). In the past 175 years, these oculomotor measures have been refined (see Kowler, 1990) and used to address similar questions in infants (see Aslin, 1985, 1987; Bronson, 1982; Haith, 1980; Maurer, 1975). The general rationale for relying on these visual behaviors is that where one is looking is closely tied to what one is seeing. This is not to deny the fact that we can detect visual stimuli in the peripheral visual field, but rather that there is a bias to attend to and process information primarily when it is located in the central portion of the retina. Thus, while the direction of gaze is not perfectly correlated with the uptake of visual information (e.g., as in a blank stare or a covert shift of attention), there is a strong presumption that the direction of gaze can provide important information about visual even in newborn infants (Haith, 1966; Salapatek, 1968; Salapatek & Kessen, 1966). History of Eye Movement Techniques Techniques for measuring eye movements and the direction of gaze have evolved over the past century, although even the most rudimentary simple observation of the eyes remains the most common technique in use today. The primary limitation of such observations is that even trained on-line coders can only judge reliably the direction of an infant s gaze when it falls within one of 3-4 regions of a stimulus field, and determining vertical gaze position is much more difficult than horizontal. Such global observations are the mainstay of single-stimulus habituation paradigms and two-stimulus preferential looking paradigms. In addition, while observers can judge the presence-absence of particular types of eye movements (e.g., optokinetic nystagmus, saccades, smooth
3 3 pursuit; see Aslin & Salapatek, 1975; McGinnis, 1930; Teller & Lindsay, 1993), the quantitative characteristics of these eye movements can only be specified with a more precise measurement technique. While off-line coding of films or videotapes of the eyes allows for more detailed temporal resolution through the use of single frames and slowmotion playback, it does not improve the spatial resolution of coding or the ability to make a quantitative assessment of different types of eye movements. Moreover, while reliability ratings of trained coders are often quite high, the difficulty of determining eye position from global video recordings renders the coding and analysis of these data extremely time consuming. Two measurement techniques developed for use in studies of adult eye movements and direction of gaze were subsequently applied to infants in the early 1960s: electro-oculography (EOG) and corneal-reflection photography (see Maurer, 1975). EOG is based on the gradient of metabolic activity between the front and back of the eyeball. When passive electrodes are placed on the face of an infant, flanking one (or both) eyeballs, rotation of the eye with respect to these fixed electrodes creates a change in electrical potential that is approximately linear with the change in direction of gaze (up to +/- 30 deg). The fundamental problems with EOG (see Finocchio, Preston & Fushs, 1990) are that (a) a pair of electrodes is needed for each axis of rotation (two pairs for each eye often cannot be attached to the infant), and (b) metabolic activity varies over time (causing the EOG signal to drift in the absence of any change in gaze). The first problem has typically been resolved by limiting the question of interest to horizontal eye movements (especially since the placement of electrodes to assess vertical eye
4 4 movements often leads to artifacts from blinks). This single-axis (horizontal) solution, however, does not enable EOG to be used in studies of eye movements or gaze shifts in two dimensions. The second problem (drift) can be overcome in adults by conducting repeated calibrations of the EOG signal. This solution is often impractical in infants because of the limited time during which they maintain interest in visual displays without becoming fussy (especially with electrodes attached to their face) and because of the difficulties in calibrating pre-linguistic subjects. Moreover, because the EOG signal is relative to the position of the electrodes (which are fixed to the head), it raises another problem: any change in head position with respect to the stimulus field will alter the interpretation of the EOG signal. For example, if a subject s gaze is fixed on a stationary target and the head is moved, the EOG signal will change despite the fact that gaze remains on the target. Thus, either the head must be stabilized or some other device must be used to measure the position of the head, thereby compensating for the gaze-error induced by the movement of the head. Given the limitations of EOG, the second technique that emerged in the early 1960s was corneal-reflection photography (see Haith, 1969; 1980). The basis for this technique is remarkably simple. If a single eye is directed to fixate a small point of light, then that light creates a reflection off the front surface (cornea) of the eyeball. For a camera located just behind this light source and directed at the eyeball of the fixating subject, the corneal reflection will appear to be located in the center of the pupil. As the subject moves fixation to the right, left, up, or down with respect to the light, the corneal reflection will be displaced relative to the center of the pupil. That is, there is a lawful
5 5 (monotonic) relationship between the relative position of the corneal reflection with respect to the center of the pupil and the direction of gaze, and this relationship holds for both horizontal and vertical shifts of gaze. Initially, the corneal-reflection technique used more than one light to create several corneal reflections (Salapatek & Kessen, 1966), thereby reducing the parallax error (i.e., non-linearity) that occurs as the eyeball rotates away from the direction of the camera. Moreover, because these lights were salient to the infant and may have distracted them from attending to the experimental stimuli of interest, infrared filters were placed over the lights to render them invisible to the infant, and infrared-sensitive film was used in the movie camera so that the corneal reflections were visible on each film frame (typically taken at 4/sec). As a result, a series of film frames provided images of the relationship between the lights reflecting off the cornea and the center of the pupil. Since the location of the lights with respect to any visual stimulus being presented to the infant could be measured precisely, it was possible by simple geometric reconstruction to obtain a detailed measure of changes in gaze with a temporal resolution of 4 Hz. With the development of infrared-sensitive video cameras in the late 1960s, it was rather simple (in principle) to move from film to videotape, thereby improving temporal resolution from 4 to 30 Hz (Haith, 1969). However, there were intrinsic problems with the corneal-reflection technique that video technology could not overcome. First, the inference that the center of the pupil corresponded with the axis of gaze was incorrect because the fovea (the visual axis) is displaced slightly with respect to a line (the optic axis) that passes through the center of the pupil (Salapatek, Haith, Maurer & Kessen, 1972). Thus, as with EOG, the output of the eye-position measurement technique (EOG
6 6 signal or distance between a corneal reflection and the center of the pupil) has no objective status, but rather must be mapped onto the stimulus field by having the subject fixate several known (and preferably small) calibration targets (see discussion in the following section). Average estimates of this error suggest that it is much larger than in adults and that there is considerable variation between infants (Maurer, 1975). As a result of this variation, calibration is critical. Another problem with the corneal-reflection technique was its reliance on judging the center of the pupil. Under normal illumination conditions, the pupil is black (i.e., it is the absence of light created by a hole in the iris). For subjects with brown or dark-blue eye color, the border between the pupil and the iris is difficult to define from the film (or video) frame, thereby adding error to the measurement. One solution, which every flash photographer has experienced, was to place the light that creates the corneal reflection close to the axis of the lens of the camera. This creates a red reflex, which is light being reflected back from the surface of the retina and filling the pupil. The so-called bright pupil technique uses this phenomenon to create images of the eyeball that consist of a dark iris, a light pupil (created by this reflected light), and an even brighter corneal reflection (since the cornea is a much better reflective surface than the retina). As the corneal-reflection technique migrated to video and more powerful digital circuitry became available in the early 1970s, it became possible to perform an electronic analysis of each video frame, extracting the coordinates of the center of the pupil and the coordinates of the corneal reflection, and compensating for parallax errors. This analysis computed the direction of gaze with respect to the location in space of the light source that creates the corneal reflection. Video circuitry by this time was fast enough to
7 7 perform these computations in real-time at a rate of 60 Hz (e.g., Aslin, 1981). Later developments have increased this rate to 120 and even 250 Hz. Although these automated corneal-reflection systems had a temporal resolution that made it possible to extract quantitative parameters of eye movements as well as gaze direction, they continued to suffer from the most serious limitation of the cornealreflection technique. To optimize the spatial resolution of the system, the video camera had a close-up lens that filled the video frame with the pupil. As a result of this very small video frame, any movement of the infant resulted in the pupil moving outside the video frame, thereby losing the opportunity to capture any useable data. In the past several years, automated corneal-reflection systems have solved this out of frame problem using three innovations. First, the video camera is mounted on a motor-driven base that can quickly move the field-of-view of the camera to compensate for small (and slow) head movements. The second innovation was the development of relatively simple algorithms to drive these motors using information from the video image of the eye to maintain the pupil in the center of the camera s field-of-view. While these innovations proved superior to a fixed camera, it did not compensate for rapid head movements (note: head movements that place the eye completely outside the possible field-of-view of the camera are, of course, not relevant here because no single-camera system can deal effectively with such large head movements and such head movements are the infant s attempt to look at non-experimental stimuli such as the mother). The final innovation was the use of a position-sensing system that monitors head movements and feeds that information to the motor-driven video camera, thereby maintaining the image of the eye in the field-of-view of the video camera (except when
8 8 the eye is completely out of range of the camera). Moreover, unlike optical methods, when the infant turns away from the camera, the position-sensing system can reacquire the image of the eye automatically as the eye returns to the possible range of the camera s field-of-view. The position-sensing system consists of a magnetic field within which the infant is placed and a small sensor worn on the infant s head. This system sends information to circuitry that (a) knows the relationship between the sensor s location on the infant s head and the location of the infant s eye, and (b) feeds that headposition signal to the motor-driven base of the video camera in real-time. As a result, even for an infant who makes substantial head movements, the automated cornealreflection system with head-tracking can maintain a steady stream of eye-position data from most infants (see figure 1 for a schematic diagram of the head- and eye-tracking equipment used in our lab). Calibration of Eye Tracking Systems The beauty of the original corneal-reflection photography technique was that it promised to deliver an objective method for measuring gaze position. Unfortunately, it could not deliver on this promise because of the uncertain relationship between the center of the pupil and the direction of gaze. Thus, every eye-tracking system must rely on calibration data to map the signals provided by the system (e.g., EOG voltages or videoframe coordinates) onto the stimulus field. Because every system suffers from nonlinearities (e.g., a compression of signals as gaze position deviates from the axis of the camera), the calibration data provide a means of linearizing the system, thereby providing a metric for any gaze position within the stimulus field.
9 9 In adults, calibration is a relatively simple matter of asking each subject to fixate (and hold) on several small targets placed at known locations in the stimulus field. However, the problem of calibrating infants is two-fold. First, they are less likely than adults to fixate (and hold) on small visual targets because such targets are relatively uninteresting and infants cannot be instructed to perform this demanding calibration task. Second, the more time that is expended on calibration, the less likely the infant will remain in a cooperative state for providing the real data that are the goal of the eyetracking experiment. Given these problems, compromises have been made in studies of infants. First, the usual 9-point calibration schemes used with adults are typically reduced to 5- or 2- point schemes. Second the calibration target that elicits fixation begins as a large, moving (or looming ) stimulus which attracts the infants gaze and then shrinks and becomes stationary at each calibration location in the stimulus field. Most infants will exhibit interest in the large calibration target, follow its movement, and maintain fixation for 1-2 sec as it shrinks and remains stationary. Thus, a 2-point calibration routine can be completed in less than a minute so that the experiment can begin promptly, and this 2- point scheme can be repeated during the experiment to ensure that calibration has been maintained. Data Analysis The advantages of an automated corneal-reflection system with head-tracking are obvious: (a) access to both horizontal and vertical eye position, (b) a relatively high sample rate of 60 Hz, (c) rapid calibration, (d) minimal loss of data from off camera head movements, and (e) minimal coding time. Of course, this last advantage is also a
10 10 potential problem because of the voluminous stream of data. Within this data stream there are undoubtedly some segments where the algorithms that compute eye position have made errors. How can these errors be detected and eliminated? One strategy is the use of a time-stamped video record of the experimental session. Most commercial eye-trackers output two sources of video data (in addition to the digital data stream). The scene output is typically a video record of the visual scene in front of the infant (either from a fixed video camera, or from the television or monitor the infant is viewing). The eye output shows the raw video data used by the eyetracker to determine pupil and corneal reflection (often with these features marked in some way). In addition, our lab (and others) employs a second video camera (wide angle) that records the entire experimental session. These three video sources can be combined using picture-in-a-picture technology onto a single tape, or recorded separately on synced VHS or digital records (see figure 2 for two examples). In this way, we always have a record of what the infant was doing during the experiment when the data stream shows anomalies. For some of these segments of output from the eye tracker, the detailed settings (e.g., the threshold for what video features define the pupil) may have been sub-optimal even though the global video image is perfectly scorable. In these cases, the back-up video enables an off-line scorer to code the direction of the infant s gaze (with admittedly less accuracy). Alternatively, the output of the eye-camera can be used to determine whether track-loss was the result of a blink, the loss of the eye (from the field of view of the camera), or sub-optimal settings. Moreover, as the number of regions of interest in the stimulus display grows, holistic coding becomes less reliable, while the close-up eye-image may still remain useful.
11 11 Preview of Thematic Collection Articles The relative ease and power of today s automatic corneal reflection eye-trackers has enabled researchers to use eye-movements as a powerful measure of infant cognitive and perceptual abilities. In each of the four articles that comprise this Thematic Collection, an automated corneal-reflection system was used to track infant eye movements and direction of gaze to either stationary or moving visual targets with implications for a diverse set of research issues. Gredeback and von Hofsten used predictive changes in gaze to assess 6- to 12- month-old infants ability to represent the movement of an object behind an occluder. Johnson, Slemmer, and Amso recorded detailed fixations while 3-month-olds viewed rod-and-box displays in an object unity task. McMurray and Aslin used a two-choice spatial localization task to study the formation of categories and the generalization to novel stimuli, with anticipatory eye movements as the dependent measure of categorization. Finally, Hunnius and Geuze studied the changes in scanning of faces and non-face control stimuli between 1 and 6 months of age. In each of these studies of infant perception and cognition, eye movements and gaze direction were used as measures of processing. Without an automated corneal-reflection system, the questions under investigation in these studies could not have been addressed. Commentaries on these four target articles are provided by Haith, one of the pioneers in the use of corneal reflection eye-trackers with young infants and by Hayhoe, who employs head-mounted eye-tracking and virtual reality displays to study adult perception.
12 12 References Aslin, R. N. (1981). Development of smooth pursuit in human infants. In D. F. Fisher, R. A. Monty, and J. W. Senders (Eds.), Eye Movements: Cognition and Visual Perception. Hillsdale, N.J.: Lawrence Erlbaum Associates (pp ). Aslin, R. N. (1985). Oculomotor measures of visual development. In G. Gottlieb and N. Krasnegor (Eds.), Measurement of audition and vision during the first year of postnatal life: A methodological overview. Norwood, NJ: Ablex (pp ). Aslin, R. N. (1987). Motor aspects of visual development in infancy. In P. Salapatek and L. B. Cohen (Eds.), Handbook of infant perception. New York: Academic Press (pp ). Aslin, R. N. and Salapatek, P. (1975). Saccadic localization of visual targets by the very young human infant. Perception and Psychophysics, 17, Bronson, G. W. (1982). The scanning patterns of human infants: Implications for visual learning. In L. P. Lipsitt (Ed.), Monographs on infancy. Norwood, NJ: Ablex. Finocchio D. V., Preston K. L, & Fuchs A. F. (1990). Obtaining a quantitative measure of eye movements in human infants: A method of calibrating the electrooculogram.. Vision Research, 30, Haith, M. M. (1966). The response of the human newborn to visual movement. Journal of Experimental Child Psychology, 3, Haith, M. M. (1969). Infrared television recording and measurement of ocular behavior in the human infant. American Psychologist, 24, Haith, M. M. (1980). Rules that babies look by: The organization of newborn visual activity. Hillsdale, NJ: Lawrence Erlbaum Associates.
13 13 Kowler, E. (Ed.) (1990). Eye movements and their role in visual and cognitive processes. Reviews of oculomotor research, Vol. 4. Amsterdam: Elsevier. Maurer, D. (1975). Infant visual perception: Methods of study. In L. B. Cohen & P. Salapatek (Eds.), Infant perception: From sensation to cognition. Volume I. Basic visual processes. New York: Academic Press (pp. 1-76). McGinnis, J. M. (1930). Eye movements and optic nystagmus in early infancy. Genetic Psychology Monographs, 8, Müller J. (1826). Beitrage zur vergleichenden Physiologie des Gesichtssinnes. [cited in Boring, E. G. (1942). Sensation and perception in the history of experimental psychology. New York: D. Appleton-Century Co. (p. 230)]. Salapatek, P. (1968). Visual scanning of geometric figures by the human newborn. Journal of Comparative and Physiological Psychology, 66, Salapatek, P., Haith, M. M., Maurer, D., & Kessen, W. (1972). Error in the cornealreflection technique: A note on Slater and Findlay. Journal of Experimental Child Psychology, 14, Salapatek, P. & Kessen, W. (1966). Visual scanning of triangles by the human newborn. Journal of Experimental Child Psychology, 3, Teller, D. Y. & Lindsey, D. T. (1993). OKN nulling techniques and infant color vision. In: C. Granrud (Ed.), Visual perception and cognition in infancy. Hillsdale, N.J.: Lawrence Erlbaum Associates.
14 14 MHT receiver MHT transmitter Remote eye tracker TV Baby (on parent s lap) Infrared video camera Eye-tracker control unit MHT control unit To eye tracking computer To experimental control computer Figure 1: Components of the automated eye/head-tracking system used in our lab.
15 a d b c e f Figure 2: Panel 1: A screen used for coding infants eye movements from the video record that includes (a) Output from a wide-angle, fixed video camera and (b) the output of the eye-tracker s scene camera which shows the image the infant is watching with superimposed crosshairs (c) indicating gaze position. Panel 2: Alternatively data can be coded from a screen showing the scene camera (d) and gaze position (e) with the eye-tracker s eye camera (g), which shows a close-up of the infant s eye with the pupil and corneal reflection marked by crosshairs.
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University A Software-Based System for Synchronizing and Preprocessing Eye Movement Data in Preparation for Analysis 1 Mohammad
Eye-tracking. Benjamin Noël
Eye-tracking Benjamin Noël Basics Majority of all perceived stimuli through the eyes Only 10% through ears, skin, nose Eye-tracking measuring movements of the eyes Record eye movements to subsequently
Video-Based Eye Tracking
Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department
Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding
Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding Susan M. Munn Leanne Stefano Jeff B. Pelz Chester F. Carlson Center for Imaging Science Department of Psychology
SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software. www.eyegaze.com
SUPERIOR EYE TRACKING TECHNOLOGY Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software www.eyegaze.com LC TECHNOLOGIES EYEGAZE EDGE SYSTEMS LC Technologies harnesses the power
PDF Created with deskpdf PDF Writer - Trial :: http://www.docudesk.com
CCTV Lens Calculator For a quick 1/3" CCD Camera you can work out the lens required using this simple method: Distance from object multiplied by 4.8, divided by horizontal or vertical area equals the lens
Designing eye tracking experiments to measure human behavior
Designing eye tracking experiments to measure human behavior Eindhoven, The Netherlands August, 2010 Ricardo Matos Tobii Technology Steps involved in measuring behaviour 1. Formulate and initial question
PRODUCT SHEET. [email protected] [email protected] www.biopac.com
EYE TRACKING SYSTEMS BIOPAC offers an array of monocular and binocular eye tracking systems that are easily integrated with stimulus presentations, VR environments and other media. Systems Monocular Part
Fundus Photograph Reading Center
Modified 7-Standard Field Digital Color Fundus Photography (7M-D) 8010 Excelsior Drive, Suite 100, Madison WI 53717 Telephone: (608) 410-0560 Fax: (608) 410-0566 Table of Contents 1. 7M-D Overview... 2
Bernice E. Rogowitz and Holly E. Rushmeier IBM TJ Watson Research Center, P.O. Box 704, Yorktown Heights, NY USA
Are Image Quality Metrics Adequate to Evaluate the Quality of Geometric Objects? Bernice E. Rogowitz and Holly E. Rushmeier IBM TJ Watson Research Center, P.O. Box 704, Yorktown Heights, NY USA ABSTRACT
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - [email protected]
The Eyelink Toolbox: Eye Tracking with MATLAB and the Psychophysics Toolbox.
The Eyelink Toolbox: Eye Tracking with MATLAB and the Psychophysics Toolbox. Frans W. Cornelissen 1, Enno M. Peters 1, John Palmer 2 1 Laboratory of Experimental Ophthalmology, School for Behavioral and
A Guided User Experience Using Subtle Gaze Direction
A Guided User Experience Using Subtle Gaze Direction Eli Ben-Joseph and Eric Greenstein Stanford University {ebj, ecgreens}@stanford.edu 1 Abstract This paper demonstrates how illumination modulation can
Quantifying Spatial Presence. Summary
Quantifying Spatial Presence Cedar Riener and Dennis Proffitt Department of Psychology, University of Virginia Keywords: spatial presence, illusions, visual perception Summary The human visual system uses
EMR-9 Quick Start Guide (00175)D 1
NAC Eye Mark Recorder EMR-9 Quick Start Guide May 2009 NAC Image Technology Inc. (00175)D 1 Contents 1. System Configurations 1.1 Standard configuration 1.2 Head unit variations 1.3 Optional items 2. Basic
Message, Audience, Production (MAP) Framework for Teaching Media Literacy Social Studies Integration PRODUCTION
Message, Audience, Production (MAP) Framework for Teaching Media Literacy Social Studies Integration PRODUCTION All media messages - a film or book, photograph or picture, newspaper article, news story,
Eye tracking in usability research: What users really see
Printed in: Empowering Software Quality: How Can Usability Engineering Reach These Goals? Usability Symposium 2005: pp 141-152, OCG publication vol. 198. Eye tracking in usability research: What users
Tobii 50 Series. Product Description. Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker
Product Description Tobii 50 Series Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker Web: www.tobii.com E-mail: [email protected] Phone: +46 (0)8 663 69 90 Copyright Tobii Technology AB,
The Limits of Human Vision
The Limits of Human Vision Michael F. Deering Sun Microsystems ABSTRACT A model of the perception s of the human visual system is presented, resulting in an estimate of approximately 15 million variable
HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT
International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT Akhil Gupta, Akash Rathi, Dr. Y. Radhika
Scanners and How to Use Them
Written by Jonathan Sachs Copyright 1996-1999 Digital Light & Color Introduction A scanner is a device that converts images to a digital file you can use with your computer. There are many different types
International Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in Office Applications Light Measurement & Quality Parameters
www.led-professional.com ISSN 1993-890X Trends & Technologies for Future Lighting Solutions ReviewJan/Feb 2015 Issue LpR 47 International Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in
Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking
Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking Roman Bednarik University of Joensuu Connet course 281: Usability in Everyday Environment February 2005 Contents
Projection Center Calibration for a Co-located Projector Camera System
Projection Center Calibration for a Co-located Camera System Toshiyuki Amano Department of Computer and Communication Science Faculty of Systems Engineering, Wakayama University Sakaedani 930, Wakayama,
3D Scanner using Line Laser. 1. Introduction. 2. Theory
. Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric
GAZE STABILIZATION SYSTEMS Vestibular Ocular Reflex (VOR) Purpose of VOR Chief function is to stabilize gaze during locomotion. Acuity declines if
GAZE STABILIZATION SYSTEMS Vestibular Ocular Reflex (VOR) Purpose of VOR Chief function is to stabilize gaze during locomotion. Acuity declines if slip exceeds 3-5 deg/sec. Ex: Head bobbing and heel strike
Digital Remote Sensing Data Processing Digital Remote Sensing Data Processing and Analysis: An Introduction and Analysis: An Introduction
Digital Remote Sensing Data Processing Digital Remote Sensing Data Processing and Analysis: An Introduction and Analysis: An Introduction Content Remote sensing data Spatial, spectral, radiometric and
How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)
Framework for colocated synchronous dual eye tracking Craig Hennessey Department of Electrical and Computer Engineering University of British Columbia Mirametrix Research [email protected] Abstract Dual
Applied Science Laboratories. Model 504. Eye Tracker and Gaze Tracker. System Setup and Operations Manual
Applied Science Laboratories Model 504 Eye Tracker and Gaze Tracker System Setup and Operations Manual Original C. 2001 by Applied Science Group, Manual version 2.4 Revisions C. 2006 by Michael Wogan,
What s in a look? Richard N. Aslin. Abstract. Introduction
Developmental Science 10:1 (2007), pp 48 53 DOI: 10.1111/j.1467-7687.2007.00563.x What s in a look? Blackwell Publishing Ltd Richard N. Aslin Department of Brain and Cognitive Sciences, University of Rochester,
Research of User Experience Centric Network in MBB Era ---Service Waiting Time
Research of User Experience Centric Network in MBB ---Service Waiting Time Content Content 1 Introduction... 1 1.1 Key Findings... 1 1.2 Research Methodology... 2 2 Explorer Excellent MBB Network Experience
Cerebral Palsy and Visual Impairment
CP Factsheet Cerebral Palsy and Visual Impairment Although cerebral palsy mainly causes difficulty with movement, other conditions may also occur. This is because other parts of the brain may also be affected
Video Camera Image Quality in Physical Electronic Security Systems
Video Camera Image Quality in Physical Electronic Security Systems Video Camera Image Quality in Physical Electronic Security Systems In the second decade of the 21st century, annual revenue for the global
Hearing and Vision Program. Public Health Muskegon County
Hearing and Vision Program Public Health Muskegon County H&V Screening: Two of the Required Public Health Services in Michigan The Hearing and Vision programs are required by the Michigan Public Health
Graphical displays are generally of two types: vector displays and raster displays. Vector displays
Display technology Graphical displays are generally of two types: vector displays and raster displays. Vector displays Vector displays generally display lines, specified by their endpoints. Vector display
Whitepaper. Image stabilization improving camera usability
Whitepaper Image stabilization improving camera usability Table of contents 1. Introduction 3 2. Vibration Impact on Video Output 3 3. Image Stabilization Techniques 3 3.1 Optical Image Stabilization 3
History of eye-tracking in psychological research
History of eye-tracking in psychological research In the 1950s, Alfred Yarbus showed the task given to a subject has a very large influence on the subjects eye movements. Yarbus also wrote about the relation
Primary Motor Pathway
Understanding Eye Movements Abdullah Moh. El-Menaisy, MD, FRCS Chief, Neuro-ophthalmology ophthalmology & Investigation Units, Dhahran Eye Specialist Hospital, Dhahran, Saudi Arabia Primary Motor Pathway
Eye Tracking System Instruction. Manual. Model 504 Pan/Tilt Optics MANUAL VERSION 2.1
Eye Tracking System Instruction Manual Model 504 Pan/Tilt Optics MANUAL VERSION 2.1 . Applied Science Laboratories 175 Middlesex Turnpike Bedford, MA 01730 ASL324-M-998 010112 Copyright 2001, by Applied
Automotive Applications of 3D Laser Scanning Introduction
Automotive Applications of 3D Laser Scanning Kyle Johnston, Ph.D., Metron Systems, Inc. 34935 SE Douglas Street, Suite 110, Snoqualmie, WA 98065 425-396-5577, www.metronsys.com 2002 Metron Systems, Inc
MRC High Resolution. MR-compatible digital HD video camera. User manual
MRC High Resolution MR-compatible digital HD video camera User manual page 1 of 12 Contents 1. Intended use...2 2. System components...3 3. Video camera and lens...4 4. Interface...4 5. Installation...5
Automatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 [email protected] 1. Introduction As autonomous vehicles become more popular,
The Eyegaze Edge : How does it work? What do you need to know?
The Eyegaze Edge : How does it work? What do you need to know? Eye tracking overview The Eyegaze Edge is an eye-controlled communication and control system. The Edge uses the pupilcenter/corneal-reflection
Comparison to other anterior segment imaging devices
found helpful in a number of areas of ophthalmology. However, in the field of anterior segment tumors its usefulness is confined to a subset of tumors that are confined to the iris, hypo pigmented, and
T-REDSPEED White paper
T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation
A Proposal for OpenEXR Color Management
A Proposal for OpenEXR Color Management Florian Kainz, Industrial Light & Magic Revision 5, 08/05/2004 Abstract We propose a practical color management scheme for the OpenEXR image file format as used
Tobii Technology AB. Accuracy and precision Test report. X2-60 fw 1.0.5. Date: 2013-08-16 Methodology/Software version: 2.1.7
Tobii Technology AB Accuracy and precision Test report X2-60 fw 1.0.5 Date: 2013-08-16 Methodology/Software version: 2.1.7 1. Introduction This document provides an overview of tests in terms of accuracy
Eye-Tracking with Webcam-Based Setups: Implementation of a Real-Time System and an Analysis of Factors Affecting Performance
Universitat Autònoma de Barcelona Master in Computer Vision and Artificial Intelligence Report of the Master Project Option: Computer Vision Eye-Tracking with Webcam-Based Setups: Implementation of a Real-Time
TVL - The True Measurement of Video Quality
ACTi Knowledge Base Category: Educational Note Sub-category: Video Quality, Hardware Model: N/A Firmware: N/A Software: N/A Author: Ando.Meritee Published: 2010/10/25 Reviewed: 2010/10/27 TVL - The True
CS231M Project Report - Automated Real-Time Face Tracking and Blending
CS231M Project Report - Automated Real-Time Face Tracking and Blending Steven Lee, [email protected] June 6, 2015 1 Introduction Summary statement: The goal of this project is to create an Android
Face detection is a process of localizing and extracting the face region from the
Chapter 4 FACE NORMALIZATION 4.1 INTRODUCTION Face detection is a process of localizing and extracting the face region from the background. The detected face varies in rotation, brightness, size, etc.
Research-Grade Research-Grade. Capture
Research-Grade Research-Grade Motion Motion Capture Capture The System of Choice For Resear systems have earned the reputation as the gold standard for motion capture among research scientists. With unparalleled
Tobii AB. Accuracy and precision Test report. Tobii Pro X3-120 fw 1.7.1. Date: 2015-09-14 Methodology/Software version: 2.1.7*
Tobii AB Accuracy and precision Test report Tobii Pro X3-120 fw 1.7.1 Date: 2015-09-14 Methodology/Software version: 2.1.7* 1. Introduction This document provides an overview of tests in terms of accuracy
WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception
Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception Abstract
Data Analysis Methods: Net Station 4.1 By Peter Molfese
Data Analysis Methods: Net Station 4.1 By Peter Molfese Preparing Data for Statistics (preprocessing): 1. Rename your files to correct any typos or formatting issues. a. The General format for naming files
GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS
GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS Chris kankford Dept. of Systems Engineering Olsson Hall, University of Virginia Charlottesville, VA 22903 804-296-3846 [email protected]
Ultra-High Resolution Digital Mosaics
Ultra-High Resolution Digital Mosaics J. Brian Caldwell, Ph.D. Introduction Digital photography has become a widely accepted alternative to conventional film photography for many applications ranging from
Using Photorealistic RenderMan for High-Quality Direct Volume Rendering
Using Photorealistic RenderMan for High-Quality Direct Volume Rendering Cyrus Jam [email protected] Mike Bailey [email protected] San Diego Supercomputer Center University of California San Diego Abstract With
Eye Tracking Instructions
Eye Tracking Instructions [1] Check to make sure that the eye tracker is properly connected and plugged in. Plug in the eye tracker power adaptor (the green light should be on. Make sure that the yellow
Synthetic Sensing: Proximity / Distance Sensors
Synthetic Sensing: Proximity / Distance Sensors MediaRobotics Lab, February 2010 Proximity detection is dependent on the object of interest. One size does not fit all For non-contact distance measurement,
The Scientific Data Mining Process
Chapter 4 The Scientific Data Mining Process When I use a word, Humpty Dumpty said, in rather a scornful tone, it means just what I choose it to mean neither more nor less. Lewis Carroll [87, p. 214] In
Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors
Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors Diego Betancourt and Carlos del Río Antenna Group, Public University of Navarra, Campus
A Prototype For Eye-Gaze Corrected
A Prototype For Eye-Gaze Corrected Video Chat on Graphics Hardware Maarten Dumont, Steven Maesen, Sammy Rogmans and Philippe Bekaert Introduction Traditional webcam video chat: No eye contact. No extensive
Build Panoramas on Android Phones
Build Panoramas on Android Phones Tao Chu, Bowen Meng, Zixuan Wang Stanford University, Stanford CA Abstract The purpose of this work is to implement panorama stitching from a sequence of photos taken
N.Galley, R.Schleicher and L.Galley Blink parameter for sleepiness detection 1
N.Galley, R.Schleicher and L.Galley Blink parameter for sleepiness detection 1 Blink Parameter as Indicators of Driver s Sleepiness Possibilities and Limitations. Niels Galley 1, Robert Schleicher 1 &
Data Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801
S7a1801 OVERVIEW In definiti 3D theaters, the audience wears special lightweight glasses to see the world projected onto the giant dome screen with real depth perception called 3D stereo. The effect allows
PREDICTING PATTERNS OF POTENTIAL DRIVER DISTRACTION THROUGH ANALYSIS OF EYE TRACKING DATA
PREDICTING PATTERNS OF POTENTIAL DRIVER DISTRACTION THROUGH ANALYSIS OF EYE TRACKING DATA Wei Zhang, Ph.D., P.E. Highway Research Engineer, Office of Safety Research & Development, HRDS-10 Federal Highway
CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.
CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. 6.1. CONNECTIONS AMONG NEURONS Neurons are interconnected with one another to form circuits, much as electronic components are wired together to form a functional
Using angular speed measurement with Hall effect sensors to observe grinding operation with flexible robot.
Using angular speed measurement with Hall effect sensors to observe grinding operation with flexible robot. François Girardin 1, Farzad Rafieian 1, Zhaoheng Liu 1, Marc Thomas 1 and Bruce Hazel 2 1 Laboratoire
Robot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
SECURITY CAMERA INSTRUCTION MANUAL ENGLISH VERSION 1.0 LBC5451. www.lorextechnology.com
SECURITY CAMERA INSTRUCTION MANUAL ENGLISH VERSION 1.0 LBC5451 www.lorextechnology.com Contents 1x Camera and mounting stand 1x Power adapter 1x Mounting kit 1x Allen key 1x Screw cap 1x 60 ft. (18m)
How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud
REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada [email protected]
A System for Capturing High Resolution Images
A System for Capturing High Resolution Images G.Voyatzis, G.Angelopoulos, A.Bors and I.Pitas Department of Informatics University of Thessaloniki BOX 451, 54006 Thessaloniki GREECE e-mail: [email protected]
COMPACT GUIDE. Camera-Integrated Motion Analysis
EN 05/13 COMPACT GUIDE Camera-Integrated Motion Analysis Detect the movement of people and objects Filter according to directions of movement Fast, simple configuration Reliable results, even in the event
Eye-Tracking Methodology and Applications in Consumer Research 1
FE947 Eye-Tracking Methodology and Applications in Consumer Research 1 Hayk Khachatryan and Alicia L. Rihn 2 Introduction Eye-tracking analysis is a research tool used to measure visual attention. Visual
Surveillance Security Camera Guide
Surveillance Security Camera Guide By Guy Avital DVRMASTER.COM All Rights Reserved A BUYERS GUIDE TO SURVEILLANCE SECURITY CAMERA SYSTEMS There are a variety range of security cameras, security DVR and
THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY
THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY Dr. T. Clarke & Dr. X. Wang Optical Metrology Centre, City University, Northampton Square, London, EC1V 0HB, UK [email protected], [email protected]
Traffic Monitoring Systems. Technology and sensors
Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal
The Physiology of the Senses Lecture 1 - The Eye www.tutis.ca/senses/
The Physiology of the Senses Lecture 1 - The Eye www.tutis.ca/senses/ Contents Objectives... 2 Introduction... 2 Accommodation... 3 The Iris... 4 The Cells in the Retina... 5 Receptive Fields... 8 The
Template-based Eye and Mouth Detection for 3D Video Conferencing
Template-based Eye and Mouth Detection for 3D Video Conferencing Jürgen Rurainsky and Peter Eisert Fraunhofer Institute for Telecommunications - Heinrich-Hertz-Institute, Image Processing Department, Einsteinufer
How To Control Gimbal
Tarot 2-Axis Brushless Gimbal for Gopro User Manual V1.0 1. Introduction Tarot T-2D gimbal is designed for the Gopro Hero3, which is widely used in film, television productions, advertising aerial photography,
2WIN Binocular Mobile Refractometer and Vision Analyzer
2WIN Binocular Mobile Refractometer and Vision Analyzer The smartest way to detect refractive errors and vision problems Adaptica was founded in 2009 as a spin-off of the University of Padova, Italy specialising
Introduction. C 2009 John Wiley & Sons, Ltd
1 Introduction The purpose of this text on stereo-based imaging is twofold: it is to give students of computer vision a thorough grounding in the image analysis and projective geometry techniques relevant
VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION
VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION Mark J. Norris Vision Inspection Technology, LLC Haverhill, MA [email protected] ABSTRACT Traditional methods of identifying and
Displays. Cathode Ray Tube. Semiconductor Elements. Basic applications. Oscilloscope TV Old monitors. 2009, Associate Professor PhD. T.
Displays Semiconductor Elements 1 Cathode Ray Tube Basic applications Oscilloscope TV Old monitors 2 1 Idea of Electrostatic Deflection 3 Inside an Electrostatic Deflection Cathode Ray Tube Gun creates
Lesson 3: Behind the Scenes with Production
Lesson 3: Behind the Scenes with Production Overview: Being in production is the second phase of the production process and involves everything that happens from the first shot to the final wrap. In this
Studying Human Face Recognition with the Gaze-Contingent Window Technique
Studying Human Face Recognition with the Gaze-Contingent Window Technique Naing Naing Maw ([email protected]) University of Massachusetts at Boston, Department of Computer Science 100 Morrissey Boulevard,
Photography of Cultural Heritage items
Photography of Cultural Heritage items A lot of people only get to know art pieces through photographic reproductions. Nowadays with digital imaging far more common than traditional analogue photography,
WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras.
WHITE PAPER P-Iris. New iris control improves image quality in megapixel and HDTV network cameras. Table of contents 1. Introduction 3 2. The role of an iris 3 3. Existing lens options 4 4. How P-Iris
