User Eye Fatigue Detection via Eye Movement Behavior



Similar documents
Video-Based Eye Tracking

Designing eye tracking experiments to measure human behavior

The Tobii I-VT Fixation Filter

Real-Time Eye Gaze Tracking With an Unmodified Commodity Webcam Employing a Neural Network

A Cheap Portable Eye-Tracker Solution for Common Setups

Quantitative Analysis of Reading Comprehension and Reading Speed Based on Serif and Sans-Serif Fonts

USING EYE TRACKING TO EVALUATE ALTERNATIVE SEARCH RESULTS INTERFACES

Package gazepath. April 1, 2015

Tobii AB. Accuracy and precision Test report. Tobii Pro X3-120 fw Date: Methodology/Software version: 2.1.7*

Studying the contextual cues associated with fear of crime through eye tracking techniques

Vestibular Assessment

Reading with Mobile Phone & Large Display

COMPARISON OF DESIGN APPROACHES BETWEEN ENGINEERS AND INDUSTRIAL DESIGNERS

Reduction of the flash-lag effect in terms of active observation

Eye tracking in usability research: What users really see

Tobii Technology AB. Accuracy and precision Test report. X2-60 fw Date: Methodology/Software version: 2.1.7

Video Game Design Using an Eye Movement Dependent Model of Visual Attention

Tracking translation process: The impact of experience and training

Towards Inferring Web Page Relevance An Eye-Tracking Study

Tobii 50 Series. Product Description. Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker

Research Methods & Experimental Design

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

WE ARE in a time of explosive growth

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Electroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep

CHAPTER 6 TEXTURE ANIMATION

History of eye-tracking in psychological research

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS

Multivariate Analysis of Variance. The general purpose of multivariate analysis of variance (MANOVA) is to determine

PRIMING OF POP-OUT AND CONSCIOUS PERCEPTION

User Recognition and Preference of App Icon Stylization Design on the Smartphone

Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression

PRODUCT SHEET.

HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT

A Procedure for Classifying New Respondents into Existing Segments Using Maximum Difference Scaling

Andrew Ilyas and Nikhil Patil - Optical Illusions - Does Colour affect the illusion?

GAZE STABILIZATION SYSTEMS Vestibular Ocular Reflex (VOR) Purpose of VOR Chief function is to stabilize gaze during locomotion. Acuity declines if

Work Performance Comparison of Holding and Using Tablet

How to Verify Performance Specifications

Eye Tracking Instructions

Studying Human Face Recognition with the Gaze-Contingent Window Technique

VIDEO CONFERENCING SYSTEMS: TELEPRESENCE AND SELECTION INTERVIEWS.

RANDOM VIBRATION AN OVERVIEW by Barry Controls, Hopkinton, MA

E10: Controlled Experiments

Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking

Cloud tracking with optical flow for short-term solar forecasting

white paper EYETRACKING STUDY REPORT: Clamshells vs Paperboard Boxes CUshop Research, Clemson University

Internuclear ophthalmoplegia: recovery and plasticity

The Online Journal of New Horizons in Education Volume 3, Issue 3

Personal Identity Verification (PIV) IMAGE QUALITY SPECIFICATIONS FOR SINGLE FINGER CAPTURE DEVICES

Statistics Review PSY379

22" Widescreen LED Monitor VA2261-2

Oncology Nursing Society Annual Progress Report: 2008 Formula Grant

Tracking and Recognition in Sports Videos

Specific Usage of Visual Data Analysis Techniques

A Comparative Analysis of Speech Recognition Platforms

Analysis of User Experience Quality on Responsive Web Design from its Informative Perspective

Online Ensembles for Financial Trading

Supplementary online appendix

Embodied decision making with animations

The Effects of Baker-Miller Pink on Biological, Physical and Cognitive Behaviour James E. Gilliam, Ph.D. 1 and David Unruh, Ph.D.

PERSPECTIVE. How Top-Down is Visual Perception?

Using Eye Tracking to Compare Web Page Designs: A Case Study

Unit 9 Describing Relationships in Scatter Plots and Line Graphs

The Second Eye Movements Verification and Identification Competition Pawel Kasprowski

Guidelines for Using the Retrospective Think Aloud Protocol with Eye Tracking

The Use of Mobile Phones as a Direct Marketing Tool and Consumer Attitudes

HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION

Eyetracker Output Utility

Speed Performance Improvement of Vehicle Blob Tracking System

How to test ocular movements in PSP Jan Kassubek

Experimental Design and Hypothesis Testing. Rick Balkin, Ph.D.

Introduction to Analysis of Variance (ANOVA) Limitations of the t-test

A Conceptual Approach to Data Visualization for User Interface Design of Smart Grid Operation Tools

User Authentication using Combination of Behavioral Biometrics over the Touchpad acting like Touch screen of Mobile Device

Adaptive information source selection during hypothesis testing

Science, Technology, Engineering & Mathematics Career Cluster

Visual-based ID Verification by Signature Tracking

Considerations When Specifying a DC Power Supply

Subjects: Fourteen Princeton undergraduate and graduate students were recruited to

Gage Studies for Continuous Data

EXAMINATION OF OLDER DRIVER STEERING ADAPTATION ON A HIGH PERFORMANCE DRIVING SIMULATOR

STATISTICA. Clustering Techniques. Case Study: Defining Clusters of Shopping Center Patrons. and

DEVELOPMENT OF HYBRID VECTORIZING SOFTWARE FOR DIGITIZATION OF CADASTRAL MAPS

IT Recruiting Practices as Retention Predictors: Can Recruiters Make a Difference to Potential IT Employees?

Monitoring and identification of trends and abnormal behaviors using an AAL systems

ECG SIGNAL PROCESSING AND HEART RATE FREQUENCY DETECTION METHODS

Immersive Audio Rendering Algorithms

Idiosyncratic repeatability of calibration errors during eye tracker calibration

An Active Head Tracking System for Distance Education and Videoconferencing Applications

In an experimental study there are two types of variables: Independent variable (I will abbreviate this as the IV)

Analysing Questionnaires using Minitab (for SPSS queries contact -)

Meeting Mediator: Enhancing Group Collaboration with Sociometric Feedback

A Segmentation Algorithm for Zebra Finch Song at the Note Level. Ping Du and Todd W. Troyer

Establishing the Uniqueness of the Human Voice for Security Applications

Evaluation of Attention and Relaxation Levels of Archers in Shooting Process using Brain Wave Signal Analysis Algorithms. NeuroSky Inc.

Introduction to Quantitative Methods

Chapter 5 Analysis of variance SPSS Analysis of variance

Research. Investigation of Optical Illusions on the Aspects of Gender and Age. Dr. Ivo Dinov Department of Statistics/ Neuroscience

Transcription:

This is a pre-print User Eye Fatigue Detection via Eye Movement Behavior Evgeniy Abdulin Department of Computer Science Texas State University San Marcos, TX 78666 e_a146@txstate.edu Oleg Komogortsev Department of Computer Science Texas State University San Marcos, TX 78666 ok11@txstate.edu Abstract: In this study we propose and evaluate a novel approach that allows detection of physical eye fatigue. The proposed approach is based on the analysis of the recorded eye movements via what is called behavioral scores. These easy-to-compute scores can be obtained Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for thirdparty components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). immediately after a calibration procedure, via processing of such basic eye movements as fixations and saccades extracted from the raw eye positional data recorded by an eye tracker. The results, based on the data from 36 volunteers indicate that one of the behavioral scores, Fixational Qualitative Score, is more sensitive to the onset of eye fatigue than already established methods based on saccadic characteristics only. Author Keywords Eye tracking; eye movements; behavioral scores; eye fatigue; human factors. ACM Classification Keywords H.1.2 [Information Systems]: User/Machine Systems - Human information processing Introduction People who use electronic devices, spent more and more time looking at displays [5]. Our eyes struggle to cope with such high workload that is causing eye fatigue and Computer Vision Syndrome [4]. While it is impossible to completely avoid eye fatigue, we can aim at reducing the fatigue and thus potentially reducing the consequences it causes. Several factors influence the onset of the eye fatigue. A poor-designed GUI can make physical eye workload higher due to using wrong color scheme and the structure that makes a user

conduct an excessive visual search, when another, a better-designed GUI requires less eye movements to complete a task [13]. Therefore the ability to have realtime computable assessment metrics for fatigue detection is very important because they will help develop better GUI that would reduce levels of physical eye fatigue. Eye movement driven metrics have the potential to aid eye fatigue assessment because of their high sensitivity and real-time computation capability. Contemporary eye trackers are user friendly and non-invasive [2]. In general, the cost and usability of eye trackers were recently considerably improved, with contemporary units available for just around $100, e.g., [1]. The majority of human eye movements in HCI-related tasks are presented by fixations and saccades [6]. Fixations are executed when a user is looking at a stationary target. Rapid, ballistic eye movements that move the eye from one fixation to another are called saccades. Our work employs scores that are based on fixations and saccades for the detection of physical eye fatigue. Previous Fatigue-related Work There is a substantial amount of work in detecting user s eye fatigue, but the majority of this work is saccade-based, employs considered characteristics such as: amplitude, duration, peak and mean velocity [3, 11, 12]. Only top-rated, the most expensive eye trackers with high sampling rate of 250Hz and more can provide enough accurate information for analyzing saccadic behavior, because saccades are very quick and short in duration. Fixations are frequently longer than 200ms, while the majority of saccades last less than 80ms. Therefore, in addition to saccade-based approach we propose fixation-based approach that can be potentially be employed on a low cost eye trackers with low sampling frequency, potentially working on the mobile and wearable devices [7] and provide enough sensitivity to user s eye fatigue to be used in real-time. Ideally, fatigue-related statistical sensitivity provided by such metrics would be higher than of existing ones and they would be simple to compute. Eye Movements Behavioral Scores described next target to satisfy those requirements. Eye Movements Behavioral scores Behavioral scores represent quantitative and qualitative characteristics of the eye movement behavior [9] recorded in response to a step-stimulus such as jumping dot of light presented for example during eye tracker s calibration. In general, the values of the behavior scores can signal a) meaningfulness of eye movement classification, b) eye tracking quality, c) normality of the captured eye movement signal. In this study by controlling for a) and b) we investigate the ability to detect abnormality of the captured eye movement signal which specifically translates to eye fatigue detection for the purposes of this study. Fixation Quantitative Score (FQnS) represents how many fixational samples were recorded nearby current stimulus location. The FQnS, calculated by the equation (1), also takes into the account the latency associated with user reaction in response to the stimulus and considers fixations only in the meaningful proximity to the presented stimulus. The FQnS is representative of both number of exhibited fixations and their durations and has less variability that either of those metrics.

FQnS = 100 fixation_detection_counter stimuli_fixation_points (1) (ANS), Average Saccade Duration (ASD), Average Saccade Peak Velocity (ASPV). Fixation Qualitative Score (FQlS) indicates the spatial accuracy of the exhibited fixations in response to the presented stimulus, calculated by the equation (2). Intuitive considerations suggest that eye fatigue would cause a user to frequently miss a target on the initial fixation and therefore some of the fixations would be located further away from the target than during normal fresh baseline. Part of our hypothesis is that FQlS will increase when a user becomes fatigued. FQlS = 1 N N i=1 fixation_distance i (2) Saccade Quantitative Score (SQnS) represents the amount of saccadic behavior captured in response to the presented stimuli calculated by the equation (3). Our intuitive considerations regarding the SQnS change during eye fatigue are similar to the ones provided for the FQlS. When a user misses the target on the initial saccadic movement, the miss is subsequently corrected with an additional saccade. These additional corrective saccades in cases when the eye overshoots or undershoots a target lead to the larger amount of saccadic behavior when compared to a baseline. Therefore, our hypothesis is the SQnS will increase when a user becomes fatigued. SQnS = 100 total_detected_saccade_amplitude total_stimuli_saccade_amplitude (3) In addition to the behavioral scores we consider metrics that are close to those investigated in previous research [3, 11, 12]: Average Number of Saccades Experimental Methodology Recording Equipment The data was recorded using the EyeLink 1000 eye tracker with a sampling frequency of 1000Hz. The recording was conducted on a tower mount with chin and forehead rest to improve the accuracy and reduce the noise in the recorded data. The stimulus was presented on 22 inch flat panel wide-screen LCD display with refresh rate of 60Hz. The display measured 474 x 297 millimeters and resolution was 1680 x 1050 pixels. The chinrest was located 550mm from the display. For each subject the chinrest was adjusted to ensure that primary eye position (eye is staring straight ahead) would be equal for all participants. Recording Procedure & Stimulus The horizontal step stimulus was selected for the experiment because the effects of fatigue upon saccadic eye movements, for example in myasthenia gravis, may be tested by asking the patient to repetitively refixate between two targets [10]. The stimulus was displayed on a black screen as a dot, consisting of a white disc sized approximately 1º with a small black point in the center. The dot performed 100 jumps horizontally with amplitude of 30º, which corresponded to the dot location +15º and -15º from the center of the display, each time the dot was stationary for 1 s. before the next jump because after making only 30 50-deg or 80 30-deg saccades, the subject could not produce normally-shaped saccades. All further saccades would have symptoms of fatigue [3]. Initial dot location was display s center. Each participant was instructed to follow the dot movements.

Task Horizontal Movie clip1 Fixational Random Reading Game Movie clip 2 S1, M(SD) 2.61 (1.40) 1.25 (0.69) 1.58 (0.69) 2.56 (1.23) 1.85 (0.87) 2.03 (1.08) 1.13 (0.42) S2, M(SD) 2.50 (1.38) 1.22 (0.42) 1.63 (0.79) 2.83 (1.38) 1.97 (1.03) 2.16 (1.30) 1.31 (0.67) Table 1. Task-oriented questionnaire results, on 7-pt. Likert scale Figure 1. Average Saccade Duration Figure 2. Average Peak Saccade Velocity The presentation of this stimulus was a part of a larger ocular biometrics study that consisted of two recording sessions, S1 and S2, conducted 10 minutes apart with the total duration of the experiment not exceeding 1 hour. Each session consisted of pure fixational stimulus, horizontal stimulus (analyzed here), movie clip 1, random saccades, text reading, gaze-controlled computer game, movie clip 2. Participants filled several questionnaires prior to recordings, between two sessions, and after all recordings. There were two 7-pt. Likert scale questionnaires related to horizontal stimulus task: a state-oriented one ( What is your eye fatigue? ) before the task and task-oriented one ( How hard was the task on your eyes? ) after the task. We believe that such recording procedure allows monitoring eye fatigue onset within a single repetitive task, such as horizontal saccades stimulus and also monitor eye fatigue related changes between sessions, assuming that during S2 recordings participants eyes became generally more fatigued than during S1. Participants, Eye Movement Classification & Quality of the Recorded Data A total of 36 participants (19 males/17 females), ages 19 31 years with an average age of 21.8 (SD=3.2), volunteered for the project. Verified mean positional accuracy of the recordings averaged between all screen regions was 0.88º (SD=0.27º). Average recorded data validity was 98.96% (SD = 1.02%). Collected eye movement data was classified by an I-VT algorithm [8] with the separation threshold of 70º/s [9]. Both positional accuracy and data validity numbers indicate that captured data quality was high. Eye movement classification via the I-VT algorithm based on the results of the previous study and high quality of the recorded data indicate that that the changes in behavioral scores would represent the change in the eye movement behavior, e.g., fatigue onset, rather than failures of eye movement classification or/and recording equipment. After fixations and saccades were classified with I-VT all records were manually examined to ensure that they represent valid saccade velocity profiles. Saccades that contained blinks were removed from the final analysis. Blinks were detected by a mechanism specified by Bahill and Kallman [2]. Saccade velocity was computed by the equation (1) presented in [2] to reduce the impact of noise and variability present in the signal. Data Partitioning & Analysis Methods The recordings for each session were broken into 10 groups marked as from G1 to G10 in recording sessions S1 and S2 with 10 stimulus dot jumps per group and corresponding recorded eye movement signal. We hypothesized that such data separation would allow tracking the onset of fatigue by the metrics we have described earlier. To research statistical differences among scores computed between groups in each session and for the averaged scores between each whole session we employed General Linear Model Repeated Measures ANOVA because each participant was recorded for all factors levels, i.e., partition groups in our case. For results that involve the comparison between all 10 groups Bonferroni correction was performed, that reduced statistically significant level to 0.005 instead of 0.05. All factors were within-subject, we did not have

Metric name Session 1, 10 groups, 10 saccades each Session 2, 10 groups, 10 saccades each between-subject factors. Session 1 Session 2 100 saccades each FQnS F(1,35)=1.378,p=0.233 F(1,35)=2.246,p=0.047 F(1,35)=0.412,p=0.525 FQlS F(1,35)=7.956, p<0.0005 F(1,35)=8.760, p<0.0005 F(1,35)=0.005,p=0.943 SQnS F(1,35)=1.068,p=0.376 F(1,35)=1.2966,p=0.112 F(1,35)=0.320,p=0.575 ASD F(1,35)=2.733,p=0.029 F(1,35)=3.652, p<0.005 ASPV F(1,35)=0.777,p=0.579 F(1,35)=1.596,p=0.154 F(1,35)=6.374, p<0.05 F(1,35)=16.340, p<0.001 ANS F(1,35)=1.166,p=0.326 F(1,35)=2.427,p=0.046 F(1,35)=0.441,p=0.511 Table 2. Repeated-measures ANOVA results for tested scenarios Figure 3. Regression for Fixation Qualitative Score. Session 1. Figure 4. Regression for Fixation Qualitative Score. Session 2. Results Results of the subjective study presented in Table 1 confirm that the horizontal stimulus induced second highest workload on users eyes among the tasks conducted; Results of the state-oriented question What is your eye fatigue? are M(SD): 2.86(1.31) before S1 and 3.67(1.22) before S2. As it is shown in Table 2, significant effects were found for ASD and APSV that are used in existing methods ASPV significantly decreases for S1-S2 (between sessions comparison) and ASD significantly increases for S1-S2 and S2 (Fig. 1, 2). Only one of the behavioral scores, FQlS, shown strong effects for S1 (within session comparison) and S2 (within session comparison) scenarios, but not S1-S2. No more effects was found, although FQnS and ANS had relatively low p-values for S2 scenario. We hypothesize if the participant s tasks lasted longer, the effect would be also found. In addition, the linear regressions for FQlS have high R- square value for S1 and S2 and approximately equal slope and shift for both sessions that makes it possibly usable as an empirical formulation for user s eye fatigue evaluation; it is shown on Fig. 3 and 4. Discussion The results indicate that FQnS and the SQnS showed low sensitivity to the eye fatigue. We hypothesize that it happens due to relatively stable user reaction time and total exhibited saccade amplitude for the duration of each task that was recorded. Our results support the sensitivity to fatigue of the saccade related metrics that we adopted from the previous research. The main contribution of our work is the finding that FQlS, which is very simple and fast to compute, is able to indicate the progression of fatigue following the progression of a fatigue inducing task. For example it is possible to see from the Fig. 3 and 4 how the FQlS increases over time and also gets refreshed between the sessions when the subjects gets some rest. None other metric has shown similar dynamics in fatigue progression as the FQlS. It should be noted that metrics from previous research can detect change in fatigue over large amount of eye movements (i.e., difference between whole recording sessions), but proposed fixation behavior metric can detect the increase in fatigue for smaller eye movement groups, thus potentially making this metric more usable for user eye fatigue detection over short periods of time and also monitoring the fatigue progression over time. Acknowledgements This research was supported by NSF CAREER Grant #CNS-1250718 and NIST Grants #60NANB10D213 and #60NANB12D234s.

Conclusion and Future Work We conducted a study to investigate eye fatigue detection capabilities of the eye movement based metrics called behavioral scores, which are simpler to derive and that are potentially more robust than metrics described in the previous research. Our major finding is that one of those scores, the Fixational Qualitative Score, has better sensitivity to eye fatigue than existing saccade-based metrics, in short-term range, while the task execution is in progress. Moreover, we hypothesize that the FQlS as a fixational metric can be effective when derived from a signal coming from the inexpensive eye trackers, therefore making it potentially applicable for employment on mobile and wearable devices with eye tracking capabilities. Future work will concentrate on assessing if the achieved metric sensitivity to fatigue can be maintained on the low cost, low sampling frequency eye tracking equipment. Also we will search for other objective metrics with high sensitivity to eye fatigue and integrate them with FQlS and the best of saccadebased metrics in an application for mobile/wearable devices for next generation GUIs that are sensitive and adjustable to different levels of users eye fatigue. References [1] (2013). The Eye Tribe Eye Tracker. Available: https://theeyetribe.com/blog/ [2] A. T. Bahill and J. S. Kallman, Predicting final eye position halfway through a saccade, IEEE Transactions on Biomedical Engineering, 12 (1983), 781-786. [3] A. T. Bahill and L. Stark, Overlapping saccades and glissades are produced by fatigue in the saccadic eye movement system, Experimental Neurology 48, 95-106 (1975). [4] J. Bali, et al., Computer vision syndrome: A review, Journal of Clinical Ophthalmology and Research 2, 1 (2014), 61. [5] A. Chang, Digital-media use to average 15.5 hours a day by 2015, study predicts, in Los Angeles Times, 2013. [6] A. Duchowski, Eye Tracking Methodology: Theory and Practice, 2nd ed., Springer, 2007. [7] C. Holland, et al., Usability evaluation of eye tracking on an unmodified common tablet, in CHI'13 Extended Abstracts on Human Factors in Computing Systems, 295-300. [8] O. V. Komogortsev. (2011, June 21). Eye Movement Classification Software. Available: http://www.cs.txstate.edu/~ok11/emd_offline.html [9] O. V. Komogortsev, et al., Standardization of Automated Analyses of Oculomotor Fixation and Saccadic Behaviors, IEEE Transactions on Biomedical Engineering 57, 11 (2010), 2635-2645. [10] R. J. Leigh and D. S. Zee, The neurology of eye movements vol. 90, Oxford University Press New York, 1999. [11] T. Megaw and T. Sen, Visual fatigue and saccadic eye movement parameters, in 27th Annual Meeting Human Factor Society, 728-732. [12] L. L. D. Stasi, et al., Saccadic peak velocity as an alternative index of operator attention: A short review, European Review of Applied Psychology 63, 6 (2013), 335-343. [13] D. Tamir, et al., Detecting Software Usability Deficiencies Through Pinpoint Analysis, International Journal On Advances in Software 7, 1 and 2 (2014), 44-62.