A Guided User Experience Using Subtle Gaze Direction
|
|
|
- Steven Phelps
- 10 years ago
- Views:
Transcription
1 A Guided User Experience Using Subtle Gaze Direction Eli Ben-Joseph and Eric Greenstein Stanford University {ebj, 1 Abstract This paper demonstrates how illumination modulation can guide a user s gaze on a display without them knowing. This technique takes advantage of eye biology and the differences in flicker fusion rate between the fovea and periphery, so that flickering in the periphery of a viewer s attention attracts their gaze, yet is undetectable in their foveal vision. Applications for this technology include interactive media and gaming, visual training programs in medicine and defense, and advertising. I. INTRODUCTION In static images, human gaze is controlled by a variety of factors, including image contrast, presence of faces and other informative regions, and high edge density [1], [2], [3]. Some groups have altered images to attract a viewer s gaze towards regions of interest by adding contrast or non-realistic features [4], [5]. However, digital screens that display dynamic images offer new possibilities. Previous research has shown that illumination modulation can direct a user s gaze around a digital image [6]. This technique used active eye tracking and terminated the modulation before a user could focus on the modulating object. However, by taking advantage of eye biology, a similar type of illumination modulation can be used to direct a user s gaze without the modulation being noticeable in their foveal vision. Studies have shown that the flicker fusion threshold is different for foveal and peripheral vision [7]. Therefore, for images flashing at certain frequencies, the modulation will be apparent when the object is in the periphery, but the images will be fused together when viewed directly. The applications of technology that can subtly direct gaze are varied. It would be valuable for visual training programs in medicine and defense, as experts show different eye patterns than novices. For example, a trained radiologist scans an MRI image differently than an apprentice doctor. Additionally, more interactive media and gaming experiences would be enabled. Overall, this research would augment knowledge of human perception. Figure 1. Distribution of red, green, and blue cones in the fovea. Courtesy of [8]. II. RELATED WORK Eye biology and human visual perception are topics that have been extensively studied. The eye is comprised of two types of photoreceptors: rods and cones. Cones are concentrated at the center of gaze, and are responsible for color sensitivity. The three types of cones red, green, and blue are also asymmetrically distributed, with red and green cones being more numerous and blue cones being more sparse and located away from the center of gaze. Figure 1 shows how the different cone types are located across the eye. Rods are concentrated in the periphery of one s gaze, and are extremely sensitive to illumination and motion. Figure 2 is a plot of the distribution of cone and rod cells as a function of the angle of gaze (relative to center). In 2009, Bailey et al. introduced a novel technique to direct a viewer s gaze about an image [6]. They used subtle modulation to attract the attention of a viewer, noting that peripheral vision responds to stimuli faster than foveal vision. By modulating regions of the scene in the periphery of a viewer s vision, they caused the viewer s eyes to saccade to that region. Luminance modulation and warm-cool modulation were chosen, as the human visual system is very sensitive to these changes [10]. A few groups have applied this method for medical training and visual searches [11], [12].
2 2 60 Hz and has a resolution of 1920x1080 pixels. The images shown were 1024x768 pixels. An eye tracker manufactured by The Eye Tribe was used to track the gaze of test subjects at 30 Hz. A chin rest was used to keep the viewers heads steady at a fixed distance from the screen (approximately 2 feet). The experiments were all performed during the day but with the blinds closed, so the room brightness was kept roughly constant. Figure 2. Distribution of rods and cones in the retina. Courtesy of [9]. B. Subjects Five male participants between the ages of 22 and 26 participated in this study. All participants had normal or corrected-to-normal vision and participated in the following tests. Figure 3. Critical fusion frequency vs. luminance. Adapted from Granit and Harper. While this technique was successful, it used active eye tracking to detect when a viewer s eye was moving towards the modulating object, then stopping the modulation. This limits potential applications as it requires an eye tracker setup to be present. The method employed in our experiment takes advantage of differences in the flicker fusion threshold between the periphery and fovea, which was first investigated in the 1930 s. Humans notice flickering images at low frequencies, but there is a threshold frequency at which flickering images fuse into one image. This critical fusion frequency depends in the luminance of the object (the Ferry-Porter law) and it s size (the Granit-Harper law). Granit and Harper demonstrated that the flicker fusion frequency depends on where the object is located [7]. For large objects, the critical fusion frequency is higher if the object is in the periphery of a viewer s gaze than if the object is centered in the viewer s fovea, as shown in Figure 3. A. Setup III. METHODS In this paper, OpenGL was used to render images that were then displayed on a monitor that operates at C. Modulation Intensity, Frequency, and Color Modulation intensity, frequency, and color are three parameters that drastically alter the effect of the gaze guiding. The teapots were chosen to be blue teapots, as people tend to have less blue cones than red or green cones, and blue cones are concentrated away from the gaze center. According to the research on critical fusion frequency and our estimates of object size and luminance, frequencies around 30 Hz were likely to be optimal (above the flicker fusion threshold in the fovea but below it in the periphery). Given that the monitor was a 60 Hz monitor, 30 Hz was also the maximum frequency that could be displayed. Therefore, 30 Hz was chosen as the modulation frequency. Modulation intensity is another critical parameter. If the modulation intensity is too high, it is noticeable in foveal vision, but if it is too small, then perhaps the effect is not apparent in the periphery. Additionally, modulation intensity changes the average luminance of the object, which in turn affects the flicker fusion frequency. The modulation intensity was 30% (high value/low value - 1) and this value was chosen based on our own perception. We tested different levels of modulation intensity with our subjects at the end of the experiment to see at what point flickering was noticeable in their foveal vision. D. Gaze Direction Test Figure 4 shows the image shown to subjects. First, the subjects were shown an image without illumination modulation as a control experiment. Their eye movements were tracked for approximately 10 seconds. Next, the subjects were shown our gaze-directing test. Again, the image showed six blue teapots, but this time, one teapot of the six was modulating: first the top left teapot, then the top middle teapot, and then the bottom right teapot. Figure 5 shows the intended path of a user s gaze during
3 3 Figure 4. Image of six identical teapots displayed to subjects. Figure 6. Example of how histograms are computed for the data. Figure 5. Intended path of a user s gaze during the test experiment. the test experiment. The test was also for 10 seconds, with each of the three teapots modulating one third of the total time. E. Analysis The data from the eyetracker was parsed using a Python script and then analyzed using MATLAB. A viewer s gaze was calculated to be the average X and Y coordinates of the left eye and right eye. Points where the eye tracker was not able to compute an eye position (possibly due to the subjects blinking or looking too far away from the screen) were not included in the analysis. One primary tool used to investigate the results was videos of the data. The videos contained significant information about how long subjects spent looking at each teapot, as well as their general eye patterns throughout the test. Additionally, histograms were used to compare the test and control experiments. To compute the histogram, six bins were defined between the teapots as shown in Figure 6, and the number of points in each section were counted. Looking at the test vs. control ratio for each section demonstrates how the illumination modulation effects the gaze direction of the user. IV. RESULTS Eye-tracking data was gathered and analyzed for the five subjects. Figures 7 shows an example of the data from one of the subjects. In the control experiment, the subject mainly focused on the top right teapot and Figure 7. Eye tracking data from one subject. Control (top) and test (bottom). spent a little time observing the other teapots. In the test experiment, the subject s gaze concentrated on the top left, top middle, and bottom right teapots, which is where his gaze was directed. Figure 8 shows how his gaze was directed from the top left teapot, to the top middle teapot, to the bottom right teapot. Analysis on all of the test results (i.e. where the teapots had modulation) was first conducted, in which the number of data points in each of the six sections were counted. This initial heat map analysis showed that during the test, users tended to look at the sections where the teapots were modulating (sections 1, 2, and 6). Compiled across all five users, test-only analysis gave counts of 223, 470, and 342 for sections 1, 2, and
4 4 Figure 9. Heat map of test vs. control ratios Table I MODULATION INTENSITY DIFFERENCE AT WHICH USERS NOTICED THE MODULATION IN THEIR FOVEAL VISION. of interest compared to other teapots, when compared to control. Figure 9 shows the heatmap of the test vs. control ratios. After running the experiment, users were asked if they noticed any differences between the test and control experiments. Most users responded that they did not notice any difference, or that they noticed slight flickering but that they were not sure about it. This is exactly what was intended: subtle gaze directing. Additionally, users were given a flicker sensitivity test, in which a single teapot had increasing differences in flicker modulation. The sensitivity test, shown in Table I, demonstrates that most users began to detect flickering (when looking directly at the image) when the difference in modulation intensity was 30% - 40%. This confirms that the intensity modulation chosen was fairly optimal. Figure 8. Eye tracking data for a subject. Test data after the top left teapot modulates (top), after the top middle teapot modulates (middle), and after the bottom teapot modulates (bottom). Note that the data accumulates over the three images. 6 respectively. The other three sections (3, 4, and 5) reported counts of 272, 79, and 190 respectively. This indicates that users tended to focus most on the topmiddle teapot, which may have to do with its central location on the screen. The second analysis performed computed ratios of test vs. control on the counts of data points per each of the six sections. Aggregated across all five users, sections 1, 2, and 6 show ratios of 2.28, 2.16 and 2.42, respectively, whereas sections 3, 4, and 5 show ratios of 0.85, 0.91, and 1.02, respectively. This indicates that users were at least two times more likely to look at the teapots V. CONCLUSION Results from the experiment validated the initial hypothesis that flickering visible only in the periphery of a user s field of view can subtly guide their gaze to an area of interest. Test vs. control ratios indicate that users are over two times more likely to look at the desired teapots than other teapots without consciously noticing that they were doing so. Users subjectively responded that they did not notice any flickering, and the flicker sensitivity test validated that the modulation used during the experiment would not be directly visible in their foveal vision. These results are a promising first step towards subtle gaze directing without the need for an active eye tracking system. We look forward to applying these techniques in more complicated images, such as art, web pages, and medical images.
5 5 VI. FUTURE WORK Though this experiment was a successful proof of concept, it has opened up many additional avenues for improvement and additional research. First, a more accurate, high-frequency eye tracker should be used in future experiments. While the eye tracker used was suitable for the proof of concept, it was prone to be slightly inaccurate at times and required adjustments. These adjustments add in a layer of human error that should be removed. Second, more experimentation should be done on colors, sizes, and modulation intensity. Testing how changes in illumination modulation influence guidance effectiveness is a logical step. Investigating how the size of the flickering object, specifically how small it can be and still induce the desired effect, would be another interesting study. If an image has multiple colors, can flickering only in the blue channel still guide a user s gaze, or do other colors work better? These are all questions that should be answered in future research. Third, the applicability of the technology must be examined. Can a user s gaze be guided across a work of art or an MRI image? Investigating how this technique can create different experiences for user s looking at web-based media would be interesting. Additionally, we would like to investigate how illumination modulation affects the perceived quality of images. These avenues also need to be explored to better understand where this exciting technology can be applied. [8] [9] Osterberg, Gustav. Topography of the layer of rods and cones in the human retina. Nyt Nordisk Forlag, [10] Spillmann, Lothar, and John S. Werner, eds. Visual perception: The neurophysiological foundations. Elsevier, [11] Sridharan, Srinivas, et al. "Subtle gaze manipulation for improved mammography training." Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, [12] McNamara, Ann, Reynold Bailey, and Cindy Grimm. "Improving search task performance using subtle gaze direction." Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization. ACM, ACKNOWLEDGMENTS We would like to thank Tom Malzbender for his guidance during the project. We would also like to thank Professor Gordon Wetzstein and Isaac Kauvar for their hard work and help. We really enjoyed the course! REFERENCES [1] Mannan, Sabira K., Keith H. Ruddock, and David S. Wooding. "The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images." Spatial vision 10.3 (1996): [2] Parkhurst, Derrick J., and Ernst Niebur. "Scene content selected by active vision." Spatial vision 16.2 (2003): [3] Mackworth, Norman H., and Anthony J. Morandi. "The gaze selects informative details within pictures." Perception & Psychophysics 2.11 (1967): [4] Lu, Weiquan, B-LH Duh, and Steven Feiner. "Subtle cueing for visual search in augmented reality." Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on. IEEE, [5] Rensink, Ronald A. "The management of visual attention in graphic displays." Human attention in digital environments (2011): 63. [6] Bailey, Reynold, et al. "Subtle gaze direction." ACM Transactions on Graphics (TOG) 28.4 (2009): 100. [7] Granit, Ragnar, and Phyllis Harper. "Comparative studies on the peripheral and central retina." American Journal of Physiology Legacy Content 95.1 (1930):
Video-Based Eye Tracking
Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department
Eye tracking in usability research: What users really see
Printed in: Empowering Software Quality: How Can Usability Engineering Reach These Goals? Usability Symposium 2005: pp 141-152, OCG publication vol. 198. Eye tracking in usability research: What users
Video Conferencing Display System Sizing and Location
Video Conferencing Display System Sizing and Location As video conferencing systems become more widely installed, there are often questions about what size monitors and how many are required. While fixed
CS 325 Computer Graphics
CS 325 Computer Graphics 01 / 25 / 2016 Instructor: Michael Eckmann Today s Topics Review the syllabus Review course policies Color CIE system chromaticity diagram color gamut, complementary colors, dominant
Designing eye tracking experiments to measure human behavior
Designing eye tracking experiments to measure human behavior Eindhoven, The Netherlands August, 2010 Ricardo Matos Tobii Technology Steps involved in measuring behaviour 1. Formulate and initial question
Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230
Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230 Optical Illusions http://www.michaelbach.de/ot/mot_mib/index.html Vision We construct images unconsciously
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University A Software-Based System for Synchronizing and Preprocessing Eye Movement Data in Preparation for Analysis 1 Mohammad
PRODUCT SHEET. [email protected] [email protected] www.biopac.com
EYE TRACKING SYSTEMS BIOPAC offers an array of monocular and binocular eye tracking systems that are easily integrated with stimulus presentations, VR environments and other media. Systems Monocular Part
Contrast ratio what does it really mean? Introduction...1 High contrast vs. low contrast...2 Dynamic contrast ratio...4 Conclusion...
Contrast ratio what does it really mean? Introduction...1 High contrast vs. low contrast...2 Dynamic contrast ratio...4 Conclusion...5 Introduction Contrast, along with brightness, size, and "resolution"
The Information Processing model
The Information Processing model A model for understanding human cognition. 1 from: Wickens, Lee, Liu, & Becker (2004) An Introduction to Human Factors Engineering. p. 122 Assumptions in the IP model Each
The Limits of Human Vision
The Limits of Human Vision Michael F. Deering Sun Microsystems ABSTRACT A model of the perception s of the human visual system is presented, resulting in an estimate of approximately 15 million variable
Eye-tracking. Benjamin Noël
Eye-tracking Benjamin Noël Basics Majority of all perceived stimuli through the eyes Only 10% through ears, skin, nose Eye-tracking measuring movements of the eyes Record eye movements to subsequently
GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS
GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS Chris kankford Dept. of Systems Engineering Olsson Hall, University of Virginia Charlottesville, VA 22903 804-296-3846 [email protected]
How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)
Framework for colocated synchronous dual eye tracking Craig Hennessey Department of Electrical and Computer Engineering University of British Columbia Mirametrix Research [email protected] Abstract Dual
Anna Martelli Ravenscroft
Left vs Right processing of & Place in fovea & periphery Psych204b Background: Anna Martelli Ravenscroft Vision depends on multiple regions of the brain, from the specialized photoreceptors of the retina,
Eye-Tracking Methodology and Applications in Consumer Research 1
FE947 Eye-Tracking Methodology and Applications in Consumer Research 1 Hayk Khachatryan and Alicia L. Rihn 2 Introduction Eye-tracking analysis is a research tool used to measure visual attention. Visual
Research. Investigation of Optical Illusions on the Aspects of Gender and Age. Dr. Ivo Dinov Department of Statistics/ Neuroscience
RESEARCH Research Ka Chai Lo Dr. Ivo Dinov Department of Statistics/ Neuroscience Investigation of Optical Illusions on the Aspects of Gender and Age Optical illusions can reveal the remarkable vulnerabilities
Agent Simulation of Hull s Drive Theory
Agent Simulation of Hull s Drive Theory Nick Schmansky Department of Cognitive and Neural Systems Boston University March 7, 4 Abstract A computer simulation was conducted of an agent attempting to survive
Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking
Usability Testing Jeliot 3- Program Visualization Tool: Evaluation Using Eye-Movement Tracking Roman Bednarik University of Joensuu Connet course 281: Usability in Everyday Environment February 2005 Contents
Bernice E. Rogowitz and Holly E. Rushmeier IBM TJ Watson Research Center, P.O. Box 704, Yorktown Heights, NY USA
Are Image Quality Metrics Adequate to Evaluate the Quality of Geometric Objects? Bernice E. Rogowitz and Holly E. Rushmeier IBM TJ Watson Research Center, P.O. Box 704, Yorktown Heights, NY USA ABSTRACT
Perception of Light and Color
Perception of Light and Color Theory and Practice Trichromacy Three cones types in retina a b G+B +R Cone sensitivity functions 100 80 60 40 20 400 500 600 700 Wavelength (nm) Short wavelength sensitive
Better Vision with LED lights
White Paper Better Vision with LED lights Scotopic and Photopic Lumens Executive Summary... 2 The Evidence... 2 The Science behind Scotopic Lumens... 3 Control of Pupil size by Rod stimulation... 7 Conclusion...
INTRODUCTION IMAGE PROCESSING >INTRODUCTION & HUMAN VISION UTRECHT UNIVERSITY RONALD POPPE
INTRODUCTION IMAGE PROCESSING >INTRODUCTION & HUMAN VISION UTRECHT UNIVERSITY RONALD POPPE OUTLINE Course info Image processing Definition Applications Digital images Human visual system Human eye Reflectivity
Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding
Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding Susan M. Munn Leanne Stefano Jeff B. Pelz Chester F. Carlson Center for Imaging Science Department of Psychology
Guidelines for Using the Retrospective Think Aloud Protocol with Eye Tracking
Guidelines for Using the Retrospective Think Aloud Protocol with Eye Tracking September, 2009 Short paper by Tobii Technology Not sure of how to design your eye tracking study? This document aims to provide
Info425, UW ischool 10/11/2007
Today s lecture Info 424 Channeling Few Projects Schedules & grades Intro to Perception Vis Critiques Projects & Grading Introduction to Perception Show Me the Numbers (ch 6) Sit near the front: some demos
PRIMING OF POP-OUT AND CONSCIOUS PERCEPTION
PRIMING OF POP-OUT AND CONSCIOUS PERCEPTION Peremen Ziv and Lamy Dominique Department of Psychology, Tel-Aviv University [email protected] [email protected] Abstract Research has demonstrated
Comp 410/510. Computer Graphics Spring 2016. Introduction to Graphics Systems
Comp 410/510 Computer Graphics Spring 2016 Introduction to Graphics Systems Computer Graphics Computer graphics deals with all aspects of creating images with a computer Hardware (PC with graphics card)
Introduction to Computer Graphics
Introduction to Computer Graphics Torsten Möller TASC 8021 778-782-2215 [email protected] www.cs.sfu.ca/~torsten Today What is computer graphics? Contents of this course Syllabus Overview of course topics
TEXT-FILLED STACKED AREA GRAPHS Martin Kraus
Martin Kraus Text can add a significant amount of detail and value to an information visualization. In particular, it can integrate more of the data that a visualization is based on, and it can also integrate
Implications IN THIS ISSUE. A Newsletter by InformeDesign. A Web site for design and human behavior research.
A Newsletter by InformeDesign. A Web site for design and human behavior research. VOL. 02 ISSUE 02 IN THIS ISSUE Lighting: Its Effect on People and Spaces Related Research Summaries Lighting: Its Effect
Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors
Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors Diego Betancourt and Carlos del Río Antenna Group, Public University of Navarra, Campus
Assessment of Camera Phone Distortion and Implications for Watermarking
Assessment of Camera Phone Distortion and Implications for Watermarking Aparna Gurijala, Alastair Reed and Eric Evans Digimarc Corporation, 9405 SW Gemini Drive, Beaverton, OR 97008, USA 1. INTRODUCTION
TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS
TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS Michael A. Knodler Jr. Department of Civil & Environmental Engineering University of Massachusetts Amherst Amherst, Massachusetts, USA E-mail: [email protected]
Data Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801
S7a1801 OVERVIEW In definiti 3D theaters, the audience wears special lightweight glasses to see the world projected onto the giant dome screen with real depth perception called 3D stereo. The effect allows
How to prepare an optical illusionthemed
ARVO Outreach Tools How to prepare an optical illusionthemed exhibit at a science exhibition 1801 Rockville Pike, Suite 400 Rockville, MD 20852 +1.240.221.2900 arvo.org Questions? @arvo.org How to prepare
3D Data Visualization / Casey Reas
3D Data Visualization / Casey Reas Large scale data visualization offers the ability to see many data points at once. By providing more of the raw data for the viewer to consume, visualization hopes to
OpenEXR Image Viewing Software
OpenEXR Image Viewing Software Florian Kainz, Industrial Light & Magic updated 07/26/2007 This document describes two OpenEXR image viewing software programs, exrdisplay and playexr. It briefly explains
Using the Tobii Mobile Device Stand in Usability Testing on Mobile Devices
Using the Tobii Mobile Device Stand in Usability Testing on Mobile Devices Whitepaper by Alexander Rösler This white paper provides usability professionals with a set of 44 practical methodological guidelines,
Help maintain homeostasis by capturing stimuli from the external environment and relaying them to the brain for processing.
The Sense Organs... (page 409) Help maintain homeostasis by capturing stimuli from the external environment and relaying them to the brain for processing. Ex. Eye structure - protected by bony ridges and
A Color Placement Support System for Visualization Designs Based on Subjective Color Balance
A Color Placement Support System for Visualization Designs Based on Subjective Color Balance Eric Cooper and Katsuari Kamei College of Information Science and Engineering Ritsumeikan University Abstract:
Chapter 8: Perceiving Depth and Size
Chapter 8: Perceiving Depth and Size Cues to Depth Perception Oculomotor - cues based on sensing the position of the eyes and muscle tension 1. Convergence knowing the inward movement of the eyes when
Colour Science Typography Frontend Dev
Colour Science Typography Frontend Dev Name Kevin Eger Who am I? Why should you listen to anything I say? Summary What the next hour of your life looks like Colour Science Typography Frontend Dev 1. 2.
CELL PHONE INDUCED PERCEPTUAL IMPAIRMENTS DURING SIMULATED DRIVING
CELL PHONE INDUCED PERCEPTUAL IMPAIRMENTS DURING SIMULATED DRIVING David L. Strayer, Frank A. Drews, Robert W. Albert, and William A. Johnston Department of Psychology University of Utah Salt Lake City,
What is Visualization? Information Visualization An Overview. Information Visualization. Definitions
What is Visualization? Information Visualization An Overview Jonathan I. Maletic, Ph.D. Computer Science Kent State University Visualize/Visualization: To form a mental image or vision of [some
Mouse Control using a Web Camera based on Colour Detection
Mouse Control using a Web Camera based on Colour Detection Abhik Banerjee 1, Abhirup Ghosh 2, Koustuvmoni Bharadwaj 3, Hemanta Saikia 4 1, 2, 3, 4 Department of Electronics & Communication Engineering,
Exp. 1 Pathways of sound conduction. Yu Yanqin, PhD Zhejiang University, School of Medicine
Exp. 1 Pathways of sound conduction Yu Yanqin, PhD Zhejiang University, School of Medicine [Purpose] To learn how to use a tuning fork to generate sound; To understand the function of the auditory organ;
Greenwich Visual Arts Objectives Computer Graphics High School
Media, Techniques and Processes Greenwich Visual Arts Objectives 1. Uses a variety of tools to draw digitally to capture the essence of the subject using an application program such as Apple works Paint
Quantifying Spatial Presence. Summary
Quantifying Spatial Presence Cedar Riener and Dennis Proffitt Department of Psychology, University of Virginia Keywords: spatial presence, illusions, visual perception Summary The human visual system uses
Increase the measurement speed with NOVA
Increase the measurement speed with NOVA Case study: how to increase the measurement speed by optimizing the USB communication with the Autolab? 1 USB 1.1 vs USB 2.0 Although the Autolab uses a USB 1.1
vcenter Operations Manager Administration 5.0 Online Help VPAT
Administration 5.0 Online Help VPAT Product Name: Administration 5.0 Online Help VPAT Since the VPAT must be comprehensive, all Section 508 issues on all pages must be corrected to sustain compliance.
How does my eye compare to the telescope?
EXPLORATION 1: EYE AND TELESCOPE How does my eye compare to the telescope? The purpose of this exploration is to compare the performance of your own eye with the performance of the MicroObservatory online
The Big Four: Contrast, Repetition, Alignment, Proximity
Sample pages from Chapter six of Presentation Zen: Simple Ideas on Presentation Design and Delivery by Garr Reynolds On this page you can see five samples of simple slides in which elements were arranged
Visualization. For Novices. ( Ted Hall ) University of Michigan 3D Lab Digital Media Commons, Library http://um3d.dc.umich.edu
Visualization For Novices ( Ted Hall ) University of Michigan 3D Lab Digital Media Commons, Library http://um3d.dc.umich.edu Data Visualization Data visualization deals with communicating information about
Principles of Data Visualization for Exploratory Data Analysis. Renee M. P. Teate. SYS 6023 Cognitive Systems Engineering April 28, 2015
Principles of Data Visualization for Exploratory Data Analysis Renee M. P. Teate SYS 6023 Cognitive Systems Engineering April 28, 2015 Introduction Exploratory Data Analysis (EDA) is the phase of analysis
COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies
COMP175: Computer Graphics Lecture 1 Introduction and Display Technologies Course mechanics Number: COMP 175-01, Fall 2009 Meetings: TR 1:30-2:45pm Instructor: Sara Su ([email protected]) TA: Matt Menke
Session 15 Lighting Fundamentals
Session 15 Lighting Fundamentals Illumination Levels - Example Illumination Levels (Cont.) Lighting Sources in the International World Incandescent: -40⁰ C (-40⁰ F) Fluorescent: -20⁰ C (-4⁰ F) minimum
A Short Introduction to Computer Graphics
A Short Introduction to Computer Graphics Frédo Durand MIT Laboratory for Computer Science 1 Introduction Chapter I: Basics Although computer graphics is a vast field that encompasses almost any graphical
Adding Animation With Cinema 4D XL
Step-by-Step Adding Animation With Cinema 4D XL This Step-by-Step Card covers the basics of using the animation features of Cinema 4D XL. Note: Before you start this Step-by-Step Card, you need to have
Lesson 3: Behind the Scenes with Production
Lesson 3: Behind the Scenes with Production Overview: Being in production is the second phase of the production process and involves everything that happens from the first shot to the final wrap. In this
QAV-PET: A Free Software for Quantitative Analysis and Visualization of PET Images
QAV-PET: A Free Software for Quantitative Analysis and Visualization of PET Images Brent Foster, Ulas Bagci, and Daniel J. Mollura 1 Getting Started 1.1 What is QAV-PET used for? Quantitative Analysis
A Study on M2M-based AR Multiple Objects Loading Technology using PPHT
A Study on M2M-based AR Multiple Objects Loading Technology using PPHT Sungmo Jung, Seoksoo Kim * Department of Multimedia Hannam University 133, Ojeong-dong, Daedeok-gu, Daejeon-city Korea [email protected],
SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software. www.eyegaze.com
SUPERIOR EYE TRACKING TECHNOLOGY Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software www.eyegaze.com LC TECHNOLOGIES EYEGAZE EDGE SYSTEMS LC Technologies harnesses the power
A Cognitive Approach to Vision for a Mobile Robot
A Cognitive Approach to Vision for a Mobile Robot D. Paul Benjamin Christopher Funk Pace University, 1 Pace Plaza, New York, New York 10038, 212-346-1012 [email protected] Damian Lyons Fordham University,
The Physiology of the Senses Lecture 1 - The Eye www.tutis.ca/senses/
The Physiology of the Senses Lecture 1 - The Eye www.tutis.ca/senses/ Contents Objectives... 2 Introduction... 2 Accommodation... 3 The Iris... 4 The Cells in the Retina... 5 Receptive Fields... 8 The
Chapter 14: The Cutaneous Senses
Chapter 14: The Cutaneous Senses Skin - heaviest organ in the body Cutaneous System Epidermis is the outer layer of the skin, which is made up of dead skin cells Dermis is below the epidermis and contains
Application of the Golden Ratio to 3D Facial Models
Application of the Golden Ratio to 3D Facial Models Rachel McDonnell University of Dublin Trinity College Dublin 2 IRELAND [email protected] Ann McNamara University of Dublin Trinity College Dublin
Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone
Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone Young Jae Lee Dept. of Multimedia, Jeonju University #45, Backma-Gil, Wansan-Gu,Jeonju, Jeonbul, 560-759,
PERSPECTIVE. How Top-Down is Visual Perception?
PERSPECTIVE How Top-Down is Visual Perception? featuring new data (VSS Poster): Attentional Cycles in Detecting Simple Events within Complex Displays Sunday PM Poster #36.301, VSS 2014 Thomas Sanocki,
Introduction to 3D Imaging
Chapter 5 Introduction to 3D Imaging 5.1 3D Basics We all remember pairs of cardboard glasses with blue and red plastic lenses used to watch a horror movie. This is what most people still think of when
Studying Human Face Recognition with the Gaze-Contingent Window Technique
Studying Human Face Recognition with the Gaze-Contingent Window Technique Naing Naing Maw ([email protected]) University of Massachusetts at Boston, Department of Computer Science 100 Morrissey Boulevard,
Fundus Photograph Reading Center
Modified 7-Standard Field Digital Color Fundus Photography (7M-D) 8010 Excelsior Drive, Suite 100, Madison WI 53717 Telephone: (608) 410-0560 Fax: (608) 410-0566 Table of Contents 1. 7M-D Overview... 2
Eye Contact in Leisure Video Conferencing. Annick Van der Hoest & Dr. Simon McCallum Gjøvik University College, Norway.
Eye Contact in Leisure Video Conferencing Annick Van der Hoest & Dr. Simon McCallum Gjøvik University College, Norway 19 November 2012 Abstract This paper presents systems which enable eye contact in leisure
Eyetracking in E-mail Marketing
How to Design an Eye-catching Newsletter? Eyetracking in E-mail Marketing Cracow (Poland), July 2013 user experience studio Eyetracking in E-mail marketing How to Design an Eye-catching Newsletter? 2 Introduction
Space Perception and Binocular Vision
Space Perception and Binocular Vision Space Perception Monocular Cues to Three-Dimensional Space Binocular Vision and Stereopsis Combining Depth Cues 9/30/2008 1 Introduction to Space Perception Realism:
6 Space Perception and Binocular Vision
Space Perception and Binocular Vision Space Perception and Binocular Vision space perception monocular cues to 3D space binocular vision and stereopsis combining depth cues monocular/pictorial cues cues
HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER
HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER Gholamreza Anbarjafari icv Group, IMS Lab, Institute of Technology, University of Tartu, Tartu 50411, Estonia [email protected]
A Short Introduction on Data Visualization. Guoning Chen
A Short Introduction on Data Visualization Guoning Chen Data is generated everywhere and everyday Age of Big Data Data in ever increasing sizes need an effective way to understand them History of Visualization
2.3 Spatial Resolution, Pixel Size, and Scale
Section 2.3 Spatial Resolution, Pixel Size, and Scale Page 39 2.3 Spatial Resolution, Pixel Size, and Scale For some remote sensing instruments, the distance between the target being imaged and the platform,
Feasibility of an Augmented Reality-Based Approach to Driving Simulation
Liberty Mutual Research Institute for Safety Feasibility of an Augmented Reality-Based Approach to Driving Simulation Matthias Roetting (LMRIS) Thomas B. Sheridan (MIT AgeLab) International Symposium New
Choosing Colors for Data Visualization Maureen Stone January 17, 2006
Choosing Colors for Data Visualization Maureen Stone January 17, 2006 The problem of choosing colors for data visualization is expressed by this quote from information visualization guru Edward Tufte:
Condensing PQ Data and Visualization Analytics. Thomas Cooke
1 Condensing PQ Data and Visualization Analytics Thomas Cooke Storing and Transmitting Data One Monitor / Site (2 bytes per sample) X (512 samples/cycle) X (60 cycles/sec) X (60 sec/min) X (60 min/hr)
Simultaneous Gamma Correction and Registration in the Frequency Domain
Simultaneous Gamma Correction and Registration in the Frequency Domain Alexander Wong [email protected] William Bishop [email protected] Department of Electrical and Computer Engineering University
Review Vocabulary spectrum: a range of values or properties
Standards 7.3.19: Explain that human eyes respond to a narrow range of wavelengths of the electromagnetic spectrum. 7.3.20: Describe that something can be seen when light waves emitted or reflected by
