Weal-Time System for Monitoring Driver Vigilance
|
|
|
- Daisy Cole
- 10 years ago
- Views:
Transcription
1 2004 IEEE Intelligent Vehicles Symposium University of Parma Parma, Italy June 14-17,2004 Weal-Time System for Monitoring Driver Vigilance Luis M. Bergasa, Jesus Nuevo, Miguel A. Sotelo Departamento de Electronica Universidad de Alcali Alcalh de Henares Madrid, Spain Manuel Vhzquez Dpto. Sistemas Electronicos y de Control Universidad PolitCcnica de Madrid Madrid, Spain [email protected] Abstract In this paper we present a non-intrusive prototype computer vision system for real-time monitoring driver s vigilance. It is based on a hardware system, for real time acquisition of driver s images using an active IR illuminator, and their sofiare implementation for monitoring some visual behaviours that characterize a driver s level of vigilance. These are the eyelid movements and the pose face. The system has been tested with different sequences recorded on night and day driving conditions in a motorway and with different users. We show some experimental results and some conclusions about the perjormance of the system. 1. Introduction The increasing number of traffic accidents in Europe due to a diminished driver s vigilance level has become a serious problem for society. Statistics show that between 10% and 20% of all the traffic accidents are due to drivers with a diminished vigilance level. In the trucking industry, about a 60% of fatal truck accidents are caused to driver fatigue [l]. For this reason, developing systems for monitoring a driver s level of vigilance and alerting the driver, when he is not paying adequate attention to the road, is essential to prevent accidents. In the last few years many researchers have been working on the development of these kinds of systems using different techniques. The most accurate techniques are based on physiological measures like brain waves, heart rate, pulse rate, respiration, etc. These techniques are intrusive, since they need to attach some electrodes on the drivers, causing annoyance to them. Some representative projects in this line are the MIT Smart Car [2], and the ASV (Advanced Safety Vehicle) project performed by Toyota, Nissan and Honda [3]. Others techniques monitor eyes and gaze movement using a helmet or special contact lens. These, though less intrusive, are still not acceptable in the practice [4]. A driver s state of vigilance can also be characterized by indirect behaviours of the vehicle like lateral position, steering wheel movements, time to line crossing, etc. Although these techniques are not intrusive they are subjected to several limitations as the vehicle type, driver experience, geometric characteristics and state of the road, etc. On the other hand, this takes too much time to analyse user behaviours and thereby it doesn t work with the known as micro-sleeps [5]. In this line we can find an important Spanish project called TCD (Tech CO Driver) [6] and the Mitsubishi advanced safety vehicle system [3]. People in fatigue show some visual behaviours easily observable from changes in their facial features like eyes, head and face. Computer vision can be a natural and nonintrusive technique to monitor driver s vigilance. Many researches have been reported in the literature on developing image-based driver alertness using this technique. Several of them have worked on head tracking. In [7], [8] two methods are presented based on 3D stereo matching. In [9] we can find a system based on templates matching with one camera. All these systems rely of manual initialization. A successful headeye monitoring and tracking of drivers to detect drowsiness by use of one camera and based on colour predicates is presented in [lo]. The described systems are based on passive vision techniques and their functioning can be problematical in poor and very bright light conditions. Moreover, they don t work at nights, where the monitoring is more important. In order to work at nights some researches use active illumination based on infrared LED [ll], [12]. Almost all the active systems that have been reported in the literature have been tested in laboratories but not in real vehicles moving. A moving vehicle presents new challenges like variable lighting, changing background and vibrations /04/$ IEEE 78
2 that must be haven in mind in real systems. In [13] an industrial prototype called Copilot is presented. This system has been tested with truck s drivers. It uses a simple subtraction process for fmding the eyes and it only calculates PERCLOS (percent eye closure), in order to measure driver s drowsiness. Systems relying on a single visual cue may encounter difficulty when the required visual features cannot be acquired accurately or reliably, as it occurs in real conditions. Then, a single visual cue may not always indicative of one s mental conditions [12]. The use of multiple visual cues reduces the uncertainty and the ambiguity present in the information from a single source. In this line, currently, it is been developing an ambitious European project called AWAKE [l]. At the moment only some partial results have been presented. This paper describes a real-time prototype computer vision system for monitoring driver vigilance using infrared illumination. As we have justify before, we considerer that using active illumination is the best option because our goal is to monitor the driver 24 hours on real conditions (vehicle moving) and in a very robust and accurate way. As a difference respecting to other previous systems we propose monitor several visual behaviours that typically characterize a person s level of alertness while driving. Moreover, we have tested our system in a car moving in a motorway and with different users. The rest of the paper is structured as follows. In section 2 we present the general system architecture, explaining its main parts. Experimental results are presented in section 3. Finally, in section 4 we show the conclusions and the future works. 2. System Architecture The general architecture of our system is shown in figure 1. It consists of four major parts: 1) Image acquisition, 2) Pupil detection and tracking, 3) Visual behaviours and 4) Driver vigilance. As image acquisition system we use a CCD micro-camera, a commercial frame-grabber and an IR illuminator controlled by the acquisition system. The pupil detection and tracking stage stars with pupil detection based on the bright pupil effect, similar to the red eye effect in photography. Then, we use two Kalman filters in order to track the pupils robustly in real-time. In the visual behaviours stage we use some visual cues as pupil movements and face pose in order to detect some visual behaviours easily observable in people in fatigue as: slow eyelid movement, smaller degree of eye openness, frequent nodding, blink frequency, face pose, etc. Finally, in the driver vigilance stage we hsion all the individual behaviours obtained in the before stage using a Fuzzy system. This way we can conclude if the driver is fatigued and active a warning. 1 Image adquisition Pupil detection and tracking Visual behaviors c==l KIJ Active w e g ves Figure 1. General architecture 2.1. Image Acquisition System The purpose of this stage is to acquire the image of the driver face in real-time. The acquired images should be relatively invariant to light conditions (day and night) and should facilitate the eye detection and tracking. The use of infrared IR illuminator serves these goals. First, it minimizes the impact of changes in the ambient light, and second, it produces the bright pupil effect, which constitutes the foundation of our detection and tracking system. A bright pupil can be obtained if the eyes are illuminated with an IR illuminator beaming light along the camera optical axis. At the IR wavelength, pupils reflects almost all IR light they receive along the path back to the camera, a bright pupil effect will be produced in the image. If illuminated off the camera optical axis, the pupils appear dark since the reflected light will not enter the camera lens. An example of the brightldark pupil effect can be seen in figure 3. This pupil effect is clear with and without glasses, contact lenses and it even works to certain degree with sunglasses. Figure 2 shows the image acquisition system configuration. It is composed by a miniature CCD 79
3 camera sensitive to the near-infrared and located on the dashboard of the vehicle. This camera focuses on the driver's head for detecting the multiple visual cues. The IR illuminator is composed by two sets of IR LEDs distributed symmetrically along two rings as shown in figure 2. An embedded PC with a low cost frame-grabber is used for video signal acquisition and signal processing. -1 R Synchronization &driving h I I hg. adquisitbn is binarized, using a priori threshold for detecting the brighter blobs in the image. The blobs that are out of some size constraints are removed and for the rest we take all the possible pairs between two blobs and we applying some constraints respecting their size and distance in order to detect the most probably pair of be pupils. The centroids of the blob are returned ils the position of the detected pupils. Illumination control Figure 2. Block diagram of the prototype The size of the rings has been calculated empirically in order to obtain a dark pupil image if the outer ring is turned on and a bright pupil image if the inner ring is turned on. The inner ring configuration obtains the bright pupil effect because the centre of the ring coincides with the camera optical axis, actuating as it was only a LED locate along the optical axis of the lens. For minimizing interference from light sources beyond the.ir light emitted by our LEDs, a narrow bandpass filter has been attached between the CCD and the lens Pupil Detection and Tracking This stage starts with pupil detection. We have designed a specific hardware that detects from each interlaced image frame (camera output) the even and odd field signal starts, which are then used to alternately turn the outer and inner IR rings on to produce the dark and bright pupil image fields. Each frame is separated into two images fields (even and odd), representing the bright and dark pupil images separately. The even image field is then digitally subtracted from the odd image field to produce the difference image. In this image pupils appear as the brightest parts in the image as can be seen in figure 3. This system minimizes the ambient light illumination influence because this is subtracted in the generation of the difference images. Pupils are detected on the difference image, by searching the entire image lo locate two bright blobs that satisfy certain size, position and distance constraints. The image Figure 3.(a) Image obtained with internal IR, (b) Image obtained with external IR, (c) DifSerence Image To continuously monitor the driver it is important to track his pupils from frame to frame. This can be done efficiently by using two Kalman filters, one for each pupil, in order to predict pupil positions in the image. We have used a pupil tracker based on [ 121 but we have tested it with images obtained from a car moving in a motorway. Then, in our case the search window is determined automatically based on pupil size and location error. The state vector of the filter is represented as xt=(q, r,, ut, v,)', where (q,rj indicate the pupil pixel position (its centroid) at time t and (ut, vt) be its velocity at time t in c and r directions respectively. Figure 4 shows a sequence of the pupil tracker working. Rectangles on,the images indicate the search window of the filter and crosses indicate the locations of the detected pupils. (a) and (b) draw estimation of the pupil positions for the sequence under test. The tracker is found to be rather robust under different users without glasses, light conditions, face orientations and distances in a car. Performance of the tracker gets worse when users wear eyeglasses because different bright blobs appear in the image due to IR reflections in the glasses. It automatically finds and tracks the pupils and can recover from trackingfailures. The system runs at a frame-rate of 25 frame&. 80
4 h e 217 frame 298 h e 376 The system starts at the init-state. When the pupils are detected, the FSM pass to the tracking-ok state indicating that the pupil's tracking is working correctly. Being in this state, if the pupils are not detected in a frame, a transition to the tracking-lost state is produced. The FSM stays in this state until that the pupils are correctly detected again. In this moment, the FSM pass to the trucking-ok state. If the width-height ratio of the pupil increases above a threshold, being in tracking-ok state, a closing eye action is detected and the FSM changes to the closing-state. Because the width-height ratio may increase due to other reasons, such as segmentation noise, it is possible to return to the tracking-ok state if the ratio does not constantly increase. INITSTATE Pupils detected 2.3. Visual behaviours Eyelid movements and pose face are some of the visual behaviours that reflect a person's level of fatigue. There are several ocular measures to characterize eyelid movements such as eye closure duration, blink frequency, eye closurelopening speed, and the recently developed parameter PERCLOS [14]. This last measure percentage of eye closed over time, excluding the time spent on normal closure. It has been found to be the most valid ocular parameter for characterizing driver fatigue [3]. Nevertheless, we have calculated all of them in order to be able to evaluate its performance. To obtain the ocular measures we continuously track the subject's pupils and fit two ellipses, to each of them, using a modification of the LIN algorithm [15]. In order to detect the difference between a blink and an error in the tracking of the pupils, we use a Finite State Automata (FSM) as we depict in figure 5. Apart of the init-state, five states have been defined tracking-ok, closing, closed, opening and tracking-lost. Transitions between states are achieved from frame to frame as a function of the width-height ratio of the pupils. fully openned Figure 5.Finite State Automata for ocular measures The pupils loss being in closing-state provokes the transition of the FSM to closed-state, which means that the eyes are closed. A new detection of the pupils from the closed-state produces a change to opening-state or tracking-ok state, depending on the degree of openness of the eyelid. Being in the closed-state, a transition to the tracking-lost state is produced if the closed time is over a threshold. That means that the tracker is not working correctly. A transition from opening to closing is possible if the width-height ratio decreases again. Ocular parameters that characterize eyelid movements have been calculated as function of the FSM. PERCLOS is calculated from all the states, except from the tracking-lost state, analysing the pupil height. We consider that an eye closure occurs when the pupil height is less than the 30% of its nominal size. We compute the PERCLOS measure based on the percentage of eye closure, in 30 s. Eye closure duration measure is the time 81
5 that the system is at the closed-state evaluated in 30 s. Eye closure/opening speed measures are calculated in the same way but for the closing-state and opening-state respectively. Blink frequency is the number of blinks detected in 30 s. A blink action will be detected as a transition among the following states: closing, closed and opening. On the other hand, face pose can detect fatigue and inattentive drivers. The nominal face orientation while driving is frontal. If the driver s face orientation is in other directions for an extended period of time or occurs frequently (in case of various head tilts), this is due to fatigue or inattention. In our application, the precise degree of face orientation is not necessary. What we are interested in is to detect if the driver s head deviates from its nominal position and orientation for an extended time or too frequently (nodding detection). We have used a model-based approach for recovering the face pose by establishing the relationship between 3D face model and its two-dimensional (2D) projections. The inputs of the model are the pupils and the nostrils positions. Then, we have developed a new tracker for the nostrils positions in the same way as the used for the pupils tracking. Nostrils appear in the images as dark pixels surrounded by not so dark pixels (skin), and they are easily detectable. As function of the calculated angles from the model and using the speed data of the pupil s movements from the Kalman filters, we quantized the face direction in nine areas: frontal, left, right, up, down, upper left, upper right, lower left and lower right. Nodding measure indicates the number of head tilts detected in the last 2 minutes. These actions are calculated analysing the temporal evolution of the face model and the vertical speed data of the pupils Driver Vigilance Computation The fatigue visual behaviours, obtained in the previous section (eyelid movements and pose face) are subsequently combined to form a fatigue parameter that can robustly and accurately characterize one s vigilance level. This information fusion is obtained using a Fuzzy system. We have chosen this technique for its well known linguistic concept modelling ability. The fuzzy rule expression is close to expert natural language. On the other hand, as they are universal approximators, fuzzy inference systems can be used for knowledge induction processes. The objective of the fuzzy system is to provide a driver s vigilance level (DVL) fiom the fusion of several ocular and face pose measures and using expert knowledge. This knowledge has been extracted from the data analysis of the parameters in some simulated tests with different users. Various studies have shown the possibility to detect fatigue by means of ocular measures. Taking in mind the study from US Department of Transportation [3] and our experience, we have concluded that the best set of input variables for the fuzzy system are: PERCLOS, eye closure duration, blink frequency, nodding frequency and face direction. This last variable evaluates the number of frames where the face direction is not frontal in the last 30 s. All the fuzzy inputs have been normalized and different linguistic terms and its corresponding fuzzy sets have been distributed in each of them. For the output variable we have distributed some fuzzy sets in vigilance level range between 0 and 100%. Based on the variables mentioned before, we have defined some rules for correct vigilance monitoring. If this level is higher than a predefined threshold, an acoustic warning is activated. 3. Experimental Results The system is currently running on PC Pentium IV (1,s Ghz) in real time (25 framesh) with a resolution of 400x320 pixels. For testing its performance ten sequences, simulating some drowsiness behaviours, were done. These were achieved following the physiological rules explained in [3] to identify drowsiness in drivers. Test sequences were recorded fiom a car in a motorway using different users without glasses and with different light conditions. These images have been used as inputs of our algorithms, obtaining some quite robust, reliable and accurate results. Figure 6 plot DVL obtained from the fuzzy system for a test sequence of 6 minutes. As can be seen, until the frame 1000 (40 s) DVL is below 45 %, which represents the alert state. DVL is overs 50%, between frames 1000 and 1900 (76 s), due to the blink frequency increase. Then, between frames 1900 and 5500 (220 s) DVL decreases because the blink frequency returns to its normal rate. Beyond the frame 5500 the DVL rises due to the PERCLOS enhance. Between the frame 7000 (280 s) and 8000 (320 s) the duration of the eye closure rise provoking a higher enhance of the DVL. In the frame 7900 (316 s) a nodding is produced. This drowsiness behaviour provokes that the DVL is over the 60%, showing a fatigue state. After the nodding, the user temporally awakes provoking the DVL decrease. Our experiments show that, if a driver is alert while driving, the DVL should be less than 50%. A DVL over 70% indicates a clear drowsiness state. 82
6 DPI (SIRAF EM Project) from the Spanish Ministry of Science and Technology (MCyT) References I 0,000 POW wo wo goo0 Frame Figure 6.Parameter over a sequence of 6 min General performance of the measured variables is presented in table 1. Performance was measured by comparing system performance to the observer measures that were in the recorder video sequences. Parameters Percentage detected Face direction Table 1. Accuracy of the results As we can see, the percentage detected for all the variables is quite good taking in mind that the sequences were achieved in real situations. The system performance decreases when drivers wear eyeglasses. 4. Conclusions and future works We have developed a non-intrusive prototype computer vision system for real-time monitoring driver s vigilance. It is based on a hardware system, for real time acquisition of driver s images using an active IR illuminator, and their software implementation for real time pupil tracking, ocular measures and face pose estimation. Finally, driver s vigilance level is determined from the fusion of the measured parameters into a fuzzy system. In the future we have the intention to test the system with more users, in order to generalize drowsiness behaviours, and to improve it for users with eyeglasses. Acknowledgments This work has been supported by grants FOM from the Spanish Ministry of Public Works and by grants AWAKE Consortium (IST ). sept sept httwllwww.aw&e-eu.org J. Healey and R. Picard, Smartcar: Detecting Driver Stress. 15th Intemational Conference on Pattem Recognition, Volumen 4, pp Barcelona, Albert Kircher, Marcus Uddman, Jesper Sandin, Vehicle control and drowsiness, Swedish National Road and Transport Research Institute Anon, Perclos and eyetracking: Challenge and Opportunity. Technical Report, Applied Science Laboratories, Bedford, MA Ueno, H., Kaneda, M. & Tsukino, M. Development of drowsiness detection system. Proceedings of Vehicle Navigation and Information Systems Conference, Yokohama, Japan, pp ,1994. Jesh Soria, Llega el piloto tecnol6gico, Revista Tru$co, Direccibn General de Trajico. Ministerio de Interior. pp: 20-21, MmO-Abril2002. Y. Matsumoto and A. Zelinsky An algorithm for realtime stereo vision implementation of head pose and gaze direction measurements. Procs. IEEE 4th Int. Con$ Face and Gesture Recognition, pp: , March T. Victor and A. Zelinsky Automating the measurement of driver visual behaviours using passive stereo vision. Procs. Znt. Con$ Series Vision in Vehicles VIV9, Brisbane, Australia, August Boverie, S., Leqellec, J.M. & Hirl, A. Intelligent systems for video monitoring of vehicle cockpit. International Congress and Exposition ITS. Advanced Controls and Vehicle Navigation Systems, pp P. Smith, M. Shah and N. Da Vitoria Lobo. Determining Driver Visual Attention with One Camera. IEEE Transactions on Intelligent Transportation Systems, Vol. 4, no 4, pp: , December W. Shih and Liu, A calibration-free gaze tracking technique. in Proc. 151h Con Patterns Recognition, vol. 4, Barcelona, Spain, pp: Qiang Ji and Xiaojie Yang, Real-time Eye, Gaze and Face Pose Tracking for Monitoring Driver Vigilance, Real Time Imaging 8, v~: ,2002. [13] R. Grace, Drowsy Driver Monitor and Warning System, Int. Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, August, [14] Dinges D. F. PERCLOS: A valid psychophysiological measure of alertness as assesed by psychomotor vigilance, Federal Highway Administration. Ofice of [15] motor carriers. FWHA-MCRT Andrew W. Fitzgibbon, Robert B. Fisher, A buyer s guide to conic fitting, Proceedings of the 6th British conference on Machine vision (Vol. 2), p , Birmingham, United Kingdom,
Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance
Real-Time Imaging 8, 357 377 (22) doi:1.16/rtim.22.279, available online at http://www.idealibrary.com on Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance T his paper describes
Facial Features Tracking applied to Drivers Drowsiness Detection
Facial Features Tracking applied to Drivers Drosiness Detection L. M. BERGASA, R. BAREA, E. LÓPEZ, M. ESCUDERO, J. I. PINEDO Departamento de Electrónica, Universidad de Alcalá Campus universitario s/n.
Monitoring Head/Eye Motion for Driver Alertness with One Camera
Monitoring Head/Eye Motion for Driver Alertness with One Camera Paul Smith, Mubarak Shah, and N. da Vitoria Lobo Computer Science, University of Central Florida, Orlando, FL 32816 rps43158,shah,niels @cs.ucf.edu
A Dedicated System for Monitoring of Driver s Fatigue
ISSN: 2319-8753 A Dedicated System for Monitoring of Driver s Fatigue K.Subhashini Spurjeon 1, Yogesh Bahindwar 2 Department of Electronics and Telecommunication SSGI, BHILAI, Chhattisgarh, INDIA 1 Department
Keywords drowsiness, image processing, ultrasonic sensor, detection, camera, speed.
EYE TRACKING BASED DRIVER DROWSINESS MONITORING AND WARNING SYSTEM Mitharwal Surendra Singh L., Ajgar Bhavana G., Shinde Pooja S., Maske Ashish M. Department of Electronics and Telecommunication Institute
Efficient Car Alarming System for Fatigue Detection during Driving
Efficient Car Alarming System for Fatigue Detection during Driving Muhammad Fahad Khan and Farhan Aadil Abstract Driver inattention is one of the main causes of traffic accidents. Monitoring a driver to
Vision-Based Blind Spot Detection Using Optical Flow
Vision-Based Blind Spot Detection Using Optical Flow M.A. Sotelo 1, J. Barriga 1, D. Fernández 1, I. Parra 1, J.E. Naranjo 2, M. Marrón 1, S. Alvarez 1, and M. Gavilán 1 1 Department of Electronics, University
A Review on Driver Face Monitoring Systems for Fatigue and Distraction Detection
, pp.73-100 http://dx.doi.org/10.14257/ijast.2014.64.07 A Review on Driver Face Monitoring Systems for Fatigue and Distraction Detection Mohamad-Hoseyn Sigari 1, Muhammad-Reza Pourshahabi 2 Mohsen Soryani
Real Time Eye Detection and Tracking Method for Driver Assistance System
Real Time Eye Detection and Tracking Method for Driver Assistance System Sayani Ghosh, Tanaya Nandy and Nilotpal Manna Abstract Drowsiness and fatigue of automobile drivers reduce the drivers abilities
Accident Prevention Using Eye Blinking and Head Movement
Accident Prevention Using Eye Blinking and Head Movement Abhi R. Varma Seema V. Arote Asst. prof Electronics Dept. Kuldeep Singh Chetna Bharti ABSTRACT This paper describes a real-time online prototype
Vision-based Real-time Driver Fatigue Detection System for Efficient Vehicle Control
Vision-based Real-time Driver Fatigue Detection System for Efficient Vehicle Control D.Jayanthi, M.Bommy Abstract In modern days, a large no of automobile accidents are caused due to driver fatigue. To
An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network
Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal
Drowsy Driver Detection System
Drowsy Driver Detection System Design Project By: Neeta Parmar Instructor: Peter Hiscocks Department of Electrical and Computer Engineering, Ryerson University. 2002. All Rights Reserved. CERTIFICATE OF
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University A Software-Based System for Synchronizing and Preprocessing Eye Movement Data in Preparation for Analysis 1 Mohammad
Tips and Technology For Bosch Partners
Tips and Technology For Bosch Partners Current information for the successful workshop No. 04/2015 Electrics / Elektronics Driver Assistance Systems In this issue, we are continuing our series on automated
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
PRODUCT SHEET. [email protected] [email protected] www.biopac.com
EYE TRACKING SYSTEMS BIOPAC offers an array of monocular and binocular eye tracking systems that are easily integrated with stimulus presentations, VR environments and other media. Systems Monocular Part
A Real Time Driver s Eye Tracking Design Proposal for Detection of Fatigue Drowsiness
A Real Time Driver s Eye Tracking Design Proposal for Detection of Fatigue Drowsiness Nitin Jagtap 1, Ashlesha kolap 1, Mohit Adgokar 1, Dr. R.N Awale 2 PG Scholar, Dept. of Electrical Engg., VJTI, Mumbai
A Real-Time Driver Fatigue Detection System Based on Eye Tracking and Dynamic Template Matching
Tamkang Journal of Science and Engineering, Vol. 11, No. 1, pp. 65 72 (28) 65 A Real-Time Driver Fatigue Detection System Based on Eye Tracking and Dynamic Template Matching Wen-Bing Horng* and Chih-Yuan
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - [email protected]
Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle
Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle Miguel A. Olivares-Méndez mig [email protected] Pascual Campoy Cervera [email protected] Iván Mondragón [email protected] Carol
PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY
PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia
Neural Network based Vehicle Classification for Intelligent Traffic Control
Neural Network based Vehicle Classification for Intelligent Traffic Control Saeid Fazli 1, Shahram Mohammadi 2, Morteza Rahmani 3 1,2,3 Electrical Engineering Department, Zanjan University, Zanjan, IRAN
Mouse Control using a Web Camera based on Colour Detection
Mouse Control using a Web Camera based on Colour Detection Abhik Banerjee 1, Abhirup Ghosh 2, Koustuvmoni Bharadwaj 3, Hemanta Saikia 4 1, 2, 3, 4 Department of Electronics & Communication Engineering,
Template-based Eye and Mouth Detection for 3D Video Conferencing
Template-based Eye and Mouth Detection for 3D Video Conferencing Jürgen Rurainsky and Peter Eisert Fraunhofer Institute for Telecommunications - Heinrich-Hertz-Institute, Image Processing Department, Einsteinufer
Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior
Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior Kenji Yamashiro, Daisuke Deguchi, Tomokazu Takahashi,2, Ichiro Ide, Hiroshi Murase, Kazunori Higuchi 3,
Video-Based Eye Tracking
Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department
Face Model Fitting on Low Resolution Images
Face Model Fitting on Low Resolution Images Xiaoming Liu Peter H. Tu Frederick W. Wheeler Visualization and Computer Vision Lab General Electric Global Research Center Niskayuna, NY, 1239, USA {liux,tu,wheeler}@research.ge.com
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems
PASSENGER/PEDESTRIAN ANALYSIS BY NEUROMORPHIC VISUAL INFORMATION PROCESSING
PASSENGER/PEDESTRIAN ANALYSIS BY NEUROMORPHIC VISUAL INFORMATION PROCESSING Woo Joon Han Il Song Han Korea Advanced Science and Technology Republic of Korea Paper Number 13-0407 ABSTRACT The physiological
PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS
PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS Takahiro Ishikawa Research Laboratories, DENSO CORPORATION Nisshin, Aichi, Japan Tel: +81 (561) 75-1616, Fax: +81 (561) 75-1193 Email: [email protected]
N.Galley, R.Schleicher and L.Galley Blink parameter for sleepiness detection 1
N.Galley, R.Schleicher and L.Galley Blink parameter for sleepiness detection 1 Blink Parameter as Indicators of Driver s Sleepiness Possibilities and Limitations. Niels Galley 1, Robert Schleicher 1 &
Traffic Monitoring Systems. Technology and sensors
Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal
HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT
International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT Akhil Gupta, Akash Rathi, Dr. Y. Radhika
T-REDSPEED White paper
T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation
Drowsy Driver Warning System Using Image Processing
Drowsy Driver Warning System Using Image Processing 1 Singh Himani Parmar, 2 Mehul Jajal, 3 Yadav Priyanka Brijbhan Electronics & Communication, GEC, Bharuch, Gujarat 1 [email protected], 2 [email protected],
ExmoR A Testing Tool for Control Algorithms on Mobile Robots
ExmoR A Testing Tool for Control Algorithms on Mobile Robots F. Lehmann, M. Ritzschke and B. Meffert Institute of Informatics, Humboldt University, Unter den Linden 6, 10099 Berlin, Germany E-mail: [email protected],
The demonstration will be performed in the INTA high speed ring to emulate highway geometry and driving conditions.
Company / Contact: Description of your project Description of your vehicle/ mock up High speed CACC with lateral control AUTOPÍA Program Centro de Automática y Robótica (UPM-CSC) The goal of this demonstration
A Reliability Point and Kalman Filter-based Vehicle Tracking Technique
A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video
Real Time Target Tracking with Pan Tilt Zoom Camera
2009 Digital Image Computing: Techniques and Applications Real Time Target Tracking with Pan Tilt Zoom Camera Pankaj Kumar, Anthony Dick School of Computer Science The University of Adelaide Adelaide,
Poker Vision: Playing Cards and Chips Identification based on Image Processing
Poker Vision: Playing Cards and Chips Identification based on Image Processing Paulo Martins 1, Luís Paulo Reis 2, and Luís Teófilo 2 1 DEEC Electrical Engineering Department 2 LIACC Artificial Intelligence
Smartphone Based Driver Aided System to Reduce Accidents Using OpenCV
ISSN (Online) 2278-1021 Smartphone Based Driver Aided System to Reduce Accidents Using OpenCV Zope Chaitali K. 1, Y.C. Kulkarni 2 P.G. Student, Department of Information Technology, B.V.D.U College of
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems
Development of an automated Red Light Violation Detection System (RLVDS) for Indian vehicles
CS11 59 Development of an automated Red Light Violation Detection System (RLVDS) for Indian vehicles Satadal Saha 1, Subhadip Basu 2 *, Mita Nasipuri 2, Dipak Kumar Basu # 2 # AICTE Emeritus Fellow 1 CSE
REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING
REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING Ms.PALLAVI CHOUDEKAR Ajay Kumar Garg Engineering College, Department of electrical and electronics Ms.SAYANTI BANERJEE Ajay Kumar Garg Engineering
MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR
MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR Andrew Liu and Dario Salvucci* MIT Man Vehicle Laboratory, Rm 37-219, 77 Massachusetts Ave., Cambridge, MA 02139 *Nissan Cambridge Basic Research, 4 Cambridge
Real-Time Tracking of Pedestrians and Vehicles
Real-Time Tracking of Pedestrians and Vehicles N.T. Siebel and S.J. Maybank. Computational Vision Group Department of Computer Science The University of Reading Reading RG6 6AY, England Abstract We present
E190Q Lecture 5 Autonomous Robot Navigation
E190Q Lecture 5 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2014 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
International Journal of Advanced Information in Arts, Science & Management Vol.2, No.2, December 2014
Efficient Attendance Management System Using Face Detection and Recognition Arun.A.V, Bhatath.S, Chethan.N, Manmohan.C.M, Hamsaveni M Department of Computer Science and Engineering, Vidya Vardhaka College
A video based real time fatigue detection system
DEPARTMENT OF APPLIED PHYSICS AND ELECTRONICS UMEÅ UNIVERISTY, SWEDEN DIGITAL MEDIA LAB A video based real time fatigue detection system Zhengrong Yao 1 Dept. Applied Physics and Electronics Umeå University
Automatic Traffic Estimation Using Image Processing
Automatic Traffic Estimation Using Image Processing Pejman Niksaz Science &Research Branch, Azad University of Yazd, Iran [email protected] Abstract As we know the population of city and number of
A REVIEW AND EVALUATION OF EMERGING DRIVER FATIGUE DETECTION MEASURES AND TECHNOLOGIES
- 1 - A REVIEW AND EVALUATION OF EMERGING DRIVER FATIGUE DETECTION MEASURES AND TECHNOLOGIES Lawrence Barr 1, Heidi Howarth 1, Stephen Popkin 1, Robert J. Carroll 2 1 John A. Volpe National Transportation
Development of an Automotive Active Safety System Using a 24 GHz-band High Resolution Multi-Mode Radar
Special Issue Automobile Electronics Development of an Automotive Active Safety System Using a 24 GHz-band High Resolution Multi-Mode Yasushi Aoyagi* 1, Keisuke Morii* 1, Yoshiyuki Ishida* 1, Takashi Kawate*
Sensor Integration in the Security Domain
Sensor Integration in the Security Domain Bastian Köhler, Felix Opitz, Kaeye Dästner, Guy Kouemou Defence & Communications Systems Defence Electronics Integrated Systems / Air Dominance & Sensor Data Fusion
Driver Drowsiness/Fatigue:
Driver Drowsiness/Fatigue: AWake-up Call Goal Provide you with education, training, and materials on topics related to commercial driver drowsiness and fatigue so you can effectively train your drivers
Investigation of driver sleepiness in FOT data
ViP publication 2013-2 Investigation of driver sleepiness in FOT data Final report of the project SleepEYE II, part 2 Authors Carina Fors, VTI David Hallvig, VTI Emanuel Hasselberg, Smart Eye Jordanka
False alarm in outdoor environments
Accepted 1.0 Savantic letter 1(6) False alarm in outdoor environments Accepted 1.0 Savantic letter 2(6) Table of contents Revision history 3 References 3 1 Introduction 4 2 Pre-processing 4 3 Detection,
Vehicle Tracking System Robust to Changes in Environmental Conditions
INORMATION & COMMUNICATIONS Vehicle Tracking System Robust to Changes in Environmental Conditions Yasuo OGIUCHI*, Masakatsu HIGASHIKUBO, Kenji NISHIDA and Takio KURITA Driving Safety Support Systems (DSSS)
Detection and Restoration of Vertical Non-linear Scratches in Digitized Film Sequences
Detection and Restoration of Vertical Non-linear Scratches in Digitized Film Sequences Byoung-moon You 1, Kyung-tack Jung 2, Sang-kook Kim 2, and Doo-sung Hwang 3 1 L&Y Vision Technologies, Inc., Daejeon,
ACCIDENTS AND NEAR-MISSES ANALYSIS BY USING VIDEO DRIVE-RECORDERS IN A FLEET TEST
ACCIDENTS AND NEAR-MISSES ANALYSIS BY USING VIDEO DRIVE-RECORDERS IN A FLEET TEST Yuji Arai Tetsuya Nishimoto apan Automobile Research Institute apan Yukihiro Ezaka Ministry of Land, Infrastructure and
Robot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision
International Journal of Advanced Robotic Systems ARTICLE Shape Measurement of a Sewer Pipe Using a Mobile Robot with Computer Vision Regular Paper Kikuhito Kawasue 1,* and Takayuki Komatsu 1 1 Department
Motorcycle Airbag System
PRESS INFORMATION September 2005 Motorcycle Airbag System Honda Motor Co., Ltd. Introduction Honda has been proactive in fostering driver and rider training as well as developing and implementing active
WHITE PAPER. Are More Pixels Better? www.basler-ipcam.com. Resolution Does it Really Matter?
WHITE PAPER www.basler-ipcam.com Are More Pixels Better? The most frequently asked question when buying a new digital security camera is, What resolution does the camera provide? The resolution is indeed
istraffic Automatic Traffic Monitoring system
istraffic Automatic Traffic Monitoring system Goal The goal of the system is to perform an automatic analysis of the traffic flow to detect events like slow traffic, queues, stopped vehicles, etc. using
An Instructional Aid System for Driving Schools Based on Visual Simulation
An Instructional Aid System for Driving Schools Based on Visual Simulation Salvador Bayarri, Rafael Garcia, Pedro Valero, Ignacio Pareja, Institute of Traffic and Road Safety (INTRAS), Marcos Fernandez
A System for Capturing High Resolution Images
A System for Capturing High Resolution Images G.Voyatzis, G.Angelopoulos, A.Bors and I.Pitas Department of Informatics University of Thessaloniki BOX 451, 54006 Thessaloniki GREECE e-mail: [email protected]
MACHINE VISION MNEMONICS, INC. 102 Gaither Drive, Suite 4 Mount Laurel, NJ 08054 USA 856-234-0970 www.mnemonicsinc.com
MACHINE VISION by MNEMONICS, INC. 102 Gaither Drive, Suite 4 Mount Laurel, NJ 08054 USA 856-234-0970 www.mnemonicsinc.com Overview A visual information processing company with over 25 years experience
Analecta Vol. 8, No. 2 ISSN 2064-7964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
Video Camera Image Quality in Physical Electronic Security Systems
Video Camera Image Quality in Physical Electronic Security Systems Video Camera Image Quality in Physical Electronic Security Systems In the second decade of the 21st century, annual revenue for the global
Fall detection in the elderly by head tracking
Loughborough University Institutional Repository Fall detection in the elderly by head tracking This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation:
How To Make A Light Bulb For Older People
Ambient Lighting Assistance New lighting technology for enhancing the wellbeing of older people Project Information The Specific Targeted Research Project (STREP) Ambient Lighting Assistance for an Ageing
Frequently Asked Questions About VisionGauge OnLine
Frequently Asked Questions About VisionGauge OnLine The following frequently asked questions address the most common issues and inquiries about VisionGauge OnLine: 1. What is VisionGauge OnLine? VisionGauge
A Survey of Video Processing with Field Programmable Gate Arrays (FGPA)
A Survey of Video Processing with Field Programmable Gate Arrays (FGPA) Heather Garnell Abstract This paper is a high-level, survey of recent developments in the area of video processing using reconfigurable
ESE498. Intruder Detection System
0 Washington University in St. Louis School of Engineering and Applied Science Electrical and Systems Engineering Department ESE498 Intruder Detection System By Allen Chiang, Jonathan Chu, Siwei Su Supervisor
Limitations of Human Vision. What is computer vision? What is computer vision (cont d)?
What is computer vision? Limitations of Human Vision Slide 1 Computer vision (image understanding) is a discipline that studies how to reconstruct, interpret and understand a 3D scene from its 2D images
Electric Power Steering Automation for Autonomous Driving
Electric Power Steering Automation for Autonomous Driving J. E. Naranjo, C. González, R. García, T. de Pedro Instituto de Automática Industrial (CSIC) Ctra. Campo Real Km.,2, La Poveda, Arganda del Rey,
Data Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801
S7a1801 OVERVIEW In definiti 3D theaters, the audience wears special lightweight glasses to see the world projected onto the giant dome screen with real depth perception called 3D stereo. The effect allows
http://dx.doi.org/10.1117/12.906346
Stephanie Fullerton ; Keith Bennett ; Eiji Toda and Teruo Takahashi "Camera simulation engine enables efficient system optimization for super-resolution imaging", Proc. SPIE 8228, Single Molecule Spectroscopy
Multi-view Intelligent Vehicle Surveillance System
Multi-view Intelligent Vehicle Surveillance System S. Denman, C. Fookes, J. Cook, C. Davoren, A. Mamic, G. Farquharson, D. Chen, B. Chen and S. Sridharan Image and Video Research Laboratory Queensland
CONTROL CODE GENERATOR USED FOR CONTROL EXPERIMENTS IN SHIP SCALE MODEL
CONTROL CODE GENERATOR USED FOR CONTROL EXPERIMENTS IN SHIP SCALE MODEL Polo, O. R. (1), Esteban, S. (2), Maron, A. (3), Grau, L. (4), De la Cruz, J.M. (2) (1) Dept Arquitectura de Computadores y Automatica.
Crater detection with segmentation-based image processing algorithm
Template reference : 100181708K-EN Crater detection with segmentation-based image processing algorithm M. Spigai, S. Clerc (Thales Alenia Space-France) V. Simard-Bilodeau (U. Sherbrooke and NGC Aerospace,
Tutorial 1. Introduction to robot
Tutorial 1. Introduction to moway robot www.moway-robot.com 1 Index INTRODUCTION... 2 MOWAY ROBOT... 2 MOWAY PERSON COMPARISON... 6 HEARING SENSE EXAMPLE... 11 VISION EXAMPLE... 12 TOUCH SENSE EXAMPLE...
Optimal Vision Using Cameras for Intelligent Transportation Systems
WHITE PAPER www.baslerweb.com Optimal Vision Using Cameras for Intelligent Transportation Systems Intelligent Transportation Systems (ITS) require intelligent vision solutions. Modern camera technologies
Automatic Extraction of Signatures from Bank Cheques and other Documents
Automatic Extraction of Signatures from Bank Cheques and other Documents Vamsi Krishna Madasu *, Mohd. Hafizuddin Mohd. Yusof, M. Hanmandlu ß, Kurt Kubik * *Intelligent Real-Time Imaging and Sensing group,
Vision-based Walking Parameter Estimation for Biped Locomotion Imitation
Vision-based Walking Parameter Estimation for Biped Locomotion Imitation Juan Pedro Bandera Rubio 1, Changjiu Zhou 2 and Francisco Sandoval Hernández 1 1 Dpto. Tecnología Electrónica, E.T.S.I. Telecomunicación
Mean-Shift Tracking with Random Sampling
1 Mean-Shift Tracking with Random Sampling Alex Po Leung, Shaogang Gong Department of Computer Science Queen Mary, University of London, London, E1 4NS Abstract In this work, boosting the efficiency of
