Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring
|
|
- Randall Williamson
- 8 years ago
- Views:
Transcription
1 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring Nicholas Apostoloff and Alexander Zelinsky The Australian National University, Robotic Systems Laboratory, Research School of Information Sciences and Engineering, Canberra ACT 0200, Australia Abstract million people die in road crashes around the world each year. It is estimated that up to 30% of these fatalities are caused by fatigue and inattention. This paper presents preliminary results of an Intelligent Transport System (ITS) project that has fused visual lane tracking and driver monitoring technologies in the first step to closing the loop between vision inside and outside the vehicle. Experimental results of the active stereo-vision lane tracking system will be discussed focusing on the particle filter and cue fusion technology used. The results from the integration of the lane tracker and the driver monitoring system are presented with an analysis of the driver s visual behavior in several different scenarios. 1 Introduction It is estimated that around 30% of fatal car crashes can be attributed to driver inattention and fatigue [4] [11]. Numerous studies have been performed to analyse signs of driver fatigue through the measurement of the visual demand on the driver. This is often through frame-by-frame human-rater measurement or infrared cornealreflection technologies. While these studies produce valuable results, they are often time consuming and too unreliable for many research purposes [11]. An ITS project has recently been initiated at The Australian National University (ANU) which is focused on autonomous driver monitoring and autonomous vehicle control to aid the driver [3]. A major aim of this project is the development of a system of cooperating internal and external vehicle sensors for research into the visual behavior of the driver. This paper presents the first results from this study where a lane tracker was developed using particle filtering and visual cue fusion technology. This was integrated with a driver monitoring system (facelab [11]) to investigate the visual behavior of driver in a number of common driving scenarios. There are many benefits associated with the development of reliable lane tracking systems. Traditional uses include autonomous control of vehicles, Adaptive Cruise Control (ACC) and Lane Departure Warning (LDW) systems. In combination with the driver monitoring software we can detect when a driver is looking at and has their attention focused on the road. This is the first step in characterising the environment surrounding the driver for a complete analysis of what holds the drivers attention. There are numerous uses for this technology including: LDW systems and the reduction of false positives.
2 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 609 Obstacle detection and warning systems. Automated driver visuo-attentional analysis systems. Fatigue and inattention warning systems 2 The Experimental Platform The testbed vehicle is a 1999 Toyota Landcruiser 4WD (see Fig. 1). Vision is the main form of sensing used on the vehicle, which has two different vision platforms installed (see Fig. 2). A passive set of cameras is mounted on the dashboard facing the driver and are part of the facelab system for driver monitoring. CeDAR, an active vision head designed at ANU [10], carries 4 cameras - one pair used for stereo vision in the near-field, and one pair for far-field stereo experiments and for mid-field to far-field scene coverage. Various other sensors (which were not used in this experiment) have been fitted to the vehicle including a Global Positioning System (GPS), Inertial Navigation Sensor (INS), and a Pulse FM Radar. Fig. 1. The testbed vehicle. 3 System Architecture The system architecture consists of the lane tracker, the driver monitor (facelab) and the high level data correlation systems.both the lane tracker and facelab run independently on different CPUs while the high level data correlation system produces the focus of attention of the driver with respect to the road. The driver s focus of attention is determined to be one of the regions defined in Fig. 3. The regions are divided into 3 different ranges - the near field, the far field and a horizon region. The horizon region is used to collect eye gaze data that is roughly parallel to the ground plane and that doesn t intersect with the other 2 ranges.
3 610 N. Apostoloff and A. Zelinsky Fig. 2. The vision platforms in the vehicle. Top: CeDAR active vision head. Right (above the steering wheel): facelab passive stereo cameras. Fig. 3. The regions of interest for obtaining the focus of attention of the driver.
4 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 611 Each of these ranges is split into either 3 or 4 regions depending on the test being performed. On a two lane road 4 regions are defined - a region to the left of the road, the lane detected by the lane tracker, the right lane containing on-coming traffic and a region to the right of the road. If the test is on a one-way road such as a highway the right lane containing on-coming traffic is not used. The focus of attention region is obtained by calculating the intercept of the eye gaze vector of the driver with the regions defined in Fig. 3 which are found by the lane tracker. 4 Lane Tracking Despite many impressive results from lane trackers in the past [1] [2] [9] [12], it is clear that no single cue can perform reliably in all situations. The lane tracking system presented here dynamically allocates computational resources over a suite of cues to robustly track the road in a variety of situations. Bayesian theory is used to fuse the cues while a scheduler intelligently allocates computational resources to the individual cues based on their performance. A particle filter [5] is used to control hypothesis generation of the lane location. For a detailed description of the tracking system see [6]. Each cue is specifically developed to work independently from the other cues and are customised to perform well under different situations (i.e. edge based lane marker tracking, colour based road tracking, etc.). Cues are allocated CPU time based on their performance and the time they require. Two metrics, the Kullback-Leibler divergence and the Uncertainty Deviation [8], are used to evaluate the performance of each cue with respect to the fused result and the individual result of the cue respectively. Additionally, the framework of the lane tracker was designed to allow the cues to run at different frequencies enabling slow running (but valuable) cues to run in the background. A dual phase particle filter system is proposed to reduce the search space for the lane tracker. The first particle filter searches for the the road width, the lateral offset of the vehicle from the centerline of the road and the yaw of the vehicle with respect to the centerline of the road. The second particle filter captures the horizontal and vertical road curvature in the mid to far-field ranges using the state information captured by the first particle filter. In the work reported in this paper, the second phase particle filter of the system was not used and no road curvature was calculated. The state space for the particle filter is the lateral offset of the vehicle relative to the skeletal line of the road, the yaw of the vehicle with respect to the skeletal line and the road width (see Fig. 4). 4.1 Cues for Robust Lane Tracking The cues chosen for this experiment were designed to be simple and efficient while being suited to a different set of road scenarios. Individually, each of the cues would perform poorly, but when they are combined through the cue fusion process they
5 612 N. Apostoloff and A. Zelinsky produce a robust solution to lane tracking. Each cue listed below uses the road model shown in Fig. 4 to process the probability of each hypothesis from the particle filter. Fig. 4. Road model used for the first phase particle filter. The dark shaded region is used as the non road boundary in the colour cues while the light shaded region is the road region. Note that the figure is exaggerated for clarity. 1. Lane Marker Cue is designed for roads that have lane markings. A modified ternary correlation 1 to preprocess an intensity image of the road and the cue returns the average value of the pixels along the hypothesised road edges. 2. Road Edge Cue is suited to roads with lane markings or roads with defined edges. It uses a preprocessed edge map and returns the average value of the pixels along the hypothesised road edges. 3. Road Colour Cue is useful for any roads that have a different colour than their surroundings (both unmarked and marked roads). It returns the average pixel value in the hypothesised road region from a colour probability map that is dynamically generated each iteration using the estimated road parameters from the previous iteration. 4. Non Road Colour Boundary Cue is the opposite to the Road Colour Cue and returns the average road colour probability of the non-road regions. 5. Road Width Cue is particularly useful on multi-lane roads where is is possible for the other cues to see two or more lanes as one. It returns a value from a Gaussian function centered at a desired road width given the hypothesised road width. The desired road width used in this cue was 3.61m which was empirically determined from previous lane tracking experiments to be the average road width. 6. Elastic Lane Cue is used to move particles towards the lane that the vehicle is in. It returns 1 if the lateral offset of the vehicle is less than half of the road width and 0.5 otherwise. 1 The 1D ternary correlation function is modified to be two sided with a step from -1 to 1 and back to -1.
6 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring facelab facelab [11] is a driver monitoring system commercialised by Seeing Machines [7] based on research and development work between ANU and Volvo Technological Development Corporation. It uses a passive stereo pair of cameras mounted on the dashboard of the vehicle to capture 60Hz video images of the driver s head. These images are processed in real-time to determine the 3D position of matching features on the drivers face. The features are then used to calculate the 3D pose of the persons face ( mm, ) as well as the eye gaze direction ( ), blink rates and eye closure. 6 Integration of the Lane Tracker and facelab Combining the gaze direction captured by facelab and the parameterised lane model determined by the lane tracker, we can track the visual behavior of the driver relative to the road. Calculating the intercept of the drivers gaze vector with the regions of interest shown in Fig. 3 determines the driver s focus of attention. Additionally, the visual scan patterns of the driver can be recorded with emphasis on fixation points and saccade movements to different regions in the scene. 7 Experimental Results Results from the lane tracker and the integrated system are presented below. 7.1 Lane Tracking The lane tracker was tested in several different scenarios including: Highway driving with light traffic. Outer city driving with high curvature roads. Inner city driving with moderate levels of traffic. Figure 5 shows the output of the lane tracker in the above scenarios using the 6 different cues. The lane tracker was found to work robustly and solved the problems typically associated with lane tracking including: Dramatic lighting changes (see A in Fig. 5). Changes in road colour (see D-F in Fig. 5). Shadows across the road (see C,G-H in Fig. 5). Roads with miscellaneous lines that are not lane markings (see I in Fig. 5). Lane markings that disappear and reappear.
7 614 N. Apostoloff and A. Zelinsky Fig. 5. Results from the lane tracker. The boxes indicate the ends of the lines that define the lane being tracked. This can be attributed to the combination of particle filtering and cue fusion. Because of the particle filter, cues only have to validate a hypothesis and do not have to search for the road. This indirectly incorporates a number of a priori constraints into the system (such as road edges meeting at the vanishing point in image space and the edges lying in the road plane) which assist it in its detection task. Cue fusion was found to dramatically increase the robustness of the solution due to the variety of conditions the cues were suited to. The final configuration of cues in the system is a direct result of earlier experiments uncovering certain conditions in which the cues would fail. 7.2 Driver Monitoring and Lane Tracking The integrated driver monitoring and lane tracking system was tested both on the highway and on outer city high curvature roads. A comparison of the drivers focus of attention statistics is given in Fig. 7 while a comparison of the yaw of the vehicle (a good indicator in this case of the degree of curvature) and the drivers gaze yaw is given in Fig. 6. Figure 6 gives a good indication of the correlation between the driver monitoring system and the lane tracking system. The yaw of the driver s gaze closely follows the
8 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 615 Fig. 6. Comparison between the vehicle yaw and driver s gaze yaw between two tests along the same piece of road but in different directions. The dashed line is the yaw of the vehicle while the solid line is the yaw of the driver s gaze. Note that the large peaks are generated by the driver through saccade movements to the left and right windows of the car. vehicle yaw (a good indication of the curvature of the road) in both graphs showing that the driver is following the road curvature with his gaze. Fig. 7. Comparison between the driver s focus of attention for 3 different tests along a high curvature road and the highway. In the 4 pie charts to the left, the top and bottom rows are the data from the two different tests along the same high curvature road while column 1 is for the South-West route and column 2 is the North-East route. The pie chart to the right is for the highway test
9 616 N. Apostoloff and A. Zelinsky Figure 7 shows the percentage of time the driver focused his attention on the different regions of interest in the scene. The Other field is representative of times which the drivers gaze did not intercept any of the fields of interest. These graphs clearly show the repeatability in the experiments as well as a certain characteristic of the road which these tests were carried out on. The road used for the 4 pie charts to the left has a predominantly right curvature in the North-East direction (column 2 of Fig. 7) shown by the greater proportion of time the driver spent focusing on the right lane as well as the right side of the road. Note that the Right Road regions in these pie charts indicates the lane that the on-coming traffic occupies. The amount of time the driver spent looking at the highway (right pie chart of Fig. 7) stayed approximately the same as with the high curvature roads even though it was actually a smaller region (no right lane). The time spent focusing on the left and right sides of the highway was approximately equal which is to be expected considering the low curvature of the road. 8 Conclusions An integrated visual driver monitoring and lane tracking system has been presented that was successfully used to close the loop between vision inside and outside the vehicle. It was found that the lane tracking system benefited greatly from the cue fusion and particle filtering technologies used and was shown to perform robustly in a number of situations. A strong correlation was found to exist between the eye gaze direction of the driver and the curvature of the road, while the repeatability of the experiments was high. The applications of such a system are numerous, possibly the most important being driver warning systems of the future. 9 Acknowledgments We would like to thank the staff of Seeing Machines for their help with facelab, particularly David Liebowitz who was kind enough to spend time helping with the analysis of the data collected. We would also like to acknowledge Gareth Loy and Luke Fletcher for their help with the tracking system used in the lane tracker. References 1. Parag H. Batavia, Dean A. Pomerleau, and Charles E. Thorpe. Overtaking vehicle detection using implicit optical flow. In Proc. IEEE Transport Systems Conference, Ernst D. Dickmanns. An expectation-based, multi-focal, saccadic (ems) vision system for vehicle guidance. In Proc. International Symposium on Robotics and Research, Salt Lake City, Utah, October Luke Fletcher, Nicholas Apostoloff, Jason Chen, and Alexander Zelinsky. Computer vision for vehicle monitoring and control. In Proc. Australian Conference on Robotics and Automation.
10 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring HoRSCoCTA. Inquiry into Managing Fatigue in Transport. The Parliament of the Commonwealth of Australia, M. Isard and A. Blake. Condensation conditional density propagation for visual tracking. International Journal of Computer Vision 92(1), Gareth Loy, Luke Fletcher, Nicholas Apostoloff, and Alexander Zelinsky. An adaptive fusion architecture for target tracking. In Proc. The 5th International Conference on Automatic Face and Gesture Recognition. 7. Seeing Machines. Facelab face and eye tracking system A. Soto and P.Khosla. Probabilistic adaptive agent based system for dynamic state estimation using multiple visual cues. In Proceedings of the International Symposium of Robotics Research (ISRR). 9. A. Suzuki, N. Yasui, N. Nakano, and M. Kaneko. Lane recognition system for guiding of autonomous vehicle. In Proceedings of the Intelligent Vehicles Symposium, 1992, pages , Harley Truong, Samir Abdallah, Sebastien Rougeaux, and Alexander Zelinsky. A novel mechanism for stereo active vision. In Proc. Australian Conference on Robotics and Automation. 11. Trent Victor, Olle Blomberg, and Alexander Zelinsky. Automating driver visual behaviour measurement. In Proc. Vision in Vehicles Todd Williamson and Charles Thorpe. A trinocular stereo system for highway obstacle detection. In Proc. International Conference on Robotics and Automation (ICRA99), 1999.
Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering 1
Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering 1 Nicholas Apostoloff Department of Engineering Sciences University of Oxford Oxford, United Kingdom nema@robots.ox.ac.uk Alexander
More informationComputer Vision for Vehicle Monitoring and Control
Computer Vision for Vehicle Monitoring and Control Luke Fletcher, Nicholas Apostoloff, Jason Chen, Alexander Zelinsky Robotic Systems Laboratory Department of Systems Engineering, RSISE The Australian
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems
More informationDetection and Recognition of Mixed Traffic for Driver Assistance System
Detection and Recognition of Mixed Traffic for Driver Assistance System Pradnya Meshram 1, Prof. S.S. Wankhede 2 1 Scholar, Department of Electronics Engineering, G.H.Raisoni College of Engineering, Digdoh
More informationA Reliability Point and Kalman Filter-based Vehicle Tracking Technique
A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video
More informationDriver Inattention Detection based on Eye Gaze Road Event Correlation
Luke Fletcher Department of Information Engineering, RSISE, Australian National University Canberra, Australia luke.fletcher@anu.edu.au Alexander Zelinsky CSIRO ICT Centre, Canberra, Australia, alex.zelinsky@csiro.au
More informationAutomatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems
More informationVision-Based Blind Spot Detection Using Optical Flow
Vision-Based Blind Spot Detection Using Optical Flow M.A. Sotelo 1, J. Barriga 1, D. Fernández 1, I. Parra 1, J.E. Naranjo 2, M. Marrón 1, S. Alvarez 1, and M. Gavilán 1 1 Department of Electronics, University
More informationLast Mile Intelligent Driving in Urban Mobility
底 盘 电 子 控 制 系 统 研 究 室 Chassis Electronic Control Systems Laboratory 姓 学 名 号 Hui CHEN School 学 of 院 ( Automotive 系 ) Studies, Tongji University, Shanghai, China 学 科 专 业 hui-chen@tongji.edu.cn 指 导 老 师 陈
More informationAn Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network
Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal
More informationTRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS
TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS Michael A. Knodler Jr. Department of Civil & Environmental Engineering University of Massachusetts Amherst Amherst, Massachusetts, USA E-mail: mknodler@ecs.umass.edu
More informationAutomatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior
Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior Kenji Yamashiro, Daisuke Deguchi, Tomokazu Takahashi,2, Ichiro Ide, Hiroshi Murase, Kazunori Higuchi 3,
More informationT-REDSPEED White paper
T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation
More informationStatic Environment Recognition Using Omni-camera from a Moving Vehicle
Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing
More informationA PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir
More informationVideo-Based Eye Tracking
Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department
More informationTestimony of Ann Wilson House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015
House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015 Introduction Chairman Burgess, Ranking Member Schakowsky, members of the subcommittee: Thank you for
More informationTomTom HAD story How TomTom enables Highly Automated Driving
TomTom HAD story How TomTom enables Highly Automated Driving Automotive World Webinar 12 March 2015 Jan-Maarten de Vries VP Product Marketing TomTom Automotive Automated driving is real and it is big Image:
More informationSafe Robot Driving 1 Abstract 2 The need for 360 degree safeguarding
Safe Robot Driving Chuck Thorpe, Romuald Aufrere, Justin Carlson, Dave Duggins, Terry Fong, Jay Gowdy, John Kozar, Rob MacLaughlin, Colin McCabe, Christoph Mertz, Arne Suppe, Bob Wang, Teruko Yata @ri.cmu.edu
More informationMODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR
MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR Andrew Liu and Dario Salvucci* MIT Man Vehicle Laboratory, Rm 37-219, 77 Massachusetts Ave., Cambridge, MA 02139 *Nissan Cambridge Basic Research, 4 Cambridge
More informationVEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
More informationA Simple CCD Based Lane Tracking System
SAE TECHNICAL PAPER SERIES 1999-01-1302 A Simple CCD Based Lane Tracking System Frank S. Barickman National Highway Traffic Safety Administration Duane L. Stoltzfus Transportation Research Center Inc.
More informationDetecting and positioning overtaking vehicles using 1D optical flow
Detecting and positioning overtaking vehicles using 1D optical flow Daniel Hultqvist 1, Jacob Roll 1, Fredrik Svensson 1, Johan Dahlin 2, and Thomas B. Schön 3 Abstract We are concerned with the problem
More informationGUIDELINES. for oversize and overmass vehicles and loads MAY 2006. Government of South Australia. Department for Transport, Energy and Infrastructure
ESCORTING GUIDELINES ESCORTING GUIDELINES SCORTING GUIDELINES ESCORTING GUIDELINES ESCORTING ESCORTING GUIDELINES for oversize and overmass vehicles and loads Government of South Australia Department for
More informationEB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015
EB Automotive Driver Assistance EB Assist Solutions Damian Barnett Director Automotive Software June 5, 2015 Advanced driver assistance systems Market growth The Growth of ADAS is predicted to be about
More informationHow cloud-based systems and machine-driven big data can contribute to the development of autonomous vehicles
How cloud-based systems and machine-driven big data can contribute to the development of autonomous vehicles David Fidalgo- Altran Senior Business Manager CONTENTS I. Altran Group/ Intelligence Systems
More informationJEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development
JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development ROADMAP TO AUTOMATED DRIVING Autonomous Driving (Chauffeured Driving) Increasing Capability
More informationA Vision-Based Tracking System for a Street-Crossing Robot
Submitted to ICRA-04 A Vision-Based Tracking System for a Street-Crossing Robot Michael Baker Computer Science Department University of Massachusetts Lowell Lowell, MA mbaker@cs.uml.edu Holly A. Yanco
More informationPractical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006
Practical Tour of Visual tracking David Fleet and Allan Jepson January, 2006 Designing a Visual Tracker: What is the state? pose and motion (position, velocity, acceleration, ) shape (size, deformation,
More informationMSc in Autonomous Robotics Engineering University of York
MSc in Autonomous Robotics Engineering University of York Practical Robotics Module 2015 A Mobile Robot Navigation System: Labs 1a, 1b, 2a, 2b. Associated lectures: Lecture 1 and lecture 2, given by Nick
More informationAdvanced Vehicle Safety Control System
Hitachi Review Vol. 63 (2014), No. 2 116 Advanced Vehicle Safety Control System Hiroshi Kuroda, Dr. Eng. Atsushi Yokoyama Taisetsu Tanimichi Yuji Otsuka OVERVIEW: Hitachi has been working on the development
More informationBalancing Active and Passive Safety
Balancing Active and Passive Safety Dnr: 2011-01146 Cecilia Sunnevång Ulrich Sander, Ola Boström September 17 th, 2015 Driven for Life. Background US legal & rating Automated Driving NHTSA Oblique (potential)
More informationINTERNET FOR VANET NETWORK COMMUNICATIONS -FLEETNET-
ABSTRACT INTERNET FOR VANET NETWORK COMMUNICATIONS -FLEETNET- Bahidja Boukenadil¹ ¹Department Of Telecommunication, Tlemcen University, Tlemcen,Algeria Now in the world, the exchange of information between
More informationTips and Technology For Bosch Partners
Tips and Technology For Bosch Partners Current information for the successful workshop No. 04/2015 Electrics / Elektronics Driver Assistance Systems In this issue, we are continuing our series on automated
More informationRESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University
RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University A Software-Based System for Synchronizing and Preprocessing Eye Movement Data in Preparation for Analysis 1 Mohammad
More informationGrenzenlos wissen Von der Region in die Welt. Automatisierung von Fahrzeugen
3. Wissenschaftstag der Europäischen Metropolregion Nürnberg Di, 26. Mai 2009 Hochschule Amberg-Weiden, ACC Grenzenlos wissen Von der Region in die Welt Automatisierung von Fahrzeugen Das EU-Forschungsprojekt
More informationMonitoring Head/Eye Motion for Driver Alertness with One Camera
Monitoring Head/Eye Motion for Driver Alertness with One Camera Paul Smith, Mubarak Shah, and N. da Vitoria Lobo Computer Science, University of Central Florida, Orlando, FL 32816 rps43158,shah,niels @cs.ucf.edu
More informationIP-S2 Compact+ 3D Mobile Mapping System
IP-S2 Compact+ 3D Mobile Mapping System 3D scanning of road and roadside features Delivers high density point clouds and 360 spherical imagery High accuracy IMU options without export control Simple Map,
More informationBy: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate
AUTOMATED VEHICLE CONTROL SYSTEM By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate Third Year Information Technology Engineering V.E.S.I.T.
More informationREAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING
REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING Ms.PALLAVI CHOUDEKAR Ajay Kumar Garg Engineering College, Department of electrical and electronics Ms.SAYANTI BANERJEE Ajay Kumar Garg Engineering
More informationMultisensor Data Fusion and Applications
Multisensor Data Fusion and Applications Pramod K. Varshney Department of Electrical Engineering and Computer Science Syracuse University 121 Link Hall Syracuse, New York 13244 USA E-mail: varshney@syr.edu
More informationANALYZING THE TRAVEL PATTERNS OF CONSTRUCTION WORKERS
ANALYZING THE TRAVEL PATTERNS OF CONSTRUCTION WORKERS Jochen Teizer, Ph.D., Assistant Professor Uday Mantripragada, M.S. Student Manu Venugopal, Ph.D. Student School of Civil and Environmental Engineering
More informationCommunicating Agents Architecture with Applications in Multimodal Human Computer Interaction
Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction Maximilian Krüger, Achim Schäfer, Andreas Tewes, Rolf P. Würtz Institut für Neuroinformatik, Ruhr-Universität
More informationTest-bed for Unified Perception & Decision Architecture
Test-bed for Unified Perception & Decision Architecture Luca Bombini, Stefano Cattani, Pietro Cerri, Rean Isabella Fedriga, Mirko Felisa, and Pier Paolo Porta Abstract This paper presents the test-bed
More informationRobotics. Chapter 25. Chapter 25 1
Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration
More informationA Computer Vision System on a Chip: a case study from the automotive domain
A Computer Vision System on a Chip: a case study from the automotive domain Gideon P. Stein Elchanan Rushinek Gaby Hayun Amnon Shashua Mobileye Vision Technologies Ltd. Hebrew University Jerusalem, Israel
More informationVisual Perception and Tracking of Vehicles for Driver Assistance Systems
3-11 Visual Perception and Tracking of Vehicles for Driver Assistance Systems Cristina Hilario, Juan Manuel Collado, Jose Maria Armingol and Arturo de la Escalera Intelligent Systems Laboratory, Department
More informationTracking of Small Unmanned Aerial Vehicles
Tracking of Small Unmanned Aerial Vehicles Steven Krukowski Adrien Perkins Aeronautics and Astronautics Stanford University Stanford, CA 94305 Email: spk170@stanford.edu Aeronautics and Astronautics Stanford
More informationOnline Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability
Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability Raia Hadsell1, Pierre Sermanet1,2, Jan Ben2, Ayse Naz Erkan1, Jefferson Han1, Beat Flepp2, Urs Muller2,
More informationautomatic road sign detection from survey video
automatic road sign detection from survey video by paul stapleton gps4.us.com 10 ACSM BULLETIN february 2012 tech feature In the fields of road asset management and mapping for navigation, clients increasingly
More informationA Multisensor Multiobject Tracking System for an Autonomous Vehicle Driving in an Urban Environment *
A Multisensor Multiobject Tracking ystem for an Autonomous Vehicle Driving in an Urban Environment * Michael Darms 1, Paul E. Rybski 2, Chris Urmson 2 1 Continental, Chassis & afety Division, 2 Carnegie
More informationFall Detection System based on Kinect Sensor using Novel Detection and Posture Recognition Algorithm
Fall Detection System based on Kinect Sensor using Novel Detection and Posture Recognition Algorithm Choon Kiat Lee 1, Vwen Yen Lee 2 1 Hwa Chong Institution, Singapore choonkiat.lee@gmail.com 2 Institute
More informationColorado School of Mines Computer Vision Professor William Hoff
Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Introduction to 2 What is? A process that produces from images of the external world a description
More informationMulti-view Intelligent Vehicle Surveillance System
Multi-view Intelligent Vehicle Surveillance System S. Denman, C. Fookes, J. Cook, C. Davoren, A. Mamic, G. Farquharson, D. Chen, B. Chen and S. Sridharan Image and Video Research Laboratory Queensland
More informationFalse alarm in outdoor environments
Accepted 1.0 Savantic letter 1(6) False alarm in outdoor environments Accepted 1.0 Savantic letter 2(6) Table of contents Revision history 3 References 3 1 Introduction 4 2 Pre-processing 4 3 Detection,
More informationGlobal Automotive Conference
Global Automotive Conference New York, 19 November, 2014 Henrik Kaar, Director Corporate Communications Driven for Life. Autoliv, Inc. All Rights Reserved. Safe Harbor Statement * This presentation contains
More informationA Cognitive Approach to Vision for a Mobile Robot
A Cognitive Approach to Vision for a Mobile Robot D. Paul Benjamin Christopher Funk Pace University, 1 Pace Plaza, New York, New York 10038, 212-346-1012 benjamin@pace.edu Damian Lyons Fordham University,
More informationVisual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle
Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle Miguel A. Olivares-Méndez mig olivares@hotmail.com Pascual Campoy Cervera pascual.campoy@upm.es Iván Mondragón ivanmond@yahoo.com Carol
More informationAccident Prevention Using Eye Blinking and Head Movement
Accident Prevention Using Eye Blinking and Head Movement Abhi R. Varma Seema V. Arote Asst. prof Electronics Dept. Kuldeep Singh Chetna Bharti ABSTRACT This paper describes a real-time online prototype
More informationAAA AUTOMOTIVE ENGINEERING
AAA AUTOMOTIVE ENGINEERING Evaluation of Blind Spot Monitoring and Blind Spot Intervention Technologies 2014 AAA conducted research on blind-spot monitoring systems in the third quarter of 2014. The research
More informationPHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY
PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia
More informationNighttime Vehicle Distance Alarm System
Proceedings of the 7th WSEAS Int. Conf. on Signal Processing, Computational Geometry & Artificial Vision, Athens, Greece, August 24-26, 2007 226 ighttime Vehicle Distance Alarm System MIG-CI LU *, WEI-YE
More informationTraffic Monitoring Systems. Technology and sensors
Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal
More informationMODULAR TRAFFIC SIGNS RECOGNITION APPLIED TO ON-VEHICLE REAL-TIME VISUAL DETECTION OF AMERICAN AND EUROPEAN SPEED LIMIT SIGNS
MODULAR TRAFFIC SIGNS RECOGNITION APPLIED TO ON-VEHICLE REAL-TIME VISUAL DETECTION OF AMERICAN AND EUROPEAN SPEED LIMIT SIGNS Fabien Moutarde and Alexandre Bargeton Robotics Laboratory Ecole des Mines
More informationExmoR A Testing Tool for Control Algorithms on Mobile Robots
ExmoR A Testing Tool for Control Algorithms on Mobile Robots F. Lehmann, M. Ritzschke and B. Meffert Institute of Informatics, Humboldt University, Unter den Linden 6, 10099 Berlin, Germany E-mail: falk.lehmann@gmx.de,
More informationPASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS
PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS Takahiro Ishikawa Research Laboratories, DENSO CORPORATION Nisshin, Aichi, Japan Tel: +81 (561) 75-1616, Fax: +81 (561) 75-1193 Email: tishika@rlab.denso.co.jp
More informationPRODUCT SHEET. info@biopac.com support@biopac.com www.biopac.com
EYE TRACKING SYSTEMS BIOPAC offers an array of monocular and binocular eye tracking systems that are easily integrated with stimulus presentations, VR environments and other media. Systems Monocular Part
More informationHAVEit. Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE
HAVEit Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE HAVEit General Information Project full title: Project coordinator: Highly Automated Vehicles for Intelligent Transport Dr. Reiner
More informationSMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES
SMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES Bandi Sree Geeta 1, Diwakar R. Marur 2 1,2 Department of Electronics and Communication Engineering, SRM University, (India) ABSTRACT
More informationAutomatic Traffic Estimation Using Image Processing
Automatic Traffic Estimation Using Image Processing Pejman Niksaz Science &Research Branch, Azad University of Yazd, Iran Pezhman_1366@yahoo.com Abstract As we know the population of city and number of
More informationA STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER
A STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER Takashi Wakasugi Japan Automobile Research Institute Japan Paper Number 5-29 ABSTRACT The purpose of
More informationSpeed Performance Improvement of Vehicle Blob Tracking System
Speed Performance Improvement of Vehicle Blob Tracking System Sung Chun Lee and Ram Nevatia University of Southern California, Los Angeles, CA 90089, USA sungchun@usc.edu, nevatia@usc.edu Abstract. A speed
More informationE70 Rear-view Camera (RFK)
Table of Contents (RFK) Subject Page Introduction..................................................3 Rear-view Camera..............................................3 Input/Output...................................................4
More informationAnalecta Vol. 8, No. 2 ISSN 2064-7964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
More informationSHOULDN T CARS REACT AS DRIVERS EXPECT? E-mail: jan-erik.kallhammer@autoliv.com. Linköping University, Sweden
SHOULDN T CARS REACT AS DRIVERS EXPECT? Jan-Erik Källhammer, 1,2 Kip Smith, 2 Johan Karlsson, 1 Erik Hollnagel 2 1 Autoliv Research, Vårgårda, Sweden E-mail: jan-erik.kallhammer@autoliv.com 2 Department
More informationTowards Zero Accidents and Increased Productivity in Roadside Construction
Towards Zero Accidents and Increased Productivity in Roadside Construction Project within FFI Vehicle and Traffic Safety Author: Stefan Bergquist and Peter Wallin Date: 2015-03-25 Content 1. Executive
More informationFeasibility of an Augmented Reality-Based Approach to Driving Simulation
Liberty Mutual Research Institute for Safety Feasibility of an Augmented Reality-Based Approach to Driving Simulation Matthias Roetting (LMRIS) Thomas B. Sheridan (MIT AgeLab) International Symposium New
More informationDeterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking
Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle
More informationChallenges for the European Automotive Software Industry
Challenges for the European Automotive Software Industry Viewpoint of a safety supplier 28 th April 2010 Franck Lesbroussart What Trends do we see? Integration of functions Functionalities are expanding
More informationRobotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.
Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical
More informationNaturalistic Cycling Studies
Naturalistic Cycling Studies November 26 th 2013 Marco Dozza CHALMERS University of Technology SAFER presentation for Japan SAFER Goals Phase 1 #1 071220 Summary The cycling safety problem Naturalistic
More informationAguantebocawertyuiopasdfghvdslpmj klzxcvbquieromuchoanachaguilleanic oyafernmqwertyuiopasdfghjklzxcvbn mqwertyuiopasdfghjklzxcvbnmqwerty
Aguantebocawertyuiopasdfghvdslpmj klzxcvbquieromuchoanachaguilleanic oyafernmqwertyuiopasdfghjklzxcvbn mqwertyuiopasdfghjklzxcvbnmqwerty GOOGLE CAR uiopasdfghjklzxcvbnmqwertyuiopasdf Gonzalo Ghigliazza
More informationIN the United States, tens of thousands of drivers and passengers
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 14, NO. 4, DECEMBER 2013 1773 Looking at Vehicles on the Road: A Survey of Vision-Based Vehicle Detection, Tracking, and Behavior Analysis
More informationTracking and integrated navigation Konrad Schindler
Tracking and integrated navigation Konrad Schindler Institute of Geodesy and Photogrammetry Tracking Navigation needs predictions for dynamic objects estimate trajectories in 3D world coordinates and extrapolate
More informationAccidents with Pedestrians and Cyclists in Germany Findings and Measures
Accidents with Pedestrians and Cyclists in Germany Findings and Measures Siegfried Brockmann Unfallforschung der Versicherer (UDV) May 7th, Geneva 2 Content 2 Accident situation in Germany based on National
More informationKeywords drowsiness, image processing, ultrasonic sensor, detection, camera, speed.
EYE TRACKING BASED DRIVER DROWSINESS MONITORING AND WARNING SYSTEM Mitharwal Surendra Singh L., Ajgar Bhavana G., Shinde Pooja S., Maske Ashish M. Department of Electronics and Telecommunication Institute
More informationCrater detection with segmentation-based image processing algorithm
Template reference : 100181708K-EN Crater detection with segmentation-based image processing algorithm M. Spigai, S. Clerc (Thales Alenia Space-France) V. Simard-Bilodeau (U. Sherbrooke and NGC Aerospace,
More informationScience Fiction to Reality: The Future of Automobile Insurance and Transportation Technology
Michael R. Nelson Kymberly Kochis October 13, 2015 Science Fiction to Reality: The Future of Automobile Insurance and Transportation Technology INSURANCE AND FINANCIAL SERVICES LITIGATION WEBINAR SERIES
More informationAutomated Process for Generating Digitised Maps through GPS Data Compression
Automated Process for Generating Digitised Maps through GPS Data Compression Stewart Worrall and Eduardo Nebot University of Sydney, Australia {s.worrall, e.nebot}@acfr.usyd.edu.au Abstract This paper
More informationArtificial Vision and Mobile Robots
Autonomous Robots 5, 215 231 (1998) c 1998 Kluwer Academic Publishers. Manufactured in The Netherlands. Automation of an Industrial Fork Lift Truck, Guided by Artificial Vision in Open Environments F.
More informationRACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night
Appears in: Proceedings of IEEE Intelligent Vehicles 1993 RACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night Rahul Sukthankar Robotics Institute Carnegie Mellon University 5000 Forbes
More information3 rd Workshop on Naturalistic Driving Data Analytics (Tentative) Programme 19 June 2016
3 rd Workshop on Naturalistic Driving Data Analytics (Tentative) Programme 19 June 2016 09:00-09:05 Welcome 09:05-09:40 Analysis of non-critical left turns at intersections and LTAP/OD crashes/nearcrashes
More informationEnterprise M2M Solutions. Fast, Flexible, Cost Effective
Enterprise M2M Solutions Fast, Flexible, Cost Effective The Procon Difference Procon provides M2M platforms and applications for the management of all mobile assets. We are an Australian owned company
More informationReal time vehicle detection and tracking on multiple lanes
Real time vehicle detection and tracking on multiple lanes Kristian Kovačić Edouard Ivanjko Hrvoje Gold Department of Intelligent Transportation Systems Faculty of Transport and Traffic Sciences University
More informationThe demonstration will be performed in the INTA high speed ring to emulate highway geometry and driving conditions.
Company / Contact: Description of your project Description of your vehicle/ mock up High speed CACC with lateral control AUTOPÍA Program Centro de Automática y Robótica (UPM-CSC) The goal of this demonstration
More informationPoker Vision: Playing Cards and Chips Identification based on Image Processing
Poker Vision: Playing Cards and Chips Identification based on Image Processing Paulo Martins 1, Luís Paulo Reis 2, and Luís Teófilo 2 1 DEEC Electrical Engineering Department 2 LIACC Artificial Intelligence
More informationSequence. Liang Zhao and Chuck Thorpe. Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213. E-mail: flzhao, cetg@ri.cmu.
Proc. CVPR'98, Santa Barbara, CA, June -, pp. 9-, 998 Qualitative and Quantitative Car Tracking from a Range Image Sequence Liang Zhao and Chuck Thorpe Robotics Institute, Carnegie Mellon University, Pittsburgh,
More informationHow To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud
REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com
More informationRemoving Moving Objects from Point Cloud Scenes
1 Removing Moving Objects from Point Cloud Scenes Krystof Litomisky klitomis@cs.ucr.edu Abstract. Three-dimensional simultaneous localization and mapping is a topic of significant interest in the research
More information