Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring

Size: px
Start display at page:

Download "Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring"

Transcription

1 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring Nicholas Apostoloff and Alexander Zelinsky The Australian National University, Robotic Systems Laboratory, Research School of Information Sciences and Engineering, Canberra ACT 0200, Australia Abstract million people die in road crashes around the world each year. It is estimated that up to 30% of these fatalities are caused by fatigue and inattention. This paper presents preliminary results of an Intelligent Transport System (ITS) project that has fused visual lane tracking and driver monitoring technologies in the first step to closing the loop between vision inside and outside the vehicle. Experimental results of the active stereo-vision lane tracking system will be discussed focusing on the particle filter and cue fusion technology used. The results from the integration of the lane tracker and the driver monitoring system are presented with an analysis of the driver s visual behavior in several different scenarios. 1 Introduction It is estimated that around 30% of fatal car crashes can be attributed to driver inattention and fatigue [4] [11]. Numerous studies have been performed to analyse signs of driver fatigue through the measurement of the visual demand on the driver. This is often through frame-by-frame human-rater measurement or infrared cornealreflection technologies. While these studies produce valuable results, they are often time consuming and too unreliable for many research purposes [11]. An ITS project has recently been initiated at The Australian National University (ANU) which is focused on autonomous driver monitoring and autonomous vehicle control to aid the driver [3]. A major aim of this project is the development of a system of cooperating internal and external vehicle sensors for research into the visual behavior of the driver. This paper presents the first results from this study where a lane tracker was developed using particle filtering and visual cue fusion technology. This was integrated with a driver monitoring system (facelab [11]) to investigate the visual behavior of driver in a number of common driving scenarios. There are many benefits associated with the development of reliable lane tracking systems. Traditional uses include autonomous control of vehicles, Adaptive Cruise Control (ACC) and Lane Departure Warning (LDW) systems. In combination with the driver monitoring software we can detect when a driver is looking at and has their attention focused on the road. This is the first step in characterising the environment surrounding the driver for a complete analysis of what holds the drivers attention. There are numerous uses for this technology including: LDW systems and the reduction of false positives.

2 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 609 Obstacle detection and warning systems. Automated driver visuo-attentional analysis systems. Fatigue and inattention warning systems 2 The Experimental Platform The testbed vehicle is a 1999 Toyota Landcruiser 4WD (see Fig. 1). Vision is the main form of sensing used on the vehicle, which has two different vision platforms installed (see Fig. 2). A passive set of cameras is mounted on the dashboard facing the driver and are part of the facelab system for driver monitoring. CeDAR, an active vision head designed at ANU [10], carries 4 cameras - one pair used for stereo vision in the near-field, and one pair for far-field stereo experiments and for mid-field to far-field scene coverage. Various other sensors (which were not used in this experiment) have been fitted to the vehicle including a Global Positioning System (GPS), Inertial Navigation Sensor (INS), and a Pulse FM Radar. Fig. 1. The testbed vehicle. 3 System Architecture The system architecture consists of the lane tracker, the driver monitor (facelab) and the high level data correlation systems.both the lane tracker and facelab run independently on different CPUs while the high level data correlation system produces the focus of attention of the driver with respect to the road. The driver s focus of attention is determined to be one of the regions defined in Fig. 3. The regions are divided into 3 different ranges - the near field, the far field and a horizon region. The horizon region is used to collect eye gaze data that is roughly parallel to the ground plane and that doesn t intersect with the other 2 ranges.

3 610 N. Apostoloff and A. Zelinsky Fig. 2. The vision platforms in the vehicle. Top: CeDAR active vision head. Right (above the steering wheel): facelab passive stereo cameras. Fig. 3. The regions of interest for obtaining the focus of attention of the driver.

4 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 611 Each of these ranges is split into either 3 or 4 regions depending on the test being performed. On a two lane road 4 regions are defined - a region to the left of the road, the lane detected by the lane tracker, the right lane containing on-coming traffic and a region to the right of the road. If the test is on a one-way road such as a highway the right lane containing on-coming traffic is not used. The focus of attention region is obtained by calculating the intercept of the eye gaze vector of the driver with the regions defined in Fig. 3 which are found by the lane tracker. 4 Lane Tracking Despite many impressive results from lane trackers in the past [1] [2] [9] [12], it is clear that no single cue can perform reliably in all situations. The lane tracking system presented here dynamically allocates computational resources over a suite of cues to robustly track the road in a variety of situations. Bayesian theory is used to fuse the cues while a scheduler intelligently allocates computational resources to the individual cues based on their performance. A particle filter [5] is used to control hypothesis generation of the lane location. For a detailed description of the tracking system see [6]. Each cue is specifically developed to work independently from the other cues and are customised to perform well under different situations (i.e. edge based lane marker tracking, colour based road tracking, etc.). Cues are allocated CPU time based on their performance and the time they require. Two metrics, the Kullback-Leibler divergence and the Uncertainty Deviation [8], are used to evaluate the performance of each cue with respect to the fused result and the individual result of the cue respectively. Additionally, the framework of the lane tracker was designed to allow the cues to run at different frequencies enabling slow running (but valuable) cues to run in the background. A dual phase particle filter system is proposed to reduce the search space for the lane tracker. The first particle filter searches for the the road width, the lateral offset of the vehicle from the centerline of the road and the yaw of the vehicle with respect to the centerline of the road. The second particle filter captures the horizontal and vertical road curvature in the mid to far-field ranges using the state information captured by the first particle filter. In the work reported in this paper, the second phase particle filter of the system was not used and no road curvature was calculated. The state space for the particle filter is the lateral offset of the vehicle relative to the skeletal line of the road, the yaw of the vehicle with respect to the skeletal line and the road width (see Fig. 4). 4.1 Cues for Robust Lane Tracking The cues chosen for this experiment were designed to be simple and efficient while being suited to a different set of road scenarios. Individually, each of the cues would perform poorly, but when they are combined through the cue fusion process they

5 612 N. Apostoloff and A. Zelinsky produce a robust solution to lane tracking. Each cue listed below uses the road model shown in Fig. 4 to process the probability of each hypothesis from the particle filter. Fig. 4. Road model used for the first phase particle filter. The dark shaded region is used as the non road boundary in the colour cues while the light shaded region is the road region. Note that the figure is exaggerated for clarity. 1. Lane Marker Cue is designed for roads that have lane markings. A modified ternary correlation 1 to preprocess an intensity image of the road and the cue returns the average value of the pixels along the hypothesised road edges. 2. Road Edge Cue is suited to roads with lane markings or roads with defined edges. It uses a preprocessed edge map and returns the average value of the pixels along the hypothesised road edges. 3. Road Colour Cue is useful for any roads that have a different colour than their surroundings (both unmarked and marked roads). It returns the average pixel value in the hypothesised road region from a colour probability map that is dynamically generated each iteration using the estimated road parameters from the previous iteration. 4. Non Road Colour Boundary Cue is the opposite to the Road Colour Cue and returns the average road colour probability of the non-road regions. 5. Road Width Cue is particularly useful on multi-lane roads where is is possible for the other cues to see two or more lanes as one. It returns a value from a Gaussian function centered at a desired road width given the hypothesised road width. The desired road width used in this cue was 3.61m which was empirically determined from previous lane tracking experiments to be the average road width. 6. Elastic Lane Cue is used to move particles towards the lane that the vehicle is in. It returns 1 if the lateral offset of the vehicle is less than half of the road width and 0.5 otherwise. 1 The 1D ternary correlation function is modified to be two sided with a step from -1 to 1 and back to -1.

6 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring facelab facelab [11] is a driver monitoring system commercialised by Seeing Machines [7] based on research and development work between ANU and Volvo Technological Development Corporation. It uses a passive stereo pair of cameras mounted on the dashboard of the vehicle to capture 60Hz video images of the driver s head. These images are processed in real-time to determine the 3D position of matching features on the drivers face. The features are then used to calculate the 3D pose of the persons face ( mm, ) as well as the eye gaze direction ( ), blink rates and eye closure. 6 Integration of the Lane Tracker and facelab Combining the gaze direction captured by facelab and the parameterised lane model determined by the lane tracker, we can track the visual behavior of the driver relative to the road. Calculating the intercept of the drivers gaze vector with the regions of interest shown in Fig. 3 determines the driver s focus of attention. Additionally, the visual scan patterns of the driver can be recorded with emphasis on fixation points and saccade movements to different regions in the scene. 7 Experimental Results Results from the lane tracker and the integrated system are presented below. 7.1 Lane Tracking The lane tracker was tested in several different scenarios including: Highway driving with light traffic. Outer city driving with high curvature roads. Inner city driving with moderate levels of traffic. Figure 5 shows the output of the lane tracker in the above scenarios using the 6 different cues. The lane tracker was found to work robustly and solved the problems typically associated with lane tracking including: Dramatic lighting changes (see A in Fig. 5). Changes in road colour (see D-F in Fig. 5). Shadows across the road (see C,G-H in Fig. 5). Roads with miscellaneous lines that are not lane markings (see I in Fig. 5). Lane markings that disappear and reappear.

7 614 N. Apostoloff and A. Zelinsky Fig. 5. Results from the lane tracker. The boxes indicate the ends of the lines that define the lane being tracked. This can be attributed to the combination of particle filtering and cue fusion. Because of the particle filter, cues only have to validate a hypothesis and do not have to search for the road. This indirectly incorporates a number of a priori constraints into the system (such as road edges meeting at the vanishing point in image space and the edges lying in the road plane) which assist it in its detection task. Cue fusion was found to dramatically increase the robustness of the solution due to the variety of conditions the cues were suited to. The final configuration of cues in the system is a direct result of earlier experiments uncovering certain conditions in which the cues would fail. 7.2 Driver Monitoring and Lane Tracking The integrated driver monitoring and lane tracking system was tested both on the highway and on outer city high curvature roads. A comparison of the drivers focus of attention statistics is given in Fig. 7 while a comparison of the yaw of the vehicle (a good indicator in this case of the degree of curvature) and the drivers gaze yaw is given in Fig. 6. Figure 6 gives a good indication of the correlation between the driver monitoring system and the lane tracking system. The yaw of the driver s gaze closely follows the

8 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring 615 Fig. 6. Comparison between the vehicle yaw and driver s gaze yaw between two tests along the same piece of road but in different directions. The dashed line is the yaw of the vehicle while the solid line is the yaw of the driver s gaze. Note that the large peaks are generated by the driver through saccade movements to the left and right windows of the car. vehicle yaw (a good indication of the curvature of the road) in both graphs showing that the driver is following the road curvature with his gaze. Fig. 7. Comparison between the driver s focus of attention for 3 different tests along a high curvature road and the highway. In the 4 pie charts to the left, the top and bottom rows are the data from the two different tests along the same high curvature road while column 1 is for the South-West route and column 2 is the North-East route. The pie chart to the right is for the highway test

9 616 N. Apostoloff and A. Zelinsky Figure 7 shows the percentage of time the driver focused his attention on the different regions of interest in the scene. The Other field is representative of times which the drivers gaze did not intercept any of the fields of interest. These graphs clearly show the repeatability in the experiments as well as a certain characteristic of the road which these tests were carried out on. The road used for the 4 pie charts to the left has a predominantly right curvature in the North-East direction (column 2 of Fig. 7) shown by the greater proportion of time the driver spent focusing on the right lane as well as the right side of the road. Note that the Right Road regions in these pie charts indicates the lane that the on-coming traffic occupies. The amount of time the driver spent looking at the highway (right pie chart of Fig. 7) stayed approximately the same as with the high curvature roads even though it was actually a smaller region (no right lane). The time spent focusing on the left and right sides of the highway was approximately equal which is to be expected considering the low curvature of the road. 8 Conclusions An integrated visual driver monitoring and lane tracking system has been presented that was successfully used to close the loop between vision inside and outside the vehicle. It was found that the lane tracking system benefited greatly from the cue fusion and particle filtering technologies used and was shown to perform robustly in a number of situations. A strong correlation was found to exist between the eye gaze direction of the driver and the curvature of the road, while the repeatability of the experiments was high. The applications of such a system are numerous, possibly the most important being driver warning systems of the future. 9 Acknowledgments We would like to thank the staff of Seeing Machines for their help with facelab, particularly David Liebowitz who was kind enough to spend time helping with the analysis of the data collected. We would also like to acknowledge Gareth Loy and Luke Fletcher for their help with the tracking system used in the lane tracker. References 1. Parag H. Batavia, Dean A. Pomerleau, and Charles E. Thorpe. Overtaking vehicle detection using implicit optical flow. In Proc. IEEE Transport Systems Conference, Ernst D. Dickmanns. An expectation-based, multi-focal, saccadic (ems) vision system for vehicle guidance. In Proc. International Symposium on Robotics and Research, Salt Lake City, Utah, October Luke Fletcher, Nicholas Apostoloff, Jason Chen, and Alexander Zelinsky. Computer vision for vehicle monitoring and control. In Proc. Australian Conference on Robotics and Automation.

10 Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring HoRSCoCTA. Inquiry into Managing Fatigue in Transport. The Parliament of the Commonwealth of Australia, M. Isard and A. Blake. Condensation conditional density propagation for visual tracking. International Journal of Computer Vision 92(1), Gareth Loy, Luke Fletcher, Nicholas Apostoloff, and Alexander Zelinsky. An adaptive fusion architecture for target tracking. In Proc. The 5th International Conference on Automatic Face and Gesture Recognition. 7. Seeing Machines. Facelab face and eye tracking system A. Soto and P.Khosla. Probabilistic adaptive agent based system for dynamic state estimation using multiple visual cues. In Proceedings of the International Symposium of Robotics Research (ISRR). 9. A. Suzuki, N. Yasui, N. Nakano, and M. Kaneko. Lane recognition system for guiding of autonomous vehicle. In Proceedings of the Intelligent Vehicles Symposium, 1992, pages , Harley Truong, Samir Abdallah, Sebastien Rougeaux, and Alexander Zelinsky. A novel mechanism for stereo active vision. In Proc. Australian Conference on Robotics and Automation. 11. Trent Victor, Olle Blomberg, and Alexander Zelinsky. Automating driver visual behaviour measurement. In Proc. Vision in Vehicles Todd Williamson and Charles Thorpe. A trinocular stereo system for highway obstacle detection. In Proc. International Conference on Robotics and Automation (ICRA99), 1999.

Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering 1

Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering 1 Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering 1 Nicholas Apostoloff Department of Engineering Sciences University of Oxford Oxford, United Kingdom nema@robots.ox.ac.uk Alexander

More information

Computer Vision for Vehicle Monitoring and Control

Computer Vision for Vehicle Monitoring and Control Computer Vision for Vehicle Monitoring and Control Luke Fletcher, Nicholas Apostoloff, Jason Chen, Alexander Zelinsky Robotic Systems Laboratory Department of Systems Engineering, RSISE The Australian

More information

13. Symposium Reifen und Fahrwerk Glatte Fahrbahn Die Straße als Signalgeber

13. Symposium Reifen und Fahrwerk Glatte Fahrbahn Die Straße als Signalgeber Glatte Fahrbahn Die Straße als Signalgeber Dipl.Ing. James Remfrey, Systems & Technology, Continental Chassis & Safety Division Chassis & Safety Continental Corporation Overview 2014 Sales by division

More information

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving 3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems

More information

Detection and Recognition of Mixed Traffic for Driver Assistance System

Detection and Recognition of Mixed Traffic for Driver Assistance System Detection and Recognition of Mixed Traffic for Driver Assistance System Pradnya Meshram 1, Prof. S.S. Wankhede 2 1 Scholar, Department of Electronics Engineering, G.H.Raisoni College of Engineering, Digdoh

More information

Driver Inattention Detection based on Eye Gaze Road Event Correlation

Driver Inattention Detection based on Eye Gaze Road Event Correlation Luke Fletcher Department of Information Engineering, RSISE, Australian National University Canberra, Australia luke.fletcher@anu.edu.au Alexander Zelinsky CSIRO ICT Centre, Canberra, Australia, alex.zelinsky@csiro.au

More information

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video

More information

Automatic Labeling of Lane Markings for Autonomous Vehicles

Automatic Labeling of Lane Markings for Autonomous Vehicles Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,

More information

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving 3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems

More information

Last Mile Intelligent Driving in Urban Mobility

Last Mile Intelligent Driving in Urban Mobility 底 盘 电 子 控 制 系 统 研 究 室 Chassis Electronic Control Systems Laboratory 姓 学 名 号 Hui CHEN School 学 of 院 ( Automotive 系 ) Studies, Tongji University, Shanghai, China 学 科 专 业 hui-chen@tongji.edu.cn 指 导 老 师 陈

More information

Vision-Based Blind Spot Detection Using Optical Flow

Vision-Based Blind Spot Detection Using Optical Flow Vision-Based Blind Spot Detection Using Optical Flow M.A. Sotelo 1, J. Barriga 1, D. Fernández 1, I. Parra 1, J.E. Naranjo 2, M. Marrón 1, S. Alvarez 1, and M. Gavilán 1 1 Department of Electronics, University

More information

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal

More information

Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior

Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior Kenji Yamashiro, Daisuke Deguchi, Tomokazu Takahashi,2, Ichiro Ide, Hiroshi Murase, Kazunori Higuchi 3,

More information

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Static Environment Recognition Using Omni-camera from a Moving Vehicle Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing

More information

T-REDSPEED White paper

T-REDSPEED White paper T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation

More information

TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS

TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS Michael A. Knodler Jr. Department of Civil & Environmental Engineering University of Massachusetts Amherst Amherst, Massachusetts, USA E-mail: mknodler@ecs.umass.edu

More information

TomTom HAD story How TomTom enables Highly Automated Driving

TomTom HAD story How TomTom enables Highly Automated Driving TomTom HAD story How TomTom enables Highly Automated Driving Automotive World Webinar 12 March 2015 Jan-Maarten de Vries VP Product Marketing TomTom Automotive Automated driving is real and it is big Image:

More information

Testimony of Ann Wilson House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015

Testimony of Ann Wilson House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015 House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015 Introduction Chairman Burgess, Ranking Member Schakowsky, members of the subcommittee: Thank you for

More information

Safe Robot Driving 1 Abstract 2 The need for 360 degree safeguarding

Safe Robot Driving 1 Abstract 2 The need for 360 degree safeguarding Safe Robot Driving Chuck Thorpe, Romuald Aufrere, Justin Carlson, Dave Duggins, Terry Fong, Jay Gowdy, John Kozar, Rob MacLaughlin, Colin McCabe, Christoph Mertz, Arne Suppe, Bob Wang, Teruko Yata @ri.cmu.edu

More information

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir

More information

Vision-Based Pedestrian Detection for Driving Assistance

Vision-Based Pedestrian Detection for Driving Assistance Vision-Based Pedestrian Detection for Driving Assistance Literature Survey Multidimensional DSP Project, Spring 2005 Marco Perez Abstract This survey focuses on some of the most important and recent algorithms

More information

EB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015

EB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015 EB Automotive Driver Assistance EB Assist Solutions Damian Barnett Director Automotive Software June 5, 2015 Advanced driver assistance systems Market growth The Growth of ADAS is predicted to be about

More information

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James

More information

Video-Based Eye Tracking

Video-Based Eye Tracking Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department

More information

A Vision-Based Tracking System for a Street-Crossing Robot

A Vision-Based Tracking System for a Street-Crossing Robot Submitted to ICRA-04 A Vision-Based Tracking System for a Street-Crossing Robot Michael Baker Computer Science Department University of Massachusetts Lowell Lowell, MA mbaker@cs.uml.edu Holly A. Yanco

More information

A Simple CCD Based Lane Tracking System

A Simple CCD Based Lane Tracking System SAE TECHNICAL PAPER SERIES 1999-01-1302 A Simple CCD Based Lane Tracking System Frank S. Barickman National Highway Traffic Safety Administration Duane L. Stoltzfus Transportation Research Center Inc.

More information

Detecting and positioning overtaking vehicles using 1D optical flow

Detecting and positioning overtaking vehicles using 1D optical flow Detecting and positioning overtaking vehicles using 1D optical flow Daniel Hultqvist 1, Jacob Roll 1, Fredrik Svensson 1, Johan Dahlin 2, and Thomas B. Schön 3 Abstract We are concerned with the problem

More information

JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development

JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development ROADMAP TO AUTOMATED DRIVING Autonomous Driving (Chauffeured Driving) Increasing Capability

More information

How cloud-based systems and machine-driven big data can contribute to the development of autonomous vehicles

How cloud-based systems and machine-driven big data can contribute to the development of autonomous vehicles How cloud-based systems and machine-driven big data can contribute to the development of autonomous vehicles David Fidalgo- Altran Senior Business Manager CONTENTS I. Altran Group/ Intelligence Systems

More information

Test system for camera-based driver assistance systems TESIS DYNAware GmbH / Bertrandt Ing.- Büro GmbH. M. Baderschneider, Dr. T.

Test system for camera-based driver assistance systems TESIS DYNAware GmbH / Bertrandt Ing.- Büro GmbH. M. Baderschneider, Dr. T. Test system for camera-based driver assistance systems TESIS DYNAware GmbH / Bertrandt Ing.- Büro GmbH M. Baderschneider, Dr. T. Mertke Agenda 1. Introduction Presentation of the overall system, requirements

More information

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy Gideon P. Stein Ofer Mano Amnon Shashua MobileEye Vision Technologies Ltd. MobileEye Vision Technologies Ltd. Hebrew University

More information

Practical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006

Practical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006 Practical Tour of Visual tracking David Fleet and Allan Jepson January, 2006 Designing a Visual Tracker: What is the state? pose and motion (position, velocity, acceleration, ) shape (size, deformation,

More information

MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR

MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR MODELING AND PREDICTION OF HUMAN DRIVER BEHAVIOR Andrew Liu and Dario Salvucci* MIT Man Vehicle Laboratory, Rm 37-219, 77 Massachusetts Ave., Cambridge, MA 02139 *Nissan Cambridge Basic Research, 4 Cambridge

More information

Tips and Technology For Bosch Partners

Tips and Technology For Bosch Partners Tips and Technology For Bosch Partners Current information for the successful workshop No. 04/2015 Electrics / Elektronics Driver Assistance Systems In this issue, we are continuing our series on automated

More information

GUIDELINES. for oversize and overmass vehicles and loads MAY 2006. Government of South Australia. Department for Transport, Energy and Infrastructure

GUIDELINES. for oversize and overmass vehicles and loads MAY 2006. Government of South Australia. Department for Transport, Energy and Infrastructure ESCORTING GUIDELINES ESCORTING GUIDELINES SCORTING GUIDELINES ESCORTING GUIDELINES ESCORTING ESCORTING GUIDELINES for oversize and overmass vehicles and loads Government of South Australia Department for

More information

MSc in Autonomous Robotics Engineering University of York

MSc in Autonomous Robotics Engineering University of York MSc in Autonomous Robotics Engineering University of York Practical Robotics Module 2015 A Mobile Robot Navigation System: Labs 1a, 1b, 2a, 2b. Associated lectures: Lecture 1 and lecture 2, given by Nick

More information

Intelligent, Connected Cars Volkswagen s Vision of the Future

Intelligent, Connected Cars Volkswagen s Vision of the Future Intelligent, Connected Cars Volkswagen s Vision of the Future Dr. Markus Lienkamp Electronics and Vehicle Research Global trends Time is scarce Space is scarce Oil is scarce Ubiquitous broadband communication

More information

Balancing Active and Passive Safety

Balancing Active and Passive Safety Balancing Active and Passive Safety Dnr: 2011-01146 Cecilia Sunnevång Ulrich Sander, Ola Boström September 17 th, 2015 Driven for Life. Background US legal & rating Automated Driving NHTSA Oblique (potential)

More information

Grenzenlos wissen Von der Region in die Welt. Automatisierung von Fahrzeugen

Grenzenlos wissen Von der Region in die Welt. Automatisierung von Fahrzeugen 3. Wissenschaftstag der Europäischen Metropolregion Nürnberg Di, 26. Mai 2009 Hochschule Amberg-Weiden, ACC Grenzenlos wissen Von der Region in die Welt Automatisierung von Fahrzeugen Das EU-Forschungsprojekt

More information

By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate

By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate AUTOMATED VEHICLE CONTROL SYSTEM By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate Third Year Information Technology Engineering V.E.S.I.T.

More information

Multisensor Data Fusion and Applications

Multisensor Data Fusion and Applications Multisensor Data Fusion and Applications Pramod K. Varshney Department of Electrical Engineering and Computer Science Syracuse University 121 Link Hall Syracuse, New York 13244 USA E-mail: varshney@syr.edu

More information

Monitoring Head/Eye Motion for Driver Alertness with One Camera

Monitoring Head/Eye Motion for Driver Alertness with One Camera Monitoring Head/Eye Motion for Driver Alertness with One Camera Paul Smith, Mubarak Shah, and N. da Vitoria Lobo Computer Science, University of Central Florida, Orlando, FL 32816 rps43158,shah,niels @cs.ucf.edu

More information

IP-S2 Compact+ 3D Mobile Mapping System

IP-S2 Compact+ 3D Mobile Mapping System IP-S2 Compact+ 3D Mobile Mapping System 3D scanning of road and roadside features Delivers high density point clouds and 360 spherical imagery High accuracy IMU options without export control Simple Map,

More information

Advanced Vehicle Safety Control System

Advanced Vehicle Safety Control System Hitachi Review Vol. 63 (2014), No. 2 116 Advanced Vehicle Safety Control System Hiroshi Kuroda, Dr. Eng. Atsushi Yokoyama Taisetsu Tanimichi Yuji Otsuka OVERVIEW: Hitachi has been working on the development

More information

INTERNET FOR VANET NETWORK COMMUNICATIONS -FLEETNET-

INTERNET FOR VANET NETWORK COMMUNICATIONS -FLEETNET- ABSTRACT INTERNET FOR VANET NETWORK COMMUNICATIONS -FLEETNET- Bahidja Boukenadil¹ ¹Department Of Telecommunication, Tlemcen University, Tlemcen,Algeria Now in the world, the exchange of information between

More information

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING Ms.PALLAVI CHOUDEKAR Ajay Kumar Garg Engineering College, Department of electrical and electronics Ms.SAYANTI BANERJEE Ajay Kumar Garg Engineering

More information

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 29 (2008) Indiana University A Software-Based System for Synchronizing and Preprocessing Eye Movement Data in Preparation for Analysis 1 Mohammad

More information

Test-bed for Unified Perception & Decision Architecture

Test-bed for Unified Perception & Decision Architecture Test-bed for Unified Perception & Decision Architecture Luca Bombini, Stefano Cattani, Pietro Cerri, Rean Isabella Fedriga, Mirko Felisa, and Pier Paolo Porta Abstract This paper presents the test-bed

More information

ANALYZING THE TRAVEL PATTERNS OF CONSTRUCTION WORKERS

ANALYZING THE TRAVEL PATTERNS OF CONSTRUCTION WORKERS ANALYZING THE TRAVEL PATTERNS OF CONSTRUCTION WORKERS Jochen Teizer, Ph.D., Assistant Professor Uday Mantripragada, M.S. Student Manu Venugopal, Ph.D. Student School of Civil and Environmental Engineering

More information

Robotics. Chapter 25. Chapter 25 1

Robotics. Chapter 25. Chapter 25 1 Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration

More information

A Computer Vision System on a Chip: a case study from the automotive domain

A Computer Vision System on a Chip: a case study from the automotive domain A Computer Vision System on a Chip: a case study from the automotive domain Gideon P. Stein Elchanan Rushinek Gaby Hayun Amnon Shashua Mobileye Vision Technologies Ltd. Hebrew University Jerusalem, Israel

More information

Tracking of Small Unmanned Aerial Vehicles

Tracking of Small Unmanned Aerial Vehicles Tracking of Small Unmanned Aerial Vehicles Steven Krukowski Adrien Perkins Aeronautics and Astronautics Stanford University Stanford, CA 94305 Email: spk170@stanford.edu Aeronautics and Astronautics Stanford

More information

Challenges for the European Automotive Software Industry

Challenges for the European Automotive Software Industry Challenges for the European Automotive Software Industry Viewpoint of a safety supplier 28 th April 2010 Franck Lesbroussart What Trends do we see? Integration of functions Functionalities are expanding

More information

Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability

Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability Raia Hadsell1, Pierre Sermanet1,2, Jan Ben2, Ayse Naz Erkan1, Jefferson Han1, Beat Flepp2, Urs Muller2,

More information

automatic road sign detection from survey video

automatic road sign detection from survey video automatic road sign detection from survey video by paul stapleton gps4.us.com 10 ACSM BULLETIN february 2012 tech feature In the fields of road asset management and mapping for navigation, clients increasingly

More information

Freescale s Advanced Processor for Next Generation ADAS: S32V234

Freescale s Advanced Processor for Next Generation ADAS: S32V234 Technology industry Reporting In-sights Advisory Services Freescale s S32V ADAS Processor Family Whitepaper Sponsored by Freescale March 02, 2015 Freescale s Advanced Processor for Next Generation ADAS:

More information

Colorado School of Mines Computer Vision Professor William Hoff

Colorado School of Mines Computer Vision Professor William Hoff Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Introduction to 2 What is? A process that produces from images of the external world a description

More information

Multi-view Intelligent Vehicle Surveillance System

Multi-view Intelligent Vehicle Surveillance System Multi-view Intelligent Vehicle Surveillance System S. Denman, C. Fookes, J. Cook, C. Davoren, A. Mamic, G. Farquharson, D. Chen, B. Chen and S. Sridharan Image and Video Research Laboratory Queensland

More information

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle

More information

Visual Perception and Tracking of Vehicles for Driver Assistance Systems

Visual Perception and Tracking of Vehicles for Driver Assistance Systems 3-11 Visual Perception and Tracking of Vehicles for Driver Assistance Systems Cristina Hilario, Juan Manuel Collado, Jose Maria Armingol and Arturo de la Escalera Intelligent Systems Laboratory, Department

More information

Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction

Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction Maximilian Krüger, Achim Schäfer, Andreas Tewes, Rolf P. Würtz Institut für Neuroinformatik, Ruhr-Universität

More information

AAA AUTOMOTIVE ENGINEERING

AAA AUTOMOTIVE ENGINEERING AAA AUTOMOTIVE ENGINEERING Evaluation of Blind Spot Monitoring and Blind Spot Intervention Technologies 2014 AAA conducted research on blind-spot monitoring systems in the third quarter of 2014. The research

More information

Nighttime Vehicle Distance Alarm System

Nighttime Vehicle Distance Alarm System Proceedings of the 7th WSEAS Int. Conf. on Signal Processing, Computational Geometry & Artificial Vision, Athens, Greece, August 24-26, 2007 226 ighttime Vehicle Distance Alarm System MIG-CI LU *, WEI-YE

More information

A Multisensor Multiobject Tracking System for an Autonomous Vehicle Driving in an Urban Environment *

A Multisensor Multiobject Tracking System for an Autonomous Vehicle Driving in an Urban Environment * A Multisensor Multiobject Tracking ystem for an Autonomous Vehicle Driving in an Urban Environment * Michael Darms 1, Paul E. Rybski 2, Chris Urmson 2 1 Continental, Chassis & afety Division, 2 Carnegie

More information

PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS

PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS PASSIVE DRIVER GAZE TRACKING WITH ACTIVE APPEARANCE MODELS Takahiro Ishikawa Research Laboratories, DENSO CORPORATION Nisshin, Aichi, Japan Tel: +81 (561) 75-1616, Fax: +81 (561) 75-1193 Email: tishika@rlab.denso.co.jp

More information

MODULAR TRAFFIC SIGNS RECOGNITION APPLIED TO ON-VEHICLE REAL-TIME VISUAL DETECTION OF AMERICAN AND EUROPEAN SPEED LIMIT SIGNS

MODULAR TRAFFIC SIGNS RECOGNITION APPLIED TO ON-VEHICLE REAL-TIME VISUAL DETECTION OF AMERICAN AND EUROPEAN SPEED LIMIT SIGNS MODULAR TRAFFIC SIGNS RECOGNITION APPLIED TO ON-VEHICLE REAL-TIME VISUAL DETECTION OF AMERICAN AND EUROPEAN SPEED LIMIT SIGNS Fabien Moutarde and Alexandre Bargeton Robotics Laboratory Ecole des Mines

More information

HAVEit. Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE

HAVEit. Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE HAVEit Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE HAVEit General Information Project full title: Project coordinator: Highly Automated Vehicles for Intelligent Transport Dr. Reiner

More information

Automatic Traffic Estimation Using Image Processing

Automatic Traffic Estimation Using Image Processing Automatic Traffic Estimation Using Image Processing Pejman Niksaz Science &Research Branch, Azad University of Yazd, Iran Pezhman_1366@yahoo.com Abstract As we know the population of city and number of

More information

Speed Performance Improvement of Vehicle Blob Tracking System

Speed Performance Improvement of Vehicle Blob Tracking System Speed Performance Improvement of Vehicle Blob Tracking System Sung Chun Lee and Ram Nevatia University of Southern California, Los Angeles, CA 90089, USA sungchun@usc.edu, nevatia@usc.edu Abstract. A speed

More information

A STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER

A STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER A STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER Takashi Wakasugi Japan Automobile Research Institute Japan Paper Number 5-29 ABSTRACT The purpose of

More information

Service Training. Self-study Programme 396. Lane Change Assist. Design and function

Service Training. Self-study Programme 396. Lane Change Assist. Design and function Service Training Self-study Programme 396 Lane Change Assist Design and function Lane change assist is a further technical innovation in driver assistance systems. This system is designed to prevent accidents.

More information

RACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night

RACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night Appears in: Proceedings of IEEE Intelligent Vehicles 1993 RACCOON: A Real-time Autonomous Car Chaser Operating Optimally at Night Rahul Sukthankar Robotics Institute Carnegie Mellon University 5000 Forbes

More information

E70 Rear-view Camera (RFK)

E70 Rear-view Camera (RFK) Table of Contents (RFK) Subject Page Introduction..................................................3 Rear-view Camera..............................................3 Input/Output...................................................4

More information

Analecta Vol. 8, No. 2 ISSN 2064-7964

Analecta Vol. 8, No. 2 ISSN 2064-7964 EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,

More information

False alarm in outdoor environments

False alarm in outdoor environments Accepted 1.0 Savantic letter 1(6) False alarm in outdoor environments Accepted 1.0 Savantic letter 2(6) Table of contents Revision history 3 References 3 1 Introduction 4 2 Pre-processing 4 3 Detection,

More information

Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle

Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle Miguel A. Olivares-Méndez mig olivares@hotmail.com Pascual Campoy Cervera pascual.campoy@upm.es Iván Mondragón ivanmond@yahoo.com Carol

More information

Accidents with Pedestrians and Cyclists in Germany Findings and Measures

Accidents with Pedestrians and Cyclists in Germany Findings and Measures Accidents with Pedestrians and Cyclists in Germany Findings and Measures Siegfried Brockmann Unfallforschung der Versicherer (UDV) May 7th, Geneva 2 Content 2 Accident situation in Germany based on National

More information

SHOULDN T CARS REACT AS DRIVERS EXPECT? E-mail: jan-erik.kallhammer@autoliv.com. Linköping University, Sweden

SHOULDN T CARS REACT AS DRIVERS EXPECT? E-mail: jan-erik.kallhammer@autoliv.com. Linköping University, Sweden SHOULDN T CARS REACT AS DRIVERS EXPECT? Jan-Erik Källhammer, 1,2 Kip Smith, 2 Johan Karlsson, 1 Erik Hollnagel 2 1 Autoliv Research, Vårgårda, Sweden E-mail: jan-erik.kallhammer@autoliv.com 2 Department

More information

A Cognitive Approach to Vision for a Mobile Robot

A Cognitive Approach to Vision for a Mobile Robot A Cognitive Approach to Vision for a Mobile Robot D. Paul Benjamin Christopher Funk Pace University, 1 Pace Plaza, New York, New York 10038, 212-346-1012 benjamin@pace.edu Damian Lyons Fordham University,

More information

SMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES

SMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES SMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES Bandi Sree Geeta 1, Diwakar R. Marur 2 1,2 Department of Electronics and Communication Engineering, SRM University, (India) ABSTRACT

More information

Naturalistic Cycling Studies

Naturalistic Cycling Studies Naturalistic Cycling Studies November 26 th 2013 Marco Dozza CHALMERS University of Technology SAFER presentation for Japan SAFER Goals Phase 1 #1 071220 Summary The cycling safety problem Naturalistic

More information

INTRODUCTION TO NEURAL NETWORKS

INTRODUCTION TO NEURAL NETWORKS INTRODUCTION TO NEURAL NETWORKS Pictures are taken from http://www.cs.cmu.edu/~tom/mlbook-chapter-slides.html http://research.microsoft.com/~cmbishop/prml/index.htm By Nobel Khandaker Neural Networks An

More information

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical

More information

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia

More information

Crater detection with segmentation-based image processing algorithm

Crater detection with segmentation-based image processing algorithm Template reference : 100181708K-EN Crater detection with segmentation-based image processing algorithm M. Spigai, S. Clerc (Thales Alenia Space-France) V. Simard-Bilodeau (U. Sherbrooke and NGC Aerospace,

More information

CSCI 445 Amin Atrash. Ultrasound, Laser and Vision Sensors. Introduction to Robotics L. Itti & M. J. Mataric

CSCI 445 Amin Atrash. Ultrasound, Laser and Vision Sensors. Introduction to Robotics L. Itti & M. J. Mataric Introduction to Robotics CSCI 445 Amin Atrash Ultrasound, Laser and Vision Sensors Today s Lecture Outline Ultrasound (sonar) Laser range-finders (ladar, not lidar) Vision Stereo vision Ultrasound/Sonar

More information

Accident Prevention Using Eye Blinking and Head Movement

Accident Prevention Using Eye Blinking and Head Movement Accident Prevention Using Eye Blinking and Head Movement Abhi R. Varma Seema V. Arote Asst. prof Electronics Dept. Kuldeep Singh Chetna Bharti ABSTRACT This paper describes a real-time online prototype

More information

Global Automotive Conference

Global Automotive Conference Global Automotive Conference New York, 19 November, 2014 Henrik Kaar, Director Corporate Communications Driven for Life. Autoliv, Inc. All Rights Reserved. Safe Harbor Statement * This presentation contains

More information

IN the United States, tens of thousands of drivers and passengers

IN the United States, tens of thousands of drivers and passengers IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 14, NO. 4, DECEMBER 2013 1773 Looking at Vehicles on the Road: A Survey of Vision-Based Vehicle Detection, Tracking, and Behavior Analysis

More information

3 rd Workshop on Naturalistic Driving Data Analytics (Tentative) Programme 19 June 2016

3 rd Workshop on Naturalistic Driving Data Analytics (Tentative) Programme 19 June 2016 3 rd Workshop on Naturalistic Driving Data Analytics (Tentative) Programme 19 June 2016 09:00-09:05 Welcome 09:05-09:40 Analysis of non-critical left turns at intersections and LTAP/OD crashes/nearcrashes

More information

Science Fiction to Reality: The Future of Automobile Insurance and Transportation Technology

Science Fiction to Reality: The Future of Automobile Insurance and Transportation Technology Michael R. Nelson Kymberly Kochis October 13, 2015 Science Fiction to Reality: The Future of Automobile Insurance and Transportation Technology INSURANCE AND FINANCIAL SERVICES LITIGATION WEBINAR SERIES

More information

Keywords drowsiness, image processing, ultrasonic sensor, detection, camera, speed.

Keywords drowsiness, image processing, ultrasonic sensor, detection, camera, speed. EYE TRACKING BASED DRIVER DROWSINESS MONITORING AND WARNING SYSTEM Mitharwal Surendra Singh L., Ajgar Bhavana G., Shinde Pooja S., Maske Ashish M. Department of Electronics and Telecommunication Institute

More information

Common platform for automated trucks and construction equipment

Common platform for automated trucks and construction equipment Common platform for automated trucks and construction equipment Erik Nordin, Advanced Technology and Research Common platform for automated trucks and construction equipment What basic principles should

More information

Toyota s Safety Initiatives

Toyota s Safety Initiatives JITI Safer Vehicle Seminar Toyota s Safety Initiatives Toyota Motor Corporation March 20, 2013 Countermeasures taken into vehicles 1. Manufactures have continued developing and innovating various safety

More information

Team Information.

Team Information. Picture of vehicle: Name of vehicle: Kurt3D Picture of team leader: Name of team leader: Andreas Nüchter Team Name: Kurt3D Team E-mail: andreas.nuechter@informatik.uni-osnabrueck.de Website: http://www.informatik.uni-osnabrueck.de/nuechter

More information

LOCAL SURFACE PATCH BASED TIME ATTENDANCE SYSTEM USING FACE. indhubatchvsa@gmail.com

LOCAL SURFACE PATCH BASED TIME ATTENDANCE SYSTEM USING FACE. indhubatchvsa@gmail.com LOCAL SURFACE PATCH BASED TIME ATTENDANCE SYSTEM USING FACE 1 S.Manikandan, 2 S.Abirami, 2 R.Indumathi, 2 R.Nandhini, 2 T.Nanthini 1 Assistant Professor, VSA group of institution, Salem. 2 BE(ECE), VSA

More information

Tracking and integrated navigation Konrad Schindler

Tracking and integrated navigation Konrad Schindler Tracking and integrated navigation Konrad Schindler Institute of Geodesy and Photogrammetry Tracking Navigation needs predictions for dynamic objects estimate trajectories in 3D world coordinates and extrapolate

More information

Traffic Monitoring Systems. Technology and sensors

Traffic Monitoring Systems. Technology and sensors Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal

More information

Poker Vision: Playing Cards and Chips Identification based on Image Processing

Poker Vision: Playing Cards and Chips Identification based on Image Processing Poker Vision: Playing Cards and Chips Identification based on Image Processing Paulo Martins 1, Luís Paulo Reis 2, and Luís Teófilo 2 1 DEEC Electrical Engineering Department 2 LIACC Artificial Intelligence

More information

Sequence. Liang Zhao and Chuck Thorpe. Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213. E-mail: flzhao, cetg@ri.cmu.

Sequence. Liang Zhao and Chuck Thorpe. Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213. E-mail: flzhao, cetg@ri.cmu. Proc. CVPR'98, Santa Barbara, CA, June -, pp. 9-, 998 Qualitative and Quantitative Car Tracking from a Range Image Sequence Liang Zhao and Chuck Thorpe Robotics Institute, Carnegie Mellon University, Pittsburgh,

More information