Development of Collision Avoidance Systems at Delphi Automotive Systems

Similar documents
Testimony of Ann Wilson House Energy & Commerce Committee Subcommittee on Commerce, Manufacturing and Trade, October 21, 2015

DETROIT SUPPORT. ANYWHERE. Detroit Assurance delivers exceptional control and protection. THE DETROIT ASSURANCE SUITE OF SAFETY SYSTEMS

DETROIT ASSURANCE TM A SUITE OF SAFETY SYSTEMS

Adaptive Cruise Control

AAA AUTOMOTIVE ENGINEERING

SEMINAR REPORT 2004 HANDFREE DRIVING FOR AUTOMOBILES

Tips and Technology For Bosch Partners

Adaptive Cruise Control System Overview

The Integrated Vehicle-Based Safety Systems Initiative

Trends in Transit Bus Accidents and Promising Collision Countermeasures

Smart features like these are why Ford F-Series has been America s best-selling truck for 37 years and America s best-selling vehicle for 32 years

JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development

ACCIDENTS AND NEAR-MISSES ANALYSIS BY USING VIDEO DRIVE-RECORDERS IN A FLEET TEST

Advanced Vehicle Safety Control System

Automotive Black Box Data Recovery Systems

Adaptive Cruise Control of a Passenger Car Using Hybrid of Sliding Mode Control and Fuzzy Logic Control

Appendix A In-Car Lessons

Professional Truck Driver Training Course Syllabus

Operating Vehicle Control Devices

How new Safety Systems and always Connected Vehicles leads to challenges on Antenna Design and Integration in the Automotive Domain

Adaptive cruise control (ACC)

SmartTrac Stability Control Systems

INDUSTRY REPORT ON AIRBAG INDUSTRY

Hybrid System for Driver Assistance

Effectiveness Estimation Method for Advanced Driver Assistance System and its Application to Collision Mitigation Brake System

Improving Driving Safety Through Automation

SIGHT DISTANCE. Presented by Nazir Lalani P.E. Traffex Engineers Inc. WHY IS SIGHT DISTANCE SO IMPORTANT?

Introduction CHAPTER 1

A STUDY ON WARNING TIMING FOR LANE CHANGE DECISION AID SYSTEMS BASED ON DRIVER S LANE CHANGE MANEUVER

How To Know If You Are Distracted By Cell Phones

Bendix Wingman Fusion

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Automotive Collision Injury Form

Cat Detect. for Surface Mining Applications

Safety Data Collection, Analysis, and Sharing

Rail Automation. What is ACSES? usa.siemens.com/rail-automation

Technical Trends of Driver Assistance/ Automated Driving

METHODS FOR ESTABLISHING SAFE SPEEDS ON CURVES

FUTURE E/E-ARCHITECTURES IN THE SAFETY DOMAIN

Bendix Wingman ACB Active Cruise with Braking Questions & Answers

HOW TO PREPARE FOR YOUR MARYLAND NONCOMMERCIAL CLASS C DRIVER S TEST

CHAPTER 2 TRAFFIC SIGNS AND HIGHWAY MARKINGS

are the leading cause of teen fatalities, accounting for

INTRUSION PREVENTION AND EXPERT SYSTEMS

ROAD SIGNS IN JAPAN PARKING SIGNS. No Parking or Stopping Anytime SIZE & WEIGHT LIMIT SIGNS SPEED LIMIT SIGNS

EB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015

TEST ON Driving Safely Among Bicyclists and Pedestrians

SAMPLE VEHICLE FLEET SAFETY & USAGE POLICY

Florida Class E Knowledge Exam Road Rules Practice Questions

ACCELERATION OF HEAVY TRUCKS Woodrow M. Poplin, P.E.

Development of an Automotive Active Safety System Using a 24 GHz-band High Resolution Multi-Mode Radar

Does the Federal government require them? No, the Federal government does not require manufacturers to install EDRs.

NHTSA Heavy Vehicle Test Track Research. Frank Barickman Devin Elsasser. National Highway Traffic Safety Administration

Existing safety technology is the driverless vehicle already here? Matthew Avery Safety Research Director

BILLING INFORMATION. Address: City, State, Zip: Telephone Number: Date of Injury: Time of Injury: AM PM City and street where crash occurred:

Grenzenlos wissen Von der Region in die Welt. Automatisierung von Fahrzeugen

Project Overview. Focus: Go to Market. Samuel Ginn College of Engineering 2

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

No.2 October Automotive insight for Members. Stop the Crash! AEB prevents low speed crashes: now part of UK group rating

In-vehicle Technologies for Increasing Seat Belt Use

Synthetic Aperture Radar (SAR) Imaging using the MIT IAP 2011 Laptop Based Radar*

CHAPTER 3 AVI TRAVEL TIME DATA COLLECTION

Driver Certification

Rear-end Collision Prevention Using Mobile Devices

Frontal Crash Protection

NATIONAL TRANSPORT AND SAFETY AUTHORITY

ON THE WAY TOWARDS ZERO ACCIDENTS

City of Auburn Americans with Disabilities Act (ADA) Transition Plan for Curb Ramps, Sidewalks, and Pedestrian Signals ADA TRANSITION PLAN

Stop Alert Flasher with G-Force sensor

Using big data in automotive engineering?

ASK THE CAR WHAT HAPPENED

Motorcycle Airbag System

ZF Innovation Truck Turns Maneuvering Long Trucks into a Finger Exercise

TomTom HAD story How TomTom enables Highly Automated Driving

Production testing of radar sensors for automotive applications

REAL-WORLD PERFORMANCE OF CITY SAFETY BASED ON SWEDISH INSURANCE DATA

Frost & Sullivan s Research and Market Consulting Group. Global Trends, Technology Roadmaps and Strategic Market Analysis AUTOMOTIVE SAFETY & DAS

OnGuard Collision Mitigation System

Virtual CRASH 3.0 Staging a Car Crash

SAFE Streets for CHICAGO

Safety-conscious Drivers Prevent Crashes. Defensive Driving: Managing Visibility, Time and Space. Improve Your Defensive Driving Skills

Adaptive Cruise Control Unit

CHAPTER 11: PEDESTRIAN SIGNS AND SIGNALS

Investigating the Aggravated Vehicular Homicide Case

PEDESTRIAN HEAD IMPACT ANALYSIS

INSTRUCTOR S GUIDE. Stay on the Right Track Highway-Railway Crossing Awareness Training for Newly Licensed Drivers

STATEMENT THE ALLIANCE OF AUTOMOBILE MANUFACTURERS APRIL 14, 2016 PRESENTED BY:

Road speed limitation for commercial vehicles

TRACKING DRIVER EYE MOVEMENTS AT PERMISSIVE LEFT-TURNS

PERMISSION FOR TRANSPARENCY USE

Seagull Intersection Layout. Island Point Road - A Case Study. Authors: John Harper, Wal Smart, Michael de Roos

Space-Based Position Navigation and Timing National Advisory Board

Document Name: Driving Skills. Purpose: To outline necessary driving skills required to maximize driving safety.

Improving Fuel economy and CO 2 Through The Application of V2I and V2V Communications

ACCELERATION CHARACTERISTICS OF VEHICLES IN RURAL PENNSYLVANIA

2014 Annual General Meeting

Five Star Ratings for Collision Mitigation/Avoidance Systems

Working Paper. Extended Validation of the Finite Element Model for the 2010 Toyota Yaris Passenger Sedan

Evaluation of the Automatic Transmission Model in HVE Version 7.1

Transcription:

Development of Collision Avoidance s at Delphi Automotive s Glenn R. Widmann, William A. Bauson, and Steven W. Alland Delphi Automotive s Delphi Delco Electronics s 5725 Delphi Drive Troy, MI 48098-2815 USA Delphi Automotive s Delphi Delco Electronics s P.O. Box 9005 Kokomo, IN 46904-9005 USA HE Microwave P.O. Box 23340 Tucson, AZ 85734 USA Abstract Since the late 1980s, Delphi Automotive s has been very involved with the practical development of a variety of Collision Avoidance products for the near- and long-term automotive market. Many of these products will incorporate the use of millimeter-wave radar sensors as the primary detection sensor for use in the detection and identification of primary objects that pose a conflicting threat to the host vehicle. The design of these products have been approached from a system perspective, with a particular emphasis placed on performance, package size, and cost. The critical issues and other technical challenges in developing these systems will be explored 1. INTRODUCTION Tremendous progress has been made since the 1960 s with regards to vehicle safety. Early safety approaches emphasized precaution (e.g., surviving a crash) and focused on such passive devices as seat belts, air bags, crash zones, and lighting. These improvements have dramatically reduced the rate of crashrelated injury severity and fatalities. For example, the fatality rate per hundred million vehicle miles traveled has fallen from 5.5 to 1.7 in the period from the mid-1960s to 1994. However, in spite of these impressive improvements, each year in the United States, motor vehicle crashes still account for a staggering 40,000 deaths, more than 3 million injuries, and over $150 billion in economic losses [1]. Greater demand for improvements in vehicular transportation safety, fueled by government and consumers alike, are compelling the automotive manufacturing community to constantly seek to develop innovative technologies and products which can assist in achieving further crash statistics reductions. The emphasis of these future systems will migrate from a passive safety system (e.g., crash precaution) to active safety system (e.g., crash prevention). Consequently, the introduction of collision warning/avoidance systems have the potential to represent the next significant leap in vehicle safety technology by attempting to actively warn drivers of a potential impending collision event, thereby, allowing the driver adequate time to take appropriate corrective actions in order to mitigate, or completely avoid, the event. Crash statistics and numerical analysis strongly suggest that collision warning systems will be effective. Crash-related data collected by the U.S. National Highway Traffic Safety Administration (NHTSA) shows approximately 88% of rear-end collisions are caused by driver inattention and following too closely. These types of crash events could derive a positive beneficial influence from such systems. In fact, NHTSA countermeasure effectiveness modeling has determined that these types of head-way detection systems can theoretically prevent 37% to 74% of all police reported rear-end crashes [2] [3]. For such systems to gain widespread acceptance by the general automotive consumer, it must be reasonably priced, provide highly reliable performance, and provide functionality. In order to achieve these goals, Delphi Automotive s has heavily relied on system engineering principles as a framework to guide this highly focused product design effort. This paper will discuss some of the critical issues which will be encountered in the steps towards achieving the ultimate goal of a collision avoidance system. - 1-1998 IEEE International Conference on Intelligent Vehicles 353

2. SYSTEM EVOLUTION & ARCHITECTURE The technology roadmap towards a complete collision avoidance product is shown in Figure 1. The strategy towards collision avoidance system begins with an Adaptive Cruise Control (ACC) system. Each succeeding product will provide increase functionality over the preceding product. The various combinations of subsystems will eventually yield a complete family of collision avoidance products, such as: ACC, Lane Change, Lane Departure, Roadway Departure, Lane Keeping, Parking Aid, Reversing Aid, etc. These systems will be introduced based on variety of factors (e.g., technology maturity, performance, packaging, costs, etc.). Product Collision Avoidance Collision Intervention Collision Warning Forward Collision Warning Enhanced Adaptive Cruise Control Functionality Complete 360 o vehicle coverage. Braking & steering to avoid object. Limited all vehicle coverage. Complete throttle & brake & limited steering. Partial all vehicle coverage, with alert function. Lane/road departure alerts. Future Enhanced ACC capabilities, with full alert functionality. Vision required. Identify stopped objects & provide limited warning. Provide Low Speed Cruise & Stop & Go capabilities. Adaptive Cruise Control Cruise Control Today Throttle control with limited braking. No stopped object identification. No warning. Driver controlled system. No dynamics. Figure 1: Collision Avoidance Evolution. The development processes for each of these products are not unique, but fits within a common framework which builds upon the achievements of the preceding subsystem. Figure 2 illustrate the hierarchical structure that guides all the integration and development processes towards Collision Avoidance. Information from the primary active sensing sub-systems (e.g., forward radar and vision sensors, GPS/Map system, side/rear sensing sensors) and vehicle state sensors (e.g., speed, yaw, accelerometers, etc.) are processed by the Processing Module in order to reconstruct the traffic environment about the host vehicle. Within this module, sensor fusion techniques are used employed to assess, evaluate, and combine the parametric information yielded from all the active sensing subsystems, in conjunction with the host vehicle states, into reliable parametric features which are used to improve the performance of object detection, tracking, in-path target identification & selection. Sophisticated modelbased scene tracking techniques are also employed to improve the in-path target identification process [2]. Once the in-path target has been identified, situational awareness algorithms evaluate if this target presents a potential threat to the Host vehicle. If a potential threat does occur, appropriate smooth corrective vehicle control actions (e.g., brakes, throttle, steering, etc.) and Driver-Vehicle Interfaces (e.g., visual, auditory, and tactile warnings) are implemented in order to minimize the risk. Selective instrument panel sub-systems (e.g., windshield wiper control, HVAC, radio adjustments, etc.) are continually monitored and are used to further enhance the threat assessment processes. - 2-1998 IEEE International Conference on Intelligent Vehicles 354

Side/Rear Sensors Target Detection Target Kinematic Attributes Fwd Radar Sensor Target Detection Target Kinematic Attributes Fwd Vision Lane Detection Roadway Curvature Host Vehicle Lateral Position in Lane GPS/Map Roadway Type Roadway Curvature Off/On Ramps Bridges In-Vehicle Sensors Differential Speed Speed Steering Angle Yaw Rate Sensor Fusion - Sensor/Data Fusion Algorithms Target Vehicle Tracking Scene Model Path Determination Host Vehicle Tracking Vehicle Model Path Determination In-Path Target Identification In-Path Target Discrimination, Identification & Selection Processing Module Situational Assessment Threat Assessment Situation Model Control Action Driver Model In-Vehicle Driver Monitoring DVI Warning Cues Audio Visual Haptic/Tactile - HUD Active Vehicle Control Brakes Throttle Transmission - Steering Figure 2: Conceptual Architecture of Collision Avoidance. The discussion of the architecture for Figure 2 will be seamlessly integrated into the vehicle infrastructure to provide a cohesive collision avoidance product. Figure 3 provides an example of the vehicle/system integration. The central processor for the system will be the Collision Avoidance Processor (CAP) which provides the functionality of the Processing Module. It is envisioned that such a collision avoidance system will provide 360 ο coverage about the host vehicle. Communication Interface Bus Collision Avoidance Processor BRAKE! BRAKE! BRAKE! Visual H UD IP Audio Chimes/Voice Tactile Rear/Side Mirrors Radio Control S eat Vibration Brake Pulse Steering Shake r Resistive Steering Driver Monitor Driver/ Vehicle Interface Vision CAP Vehicle State Interface Vehicle Control Forward Radar Sensor Rear Detection Sensor Vehicle Sensors GPS/Map Inertial Speed Steering Side Detection Sensor Active Vehicle Control Brakes Steering Throttle Transmission Figure 3: Vehicle Mechanization of Collision Avoidance. 3. ACC SYSTEM The first product introductions towards Collision Avoidance functionality will be Adaptive Cruise Control (ACC). ACC systems will be a customer convenience product which will enhance the functionality of a standard cruise control function. It will automatically adjust the host vehicle speed in order to maintain a driver-specified adjustable timed-hadway (e.g., 1 sec, 2 sec, etc.) distance behind a lead vehicle. A forward-looking detection sensor (e.g., rdar or lidar) will be used to detect and assess the - 3-1998 IEEE International Conference on Intelligent Vehicles 355

kinematic target attributes (e.g., range, range rate, angle, etc.) in front of the host vehicle. Once the in-path target is identified, automatic braking and throttle controls are applied to appropriately adjust the host vehicle s speed in order to maintain the predetermined timed headway to the lead vehicle. The minimum sensor requirements for the ACC system are based on the ability to provide smooth adequate vehicle control that can be accommodated without the need for driver intervention. The minimum subset requirements are a function of: (a) Maximum timed headway (d) Minimum closest approach (b) Maximum allowable longitudinal (e) Maximum allowable lateral deceleration acceleration (c) Maximum difference in velocity (f) Maximum operating velocity A reasonable ACC design guidelines which assist in defining the maximum sensor range is based on a driver selectable timed headway of between 1-to-2 seconds. Figure 4 illustrates the resulting following distance as a function of host vehicle speed and timed-headway. Thus, if the maximum allowable ACC set speed is 100 mph (e.g., 160 kph), with a timed-headway of 2 seconds, then the required following distance is approximately 90 meters. However, in order to maintain continuous steady-state ACC control, the sensor range will need to exceed the following range by an appropriate amount (e.g., 10%). Thus, for this analysis, the maximum sensor detection range needs to be approximately 100 m. In addition to the ACC system being able to maintain a steady state timed headway control, the ACC system is also required to be able to reasonably react as the host vehicle approaches a lead vehicle traveling at a much slower speed. That is, without driver intervention, the ACC system should have the capability to slow the host vehicle to a reasonable distance that is no closer than a given minimum distance (e.g., this might violate the desired timed-headway) to the lead vehicle and then be able to successfully reestablish the appropriate timed-headway. Figure 5 provides the required sensor range as a function of the point of closest approach to the lead vehicle as a function of the difference in vehicle velocities, given a maximum ACC-brake deceleration of 0.2g. As shown, a maximum sensor range of 80-to-100 m is again required for a velocity difference of 30-to-40 mph, while assuming a closest point of approach of 20-to-30 m. The minimum azimuth sensor field-of-view (FOV) required is based on providing ACC operation through curves. The FOV requirement is driven by the minimum roadway curvature (e.g., radius-ofcurvature) and maximum range. Highway design standards establish the minimum radius-of-curvature as a function of vehicle speed is shown in Figure 6 [5]. These standards are conservatively based on maximum lateral acceleration of 0.13g. Figure 7 provides the minimum sensor FOV as a function of range and radius-of-curvature. Figures 4-7 are used to determine the sensor FOV. For example, at a speed of 55 mph the ACC following distance is 50 m (from Fig 4) and the minimum radius-of-curvature is 300 m (see Fig 6) which leads to a minimum sensor FOV of ±5 ο (see Fig 7). The situation of approaching a lead vehicle with a large difference in velocity leads to a similar requirement. Given a host vehicle speed of 70 mph, lead vehicle speed of 35 mph (e.g., 35 mph differential speed), and closest point of approach of 30 m, the required sensor range is 90 m (from Fig 5) and the minimum radius of curvature is 500 m (see Fig 6) which also leads to a minimum sensor FOV of ±5 ο. Additional FOV (e.g., overscan) is needed to accommodate mechanical or electrical misalignment of the antenna relative to the sensor enclosure (e.g., ±0.5 ο ) and mechanical misalignment of the sensor enclosure caused by installation tolerances during automotive assembly (e.g., ±2 ο ). Otherwise, precise alignment either by mechanical or RF means is required during installation which would be time consuming and costly. Accounting for these tolerances (±2 ο total) leads to a desirable sensor FOV of ±7.5 ο. - 4-1998 IEEE International Conference on Intelligent Vehicles 356

Following Range (m) 120 100 Timed Headway = 2 sec 80 60 40 20 Timed Headway = 1 sec 0 30 40 50 60 70 80 90 100 110 120 Speed (mph) Sensor Range (m) 140 Deceleration = 0.2 g V=40 mph 120 100 80 60 40 V=35 mph 20 V=20 mph V=30 mph 0 0 10 20 30 40 50 60 Closest Approach (m) Figure 4: Steady-state ACC Following Range. Figure 5: Required Sensor Range for Slow Moving Lead Vehicle. Minimum Radius of Curvature (m) 1000 e=roadway bank angle 800 e = 0.0 600 e = 0.04 400 e = 0.1 200 0 30 40 50 60 70 Speed (mph) Figure 6: Minimum Speed on a Curve. Azimuth FOV (deg) 30.0 25.0 20.0 15.0 10.0 5.0 Radius of Curvature = 300 m 500 m 0.0 0 25 50 75 100 125 150 Range (m) Figure 7: Azimuth FOV Requirement. 4. DATA ACQUISITION SYSTEMS (DAS) It was recognized very early in the development these systems that the need to develop an extensive, user-friendly DAS systems would be required. These DAS tools are use to effectively: (a) observe realtime system behavior for on-road evaluation, (b) provide system analysis, (c) quantify system performance, & (d) quantitatively measure system improvements resulting from algorithm refinements. Also, it was required that the DAS architecture be robust, flexible and extensible in order to provide future growth. Laptop computers are interfaced to the various system components (e.g., detection sensor, CAP, etc), and are used to dynamically collect a voluminous variety of system events (in 100 milli-seconds intervals), such as: (a) host vehicle (e.g., speed, steering angle, yaw rate, etc.), (b) detected target (e.g., range, relative speed, angle, angular extent, etc.), (c) in-path target, (d) vehicle control action & warning cues, etc. Also, a video system (e.g., camera & video recorder) is used to provide time-stamped video. This video is used only for DAS purposes and not to perform lane tracking functionality. The DAS provides the ability to preview real-time system performance while performing on-road testing activities in system vehicles. It also has the capability to replay the collected data in a laboratory setting for more detailed post processing engineering evaluation investigations. The collected data set can be used to reconstruct the state of the system and the environment about the vehicle. The data set may be - 5-1998 IEEE International Conference on Intelligent Vehicles 357

replayed into the system algorithms in order to quantify performance improvements as a result of algorithm refinements. This ability to replay measured data has provided a real benefit towards understanding system performance issues and providing an ideal engineering environment for the rapid prototyping and evaluation of system algorithms. The video-based DAS is an useful tool to review lengthy time segments of on-road data and isolating those small time segments which reveal marginal/questionable system performance behavior. Once identified, then the voluminous files of more detailed data can be more carefully examined to investigated the precise cause of the observed anomaly. The addition of the video system provided a novel real-time approach to simultaneously display the synthetically generated rectangular targets, as detected by the detection sensor, with the actual visually detected targets, as seen by the video system. This video-based diagnostic configuration is presented with a splitscreen format, as illustrated by Figure 8. In the upper portion, box-like imagery displays the detected targets, while the lower portion is used to display the realtime video imagery of the roadway environment ahead of the Host vehicle. The box-like imagery is synthetically generated by the DAS, based on the target data states as seen by the detection Figure 8: Video-based Data Acquisition. sensor. These boxes are colored-coded and variable-sized and represent the detected targets from the perspective of the host vehicle. The size of the target box is based on the target distance and if the target is classified as to be in the host vehicle motion path (as determined by the path algorithms). For example, wide & narrow boxes represent in-path and not-in-path targets, respectively. The box color provides indication of the relative speed of the targets. Also, overlaid is the perceived host vehicle lane boundaries which are generated by the DAS from the Host vehicle yaw rate sensor. As shown in Figure 8, within the camera FOV, 3 vehicles (e.g., passenger vehicle, motorcycle & partially obscured passenger vehicle) are visually seen on a straight roadway. The detection sensor detects 2 targets (e.g., confirmed by 2 boxes synthetically generated on the upper portion). The visually seen partially obscured passenger vehicle is outside the radar detection sensor FOV, and consequently, can not be seen by the sensor. The path algorithms correctly classified the motorcycle as being in-lane (e.g., wide box),. and the passenger vehicle as being out-of-lane (e.g., narrow box). 5. REFERENCES [1] J.L. Blincoe; The Economic Cost of Motor Vehicle Crashes 1994, U.S. Department of Transportation (Report DOT HS 808-425), 1996. [2] R. Knipling, et al.; Rear End Crashes: Problem Size Assessment and Statistical Description, United States Department of Transportation (NHTSA Technical Report HS807-994), Springfiled, VA, 1993 [3] R. Knipling, et al.; Assessment of IVHS Countermeasures for Collision Avoidance: Rear End Crashes, U.S. Department of Transportation (NHTSA Technical Report HS807995), Springfiled, VA, 1993. [4] J. Shiffmann & G.R. Widmann; Model-Based Scene Tracking using Radar Sensors for Intelligent Automotive Vehicle s, Proceedings IEEE Intelligent Transportation s Conf., Boston, MA, 1997. [5] American Association of State Highway and Transportation Officials (AASHTO); A Policy on Geometric Design of Highways and Streets, AASHTO, Washington DC, page 174, 1984. - 6-1998 IEEE International Conference on Intelligent Vehicles 358