Vehicle Collision Avoidance System by CarDAR Brian Beckrest, CpE Nicholas Martinek, EE Nathan Pax, EE Advisor: Dr. Nesreen Alsbou April 30, 2015
Summary In the 2014-15 academic year, a team of three senior engineering students at Ohio Northern University, guided by an advising professor, developed a vehicle collision avoidance system. This paper details the prototype system, which alerts drivers to their surroundings and potentially hazardous driving situations. The system is necessary to reduce the alarmingly high number of vehicle accidents that occur on roadways. Widespread implementation of collision avoidance systems would not only lead to improved safety and efficiency of roadway usage, but also limit human and economic loss. The developed prototype system is conducive to all of these potential benefits by utilizing ultrasonic sensors to provide side and blind spot coverage, and using long-range radar to detect possible frontal collisions. After demonstrating that the implementation of vehicle collision avoidance systems is necessary, the paper focuses its discussion on the early-stage development of possible solutions, including the comparison of initially proposed ideas and how the final solution was reached. A literature survey, as well as realistic constraints and marketing requirements, are provided to help validate final decisions that were made by the team. The most important information for the project is shared in the design implementation and verification sections. The sections explain in technical detail how the prototype system works and the testing that was completed to verify its operation. The last sections of the paper contain information regarding the cost of the system, including a break-even analysis. Also included is a time schedule of how the project progressed throughout the school year.
Acknowledgements There are numerous people we would like to thank because without their help this project would not have been a success. A huge thank you to our advisor, Dr. Nesreen Alsbou, for guiding us and providing valuable input throughout the entirety of this project. Another thank you goes to Dr. Khalid Al-Olimat, head of Senior Design, for keeping the course well organized for all of the many ECCS projects. We also greatly appreciate the rest of the engineering faculty for your support and advice. A special thank you goes to Dr. David Mikesell and the great people at the Transportation Research Center in East Liberty, Ohio for allowing us to come visit the center and gain insight into vehicle collision avoidance systems currently being developed in industry. Nathan Rosenbaum receives our gratitude for allowing us to use his van in order to test our system. Additionally, thank you to AutonomouStuff for responding to our questions about the Delphi radar and supplying us with helpful additional documentation. Finally, thank you to our classmates, friends, and families who have supported and encouraged us during this entire endeavor.
Table of Contents List of Figures. 1 List of Tables. 2 Problem Statement. 3 Project Objective. 3 Literature Survey. 4 Marketing Requirements. 6 Realistic Constraints. 6 Design Solution. 8 Design Integration. 16 Design Verification. 18 Cost of Project. 21 Break-Even Analysis. 22 Time Schedule. 23 Conclusion. 23 References. 24 Appendix. 26
List of Figures Figure 1. Placement of Ultrasonic Sensors and Radar and their Coverage Areas 12 Figure 2. Level 0 Block Diagram Vehicle Collision Avoidance System 13 Figure 3. Level 1 Block Diagram Vehicle Collision Avoidance System 14 Figure 4. Visual Collision Warning System with All Warnings Displayed 18 1
List of Tables Table 1. Marketing and Engineering Requirements for System 6 Table 2. Top Level System Decision Matrix 8 Table 3. Blind Spot Detection Sensors Decision Matrix 9 Table 4. Central Processing Decision Matrix 10 Table 5. Functional Requirements Vehicle Collision Avoidance System 13 Table 6. Functional Requirements Voltage Converter 14 Table 7. Functional Requirements Front Radar Sensor 14 Table 8. Functional Requirements Side & Rear Ultrasonic Sensors 14 Table 9. Functional Requirements BeagleBone Black (Processor) 15 Table 10. Functional Requirements Vehicle Map Display 15 Table 11. Functional Requirements Audible Collision Warning System 15 Table 12. Radar CAN Messages 16 Table 13. Ultrasonic Sensor Voltage Comparison 19 2
Problem Statement According to the National Highway Traffic Safety Administration (NHTSA), in the United States, there were over 5.6 million motor vehicle accidents in 2012, leading to nearly 34,000 fatalities and 2,632,000 injuries [9]. The majority of vehicular collisions and deaths occur on roads with higher speed limits. In fact, NHTSA data from 2012 shows that around 90 percent of crashes that year involved vehicles on roads with speed limits of 30 mph or greater [10]. A staggering 25,510 deaths resulted from these high-speed crashes, a number far too high to overlook [10]. Not only would a vehicle collision avoidance system increase safety on roadways, but it could also lead to increased efficiency of road usage. With fewer accidents, congestion on roadways could be potentially eliminated. To prevent motor vehicle injuries and fatalities, limit economic loss, and increase roadway usage efficiency, a vehicle collision avoidance system is vital. Project Objective The objective of this project was to design and implement a vehicle collision avoidance system. The system was to monitor the surrounding area of the host motor vehicle for other vehicles and warn the driver if any unsafe situation arose. For example, part of the system would alert the driver of any vehicles present in their blind spot. Another feature of the system would have the roadway ahead of the car monitored for any vehicles and provide warnings if the car was approaching a vehicle in front too fast. These features were to be all made possible by using sensors on the vehicle s exterior to detect surrounding vehicles. When a situation was determined dangerous, the driver would be alerted with both audio and visual cues broadcasted within the car. An audible sound would beep in more critical circumstances, while a visual representation of the vehicle s surroundings would be rendered on a small display mounted on the vehicle s dashboard. It would provide a top-down view of the car, with detected objects causing appropriate location zones to be highlighted on the screen. 3
Literature Survey A literature survey was conducted in order to gain a better understanding of the potential obstacles that come with the implementation of a vehicle collision avoidance system. The survey revealed a few key problematic areas that the proposed system should address. These technological barriers include developing accurate sensing methods and implementing effective driver warning systems. Vehicle Sensing The first order of business within a vehicle collision avoidance system is detecting the location of all nearby vehicles. This task can be carried out in numerous ways. Some are more widely implemented than others. One popular method uses sensors that are installed on various sections of the vehicle to monitor areas where collisions are more likely to occur. This sensing can be done using various technologies, such as radar and video [7]. Radar is most useful in long-range detection for frontal collision avoidance, while other methods are sometimes used in short range detection [1]. There is a stark contrast in the use of radar as opposed to video. Radar is an active type of detection, in that the energy that the sensor detects is the same energy that it emits. Video on the other hand is passive, meaning that it simply receives light from its surroundings. Active detection systems are often more expensive because they require more equipment. On the other hand, they are typically more reliable than passive systems because the algorithms required to process a video of the road is very complex and prone to false positives, with sensitivity to weather conditions contributing to its ineffectiveness as well [5]. Needless to say, radar is not without its own complexities either. Determining the frequency band to use has been a topic of debate for a few years now, although most researchers seem to agree that 77 GHz is appropriate, especially for long-range applications [1]. Research has also been done into the type of waveform that the radar should emit. Various forms of frequency modulation have been used to improve radar quality and efficiency [2][7]. Another method of detecting other vehicles that is becoming more viable involves car-tocar communication and GPS. The theory behind this is that all the cars on the road have a personal WiFi device that allows them to communicate with other vehicles. The vehicles repeatedly send data about their location and speed based on a GPS device. This data lets each car create a live map of its surroundings and analysis of that map lets each car determine any potential collisions [12]. Although this solution has a bright future, implementing it was determined to be infeasible due to the accuracy limitations of GPS. A foundational collision avoidance system consisting of sensors, such as our prototype system, would need to work in conjunction with the car-to-car communication system in order to be effective. 4
Driver Warning Systems Collision warning systems have been shown to improve inattentive or distracted driver response time in rear-end collision situations. However, most previous rear-end collision warning research has not directly compared auditory, visual, and tactile warnings [11]. It is also not clear how effective these warnings are when the driver is engaged is other tasks, such as talking on a cell phone [8]. Several different tests have been done to determine how different types of warnings affect the reaction time of the driver. In these tests sixteen participants in a fixed-base driving simulator experienced four warning conditions: no warning, visual, auditory, and tactile in three different conversation conditions: none, simple hands free, and complex hands free. The warnings activated when the time-to-collision reached a critical threshold of 3.0 or 5.0 seconds. Driver reaction time was captured from a warning below critical threshold to initiation of the brakes. From the results of these tests, it was concluded that drivers who received a tactile warning had the shortest mean reaction time and had a significant advantage over drivers who received none and drivers who received only a visual warning [11]. Auditory warnings also significantly outperformed visual warnings when no conversation was occurring. However, during both simple and complex conversation, those who received the visual warning had a shorter reaction time. Tactile warnings however, did not have much of an advantage over auditory warnings when no conversation was occurring. During simple and complex conversation, the auditory warnings were only slightly better than no warning at all [8]. From this study, it was determined that visual, audible, and tactile warnings are all viable warning methods. Although tactile warnings produce low reaction times, the difficulty of installing a tactile warning system offsets its effectiveness. Therefore, it was determined that both visual and audible warnings are sufficient for the prototype system. 5
Marketing Requirements 1. The system should be able to detect vehicles in the blind spots. 2. The system should be able to alert the driver of potential collisions with all sides of vehicle. 3. The system should have a screen displaying important information. 4. This system should not distract the driver. 5. The system should be able to work with any vehicle. 6. The system should be easy to use. Table 1: Marketing and Engineering Requirements for System Marketing Requirements Engineering Requirements Justification 1,2,3 The driver will be alerted to objects in area surrounding car within 5 meters. 5,6 Small LCD screen will provide visual guidance, with speakers emitting a high frequency beep to gain driver s attention. 1,2,5 The number of collisions will be reduced. With the top down view of the car on the screen, the driver will know when cars are in the area that cannot be seen. The screen will feature easy to understand information and alerts. Collisions will be reduced because the driver will be alerted of dangers from other cars, as well as preventing the driver from causing accidents. Realistic Constraints Performance The visual and audible warning system latency should be less than 1 second. The system should have a very small number of false warnings as to not distract the driver. Functionality The system should alert the driver under any conditions, especially ones in which the driver is distracted. 6
The used should not be distracted by the system under normal driving conditions. All processing should be done on one processor. Economic System development costs should be less than $10,000. Energy The system should be able to run off a 12 volt car battery. The system should not have a significant impact on fuel economy. Health and Safety The system should not expose drivers to any harmful amounts of electromagnetic radiation. There should be no risk of electric shock to any user of the system. Legal The design should adhere to all U.S vehicle laws. The system should meet all FCC guidelines for wireless communication. Maintainability The system should be modular so that individual parts can be replaced if they break down. Operational The system should be able to operate under all typical driving conditions (sun, rain, wind, light snow, etc.). All modules and connections outside of the cab of the vehicle should be weatherproof. The system should be able to operate in a temperature range of -40 C to 50 C. Availability This system will be operational whenever the vehicle is driving in standard driving conditions. Social The system will be intuitive to the driver and will easily alert anyone familiar with vehicle operation. 7
Design Solution Developing a design solution for this system was particularly tricky because the collision avoidance system essentially consists of three modular subsystems that all must work flawlessly together (detection, processing, warning). The detection subsystem consists of the sensors that detect and gather data about surrounding objects. The processing subsystem reads in all of the sensors data and puts it through an algorithm to determine if a warning should be outputted to the driver. The warning subsystem waits for the warning signals from the processing subsystem and outputs the appropriate warnings visually and/or audibly. The final solution to the vehicle collision avoidance system was derived after careful consideration of possible alternatives. Before commenting on the actual details of the final solution, it is important to discuss initial solutions that were brainstormed and reveal why the components of the final design were chosen over these original ideas. The main decision to be made involved determining what top-level system would be the best to implement. Concerning the specific subsystems, which types of sensors and single-board computer to use also required research. To analyze each decision factor, decision matrices were constructed. Table 2: Top Level System Decision Matrix Passive detection systems are becoming more and more popular due to the amount of information one can gather from a video. However, the algorithms for detecting objects at high speeds are still very complicated and unreliable. This, compounded with the fact that cameras are 8
very susceptible to weather conditions, means that active detection maintains majority of collision avoidance products. Verdict: Active Detection Blind Spot Detection Sensors Table 3: Blind Spot Detection Sensors Decision Matrix Radar is a suitable choice because of its commonality in automotive applications and great accuracy, but ultrasonic is inexpensive and has the ability to function in all environments experienced on the road, such as rain, fog, and dust. The detection range for ultrasonic sensors is limited; however, the necessary range for the proposed system is equal to slightly greater than the width of a road lane. While cameras are very accurate and have a long detection range, they scored very poorly because they are expensive and would be difficult to implement with the complex computer vision algorithms needed. Verdict: Ultrasonic Sensors 9
Central Processing Table 4: Central Processing Decision Matrix The Raspberry Pi is the favorite single-board computer of many electronics enthusiasts due to its cheap price, versatility, and tremendous community support, but BeagleBone Black's advantages are very alluring. With the BeagleBone Black, an increased CPU speed provides faster operation in a system that requires timely response. Its plethora of connectivity options makes it perfect for interfacing with the subsystems, whereas the Raspberry Pi has a more limited external interface. The PCDuino3, another option, features a fast processor with large RAM, however its poor support community and high price knock it out of the running. Although the Raspberry Pi community is more vibrant than that of the BeagleBone Black, the BeagleBone Black was determined to be a good fit for an embedded project that requires high-speed processing and expansive external interfacing. However, because the Raspberry Pi is the best at handling graphics, it was chosen to power the dashboard vehicle map display. Verdict: BeagleBone Black for processing Raspberry Pi for GUI 10
Coding Although not hardware, determining the implementation of the multiple programs necessary for the system garnered much attention. As stated previously, the entire system is made up of three major subsystems: detection, processing, and warning. The detection subsystem, consisting of the radar and ultrasonic sensors, requires no programming as the sensors continuously monitor their environment and return their data. The bulk of the coding resides in the processing module, which can be considered the BeagleBone Black for all intents and purposes. The programs in the processing module handle the reading in of sensor and radar data, processing the data through the collision detection algorithm, and outputting the warning signals to the warning subsystems. Initial thoughts were to write all of the programs in a low-level language such as C because of its compiling speed and quick access to the kernel for reading general-purpose input and output (GPIO). However, because C does not support Object Oriented Programming features (OOP), it was scrapped, as OOP dramatically reduces the complexity of programming the system. After further research and contemplation, Python was agreed upon as the programming language for development. Python is a weakly typed language, which makes it simpler, leading to faster development and less mental overhead. Although it is a scripting language, it still supports OOP features and has the interpreter for rapid testing and exploration. Another major advantage is that Python code is extremely readable and a team member had a great deal of experience with it. To keep consistency with the rest of the system, the visual warning subsystem was programmed in Python. However, the audible warning subsystem was programmed in C because the subsystem utilized an Arduino microcontroller. This microcontroller makes it easy to output pulse-width-modulated signals required for the buzzers. 11
Final Solution Now that it is clear why the final solution was chosen over other alternatives, the end design can be explored in detail. In overview, the system features a network of six ultrasonic sensors to monitor the blindspots of the vehicle and long-range radar to detect frontal collisions. This means that there are three coverage zones surrounding the car that are being monitored, which will be referred to as right, left, and front. Figure 1 below shows the approximate placement of the sensors and radar, as well as their coverage areas. Figure 1: Placement of Ultrasonic Sensors and Radar and their Coverage Areas The data from the sensors and radar is sent to a BeagleBone Black, which processes the data and determines if a warning should be issued to the driver. Two types of warnings can be issued for each of the coverage zones: basic and imminent. Basic warnings simply alert the driver that there is an object being detected by the sensors/radar that is in a potentially hazardous location. Imminent warnings are much more serious and communicate that a collision is going to occur unless immediate action is taken. If the algorithm on the BeagleBone Black determines that warnings should be given, appropriate signals are sent to the visual warning system and audible warning system. The visual warning system is a top-down map of the vehicle displayed on an LCD screen situated on the dashboard. It shows both basic and imminent warnings for all three coverage zones. A driver can occasionally glance at the display to become more aware of 12
their surroundings, such as when changing lanes. The audible warning system features two buzzers, one behind each ear, in the driver s headrest. The buzzers only beep to issue imminent warnings. The left buzzer corresponds to the left coverage zone, while the right corresponds to the right coverage zone. Both buzzers beeping represent a front zone imminent warning. Functional block diagrams were created for the vehicle collision avoidance system to provide a greater understanding of how the system is pieced together. The Level 0 diagram below in Figure 1 is the top-level diagram, simply detailing the vehicle collision avoidance system s inputs and outputs. The Level 1 diagram in Figure 2 further describes the subsystems contained within the vehicle collision avoidance system. Level 0 Figure 2: Level 0 Block Diagram Vehicle Collision Avoidance System Table 5: Functional Requirements Vehicle Collision Avoidance System Module Vehicle Collision Avoidance System Inputs Ultrasonic waves: Vehicle detection for vehicle s sides and rear Radar waves: Vehicle detection for vehicle s front Power: 12 volts DC Outputs Audible warning: 1.5, 2 khz beep emitted from two piezo buzzers (right and left), corresponding to respective side of host vehicle Digital vehicle map: Shows rendered overhead view of host vehicle s surrounding area based upon sensor data Functionality Detect presence of vehicles in surrounding area of host vehicle. Warn driver of presence with audible and visual warnings. 13
Level 1 Figure 3: Level 1 Block Diagram Vehicle Collision Avoidance System Module Inputs Outputs Functionality Module Inputs Outputs Functionality Module Inputs Outputs Functionality Table 6: Functional Requirements Voltage Converter Voltage Converter 12 V DC 5 V DC Down converts 12 V input from car battery to 5 V in order to power BeagleBone Black and vehicle map display. Table 7: Functional Requirements Front Radar Sensor Front Radar Sensor Radar waves: Reflected waves Power: 12 volts DC Can bus data: 64 bytes every 50 ms Emits a Simultaneous Transmit-Receive Pulse Doppler (STAR PD) signal that determines the distance and relative velocity of any objects in the range. Tracks up to 64 objects in the view of 174m at ±10 and 60m at ±45. Sends data over CAN bus to processor. Table 8: Functional Requirements Side & Rear Ultrasonic Sensors Side & Rear Ultrasonic Sensors Ultrasonic waves: Reflected waves Power: 12 volts DC Analog signal: 0-3.3 volt DC signal Sends an ultrasonic signal and receives a reflected ultrasonic wave. Then processes the signal and determines if an object is within the detection range and outputs the information to the central processor. 14
Module Inputs Outputs Functionality Module Inputs Outputs Functionality Table 9: Functional Requirements BeagleBone Black (Processor) BeagleBone Black Digital signal: Front radar sensor information Digital signal: Side and rear ultrasonic sensor information Power: 5 volts DC Video signal: 800x480 @ 24Hz through micro-hdmi PWM d digital signal (R): 3.3 volt DC digital signal w/ PWM for right piezo buzzer PWM d digital signal (L): 3.3 volt DC digital signal w/ PWM for left piezo buzzer Receives input from sensors and processes it, determining if vehicles are present and where they are. Outputs appropriate signals to vehicle map display and Audible warning system Table 10: Functional Requirements Vehicle Map Display Vehicle Map Display (Visual Collision Warning System) Video signal: 800x480 @ 24Hz through HDMI User settings control: Buttons for adjusting screen brightness, color, and contrast Power: 5 volts DC, 500 ma Digital vehicle map: 800x480 resolution image Display map of rendered overhead view of host vehicle s surrounding area based upon sensor data. Change color of area if vehicle is detected in that area. Display imminent warnings if potential collision is deemed. Table 11: Functional Requirements Audible Collision Warning System Module Audible Collision Warning System Inputs PWM d Digital Signal (Right): 3.3 volt DC digital signal with PWM for right piezo buzzer PWM d Digital Signal (Left): 3.3 volt DC digital signal with PWM for left piezo buzzer Outputs Audible warning: 1.5 or 2 khz beep emitted from two piezo buzzers (right and left), corresponding to respective side of host vehicle Functionality Audibly alerts driver of vehicle detection by having right piezo buzzer beep if vehicle is detected along right side of host vehicle, and left piezo buzzer beep if vehicle is detected along left side of host vehicle. Both will beep for forward collision warning. 15
Design Integration Ultrasonic Sensors In order to integrate the ultrasonic sensors with the rest of the system, some voltage conversion had to be done as well as extensive wiring. First, the ultrasonic sensors only accept voltages between 2.7 and 5.5 volts. A 12-volt car battery needs to be stepped down in order to power the sensors so a voltage divider was made to take the 12 volts from the car battery and supply the sensors with 3 volts. The sensors also have power filter kits to ensure the power received is clean and stable. The ultrasonic sensor then makes a distance reading and outputs an analog voltage based on how far away an object is from the sensor: the further the object, the greater the voltage. All six sensors are wired together to pulse at the same time, in order to reduce interference between the sensors. Once a sensor detects an object and registers a voltage for it, the voltage is outputted to the processor for computation to determine how far away the object is and if the driver needs to be warned. However, the voltages must be stepped down because the analog pins on the BeagleBone Black can only accept voltages of up to 1.8 volts and the sensors can output more than 1.8 volts. After the voltage division, the BeagleBone Black receives the voltage and calculates how far the object is from the sensor. Radar Integration of the Delphi ESR mostly involves understanding the data that it outputs. The radar can track up to 64 objects at once, gathering information such the object s velocity, position, acceleration, and heading. The ESR operates on CAN protocol and does a lot of packaging before it sends the messages with all this data in order to simplify the work needed by the user. The ESR completes the processing of the target data with a cycle time of 50 ms +/- 5 msec. At the completion of this processing, the ESR transmits all its CAN messages in one group of 74 lines of 8 bytes each. The structure of this transmission can be seen in Table 12. Table 12: Radar CAN Messages Start End ID 4E0h 4E1h 4E2h 4E3h 500h Track1 53Fh Track64 540h Msg 1 through 10 5E4h 5E5h 5E6h 5E7h 5E8h The lines of interest are lines 500 through 53F, denoted as Track1 through Track64, as those are the lines with the object tracking data. Processing The BeagleBone Black serves as the central processing unit for the system. Processing of the sensor and radar data is handled by using Python scripts. Python modules ( classes ) were written for the ultrasonic sensors as well as the radar. Ultrasonic sensor data and radar data is handled concurrently by a single script that imports both modules. The ultrasonic sensor Python module allows for the creation of Sensor objects. A Sensor object is created for each of the six ultrasonic sensors. The Sensor object stores data such as the distance of a detected object and the sensor range. The Sensor object also has methods that can be called including read_adc(), which is used to read the ADC value from a BeagleBone s 16
ADC pin. This method is particularly valuable because, as mentioned previously, each ultrasonic sensor feeds its output analog voltage into one of the BeagleBone s ADC pins. The six ADC pins are read sequentially by the BeagleBone every 100 ms using read_adc(). The cycle time was chosen to be 100 ms because the ultrasonic sensors pulse at around 7 Hz. The analog voltage is converted to a distance by another Sensor method. The calculations performed to determine this method can be seen in the Appendix. The distances to the nearest object detected by the ultrasonic sensors are sent to the collision warning algorithm, which is discussed a little bit later. The BeagleBone also processes the radar data using a Radar Python module. Physically, the BeagleBone connects to the radar s CAN bus through an add-on cape that attaches to the top of the BeagleBone board. The Radar module sets up an interface to this CAN bus and captures important messages from the radar every 30 ms. A 30 ms cycle time was chosen because the radar dumps its data onto the CAN bus every 50 ms. It captures only relevant messages, such as those that contain object tracking data (Table #), by filtering out messages based upon their IDs. CAN message filtering and capturing is performed by the Linux package can-utils, which provides some userspace utilities for the Linux SocketCAN subsystem. The script then extracts the necessary data from the captured messages and passes the data onto the collision detection algorithm. The collision detection warning algorithm is currently in its infancy due to the limited amount of testing that has been performed on the prototype system. However, the algorithm can be greatly improved with further testing, allowing for the algorithm to be tweaked so that it can account for any number of the infinite scenarios encountered while driving. Currently, the algorithm determines that a basic warning for the left and right coverage zones should be outputted when an ultrasonic sensor detects an object in its view. Because the range of the sensors approximately covers the width of adjacent traffic lanes, any object detected by the sensors can be considered a potential collision hazard. Additionally, for a basic warning to be issued, the sensors must detect the object in their view for more than 0.5 seconds so that objects that show up as quick blips, for example, driving by telephone poles, do not trigger basic warnings. Imminent warnings for the left and right coverage zones are issued when a detected object crosses a certain hard-coded threshold distance. This threshold distance is currently 0.75m. The algorithm determines front coverage zone warnings based upon radar data. An object s velocity and distance are used to calculate a time to collision, basically the amount of time before a collision would occur if vehicles paths and velocities stayed constant. The time to collision is calculated by dividing each tracked object s distance in meters by its relative rate in meters/second. Before the time to collision becomes small enough that the driver will not have time to react, the BeagleBone issues warning signals to the warning systems. Right now, the threshold time to collision for basic warnings is set at three seconds and for imminent warnings is two seconds. These values are on the conservative side to maintain safety while testing the system and only pertain to potential collisions detected by the front radar. Audible Warning System An Arduino Uno handles the audible warnings. Three input pins are connected to the BeagleBone and are fed with digital values for imminent collisions on the front and both sides. 17
Two output pins are connected to piezo-electric buzzers. If an input pin goes high, then the corresponding output pin is sent a square wave of a certain frequency and a duty cycle of 50%. The buzzers are mounted to the driver s headrest with the left buzzer corresponding to the left warning and right buzzer to the right warning. If a side warning is detected then a tone of 1.5 khz is sent to the correct buzzer. If the front warning pin goes high, then the tone is sent to both buzzers simultaneously at 2 khz. Visual Warning System A Raspberry Pi powers the 7 LCD screen that serves as the visual warning system. The Pi is connected to the dashboard-mounted display through HDMI and runs a Python script that renders a top-down view of the vehicle and its three coverage zones. The Python script takes advantage of the Qt framework to update the display based upon incoming warning signals to the Pi from the BeagleBone. The screen displays basic warnings indicated by a red highlighted zone and imminent warnings with a yellow warning symbol. Figure 4 below shows the visual warning system running with all warnings being displayed. Figure 4: Visual Collision Warning System with All Warnings Displayed Design Verification Ultrasonic Sensors The ultrasonic sensor testing was done in the lab by placing an object in front a sensor at various distances and reading the output voltage to ensure the distance was correct. The distance from the sensor to the object was verified with a tape measure and compared to value from the ultrasonic sensor. Once all the sensors were successfully tested on a small scale in the lab, the sensor network was taken into the hall to detect distances more similar to the ones seen in a roadway environment. All the distances determined by the sensors were found to be accurate to within three inches of the tape measure distance. 18
5m Analog Voltage(V) Calculated Distance(m) Table 13: Ultrasonic Sensor Voltage Comparison Measured Distance(m) 10m Analog Voltage(V) Calculated Distance(m) 0.129 0.25 0.25 0.064 0.25 0.25 0.259 0.5 0.5 0.132 0.51 0.5 0.528 1.02 1 0.256 0.99 1 1.029 1.99 2 0.523 2.02 2 2.593 5.01 5 1.291 4.99 5 2.59 10.01 10 Measured Distance(m) Radar The ESR was tested using proprietary software provided by Delphi called DataView. Initial testing was done in the lab to understand the initialization of the radar and making sure it could track objects. Due to the long-range nature of the radar, actual data testing needed to be done outside where it wasn t so cluttered. Stationary testing was performed first, setting up the radar and moving a car around within its range. DataView values were confirmed with actual measured values. For example, it was confirmed that the radar was tracking an object s velocity properly because the velocity DataView showed agreed with the speedometer of the car being driven. After this, the sensor was attached to the front of a vehicle and driven around town while a passenger monitored the tracking software to verify its accuracy with vehicles in range. Processing BeagleBone Black operation was verified through initial unit testing of the Python scripts. This was mostly done through the use of the Python interpreter, which allows for quick testing and validation of methods. To verify the script that reads in the ultrasonic sensors data was working properly, each sensor was wired to the BeagleBone Black and positioned a specified distance from the wall. Using values from the voltage comparison test above (shown in Table 12), the sensor s distance from the wall was progressively increased until its maximum range was hit. For each distance, the script output was certified to have equivalent voltage and distance readings to those in Table 12. This testing was performed outside as well, with an actual car being detected instead of a wall. To validate that the BeagleBone could read in all six of the sensors at one time, two cars were placed at varying positions along the sides and in the blindspot of the sensor network. The script output was validated based upon the cars positions. Initial testing of the processing of the radar output by the BeagleBone Black also took place inside the lab. In the beginning, focus was placed on validating the CAN frames on the bus. Extensive testing was performed to make sure that CAN messages were being filtered based upon their ID. Once this was established, considerable effort was put into making sure that objects that were being tracked by the radar and their corresponding CAN messages were being correctly read and converted to values by the Python script. To do so, the DataView software was ran at the same time as the Python radar script. By comparing the DataView software values 19
with the script output, it was easy to tell if the script was reading and converting the bits to the correct values. For example, if the DataView software showed an object (say, a chair in the lab) at 10 meters away, the script was confirmed to show that same object at 10 meters. This same approach was taken for every tracked object attribute (position, velocity, acceleration, etc.) that the script reads. Testing then moved outdoors into a parking lot where a car was driven around while being tracked by the radar. Script output was validated by DataView values once again. To more properly simulate traffic, two more cars were added to the parking lot that were tracked by the radar, with the script output being validated for all three car objects. Unfortunately, three ended up being the maximum number of tracked vehicles that could be tested because other vehicles could not be borrowed at the time. As a whole, these tests confirmed that the radar output was being properly read by the BeagleBone Black script from the CAN bus. The next testing phase involved validating the collision warning algorithm. This algorithm takes the sensor and radar data and determines what warnings, if any, should be outputted. To validate the algorithm with ultrasonic sensor input, the first tests involved placing stationary objects at specified distances from the sensors. This is because the algorithm currently determines if right and left warnings should be output based on a hard-coded distance for the ultrasonic sensors. It was made sure that a warning was outputted if the objects were within the hard-coded distance. A warning was properly validated by reading the corresponding BeagleBone Black warning output pin voltage with the multimeter. Validating the collision warning algorithm with radar input was a bit more complicated. As described in the design integration section, the algorithm determines if a warning should be outputted based upon the tracked object s position and velocity. Testing involved moving the vehicle with the radar attached at a constant speed relative to a leading vehicle. The system was monitored and as soon as a warning was emitted the brakes were applied. If the driver in the leading vehicle ever became uncomfortable with the approach of the test vehicle, they accelerated, ending the test. Speeds of approach went from one mile per hour up to ten miles per hour in one mile per hour increments. Visual Warning System Testing the visual warning system was done by sending digital high (3.3V) and low signals to the Raspberry Pi GPIO pins and verifying that the correct warning was displayed on the screen. Initially, each of the six input pins was tested individually. A basic test circuit was created with a push button that, when pressed, closed the loop and sent a digital high signal to the connected pin. The button was pressed at varying intervals in order to monitor any latency in warnings being displayed. When the button was pressed rapidly, there was noticeable lag in the updating of the display. This delay escalated when multiple pins were receiving signals at once; the more pins receiving signals, the greater the lag. Using the Python time module to measure the latency of the display showed a maximum latency of around 200 ms, which is considerable when moving at high speeds. Audible Warning System The audible warning system was verified in a similar fashion to the visual warning system. Each of the two pins were fed digital high signals (5V) at varying intervals using the 20
same test circuit from the visual warning system test. Beeps at 2 khz were emitted with no noticeable lag unlike the visual display. Final Testing After each system was finished with testing, all systems were put on to one vehicle and integrated into one complete system. The vehicle was then taken on the road to see how it responded in typical traffic situations. No specific tests were run at this stage as specific testing had been done in previous stages. Instead, the focus was on making sure the system ran as expected when all systems were in place. Without putting the vehicle in any precarious situations, the system ran fine with a minor issue of the Raspberry Pi crashing. Further complete system testing would need to be performed in a controlled environment in order to test it in more dangerous circumstances. Cost Table 14: Project Cost Item Cost Quantity Total BeagleBone Black $55.00 1 $55.00 Ultrasonic Sensors $110.00 6 $660.00 Delphi ESR $3600.00 1 $3600.00 Raspberry Pi $40.00 1 $40.00 CAN Cape for BBB $39.95 1 $39.95 Power Supply for BBB $7.95 1 $7.95 Arduino $24.95 1 $24.95 Piezo Elements $0.95 2 $1.90 7 Display $59.95 1 $59.95 Power Supply Filters $2.25 6 $13.50 Mounts $3.00 6 $18.00 Wires $3.00 1 $3.00 Component Cost $4524.20 Labor (Fall Semester) $30.00/hr 8 hr/week *15 weeks*3 $10,800 Labor (Spring Semester) $30.00/hr 10hr/week*15weeks*3 $13,500 $4524.20 21
Break-Even Analysis Table 15: Break-Even Analysis Fixed Costs (estimate) Cost Quantity Total Factory/Labor $10,000,000 Variable Costs Mounts $3.00 6 $18.00 Wires $3.00 1 $3.00 BeagleBone Black $55.00 1 $55.00 Ultrasonic Sensors $110.00 6 $660.00 Delphi ESR $3,600.00 1 $3,600.00 7 Display $59.95 1 $59.95 Raspberry Pi $40.00 1 $40.00 CAN Cape for BBB $39.95 1 $39.95 Power Supply for BBB $7.95 1 $7.95 Arduino $24.95 1 $24.95 Piezo Element $0.95 2 $1.90 Power Supply Filters $2.25 6 $13.50 Total Variable Costs $4,524.20 Unit Price $5,000.00 Profit per Unit $475.80 5000x = 10,000,000 + 4524.20x x = 21,018 units to break even 22
Time Schedule The time schedule for this project can be found in the Appendix. Conclusion In 2012, 25,510 people died in vehicular collisions at speeds of over 30 mph [10]. To reduce these roadway fatalities, a vehicle collision avoidance system was created that aims to reduce that statistic. The system utilizes ultrasonic sensors to monitor blind spots, as well as radar to provide warning of potential forward collisions. All of the sensor data is processed through a BeagleBone Black processor and outputted to a dashboard screen with the digital map of the vehicle, showing a top-down view of the area surrounding the vehicle, including potential threats. Two speakers and flashing alerts on the screen provide sufficient warning to the driver of impending collisions or roadway hazards. Although many of today s popular car manufacturers feature some type of vehicle collision warning system, these systems are expensive and often limited to availability in higherend cars. Additionally, these systems only come prepackaged on newer vehicles. Our system, on the other hand, is implementable on a variety of standard cars, no matter how old. While the system does not perform any autonomous driving action to avoid collisions, it warns the driver through both audible and visual warnings within adequate time to mitigate accidents. Work on this project could be carried on into the future through the further refinement of the collision detection algorithm. Also, more systems could be incorporated into the current system, such as V2V communication, lane departure warning, and autonomous control. The speed of the system could be improved by using more computational resources, such as faster processors, to offload the heavy work from the BeagleBone. Rewriting current code in a lowerlevel language like C may also increase system speed because of lower overhead and faster access to input/output. It is the opinion of the group that this project was successful in providing a proof of concept for the design idea, as well as establishing a solid base for future projects to build off. 23
References [1] Abou-Jaoude, Ramzi. "ACC Radar Sensor Technology, Test Requirements, and Test Solutions." IEEE Transactions On Intelligent Transportation Systems 4.3 (2003): 115-122. IEEE Xplore. Web. 3 Oct. 2014. [2] Alland, Stephen W., and James F. Searcy. Radar System and Method of Digital Beamforming. Delphi Technologies, Inc., assignee. Patent US7639171 B2. 29 Dec. 2009. Web. 3 Oct. 2014. [3] Alsbou, N., Prigent, S. and Refai, H. H. Analysis of Priority R-ALOHA (PR-ALOHA) protocol. (6 May 2013). Wireless Communications and Mobile Computing. Web 2 Oct. 2014. [4] Bin, Hu, and Hamid Gharavi. "A Joint Vehicle-Vehicle/Vehicle-Roadside Communication Protocol For Highway Traffic Safety." International Journal Of Vehicular Technology 2011.(2011): 1-10. Computers & Applied Sciences Complete. Web. 3 Oct. 2014. [5] Bing-Fei, Wu, et al. "A Vision-Based Collision Warning System By Surrounding Vehicles Detection." KSII Transactions On Internet & Information Systems 6.4 (2012): 1203-1222. Computers & Applied Sciences Complete. Web. 3 Oct. 2014. [6] Huihuan, Qian, et al. "Vehicle Safety Enhancement System: Sensing And Communication." International Journal Of Distributed Sensor Networks (2013): 1-19. Computers & Applied Sciences Complete. Web. 3 Oct. 2014. [7] Luo, Tang-Nian, Chi-Hung Evelyn Wu, and Yi-Jan Emery Chen. "A 77-Ghz CMOS Automotive Radar Transceiver With Anti-Interference Function." IEEE Transactions On Circuits & Systems. Part I: Regular Papers 60.12 (2013): 3247-3255. Computers & Applied Sciences Complete. Web. 3 Oct. 2014. [8] Mohebbi, Rayka, Rob Gray, and Hong Z. Tan. "Driver Reaction Time To Tactile And Auditory Rear-End Collision Warnings While Talking On A Cell Phone." Human Factors 51.1 (2009): 102-110. Computers & Applied Sciences Complete. Web. 2 Oct. 2014. [9] National Highway Traffic Safety Administration. 2012 Quick Facts. (2014) http://wwwnrd.nhtsa.dot.gov/. Web 15 Sept. 2014. [10] National Highway Traffic Safety Administration. Fatality Analysis Reporting System (FARS) Encyclopedia. (2014): http://www-fars.nhtsa.dot.gov/. Web. 14 Sept. 2014. 24
[11] Scott, J. J., and Robert Gray. "A Comparison Of Tactile, Visual, And Auditory Warnings For Rear-End Collision Prevention In Simulated Driving." Human Factors 50.2 (2008): 265-275. Computers & Applied Sciences Complete. Web. 2 Oct. 2014. [12] Seada, Karim. "Insights From A Freeway Car-To-Car Real-World Experiment." Mobicom: International Conference On Mobile Computing & Networking (2008): 49-55. Computers & Applied Sciences Complete. Web. 3 Oct. 2014. 25
Appendix Gantt Chart 26
27
28
Analog Voltage Calculations 5m Sensor V CC = 2.65V V 5mm = V CC /1024 = 0.00258789 V R mm =V m / V 5mm * 5 = 1932.0754 V m 10m Sensor V CC = 2.65V V 10mm = V CC /1024 = 0.00258789 V R mm = V m / V 5mm * 10 = 3864.1509 V m (Voltage going into sensor) (Voltage coming out of sensor per five millimeters) (Distance per volt measured) (Voltage going into sensor) (Voltage coming out of sensor per ten millimeters) (Distance per volt measured) 29