A Method for Image Processing and Distance Measuring Based on Laser Distance Triangulation



Similar documents
Navigation Aid And Label Reading With Voice Communication For Visually Impaired People

Raghavendra Reddy D 1, G Kumara Swamy 2

T-REDSPEED White paper

Laser Gesture Recognition for Human Machine Interaction

A Surveillance Robot with Climbing Capabilities for Home Security

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

A System for Capturing High Resolution Images

Original Research Articles

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

Mouse Control using a Web Camera based on Colour Detection

Microcontroller-based experiments for a control systems course in electrical engineering technology

Ensuring Security by Providing Technology Multipliers

Synthetic Sensing: Proximity / Distance Sensors

Design and Implementation of an Accidental Fall Detection System for Elderly

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Robot Perception Continued

Keywords ATM Terminal, Finger Print Recognition, Biometric Verification, PIN

PCB Component Placement Inspection

Keywords drowsiness, image processing, ultrasonic sensor, detection, camera, speed.

Innovative Practices in Optimal Utilization of Solar Energy (Solar Tracking System)

Self-Evaluation Configuration for Remote Data Logging Systems

Monitoring of Intravenous Drip Rate

Effective Use of Android Sensors Based on Visualization of Sensor Information

Barcode Based Automated Parking Management System

SENSORS. Miniature Sensors - S3Z. Advanced line of miniature Asian style of photoelectric sensors mm background suppression

DWH-1B. with a security system that keeps you in touch with what matters most

Dual Laser InfraRed (IR) Thermometer with Color Alert

Esercitazione con un robot umanoide programmabile per edutainment Giancarlo Teti RoboTech srl

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

Automatic Railway Gate Controller with High Speed Alerting System

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Intelligent Database Monitoring System using ARM9 with QR Code

Web Camera Mouse with Smart Base: An Electrical Engineering Design Project

Analecta Vol. 8, No. 2 ISSN

Lab Experiment 1: The LPC 2148 Education Board

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

Sony Releases the Transparent Lens Eyewear SmartEyeglass Developer Edition

Pen Drive to Pen Drive and Mobile Data Transfer Using ARM

Microtronics technologies Mobile:

Unit A451: Computer systems and programming. Section 2: Computing Hardware 4/5: Input and Output Devices

Embedded Systems on ARM Cortex-M3 (4weeks/45hrs)

Sensori ottici e laser nelle applicazioni industriali

Microcontroller Based Anti-theft Security System Using GSM Networks with Text Message as Feedback

How to Choose the Right Network Cameras. for Your Surveillance Project. Surveon Whitepaper

Design of Self-service Car Washing Machine Control System Based on ARM Zhengmin Cui a, Peng Sun b

Video Tracking Software User s Manual. Version 1.0

Realization of a UV fisheye hyperspectral camera

Dual Laser InfraRed (IR) Thermometer

Development of a high-resolution, high-speed vision system using CMOS image sensor technology enhanced by intelligent pixel selection technique

Interference. Physics 102 Workshop #3. General Instructions

Photovoltaic (PV) Energy as Recharge Source for Portable Devices such as Mobile Phones

Compact Sensors - S8 Series. Compact size and high performance for the most challenging detection applications

Development of a GSM based Control System for Electrical Appliances

Static Environment Recognition Using Omni-camera from a Moving Vehicle

POWER GENERATION AND DISTRIBUTION SYSTEM DESIGN FOR THE LEONIDAS CUBESAT NETWORK

Design Report. IniTech for

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

Sensor-Based Robotic Model for Vehicle Accident Avoidance

Ultrasonic Distance Meter. Operation Manual

E190Q Lecture 5 Autonomous Robot Navigation

Java Embedded Applications

Multipath fading in wireless sensor mote

Implementation of a Wimedia UWB Media Access Controller

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

DESIGN AND IMPLEMENTATION OF ONLINE PATIENT MONITORING SYSTEM

UM-D WOLF. University of Michigan-Dearborn. Stefan Filipek, Richard Herrell, Jonathan Hyland, Edward Klacza, Anthony Lucente, Sibu Varughese

Experimental Study of Automated Car Power Window with Preset Position

REMOTE CONTROL AND MONITORING OF AN INDUCTION MOTOR

Volume 2, Issue 2, February 2014 International Journal of Advance Research in Computer Science and Management Studies

Android based Secured Vehicle Key Finder System

Automated Profile Vehicle Using GSM Modem, GPS and Media Processor DM642

REMOTE CONTROL OF REAL EXPERIMENTS VIA INTERNET

Integrated electronics for enhanced performance, energy efficiency and reduced emissions

Vision-Based Blind Spot Detection Using Optical Flow

A method of generating free-route walk-through animation using vehicle-borne video image

Human Detection Robot using PIR Sensors

FIRE ALARM SYSTEM TECHNICAL SPECIFICATIONS Page 1 of 10

Robotic Home Assistant Care-O-bot: Past Present Future

Projection Center Calibration for a Co-located Projector Camera System

Basler. Line Scan Cameras

Advanced Electronic Platform Technologies Supporting Development of Complicated Vehicle Control Software

Microcontroller Based Low Cost Portable PC Mouse and Keyboard Tester

CMOS OV7660 Camera Module 1/5-Inch 0.3-Megapixel Module Datasheet

ES_LPC4357/53/37/33. Errata sheet LPC4357/53/37/33. Document information

A Review of Security System for Smart Home Applications

Braill-e-Book: An Innovative Idea for an Economical, User-Friendly and Portable ebook Reader for the Visually Impaired

Dual Laser InfraRed (IR) Thermometer

Cámaras y aparatos de espionaje

HIGH SECURED VEHICLE WITH INTERIOR VENTILATION CONTROL AND PERIMETER MONITORING SYSTEM

WIRELESS MAGNETIC CONTACT

Tutorial for Programming the LEGO MINDSTORMS NXT

Wearable Finger-Braille Interface for Navigation of Deaf-Blind in Ubiquitous Barrier-Free Space

UPS PIco. to be used with. Raspberry Pi B+, A+, B, and A. HAT Compliant. Raspberry Pi is a trademark of the Raspberry Pi Foundation

Graphical displays are generally of two types: vector displays and raster displays. Vector displays

Clear Glass Sensor. Ordering Information E3S-CR67/62. Optimum Sensor for Detecting Transparent Glass and Plastic Bottles

Indoor Surveillance Security Robot with a Self-Propelled Patrolling Vehicle

Smart Home System Using Android Application

DATA ACQUISITION FROM IN VITRO TESTING OF AN OCCLUDING MEDICAL DEVICE

Transcription:

A Method for Image Processing and Distance Measuring Based on Laser Distance Triangulation Saulo Vinicius Ferreira Barreto University of Pernambuco UPE saulo.barreto@gmail.com Remy Eskinazi Sant Anna University of Pernambuco UPE/IFPE/Unibratec remy.eskinazi@gmail.com Marcílio Feitosa University of Pernambuco UPE/Unibratec marciliofeitosa@uol.com.br Abstract Distance measuring using image processing is relatively simple task when it comes to personal computers as x86 architecture. However, for embedded systems with a smaller processing power, this task becomes more difficult. A laser triangulation distance measurement technique using an embedded micro controlled system, may be interesting from the moment it allows several new applications as autonomy and mobility for blind person trough image processing devices. In this work, the laser triangulation distance measurement is applied trough a laser and a CMOS camera, for an electronic virtual white cane. In order this application be considerable viable, the frames per second (fps) processing rate should be properly measured. Keywords White cane; assistive technolog; accessibility; embedded syste;, image processing. I. INTRODUCTION The demand for assistive technologies for persons with various types of disabilities has always existed. However, with the advance of technology has made possible to develop technological devices capable of supplying albeit rudimentary, these demands. Considering this scenario, one of the most difficult disabilities to be compensated through technology is the loss of vision, especially for the high complexity physics that involves the seen process. Despite its difficulty, several attempts have been made to develop devices capable of giving out to the visually impaired people some information about obstacles around them. Mostly of new devices for this purpose is based on ultrasound technology. Despite these ultrasound based devices work in a satisfactory way, this kind of device still presents some intrinsic problems as ultrasound reflection and interferences from the environment. Within this conception, this work seeks to find new possibilities for electronic canes evaluating the proposed technology in terms of effectiveness and reliability. The embedded image processing is already used in several industrial applications in computing platforms. With new portable cameras widely used today it is possible to develop assistive technologies based on image processing for embedded systems with high reliability of metering. The main purpose of this study is to evaluate the performance of processing image in an embedded system based on the LPC1768 microcontroller working together with a set formed by a CMOS camera TCM8230MD and a laser in order to detect and measure distances so this technique can contribute to implementing equipments such as an electronic cane. Related works about distance measuring technique and imaging processing is shown in Manduchi and Yual [1] work, where are used a camera and a laser line instead of a punctiform laser as presented in this work. At same time, Schumaker s work [2] describes one way to optimize the image buffer treatment on an ARM-7 architecture using the TCM8230MD. The initial idea of this work is that the microcontroller can process images at a rate of 15 frames per second (fps), and latter at 30 fps, so that it can be applied to an embedded system for detecting distance with laser for an application in an electronic cane. The reason for using this technique in this paper is to try a new approach in contrast to existing studies of electronic canes which mostly use ultrasound sensors to measure distances, thus seeking to extend the possibilities for this type of device and to verify the possibility to provide possible improvements of its functionalities. II. PROPOSED ARCHITECTURE The electronic cane is composed by four main elements: A CMOS image capture sensor TCM8230MD[3] as an input element, A 650nm red laser with punctiform emission; A LPC1768 ARM Cortex-M3 microcontroller [4] for processing the received images from the camera and a vibration motor. As it can be seen in fig. 1, besides these components, the system is provided of battery, buzzer speaker and buttons that are used to improve functioning of the system. The microcontroller communicates to the camera through a SPI bus to send commands and through a parallel 8 bit wide bus to receive image data [3]. The microcontroller inputs selected to work as GPIOs are used for reading buttons and turn the laser on. The analog-to-digital converter is used to detect the battery level and inform through a buzzer speaker for the user that the device need to be recharged. The buzzer speaker is also used to indicate problems in detecting the laser spot. The PWM module is used to control the vibration motor by varying its current. The laser and camera work together, in order to detect the distance between the cane, which is in user s hand, and the obstacle in front of him. The micro motor emits mechanical vibrations which varies its intensity according to the distance calculated by image processing. The distance calculation is based on the laser s position on the image captured by the camera.

A. Laser detection on the image In order to validate the strategy for image processing for seek the laser on the image, it was used a device composed by a webcam and a 650nm laser with punctiform emission. The processing was done initially in MATLAB [5] software to avoid possible problems with hardware on the embedded system. On fig. 3 the laser was directed to a region near the center of image. As it can be seen, here there are other red points from other objects scattered around different areas of the image. Also, it is possible to note that the laser point region appears as white area indicating saturation on the CMOS sensor. Thus, is difficult to distinguish between the laser center and the white colors from other parts of the image just by using the RGB values as reference. Fig. 1. Device system architecture. III. DESIGN FLOW The design flow presented in fig. 2 details the development steps which will be described ahead. Additionally, the steps include other details: the method for distance detection using a laser, hardware assembly, image processing algorithm validation and system validation tests. Fig. 3. Laser beam pointed near image center area. On fig. 4 it can be seen the result of an attempt to detect the laser point by seeking pixels with the laser border color. In this case, several points were falsely reported as a laser border pixel after the image processing. These false points lead the system to calculate and indicate the obstacle distance in an incorrect way. Fig. 4. Result from fig. 3 image processing using a pixel color based algorithm Fig. 2. Project design flow. B. Solving the camera saturation problem Considering that this image treatment is done by a microcontroller, the use of conventional techniques for image

p: is an auxiliary variable that represents the proportional value that the laser to one image border compared to the image entire horizontal length. The value of p can be calculated according to (4). l: is a variable that represents the length of the real size of the image on a distance d. l represents the real length of the distance between the laser point and the left image border on a distance d. The value of l can be calculated according to (5). (4) 2..tan (5) IV. RESULTS A. Processor performance On fig. 9 is shown a system performance analysis trough a tool provided by Keil Software[6]. The result shows that the embedded system based on the LPC1768 microcontroller was able to process every frame from camera in less than 2 milliseconds. This can be confirmed by looking at the task number 5 where the time grid is adjusted for 2ms. This performance is enough to process images at a rate of 30 frames per second without using the maximum microcontroller processing power and providing a satisfactory response time for the user. For this proposed model from fig. 5 to be valid, conditions (6) and (7) should be respected. (6) 0 1 (7) Finally, the variable x, which represents the distance between laser emitter and camera. The line from fig. 8 was determined using the following parameters: x= 3 cm, α= 22, β= 22,5. Through these parameters was possible to calculate the limit values for the telemetry process which were: da=3,3cm e db=2,22m. The x value was defined by the real distance between laser and camera used in the prototype for tests in an X86 architecture. α was defined experimentally and β was calculated in order to limit the maximum distance detection to 2,22 meters. 2,5 2 1,5 1 0,5 0 1 20 39 58 77 96 115 134 153 172 191 210 229 248 267 286 305 Calculated Distance Measured Distance Fig. 8. Obstacle Distance(m) vs. laser position on screen(pixels). On fig. 8 the x-axis represents the horizontal pixel number of an image. The ordinate axis corresponds to the calculated real distance in meters between obstacle and camera for each pixel on an image. Fig. 9. Time used by each RTOS task in a 2 milliseconds scale. B. Laser/camera set performance The device presented a good performance for indoor applications. In this scenario the system easily identified the laser on the image captured by the camera including several types of surfaces. However, surfaces as glass and mirrors represent limitations to the system. In this cases, depending on how the laser reflection occurs, the device may inform a wrong distance or it is unable to detect the laser, which result in an absence of the distance value and in an audible indication with the buzzer on the device. For external environments the device works with the same performance presented on indoor applications. However, for external daytime environment the excess of light makes it harder to the camera to find the laser spot. In this scenario was difficult to detect the laser spot for distances longer than 1 meter, which implies in a significant decrease on the system performance. C. Vibration feedback analysis Feedback through vibration has proved valid. However, by analyzing the environment scanning tests, it was possible to conclude that for a small variation in obstacle distance is difficult to perceive the variation in the vibration intensity.

D. Power consumption On laboratory measurements, device average current consumption on an active state was 300mA (it means using the camera, laser, and vibration motor). Considering that the device is powered by a 2500 mah battery, the autonomy is theoretically of 8 hours. This autonomy time is acceptable for an embedded device which does no stay activated all the time, once the device start to work only when the user requires it. One problem that contributes for decreasing the system performance is that the vibration motor needs a duty cycle from the PWM module larger than 30% to overcome the motor s inertia. When the system is in an inactive state, the average current consumption is about 1mA. This is a high consumption for an inactive state on an embedded device. However, it is possible to reduce it by managing some power features of the microcontroller. V. CONCLUSIONS This paper presented a methodology for distance measurement based on a geometrical model using a laser and a CMOS camera. The model considers an embedded system with a LPC1768 microcontroller for calculate the obstacle distance and detect the laser on camera image. This methodology applies to a low cost white cane design for helping impaired visually people. Experiments show the effectiveness of distance measurement and use of a vibration motor as a distance feedback to the user. Although the device works as expected, it is possible implement new strategies to improve power consumption performance. REFERENCES [1] D. Yuan and R. Manduchi, Dynamic environment exploration using a virtual white cane. Computer Vision and Pattern Recognition, 2005. CVPR 2005.IEEE Computer Society Conference on. Volume 1. 243-249.IEEE. [2] J.L. Shumaker, Interfacing the TCM8230MD CMOS Camera With an ARM7 microcontroller. Army Research Laboratory. ARL-TN-344, 2009. [3] K.J. Kuchenbecker and Y. Wang, HALO: Haptic Alerts for Lowhanging Obstacles in White Cane Navigation., 2012, Unpublished. [4] S.A. Bouhamed, J.F. Eleuch, I.K. Kalleland and D.S. Masmoudi, New electronic cane for visually impaired people for obstacle detection and recognition, IEEE Internetional Conference on Vehicular Electronics and Safety. Istambul, Turkey, 2012. [5] NXP Semiconductors, Datasheet UM10360 LPC17XX User manual, 2010. [6] TOSHIBA. CMOS Digital Integrated Circuit Silicon Monolithic TCM8230MD (A). V1.2, 2005. [7] ARM Ltd and Germany GmbH, Getting started building applications with RL-ARM, 2009. [8] T.C. Poon and P.P. Banerjee, Contemporary optical image processing with MATLAB, Ed Elsevier Science, 2012. [9] A.N. Sloss, D. Symes and C. Wright, ARM system developer s guide: designing and optimizing system software, Morgan Kaufmann, 2004. [10] J. Yiu, The definitive guide to the ARM Cortex-M3, Ed Newnes, 2009.