Accurate and Cheap Robot Range Finder

Similar documents
Synthetic Sensing: Proximity / Distance Sensors

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Robotics. Lecture 3: Sensors. See course website for up to date information.

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

E190Q Lecture 5 Autonomous Robot Navigation

Module 13 : Measurements on Fiber Optic Systems

USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE

Automatic Labeling of Lane Markings for Autonomous Vehicles

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

P r oba bi l i sti c R oboti cs. Yrd. Doç. Dr. SIRMA YAVUZ sirma@ce.yildiz.edu.tr Room 130

Understanding astigmatism Spring 2003

Theremino System Theremino Spectrometer Technology

EPSON SCANNING TIPS AND TROUBLESHOOTING GUIDE Epson Perfection 3170 Scanner

Robot Perception Continued

Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches

Advanced Computer Graphics. Rendering Equation. Matthias Teschner. Computer Science Department University of Freiburg

Jitter Measurements in Serial Data Signals

Reflection and Refraction

Investigation of Color Aliasing of High Spatial Frequencies and Edges for Bayer-Pattern Sensors and Foveon X3 Direct Image Sensors

Face detection is a process of localizing and extracting the face region from the

Chapter 17: Light and Image Formation

Physics 41, Winter 1998 Lab 1 - The Current Balance. Theory

Tobii AB. Accuracy and precision Test report. Tobii Pro X3-120 fw Date: Methodology/Software version: 2.1.7*

Environmental Remote Sensing GEOG 2021

Robotics. Chapter 25. Chapter 25 1

Sensor Modeling for a Walking Robot Simulation. 1 Introduction

Three-dimensional vision using structured light applied to quality control in production line

Palmprint Recognition. By Sree Rama Murthy kora Praveen Verma Yashwant Kashyap

Using Optech LMS to Calibrate Survey Data Without Ground Control Points

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets

Clock Jitter Definitions and Measurement Methods

Waves Sound and Light

Robust and accurate global vision system for real time tracking of multiple mobile robots

Experiment 3 Lenses and Images

Using Image J to Measure the Brightness of Stars (Written by Do H. Kim)

Descriptive Statistics

Bayesian Image Super-Resolution

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Component Ordering in Independent Component Analysis Based on Data Power

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

The Effect of Dropping a Ball from Different Heights on the Number of Times the Ball Bounces

LiDAR for vegetation applications

Assessment. Presenter: Yupu Zhang, Guoliang Jin, Tuo Wang Computer Vision 2008 Fall

Visualizing Data. Contents. 1 Visualizing Data. Anthony Tanbakuchi Department of Mathematics Pima Community College. Introductory Statistics Lectures

IP-S2 Compact+ 3D Mobile Mapping System

Calculation of Minimum Distances. Minimum Distance to Means. Σi i = 1

The process components and related data characteristics addressed in this document are:

Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability

A Comprehensive Set of Image Quality Metrics

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

Integrated sensors for robotic laser welding

A technical overview of the Fuel3D system.

Lecture 9: Introduction to Pattern Analysis

Lecture 8: Signal Detection and Noise Assumption

Predict the Popularity of YouTube Videos Using Early View Data

Robert Collins CSE598G. More on Mean-shift. R.Collins, CSE, PSU CSE598G Spring 2006

Colour Image Segmentation Technique for Screen Printing

2. Colour based particle filter for 3-D tracking

Part 4 fitting with energy loss and multiple scattering non gaussian uncertainties outliers

A Learning Based Method for Super-Resolution of Low Resolution Images

INTRODUCTION TO RENDERING TECHNIQUES

Topographic Change Detection Using CloudCompare Version 1.0

A NEW SUPER RESOLUTION TECHNIQUE FOR RANGE DATA. Valeria Garro, Pietro Zanuttigh, Guido M. Cortelazzo. University of Padova, Italy

Supervised and unsupervised learning - 1

/ Department of Mechanical Engineering. Manufacturing Networks. Warehouse storage: cases or layers? J.J.P. van Heur. Where innovation starts

Rodenstock Photo Optics

THE BOHR QUANTUM MODEL

Tracking of Small Unmanned Aerial Vehicles

Omni Antenna vs. Directional Antenna

Tutorial for Tracker and Supporting Software By David Chandler

HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS.

Optical Design Tools for Backlight Displays

Rodenstock Photo Optics

Video in Logger Pro. There are many ways to create and use video clips and still images in Logger Pro.

Lab #8: Introduction to ENVI (Environment for Visualizing Images) Image Processing

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

PHYS 39a Lab 3: Microscope Optics

BCC Multi Stripe Wipe

FURTHER VECTORS (MEI)

T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN

Tobii Technology AB. Accuracy and precision Test report. X2-60 fw Date: Methodology/Software version: 2.1.7

Basic Data Analysis. Stephen Turnbull Business Administration and Public Policy Lecture 12: June 22, Abstract. Review session.

Fraunhofer Diffraction

- the. or may. scales on. Butterfly wing. magnified about 75 times.

DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS RAYLEIGH-SOMMERFELD DIFFRACTION INTEGRAL OF THE FIRST KIND

Data Mining and Visualization

Video Camera Image Quality in Physical Electronic Security Systems

Understanding Line Scan Camera Applications

Analecta Vol. 8, No. 2 ISSN

The accurate calibration of all detectors is crucial for the subsequent data

Introduction to acoustic imaging

Digital Image Fundamentals. Selim Aksoy Department of Computer Engineering Bilkent University

Measurement with Ratios

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

Automatic and Objective Measurement of Residual Stress and Cord in Glass

Lidar 101: Intro to Lidar. Jason Stoker USGS EROS / SAIC

Introduction to X-Ray Powder Diffraction Data Analysis

Transcription:

Accurate and Cheap Robot Range Finder Ivan Papusha December 12, 28 Abstract A novel high-quality distance sensor for robotics applications is proposed. The sensor relies on triangulation with the offset of a laser line as it is reflected off an object into a cheap webcam. An ML approach to finding the error model showed that the sensor was very accurate when the laser line was found. It is suggested that better line-finding methods than simple color filtering would make the sensor more viable. 1 Introduction Any self-respecting mobile robot requires some sort of distance measurement sensor or range finding mechanism. Among the most accessible are ultrasonic range finders, which time the propagation and reflection of a sonic pulse. While cheap and ubiquitous, ultrasonic rangers, especially when placed in ring configuration, are plagued with fundamental difficulties that make them unattractive for serious localization. At the other end of the spectrum are high-end laser rangefinders, which use light instead of sound to determine distance from time-of-flight. These are often bundled with precision optics and mechanisms that allow one to infer 3D point clouds over large and small human distances. Their precision and reliability makes them ideal for industrial and research applications, but their cost is prohibitive to the hobbyist or researcher on a low budget. To solve the problem of optimizing user cost vs. sensor precision and reliability, we consider a different sensor design. Here, the only materials needed are a laser line generator and a camera (e.g., a cheap webcam). The principle of operation is simple: one points the line generator and the camera in the same direction, but offsets them some known distance d off ( 1cm) The camera sees the laser line, filtered from the rest of the environment based on brightness and color, which is offset whenever it strikes an object. By calculating the offset distance from the expected position of the laser line, the distance to the object can be inferred. This sensor, of course, is not perfect and is subject to its own errors. In particular, its error model and corresponding error parameters are quite different from the error parameters of time-of-flight based rangers such as the ones discussed above. This paper will formulate the error model and use an automatic Maximum Likelihood (ML) method that will learn the fundamental parameters of the error model from a proposed calibration scheme [3] [1]. Results of the sensor and sensor model in action are also discussed. I thank Morgan Quigley for his direction, as well as the STAIR Project for the materials. 1

2 Range Sensor Design 2.1 Geometry and Raycasting The relations among the fundamental lengths and angles described in Figure 1 are derived using a simple ray casting scheme. The sensor is extremely sensitive to construction parameters, so to eliminate systematic error, a sturdy construction is necessary. Note that the camera must be appropriately calibrated for radial distortion (using e.g., a standard checkerboard algorithm [2]), so that these figures make sense. laser line top field of view camera horizon object-laser intersection obstruction d off φ θ θ rot y p horizon (bottom field of view) camera d obj Figure 1: Single-Point simplified ranger geometry, side view. The camera is tilted such that the actual horizon corresponds to the bottom field of view limit. The object is d obj away from the laser line generator, which is separated from the camera by d off, a known distance. The camera sees the point of intersection y rc /y res of the way from the bottom of its viewing plane. In order to triangulate the distance to given object, we use a standard line-plane intersection method. The camera is assumed to be at the origin of a 3-D spherical coordinate system. The plane of intersection is the one generated by the laser line, offset a distance d off below the camera. After calibration [2] and laser line detection (see below), horizontal and vertical angles, θ horiz and θ vert respectively are inferred from the location of the laser line for each column of the display. Finally, the distance d obj to the object is calculated by finding the intersection of a line and a plane. 2.2 Detecting the Laser Line 2.2.1 Color Filtering Assume that the laser line has color represented by the RGB vector v = (r, g, b) T, where r, g, and b are intensity values for the red, green, and blue components of the color respectively. Experimentally, typical values for the components of v are (68, 127, 12) T on a scale from to 255. For every pixel with color w, we can project w onto v to see how strongly w corresponds to v using the dot product. y = w v v 2 ˆv (1) where y R is the strength of the projection onto v and ˆv is the unit vector in the direction of v. Colors strongly correlated with v will give large values of y and colors weakly correlated with v will give small values. 2

2.2.2 Fitting a Gaussian Since each column of pixels corresponds to one offset, and hence one measurement, we can find where the line is by a simple fit to a gaussian. To make this more concrete, suppose that in a given column, the projections onto v are the scalars {y (1), y (2),..., y (r) }, where r is the camera s vertical resolution in pixels. We can therefore treat the values y (1),..., y (r) as entries in a histogram with r bins of size 1. The expected position of the laser line is simply the weighted mean ( r ) 1 r E[y] = y i i y i. (2) Notice that E[y] can be a fraction, which effectively allows a little extra resolution beyond the camera s pixel resolution. In practice, detection works better if the camera is taken slightly out of focus, which blurs out some noise at the expense of a wider distribution over the y values. 3 Sensor Calibration 3.1 Gathering Data 5.5 1 Raw Samples, Object at 2.48 meters 3 Histogram, Object at 2.48 meters 5 4.5 25 4 3.5 2 Raw distance 3 2.5 Frequency 15 2 1 1.5 1 5.5 1 2 3 4 5 6 7 8 9 1 Sample.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 Distance (a) Raw data (b) Histogram Figure 2: Raw data and histogram for 1, measurements of a box 2.48 meters from the sensor. About 63% of the measurements are centered around the true distance. A small number of values are scattered randomly through the entire measurement space, and the rest return z max, when the laser line was not found. A box is placed a known distance d obj 1m in the center third of the camera s field of view. Only those pixels which correspond to the object are considered. After the laser line is found and raycasting is used to extract the distance to the object, one should expect all the measurements in the point cloud to center around the actual distance d obj = µ to the box. Label all the measurements z (1), z (2),..., z (m), where in this case m = 1,. 3.2 Error Model We use a modified decomposition of the measurement density discussed in [3]. In particular, the probability for each measurement z (i), the probability p(z (i) µ) of that measurement given that the actual distance is µ, is split into three distributions 3

Correct measurement: A single-variable gaussian distribution p hit (z (i) µ; σhit 2 ) centered around the correct measurement. Its variance, σhit 2 is a parameter of the model that measures noise. We include in the model only those values corresponding to the measurement space that were not failures, i.e., { p hit (z (i) µ; σhit 2 ) = ηn (µ, σhit 2 ) if z(i) < z max, otherwise. where η is a normalization parameter equal to the cumulative density η = z max N (µ, σ 2 hit ) dz Random Measurement: A uniform distribution p rand (z (i) µ) corresponding to measurement noise through the entire measurement space, i.e., { p rand (z (i) 1/z max if z < z max, µ) = otherwise. Failure: A point-mass distribution p max (z (i) µ) = 1{z = z max } equal to 1 only if the measurement is a failure - e.g., the camera failed to detect a line. The probabilities are mixed using the mixing parameters α hit, α rand, α max, where we constrain α hit + α rand + α max = 1 p(z (i) µ; σhit 2, α hit, α rand, α max ) = α hit α rand α max 3.3 Learning from Data to Infer Error Parameters T p hit (z (i) µ; σ 2 hit ) p rand (z (i) µ) p max (z (i) µ)} (3) We formulate the learning problem of finding the probability density function by maximizing the total likelihood of the parameter θ = (σ 2 hit, α hit, α rand, α max ) T. That is, θ = arg max θ m p(z (i) µ; σhit 2, α hit, α rand, α max ) (4) subject to α hit + α rand + α max = 1, where m is the number of training examples, in this case 1,. 3.4 Results The Expectation Maximization (EM) algorithm [1] was used to solve the constrained optimization problem of Equation 4. The results are summarized in Figure 3. Having run the algorithm for several distances, the mixing parameters remain about the same for the same lighting conditions, while the gaussian width σ hit increases with distance. This makes sense, because as the distance to the object increases, the offsets approach the resolution of the camera. 4 Conclusion 4.1 Advantages Over Ultrasonic Rangers The horizontal resolution of the sensor is limited only by the horizontal resolution of the camera. That is, the distances provided by the sensor are much more granular than a single or even several sonar elements. In addition, the webcam sensor does not suffer from sonar cross-talk, and can make measurements as quickly as the camera can take pictures and hardware can process them. In ultrasonic 4

1 Fit probability density, Object at 2.48 meters.9 Probability Density.8.7.6.5.4 σ =.36536 µ = 2.4785 z hit =.63889 z =.2727 max z rand =.8841.3.2.1.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 Distance (meters) Figure 3: Plot of the probability density function for measurements of distances to a box 2.48 m away. Notice the uniform density spread throughout the entire space and the gaussian centered around the actual distance. The mixing parameters show that about 64% of the results come from the gaussian, and 27% belong to the point-mass failure function. The rest are random readings. sensors, one needs to wait a long time in clock cycles after sending a ping before the ping returns (or fails to return). During this time, no more measurements can be made. The slow measurement-tomeasurement time can be detrimental in a dynamic environment, as the robot can fail to see objects that move quickly in and out of view. Because the webcam sensor relies on light, the camera sees the laser and can calculate offsets almost instantaneously. 4.2 Disadvantages The main problem with the sensor is the sensitivity to the laser line detection. The measurement z max is returned whenever the laser line was not found, which happens quite often. Because the scheme used was based on simple color filtering, any environment containing colors similar to the laser s make it hard to find the line. In particular, the sensor is currently unsuitable in use in bright environments, e.g., outside in daylight. It is possible to make the laser line detection more robust by using knowledge about what it should look like in the camera. We know, for instance, that the line is horizontal or almost horizontal. One can train a classification algorithm to look for horizontal lines in an image, and then decide whether the horizontal line is part of the environment or the laser line. Indeed, this can be the subject of further research. References [1] Andrew Ng. Em algorithm. CS229 Class Notes, 28. [2] Hai Nguyen. Ros wiki: Camera calibration, 28. [3] Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Probabilistic Robotics, chapter 6.3, pages 124 139. MIT Press, 25. 5