Distance measuring based on stereoscopic pictures



Similar documents
How To Use 3D On A Computer Or Tv

CASE HISTORY #2. APPLICATION: Piping Movement Survey using Permalign Laser Measurement System

Lines. We have learned that the graph of a linear equation. y = mx +b

Optics. Kepler's telescope and Galileo's telescope. f 1. f 2. LD Physics Leaflets P Geometrical optics Optical instruments

Affine Transformations

Polynomials and Factoring

Assessment of Camera Phone Distortion and Implications for Watermarking

Basic Theory of Intermedia Composing with Sounds and Images

Measuring. User Manual

A CLASSROOM NOTE ON PARABOLAS USING THE MIRAGE ILLUSION

THE PARABOLA section

INFITEC - A NEW STEREOSCOPIC VISUALISATION TOOL BY WAVELENGTH MULTIPLEX IMAGING

Geometric Camera Parameters

Reflection and Refraction

Differentiation of 3D scanners and their positioning method when applied to pipeline integrity

Basler. Line Scan Cameras

Protocol for Microscope Calibration

Spatial location in 360 of reference points over an object by using stereo vision

Traffic Monitoring Systems. Technology and sensors

Theremino System Theremino Spectrometer Technology

Electrical Resonance

Synthetic Sensing: Proximity / Distance Sensors

Method for measuring stereo camera depth accuracy based on stereoscopic vision

Interference. Physics 102 Workshop #3. General Instructions

Plane Stress Transformations

Fric-3. force F k and the equation (4.2) may be used. The sense of F k is opposite

How To Draw A Circle Of Radius 1 On A Computer (For A Computer)

Glencoe. correlated to SOUTH CAROLINA MATH CURRICULUM STANDARDS GRADE 6 3-3, , , 4-9

Auto Head-Up Displays: View-Through for Drivers

Ground Rules. PC1221 Fundamentals of Physics I. Kinematics. Position. Lectures 3 and 4 Motion in One Dimension. Dr Tay Seng Chuan

Robot Perception Continued

Optical laser beam scanner lens relay system

Introduction.

Understanding astigmatism Spring 2003

HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS.

Optical Illusions Essay Angela Wall EMAT 6690

PHYSIOLOGICALLY-BASED DETECTION OF COMPUTER GENERATED FACES IN VIDEO

COMPONENTS OF VECTORS

T-REDSPEED White paper

STRAND: ALGEBRA Unit 3 Solving Equations

Addition and Subtraction of Vectors

MEMORANDUM. All students taking the CLC Math Placement Exam PLACEMENT INTO CALCULUS AND ANALYTIC GEOMETRY I, MTH 145:

Mouse Control using a Web Camera based on Colour Detection

SECTION 2.2. Distance and Midpoint Formulas; Circles

Rodenstock Photo Optics

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Introduction to 3D Imaging

Section 11.4: Equations of Lines and Planes

E70 Rear-view Camera (RFK)

Stereo 3D Monitoring & Correction

Jitter Measurements in Serial Data Signals

DICOM Correction Item

A technical overview of the Fuel3D system.

waves rays Consider rays of light from an object being reflected by a plane mirror (the rays are diverging): mirror object

Calculating Astronomical Unit from Venus Transit

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

Subjective evaluation of a 3D videoconferencing system

Virtual Fitting by Single-shot Body Shape Estimation

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Light and its effects

Characteristic and use

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

KINECT PROJECT EITAN BABCOCK REPORT TO RECEIVE FINAL EE CREDIT FALL 2013

LEICA TRI-ELMAR-M mm f/4 ASPH. 1

Anamorphic Projection Photographic Techniques for setting up 3D Chalk Paintings

ε: Voltage output of Signal Generator (also called the Source voltage or Applied

Note monitors controlled by analog signals CRT monitors are controlled by analog voltage. i. e. the level of analog signal delivered through the

P R E A M B L E. Facilitated workshop problems for class discussion (1.5 hours)

Template-based Eye and Mouth Detection for 3D Video Conferencing

Object Recognition and Template Matching

Understanding Line Scan Camera Applications

Endoscope Optics. Chapter Introduction

Synchronization in. Distributed Systems. Cooperation and Coordination in. Distributed Systems. Kinds of Synchronization.

A Binary Adaptable Window SoC Architecture for a StereoVision Based Depth Field Processor

A method of generating free-route walk-through animation using vehicle-borne video image

In this this review we turn our attention to the square root function, the function defined by the equation. f(x) = x. (5.1)

6. LECTURE 6. Objectives

Color and Light. DELTA SCIENCE READER Overview Before Reading Guide the Reading After Reading

3. Solve the equation containing only one variable for that variable.

Understanding Shaft Alignment: Thermal Growth. Published in Maintenance Technology 1/2003

Integrating algebraic fractions

Machine Tool Inspection & Analyzer solutions

TS-E24mm f/3.5l TS-E45mm f/2.8 TS-E90mm f/2.8 Instructions

HDTV: A challenge to traditional video conferencing?

Chapter 4 One Dimensional Kinematics

Scanners and How to Use Them

Static Environment Recognition Using Omni-camera from a Moving Vehicle

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

OPERATING P L AYA B I L I T Y P R E C I S I O N D E S I G N

Space Perception and Binocular Vision

3D Drawing. Single Point Perspective with Diminishing Spaces

Adobe Lens Profile Creator User Guide. Version 1.0 Wednesday, April 14, 2010 Adobe Systems Inc

AUDIO POINTER V2 JOYSTICK & CAMERA IN MATLAB

A System for Capturing High Resolution Images

Optimao. In control since Machine Vision: The key considerations for successful visual inspection

Displaying Stereoscopic 3D (S3D) with Intel HD Graphics Processors for Software Developers August 2011

Digital Photogrammetric System. Version USER MANUAL. Block adjustment

SPINDLE ERROR MOVEMENTS MEASUREMENT ALGORITHM AND A NEW METHOD OF RESULTS ANALYSIS 1. INTRODUCTION

Making High Dynamic Range (HDR) Panoramas with Hugin

INTRODUCTION TO ERRORS AND ERROR ANALYSIS

Transcription:

9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia istance measuring d on stereoscopic pictures Jernej Mrovlje 1 and amir Vrančić Abstract Stereoscopy is a technique used for recording and representing stereoscopic (3) images. It can create an illusion of depth using two pictures taken at slightly different positions. There are two possible way of taking stereoscopic pictures: by using special two-lens stereo cameras or systems with two single-lens cameras joined together. Stereoscopic pictures allow us to calculate the distance from the camera(s) to the chosen object within the picture. The distance is calculated from differences between the pictures and additional technical data like focal length and distance between the cameras. The certain object is selected on the left picture, while the same object on the right picture is automatically detected by means of optimisation algorithm which searches for minimum difference between both pictures. The calculation of object s position can be calculated by doing some geometrical derivations. The accuracy of the position depends on picture resolution, optical distortions and distance between the cameras. The results showed that the calculated distance to the subject is relatively accurate. I. INTROUCTION Methods for measuring the distance to some objects can be divided into active and passive ones. The active methods are measuring the distance by sending some signals to the object [, 4] (e.g. laser beam, radio signals, ultra-sound, etc.) while passive ones only receive information about object s position (usually by light). Among passive ones, the most popular are those relying on stereoscopic measuring method. The main characteristic of the method is to use two cameras. The object s distance can be calculated from relative difference of object s position on both cameras [6]. The paper shows implementation of such algorithm within program package Matlab and gives results of eperiments d on some stereoscopic pictures taken in various locations in space. II. STEREOSCOPIC MEASUREMENT METHO Stereoscopy is a technique used for recording and representing stereoscopic (3) images. It can create an illusion of depth using two pictures taken at slightly different positions. In 1838, ritish scientist Charles Wheatstone invented stereoscopic pictures and viewing devices [1, 5, 7]. Stereoscopic picture can be taken with a pair of cameras, 1 Jernej Morvlje is with the Faculty of Electrical Engineering, University of Ljubljana, Slovenia; (e-mail: jernej.mrovlje@gmail.com). amir Vrančić is was with epartment of Systems and Control, J. Stefan Institute, Jamova 39, 1 Ljubljana, Slovenia (e-mail: damir.vrancic@ijs.si). similarly to our own eyes. The most important restrictions in taking a pair of stereoscopic pictures are the following: cameras should be horizontally aligned (see Figure 1), and the pictures should be taken at the same instant [3]. Note that the latest requirement is not so important for static pictures (where no object is moving). horizontal vertical error vertical error Fig. 1. Proper alignment of the cameras (a) and alignments with vertical errors (b) and (c). Stereoscopic pictures allow us to calculate the distance between the camera(s) and the chosen object within the picture. Let the right picture be taken in location S R and the left picture in location S L. represents the distance between the cameras and φ is camera s horizontal angle of view (see Figure ). Object s position (distance ) can be calculated by doing some geometrical derivations. We can epress distance as a sum of distances 1 and : ( a) ( b) ( c) = + ϕ, (1) 1 + = tanϕ1 tan if optical aes of the cameras are parallel, where φ 1 and φ are angles between optical ais of camera lens and the chosen object. istance is as follows: = () tanϕ 1 + tanϕ

9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia where φ is camera s horizontal angle of view and picture resolution (in piels). 1 ϕ1 ϕ ϕ 1 ϕ ϕ 1 ϕ S L Fig.. The picture of the object (tree) taken with two cameras. S R S L Using images 3 and 4 and basic trigonometry, we find: 1 tanϕ1 =, (3) ϕ tan Fig. 3. The picture of the object (tree) taken with left camera. istance can be calculated as follows: tanϕ =. (4) ϕ tan ϕ = ϕ tan L ( ) Therefore, if the distance between the cameras (), number of horizontal piels ( ), the viewing angle of the camera (ϕ ) and the horizontal difference between the same object on both pictures ( L - ) are known, then the distance to the object () can be calculated as given in epression (5). The accuracy of the calculated position (distance ) depends on several variables. Location of the object in the right picture can be found within accuracy of one piel (see Fig. 5). Each piel corresponds to the following angle of view ϕ ϕ = (5), (6) Fig. 4. The picture of the object (tree) taken with right camera. Image 5 shows one piel angle of view φ which results in distance error. In image we find: ϕ S R

9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia tanϕ tan ( ϕ ϕ) + =, (7) Using basic trigonometry identities, the distance error can be epressed as follows: subtracted images are, lower is the mean absolute value of matri I i. Image 6 shows two eamples of subtraction. Images in the first eample do not match, while in the second eample images match almost perfectly. = tan( ϕ). (8) Note that the actual error might be higher due to optical errors (barrel or pincushion distortion, aberration, etc.) [8]. L R ϕ IL ( NN ) I ( MN) Fig. 6. Selected object on the left picture (red colour) and search area in the right picture (green colour). R ϕ STEP 1 I R1 MN NN 1p STEP I R STEP3 p I R3 Fig. 5. istance error caused by 1 piel positioning error. III. OJECT RECOGNITION When the certain object is selected on the left picture, the same object on the right picture has to be automatically located. Let the square matri I L represent selected object on the left picture (see Figure 6). Knowing the properties of stereoscopic pictures (the objects are horizontally shifted), we can define search area within the right picture matri I R. Vertical dimensions of I R and I L should be the same, while horizontal dimension of I R should be higher. Our net step is to find the location within the search area I R, where the picture best fits matri I L. We do that by subtracting matri I L from all possible submatries (in size of matri I L ) within the matri I R. When size of the matri I L is NN and size of the matri I R is MN, where M>N, M-N+1 submatries have to be checked. Image 7 shows the search process. The result of each subtraction is another matri I i which tells us how similar subtracted images are. More similar two STEP M N STEP M N + 1... 1p I RM N I RM N + 1 Fig. 7. Finding appropriate submatri in the right picture.... The matri I k with the lowest mean of its elements, represents the location where matri I L best fits to matri I R. This is also the location of the chosen object within the right image. Knowing the location of the object in the left and right image allow us to calculate the distance between the pictures: L N M = + k 1. (9)

9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia Eample 1: I Rj E j = mean( I R j I L ) I j = I Rj I L IL Eample : I Rk Ek = mean( I Rk I L ) Fig. 1. Picture with targets (small white panel boards) located at 1m, m, 3m, 4m, 5m and 6m. I k = I Rk I L IL Fig. 8. The difference between selected object in the left picture and submatri on the right picture. y N / + k 1 CL C L Fig. 11. etailed view of Figure 1 (markers). Fig. 9. Calculation of the horizontal distance of the object between left and right picture. After adjusting the cameras to be parallel, the distances were calculated in program package Matlab. The calculated distances for different locations and different markers (1m, m, 3m, 4m, 5m and 6m) are given in Tables 1 to 6. The calculated difference (9) can be used in calculation of object s distance (5). Table 1. Calculated distance to marker at 1m in various locations M / N / N / M / Marker at 1m IV. EXAMPLES Four eperiments have been conducted in order to test the accuracy of our distance-measuring method. We have used si markers placed at distances 1m, m, 3m, 4m, 5m and 6m. Those markers have been placed at various locations in space (nature) and then photographed by two cameras. One of the pictures (with detail view) is shown in Figs. 1 and 11. [m] location 1 location location 3 location 4, 1,81 1,58 1,1 1,34,3 1,63 1,4 9,77 1,47,4 1,45 1,4 1,1 1,8,5 1,9 1,4 9,81 1,13,6 1,18 1,3 9,9 1,18,7 1,18 9,84 9,96 1,13

9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia Table. Calculated distance to marker at m in various locations Table 6. Calculated distance to marker at 6m in various locations Marker at m Marker at 6m [m] location 1 location location 3 location 4, 1,3 1,57 1,57,56,3,86 1,87 1,87,68,4,81,49,49,33,5,4,6,6,6,6,19,59,59,14,7,44,41,41 19,86 Table 3. Calculated distance to marker at 3m in various locations Marker at 3m [m] location 1 location location 3 location 4, 3,8 3,47 33, 9,57,3 31,81 31,56 7,97 31,13,4 3,5 3,93 3,1 3,65,5 3,5 31,55 9, 9,77,6 3,88 31,58 3,6 3,8,7 3,74 3,33 31,5 3,71 [m] location 1 location location 3 location 4, 63,36 67,69 7,67 53,6,3 63,56 64,63 59,9 64,99,4 64,97 6,6 6, 6,95,5 61,95 63,97 6,1 59,4,6 6,7 63,75 6,13 61,1,7 61,57 61,4 61,75 6,55 Figures 1 to 17 show the results graphically. It can be seen that the distance to the subjects is calculated quite well taking into account the camera s resolution, barrel distortion [8] and chosen. As epected, better accuracy is obtained with wider between the cameras. 11 1.5 Location Table 4. Calculated distance to marker at 4m in various locations Marker at 4m [m] location 1 location location 3 location 4 1, 4,49 45,8 4,9 37,98,3 41,67 4,7 4,76 39,78,4 4,4 4,86 38,61 4,93,5 4,17 41,4 4,5 38,4,6 4,69 41,34 41,8 4,4,7 41,1 4,84 39,73 4,5 9.5.1..3.4.5.6.7.8 Fig. 1. Calculated distances to marker set at 1m at 4 different locations..5 Location Table 5. Calculated distance to marker at 5m in various locations Marker at 5m [m] location 1 location location 3 location 4, 5,7 57,53 47,38 46,8,3 54,4 5,74 45,9 5,,4 53,1 51,16 57,5 5,15,5 51, 5,5 49,1 48,1,6 51,41 5,34 51,48 5,3,7 5,3 5,5 53,85 5,7 1.5 1.5 19.5.1..3.4.5.6.7.8 Fig. 13. Calculated distances to marker set at m at 4 different locations.

.1..3.4.5.6.7.8 9th International Ph Workshop on Systems and Control: Young Generation Viewpoint 1. - 3. October 8, Izola, Slovenia 35 8 34 Location 75 Location 33 7 3 31 3 65 6 9 55 8 5 7 45.1..3.4.5.6.7.8 Fig. 14. Calculated distances to marker set at 3m at 4 different locations. Fig. 176. Calculated distances to marker set at 6m at 4 different locations. 48 46 44 4 4 38 Location 36.1..3.4.5.6.7.8 Fig. 15. Calculated distances to marker set at 5m at 4 different locations. V. CONCLUSION The proposed method for non-invasive distance measurement is d upon the pictures taken from two horizontally displaced cameras. The user should select the object on left camera and the algorithm finds similar object on the right camera. From displacement of the same object on both pictures, the distance to the object can be calculated. Although the method is d on relatively simple algorithm, the calculated distance is still quite accurate. etter results are obtained with wider (distance between the cameras). This is all according to theoretical derivations. In future research, the proposed method will be tested on some other target objects. The influence of lens barrel distortion on measurement accuracy will be studied as well. 6 6 58 56 54 5 5 48 46 44 Location 4.1..3.4.5.6.7.8 Fig. 165. Calculated distances to marker set at 5m at 4 different locations. REFERENCES [1] M. Vidmar, AC sodobne stereofotografije z maloslikovnimi kamerami, Cetera, 1998. [] H. Walcher, Position sensing Angle and distance measurement for engineers, Second edition, utterworth- Heinemann Ltd., 1994. [3]. Vrančić and S. L. Smith, Permanent synchronization of camcorders via LANC protocol, Stereoscopic displays and virtual reality systems XIII : 16-19 January, 6, San Jose, California, USA, (SPIE, vol. 655). [4] Navigation, Radio Navigation, Revised edition, Oford Aviation Training, 4. [5] P. Gedei, Ena in ena je tri, Monitor, july 6, http://www.monitor.si/clanek/ena-in-ena-je-tri/, (1.1.7). [6] J. Carnicelli, Stereo vision: measuring object distance using piel offset, http://www.aleandria.nu/ai/blog/entry.asp?e=3, (9.11.7). [7] A. Klein, Questions and answers about stereoscopy, http://www.stereoscopy.com/faq/inde.html, (8.1.7). [8] A. Woods, Image distortions and stereoscopic video systems, http://www.3d.curtin.edu.au/spie93pa.html, (7.1.7).