A Short Tutorial on Three-Dimensional Cameras

Similar documents
Synthetic Sensing: Proximity / Distance Sensors

Range sensors. Sonar. Laser range finder. Time of Flight Camera. Structured light. 4a - Perception - Sensors. 4a 45

3D/4D acquisition. 3D acquisition taxonomy Computer Vision. Computer Vision. 3D acquisition methods. passive. active.

3D Scanner using Line Laser. 1. Introduction. 2. Theory

A receiver TDC chip set for accurate pulsed time-of-flight laser ranging

T-REDSPEED White paper

Robot Perception Continued

National Performance Evaluation Facility for LADARs

High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

E190Q Lecture 5 Autonomous Robot Navigation

A Comparison of 3D Sensors for Wheeled Mobile Robots

Experimental Evaluation of State of the Art 3D-Sensors for Mobile Robot Navigation 1)

A Laser Scanner Chip Set for Accurate Perception Systems

Automatic Labeling of Lane Markings for Autonomous Vehicles

ZEISS T-SCAN Automated / COMET Automated 3D Digitizing - Laserscanning / Fringe Projection Automated solutions for efficient 3D data capture

Fig.1. The DAWN spacecraft

Survey Sensors Hydrofest Ross Leitch Project Surveyor

Lidar 101: Intro to Lidar. Jason Stoker USGS EROS / SAIC

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D MODELING OF LARGE AND COMPLEX SITE USING MULTI-SENSOR INTEGRATION AND MULTI-RESOLUTION DATA

Optical Sensing: 1D to 3D using Time-of-Flight Technology. Shaping the Future of MEMS & Sensors September 10, 2013

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Full Waveform Digitizing, Dual Channel Airborne LiDAR Scanning System for Ultra Wide Area Mapping

RIEGL VQ-480. Airborne Laser Scanning. Airborne Laser Scanner with Online Waveform Processing. visit our website

IP-S2 Compact+ 3D Mobile Mapping System

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Sensor Modeling for a Walking Robot Simulation. 1 Introduction

Types of 3D Scanners and 3D Scanning Technologies.

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

IP-S2 HD. High Definition 3D Mobile Mapping System

Terrain Traversability Analysis using Organized Point Cloud, Superpixel Surface Normals-based segmentation and PCA-based Classification

Ultrasonic Wave Propagation Review

A technical overview of the Fuel3D system.

Wii Remote Calibration Using the Sensor Bar

Camera Technology Guide. Factors to consider when selecting your video surveillance cameras

MH - Gesellschaft für Hardware/Software mbh

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Interference. Physics 102 Workshop #3. General Instructions

PUMPED Nd:YAG LASER. Last Revision: August 21, 2007

Video surveillance camera Installation Guide

AS COMPETITION PAPER 2008

White paper. CCD and CMOS sensor technology Technical white paper

Advances in scmos Camera Technology Benefit Bio Research

Acoustical sensors and a range-gated imaging system in a self-routing network for advanced threat analysis

Sensors and Cellphones

Specifications for DRIVER S REAR VIEW CAMERA

Traffic Monitoring Systems. Technology and sensors

Avalanche Photodiodes: A User's Guide

Adjustment functions for both span and shift have been incorporated

Active and Passive Microwave Remote Sensing

Self-Mixing Laser Diode Vibrometer with Wide Dynamic Range

Synthetic Aperture Radar: Principles and Applications of AI in Automatic Target Recognition

PDF Created with deskpdf PDF Writer - Trial ::

Point Cloud Simulation & Applications Maurice Fallon

Interaction devices and sensors. EPFL Immersive Interaction Group Dr. Nan WANG Dr. Ronan BOULIC

Motion Activated Camera User Manual

Non-Contact Vibration Measurement of Micro-Structures

Characterizing Digital Cameras with the Photon Transfer Curve

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE

How To Use Bodescan For 3D Imaging Of The Human Body

Static Environment Recognition Using Omni-camera from a Moving Vehicle

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

1 laser altimeter. Background & Instructions. exploration extension. Instructions. Background

IP-S3 HD1. Compact, High-Density 3D Mobile Mapping System

Shutter & Aperture Research & Demonstrations

Eye-safe Laser Line Striper for Outside Use

Quasi-Continuous Wave (CW) UV Laser Xcyte Series

PCB Component Placement Inspection

TOSHIBA CCD Image Sensor CCD (charge coupled device) TCD2955D

Owner s Manual

So, you want to make a photo-realistic rendering of the Earth from orbit, eh? And you want it to look just like what astronauts see from the shuttle

Infrared Viewers. Manual

Multipath fading in wireless sensor mote

Crater detection with segmentation-based image processing algorithm

APPLICATION NOTE. Basler racer Migration Guide. Mechanics. Flexible Mount Concept. Housing

MRC High Resolution. MR-compatible digital HD video camera. User manual

Automated Infrared Thermography Complete Solutions from a Single Source

INTEGRATED GEOPHYSICAL AND REMOTE SENSING STUDIES ON GROTTA GIGANTE SHOW CAVE (TRIESTE ITALY) P. Paganini, A. Pavan, F. Coren, A.

AS-M5630U Sony 32X Optical Zoom HD 2MP Network Security Camera Module

Cross-beam scanning system to detect slim objects. 100 mm in

Measuring Laser Power and Energy Output

A NEW SUPER RESOLUTION TECHNIQUE FOR RANGE DATA. Valeria Garro, Pietro Zanuttigh, Guido M. Cortelazzo. University of Padova, Italy

CONVERGENCE OF MMS AND PAVEMENT INSPECTION SENSORS FOR COMPLETE ROAD ASSESSMENT

High-speed Photography with a Still Digital Camera

Range Finding Using Pulse Lasers Application Note

16 th IOCCG Committee annual meeting. Plymouth, UK February mission: Present status and near future

Technical Datasheet Scalar Network Analyzer Model MHz to 40 GHz

UV-NIR LASER BEAM PROFILER

Computer Vision. Image acquisition. 25 August Copyright by NHL Hogeschool and Van de Loosdrecht Machine Vision BV All rights reserved

MAR MEMS 2D Laser Scanning Mirror Data Sheet Rev 2

Rodenstock Photo Optics

A comparison of radio direction-finding technologies. Paul Denisowski, Applications Engineer Rohde & Schwarz

Color holographic 3D display unit with aperture field division

Transcription:

A Short Tutorial on INRIA Grenoble Rhone-Alpes Radu.Horaud@inria.fr http://perception.inrialpes.fr/

Binocular Vision This is also called stereo vision: depth can be inferred by computing a disparity field at each pixel location. It is also referred as passive range. All the animals have two eyes, but only a few species have stereo: predators (tigers, cats), birds (owls), primates (monkeys, humans). Hint: Observe eye placement, eye movements, and nose size! Other animals have developed active range strategies, e.g., sonars (bats, whales, etc.).

3D Sensors They measure depth based on illuminating the scene with a a controlled light source, and and on measuring the backscattered light. There are two types of such sensors: Projected-light sensors that combine the projection of a light pattern with a standard 2D camera and that measure depth via triangulation, and Time-of-flight sensors that measure depth by estimating the time delay from light emission to light detection. There are two TOF principles: (i) modulated-light and (ii) pulsed-light. There are two types of TOF cameras: A point-wise TOF sensor is mounted on a two dimensional (pan-tilt) scanning mechanism, also referred to as a LiDAR, or Light Detection And Ranging. Matricial TOF cameras that estimate depth in a single shot using a matrix of TOF sensors (in practice they use CMOS or CCD image sensors coupled with a lens system).

Examples Projected-light cameras: Kinect, Asus Xtion Pro Live, Ensenso, etc. Modulated-light time of flight: SR4000 (Swiss Ranger) Pulsed-light time of flight: Tiger Eye (Advanced Scientific Concepts, Inc.) Pulsed-light rotating scanner: Velodyne.

The Kinect Projected-Light Depth Camera

The ENSENSO Stereo 3D Camera http://fr.ids-imaging.com/ensenso.html

The Time of Flight SR4000 Depth Camera http://www.mesa-imaging.ch/

The VELODYNE LiDAR Scanners HDL-64E HDL-32E http://velodynelidar.com/lidar/lidar.aspx

TigerEye 3D Flash LIDAR Camera http://www.advancedscientificconcepts.com/index.html

The Geometry of Passive Stereo 3D point Le0 camera Right camera Epipolar lines Projec5on center Epipoles

The Geometry of Active Stereo Object Diffused light Emi6ed light Camera Light projector Projec.on center

The Geometry of Active Stereo Perspective effects (foreshortening). Surface discontinuities. Cluttered scenes.

Foreshortening

Discontinuities

Clutter

Surface Reflectivity and External Illumination This is a scientific topic in its own right. Reflectance (or absorption) depends on the material properties and of the wavelength of the incident light. A rough way to think of surface reflectivity is to divide surfaces into diffuse (matte) and specular (mirror). In general a surface is a combination of these effects plus other ones: metal, glossy, rough, transparent, translucent, etc. cause problems in practice. External illumination (daylight, artificial light, etc.) acts as noise added to the infra-red light emitted by the projector.

Reflectivity

Artifacts with the Kinect Camera

Back to Geometry Consider an IR camera and a projector in general positions (before rectification). Parameters to be estimated: The internal parameters of the IR camera The external parameters of the IR camera The projection matrix of the light projector

IR-Camera and Projector Geometry and Calibration

Geometry of Kinect (Rectified)

Geometry of a TOF camera

Depth Estimation The distance is computed with: d = 1 2 cτ In practice τ cannot be measured directly. Continuous wave modulation: The phase difference between the sent and received signals is measured. The modulation frequency is in the range 10 to 100 MHz. Relationship between phase shift and time of flight (f is the modulation frequency): φ = 2πfτ

Cross-Correlation Between Sent and Received Signals Emitted signal: s(t) = a cos(2πft) Received signal: r(t) = A cos(2πf(t τ)) + B Cross-correlation between emitted and received signals: The solution is: 1 T/2 C(x) = lim r(t)s(t + x)dt T T T/2 C(x) = aa 2 cos(2πfτ + 2πfx) + B }{{}}{{} φ ψ

Demodulation Parameters φ is the phase and is defined up to 2π, this is called phase wrapping. A is the amplitude of the received signal and it depends on the object s reflectivity and of the sensor s sensitivity. The amplitude decreases with 1/d 2 mainly due to light spreading. B is an offset coefficient due to the ambient illumination.

The 4-Bucket Method Estimate four values of C(x) at ψ 0 = 0 0, ψ 1 = 90 0, ψ 2 = 180 0, ψ 3 = 270 0

Phase, Amplitude, and Offset From these four values of the cross-correlation signal we obtain the following solutions for the phase, amplitude and offset: ( ) C(x3 ) C(x 1 ) φ = 2πfτ = arctan C(x 0 ) C(x 1 ) A = 1 (C(x3 ) C(x 1 )) 2a 2 + (C(x 0 ) C(x 1 )) 2 B = 1 4 (C(x 0) + C(x 1 ) + (C(x 2 ) + C(x 3 ))

Implementation Based on CCD/CMOS Technologies The CCD (charge-coupled device) plays several roles: Data acquisition or readout operation: the incoming photons are converted into electron charges. Clocking. Signal Processing (demodulation). After the demodulation, the signal C(ψ) is integrated at four equally space intervals, over an equal-length t, within one modulation period T. These four signal values are stored independently. The cycle of integration and storage can be repeated over many periods. Example: for f = 30MHz and at 30 frames per second (FPS), 10 6 integration periods are possible.

Depth from Phase and Modulation Frequency φ 2π Depth: d = 1 2 cτ = c 2f Minimum depth: d min = 0 (φ = 0) Maximum depth: d max = c 2f (φ = 2π). Phase wrapping: an inherent 2π phase ambiguity. Hence the distance is computed up to a wrapping ambiguity: ( ) φ(i, j) d(i, j) = + n(i, j) d max 2π where n = 0, 1, 2,... is the number of wrappings. This can also be written as: d(i, j) = d tof (i, j) + n(i, j)d max where d is the real depth value and d tof is the measured depth value.

More on Phase Wrapping For f = 30MHz, the unambiguous range is from d min = 0 to d max = 5 meters. Solving for this ambiguity is called phase unwrapping. The modulation frequency of the SR4000 camera can be selected by the user. The camera can be operated at: 29/30/31 MHz corresponding to a maximum depth of 5.17m, 5m and 4.84m. 14.5/15/15.5 MHz corresponding to a maximum depth of 10.34m, 10m and 9.67m. The accuracy increases with the modulation frequency.

Observations A TOF camera works at a very precise modulation frequency. Consequently, it is possible to simultaneously and synchronously use several TOF cameras by using a different modulation frequency for each one of the cameras, e.g., six cameras in the case of the SR4000. A TOF camera needs a long integration time (IT), over several time periods, to increase SNR, hence accuracy. In turn, this introduces motion blur in the presence of moving objects. Because of the need of long IT, fast shutter speeds (as done with standard cameras) cannot be envisaged. Sources of errors: Demodulation error; Integration error; Temperature error.

From Depth Values to Euclidean Coordinates The TOF camera measures the depth d from the 3D point M to the optical center, hence: d = M. Relationship between the image coordinates p = (u v 1) and the virtual camera coordinates m = (x y 1) of the point m: m = A 1 p Hence: Z = M m = d A 1 p We obtain for the Euclidean coordinates of the measured point M: X Y Z d = A 1 p A 1 u v 1

TOF Camera Calibration The CCD image sensor of a TOF camera provides both a depth image and an amplitude+offset image. The amplitude+offset image can be used by the OpenCV packages to calibrate the camera, namely its intrinsic and lens-distorsion parameters. However, the low-resolution (approximately 170 150 pixels) of the TOF images implies some practical considerations.

Practical Calibration of a TOF-Stereo Setup

Time of Flight with Pulsed Light A light pulse of a few nanoseconds is generated by a laser; Distance is determined directly from the time delay between emitted light pulse and its reflection; It can use very short pulses with high optical power: The pulse irradiance is much higher than the background irradiance; The emitted laser energy remains low (class 1); Does not suffer from the phase ambiguity problem; It is the technology of choice in a number of outdoor applications under adverse conditions: surveying (static and mobile), autonomous driving, cultural heritage, planetary missions.

Sensor Types Laser scanners use a rotating mirror; 2D (line) scanners (horizontal or vertical); 3D scanners (horizontal and vertical); Multiple-laser 2D line scanners (horizontal and vertical); 3D flash Lidar cameras (provide a depth image without any rotating mechanism).

The Principle of 3D Flash Cameras Elkhalili et al. (2004). IEEE Transactions on Solid-State Circuits. Vol. 39. No 7, pages 1208 1212

3D Flash Lidar Cameras from Advanced Scientific Concepts Inc. Tiger Eye Portable 3D Dragon Eye

TigerEye 3D Video Camera 128 x 128 pixels APD (avalanche photo diode); 30Hz 1570 nm eye-safe laser 3 0 field of view (actual full FOV = 3 0 x 3 0 ); Range up to 1100 meters 9 0 field of view (actual full FOV = 8.6 0 x 8.6 0 ); Range up to 450 meters 45 0 x 22 0 field of view; Range up to 150 meters 45 0 field of view; Range up to 60 meters

DragonEye 3D Flash LIDAR Space Camera 128 128 pixels; 10FPS; Range and intensity video Camera 45 0 x 45 0 field of view (17mm) Range up to 1.5km inclusive (greater depending on diffuser/lens choice) Tested and used by NASA

Use of Lidar Technology for Planetary Exploration

Landing on Moon and Mars NASA s Autonomous Landing and Hazard Avoidance (ALHAT) Lidar sensors: 3-D Imaging Flash Lidar, Doppler Lidar, and Laser Altimeter Five sensor functions: Altimetry, Velocimetry, Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA) and Hazard Relative Navigation (HRN) The Inertial Measurement Units (IMU) suffer from drastic drift over the travel time from the Earth. The IMU drift error can be over 1 km for a Moon-bound vehicle and over 10 km for Mars

Conclusions and Discussion Depth sensors are based on structured light combined with CCD/CMOS technology. Depth is measured either by triangulation or by travel time. Projected-light sensors (Kinect) are more accurate in the range 1-4 meters than time-of-flight sensors. Several modulated-light time-of-flight cameras can work together and/or in combination with other color cameras this opens the door to a wide range of devices that combine (low-resolution) depth cameras with (high-resolution) color cameras. Pulsed-light time-of-flight (3D flash Lidar) cameras can work outdoor, on Moon, Mars, etc. They are expensive! At INRIA in Montbonnot we investigate mixed camera systems (contact me for a short visit).

A Course on 3D Sensors (3DS) http://perception.inrialpes.fr/people/horaud/courses/ 3DS_2013.html