Geometric and Radiometric Camera Calibration

Similar documents
Introduction to CCDs and CCD Data Calibration

Digital image processing

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

Fig.1. The DAWN spacecraft

Solar Radiation Measurement. Bruce W Forgan, WMO RAV Metrology Workshop, Melbourne, Novemberr 2011

Information Contents of High Resolution Satellite Images

Video Camera Image Quality in Physical Electronic Security Systems

Highlight Removal by Illumination-Constrained Inpainting

Integrated sensors for robotic laser welding

Geometric Camera Parameters

Synthetic Sensing: Proximity / Distance Sensors

ING LA PALMA TECHNICAL NOTE No Investigation of Low Fringing Detectors on the ISIS Spectrograph.

Path Tracing. Michael Doggett Department of Computer Science Lund university Michael Doggett

From Pixel to Info-Cloud News at Leica Geosystems JACIE Denver, 31 March 2011 Ruedi Wagner Hexagon Geosystems, Geospatial Solutions Division.

EVIDENCE PHOTOGRAPHY TEST SPECIFICATIONS MODULE 1: CAMERA SYSTEMS & LIGHT THEORY (37)

COOKBOOK. for. Aristarchos Transient Spectrometer (ATS)

A Short Introduction to Computer Graphics

Computer Vision. Image acquisition. 25 August Copyright by NHL Hogeschool and Van de Loosdrecht Machine Vision BV All rights reserved

Introduction to Computer Graphics

PDF Created with deskpdf PDF Writer - Trial ::

Reflectance Measurements of Materials Used in the Solar Industry. Selecting the Appropriate Accessories for UV/Vis/NIR Measurements.

How to calculate reflectance and temperature using ASTER data

An introduction to Global Illumination. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

Colorado School of Mines Computer Vision Professor William Hoff

Treasure Hunt. Lecture 2 How does Light Interact with the Environment? EMR Principles and Properties. EMR and Remote Sensing

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

CBERS Program Update Jacie Frederico dos Santos Liporace AMS Kepler

Lecture 16: A Camera s Image Processing Pipeline Part 1. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Advanced Computer Graphics. Rendering Equation. Matthias Teschner. Computer Science Department University of Freiburg

Digital Image Fundamentals. Selim Aksoy Department of Computer Engineering Bilkent University

INTRODUCTION TO RENDERING TECHNIQUES

A technical overview of the Fuel3D system.

SOLSPEC MEASUREMENT OF THE SOLAR ABSOLUTE SPECTRAL IRRADIANCE FROM 165 to 2900 nm ON BOARD THE INTERNATIONAL SPACE STATION

Optical Design Tools for Backlight Displays

Can we calibrate a camera using an image of a flat, textureless Lambertian surface?

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

Basler. Line Scan Cameras

Lectures Remote Sensing

Reflectance Characteristics of Accuflect Light Reflecting Ceramic

Introduction to acoustic imaging

ENVI Classic Tutorial: Atmospherically Correcting Multispectral Data Using FLAASH 2

Measuring the Doppler Shift of a Kepler Star with a Planet

White Paper. "See" what is important

Assessment of Camera Phone Distortion and Implications for Watermarking

Monte Carlo Path Tracing

Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

Application Note #503 Comparing 3D Optical Microscopy Techniques for Metrology Applications

Characterizing Digital Cameras with the Photon Transfer Curve

We bring quality to light. MAS 40 Mini-Array Spectrometer. light measurement

Realization of a UV fisheye hyperspectral camera

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Environmental Remote Sensing GEOG 2021

The Physics of Energy sources Renewable sources of energy. Solar Energy

Implementing and Using the EMVA1288 Standard

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

Reprint (R22) Avoiding Errors in UV Radiation Measurements. By Thomas C. Larason July Reprinted from Photonics Spectra, Laurin Publishing

Experiment #1, Analyze Data using Excel, Calculator and Graphs.

Scanners and How to Use Them

Theremino System Theremino Spectrometer Technology

Vision based Vehicle Tracking using a high angle camera

Calibration of a High Dynamic Range, Low Light Level Visible Source

SAMPLE MIDTERM QUESTIONS

High Resolution Spatial Electroluminescence Imaging of Photovoltaic Modules

High Resolution Digital Surface Models and Orthoimages for Telecom Network Planning

Computer vision. 3D Stereo camera Bumblebee. 25 August 2014

Performance testing for Precision 500D Classical R/F System

Refractive Index Measurement Principle

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

MULTIPURPOSE USE OF ORTHOPHOTO MAPS FORMING BASIS TO DIGITAL CADASTRE DATA AND THE VISION OF THE GENERAL DIRECTORATE OF LAND REGISTRY AND CADASTRE

Pipeline External Corrosion Analysis Using a 3D Laser Scanner

DYNAMIC RANGE IMPROVEMENT THROUGH MULTIPLE EXPOSURES. Mark A. Robertson, Sean Borman, and Robert L. Stevenson

WHITE PAPER. Source Modeling for Illumination Design. Zemax A Radiant Zemax Company

The RADIANCE Lighting Simulation and Rendering System

Nederland België / Belgique

MAVO-MONITOR / MAVO-SPOT Instrument Set for Contact or Distant Measurements of Luminances

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Solving Simultaneous Equations and Matrices

IMPROVEMENT OF DIGITAL IMAGE RESOLUTION BY OVERSAMPLING

Automotive Applications of 3D Laser Scanning Introduction

Digital Remote Sensing Data Processing Digital Remote Sensing Data Processing and Analysis: An Introduction and Analysis: An Introduction

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

W E B R I N G Q U A L I T Y T O L I G H T. Handbook of LED Metrology

Aperture, Shutter speed and iso

RIEGL VZ-400 NEW. Laser Scanners. Latest News March 2009

REAL-TIME IMAGE BASED LIGHTING FOR OUTDOOR AUGMENTED REALITY UNDER DYNAMICALLY CHANGING ILLUMINATION CONDITIONS

Building an Advanced Invariant Real-Time Human Tracking System

Solar Tracking Application

P R E A M B L E. Facilitated workshop problems for class discussion (1.5 hours)

Fundamentals of modern UV-visible spectroscopy. Presentation Materials

DSLR Astrophotography

Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite

VIIRS-CrIS mapping. NWP SAF AAPP VIIRS-CrIS Mapping

Photography of Cultural Heritage items

Robot Perception Continued

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

Feature Tracking and Optical Flow

CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

A Proposal for OpenEXR Color Management

Transcription:

Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: Cameras extrinsic parameters, i.e. the geometric relationship between the two cameras. Camera intrinsic parameters, i.e. each camera s internal geometry (e.g. Focal Length) and lens distortion effects. Shape From Shading requires radiometric knowledge of: Camera detector uniformity (e.g. Flat-field images) Camera detector temperature noise (e.g. Dark frame images) Camera detector bad pixels Camera Digital Number (DN) to radiance transfer function 1

Camera Radiometric Calibration - 1 All cameras require radiometric calibration, and this is most important for correct science image data interpretation. Basic radiometric calibration requires three types of image exposure: Dark Frame Typically a long exposure with no light being received by the camera detector (e.g. lens cap on). This is required to remove extraneous detector noise, and is very easy to perform for terrestrial applications. However, dark frames must be captured at those temperatures that will be experienced by the detector when capturing images. Example dark frame showing hot pixels 2

Camera Radiometric Calibration - 2 Bias Frame - Zero-length camera detector exposure. This is required to compensate for different pixel startup values (due to the bias offset applied to the A-D converter). The main issues here are that bias frames should also be captured at those temperatures that will be experienced by the detector, and to ensure that the camera detector software permits this type of exposure. However, they are not needed if the dark frames match the exposure time of the images from which they are to be subtracted, since dark frames contain bias information. 3

Camera Radiometric Calibration - 3 Flat Field - Exposure of a uniform white surface. This is required to remove artefacts from 2D images that are caused by variations in the pixel-topixel sensitivity of the detector and/or by distortions in the optical path (e.g. dust on lens, vignetting, unequal pixel light sensitivity). This is harder to set up, and should be dark frame adjusted. The main problem is obtaining a white surface that is truly uniform. In order to overcome this problem, an integrating sphere is used. Ideally a flat field image should be dark frame corrected to create a calibrated flat field image. 4

Camera Radiometric Calibration - 4 Field_image Ave_FF_D_images Raw image FF_Dark_corrected_image Calibrated flat field image derived from average of 4 captured flat field images (minus dark frame) Flat field corrected image mean_of_calibrated_flat_field_image corrected_pixelx,y = (raw_pixelx,y Dark_image_pixelx,y ) calibrated_flat_field_pixelx,y 5

Camera Radiometric Calibration - 5 Cameras generate a Digital Number (DN) for each pixel. The camera detector (e.g. a CCD) measures a voltage for each pixel which represents the amount of light that the pixel has received. This voltage is converted to a DN using an analogue to digital converter (ADC). For a commercial off the shelf (COTS) 8-bit camera, then the pixel DN range is from 0 to 255. For applications such as shape from shading, then the DN value for each pixel needs to be converted to a physical quantity called radiance. This is a radiometric quantity, and is useful because it indicates how much of the (light) power emitted by an emitting or reflecting surface will be received by an optical system looking at the surface from some angle of view. The units of radiance are W/m 2 /sr. The steradian (sr) unit is the 3D cousin to the 2D radians unit. If filters are used on the camera, for example, then spectral radiance is used and the units are: W/m 2 /sr/nm. A transfer function from DN to radiance is required which may, or may not, have a linear relationship. 6

Camera Radiometric Calibration - 6 The DN to radiance transfer function can be obtained via laboratory measurements, and one method is shown below. Spectral Irradiance (W/m 2 ) at the receiving aperture Spectral radiance at the source aperture measured using a spectrometer Nominal distance between source and receiving apertures Radius of source aperture Radius of receiving aperture Example camera images Spectral radiance at the receiving aperture Example transfer function 7

Shape From Shading - 1 The Shape From Shading (SFS) problem is how to compute the 3D shape (e.g. height-map) of a surface from a single black and white image of the surface, i.e. an image that shows the brightness (radiance) of the surface under known illumination conditions. Diagram courtesy E. Prados and O. Faugeras 8

Shape From Shading - 2 In the 70 s Horn 1 was the first to formulate the Shape From Shading problem, and to realise that it required finding the solution to a nonlinear first-order Partial Differential Equation (PDE) referred to as the brightness equation: I (x 1, x 2 ) = R (n (x 1, x 2 )), where (x 1, x 2 ) are the coordinates of a point x in the image. R is the reflectance map, I the brightness image, and n is the surface normal vector of the point x. Many SFS methods assume that the surface has Lambertian reflectance properties. 1 Horn, Berthold K.P., Shape from Shading: A Method for Obtaining the Shape of a Smooth Opaque Object from One View, PhD thesis, 1970, Department Electrical Engineering, MIT. 9

Shape From Shading - 3 For a Lambertian surface, then the reflectance map R is the cosine of the angle between the light vector L(x), and the normal vector n(x) to the surface (Lambert s Law): L R = cos(l,n) = L n n Vector dot product The apparent brightness of Lambertian surface to an observer is the same regardless of the observer s angle of view. The surface represents an ideal diffusely reflecting surface. Note: most real surfaces are not Lambertian (see BRDF link). 10

Shape From Shading - 4 cos( 1 ) < cos( 2 ), therefore Slope A is greater than Slope B Sun Light Surface normal Observer cos(0 ) = 1 cos(90 ) = 0 1 2 Lambertian surface Slope A Slope B 11

Shape From Shading - 5 There have been many algorithms developed to solve the Shape From Shading problem, see: Zhang et. al., Shape from Shading: A Survey, IEEE Trans. Pattern Analysis and Machine Intelligence, 21, 8, 690-706,1999. A recent (AU) solution, called the Large Deformation Optimisation Shape From Shading (LDO-SFS) algorithm, has been generated that shows good results with Mars HRSC images from the Mars Express orbiter, see: R. O Hara, and D. Barnes, A new shape from shading technique with application to Mars Express HRSC images, ISPRS Journal of Photogrammetry and Remote Sensing, 67, 27-34, 2012. LDO-SFS can use different surface reflectance models e.g. Lambertian, or Oren-Nayar. 12

Shape From Shading: LDO-SFS Original single Martian surface (2D) image from HiRISE (MRO) Ortho-image rendered (3D) DEM views created using shape-from-shading 13

Shape From Shading: LDO-SFS Left image is the single 2D HRSC (H1022) image used as the input to the AU SFS algorithm. The right image is the 3D DEM data generated by the SFS algorithm. The DEM has been reverse lighting rendered (as compared to the left input image) to demonstrate the 3D nature of the data. Note that the 3D DEM has not been rendered with the H1022 ortho-image. 14

Shape From Shading: LDO-SFS DEM Visualisation and Slope Maps The left image is a topographic colour-coded image of the SFS generated DEM. Here the white coloured areas are the highest regions, and the dark blue coloured areas are the lowest regions. The right image shows a colour-coded slope map of the SFS DEM data. Green: 0 to < 10, Blue: 10 to < 20, and Red: 20 to <. 15

Shape From Shading: LDO-SFS Single input image Output image NOTE - SFS now with perspective projection 16

Shape From Shading: LDO-SFS Single input image Output image NOTE - SFS now with perspective projection 17

Stereo Vision (SV) versus Shape From Shading (SFS) Both SV and SFS require accurate and precise calibration. SFS requires accurate and precise knowledge of the lighting and observer vectors relative to the scene surface. SV requires two images, SFS requires only one image. SV provides absolute scene scale and dimensions. SFS has no concept of absolute scene scale and dimensions. SV accuracy falls off with distance (remember D 1/d). SFS accuracy not dependent on scene distance. SV works well when texture is present for disparity algorithm, e.g. good on rocks, but poor on sand dunes. SFS does not require texture, but does require that surface reflectance assumptions model reality (e.g. Oren-Nayar etc.). SV good at modelling low-frequency scene structure, whereas SFS is good at modelling high-frequency scene structure. Solution: combine strengths of both methods. For an example see: Cryer, J.E, et. al., Integration of Shape From Shading and Stereo, Pattern Recognition, Vol. 28, No. 7, 1033-1043, 1995. 18