Module 6: Pinhole camera model Lecture 30: Intrinsic camera parameters, Perspective projection using homogeneous coordinates



Similar documents
Geometric Camera Parameters

The Geometry of Perspective Projection

Relating Vanishing Points to Catadioptric Camera Calibration

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Jim Lambers MAT 169 Fall Semester Lecture 25 Notes

Lecture 8 : Coordinate Geometry. The coordinate plane The points on a line can be referenced if we choose an origin and a unit of 20

521493S Computer Graphics. Exercise 2 & course schedule change

Least-Squares Intersection of Lines

DICOM Correction Item

Essential Mathematics for Computer Graphics fast

Projective Geometry. Projective Geometry

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Solution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH

9.4. The Scalar Product. Introduction. Prerequisites. Learning Style. Learning Outcomes

INTRODUCTION TO RENDERING TECHNIQUES

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

V-PITS : VIDEO BASED PHONOMICROSURGERY INSTRUMENT TRACKING SYSTEM. Ketan Surender

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Section Continued

JUST THE MATHS UNIT NUMBER 8.5. VECTORS 5 (Vector equations of straight lines) A.J.Hobson

Lecture 14. Point Spread Function (PSF)

The elements used in commercial codes can be classified in two basic categories:

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

Three-dimensional vision using structured light applied to quality control in production line

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

Lecture 2: Homogeneous Coordinates, Lines and Conics

Solving Simultaneous Equations and Matrices

MAT 1341: REVIEW II SANGHOON BAEK

12.5 Equations of Lines and Planes

Linear Equations in Linear Algebra

4BA6 - Topic 4 Dr. Steven Collins. Chap. 5 3D Viewing and Projections

Optical Tracking Using Projective Invariant Marker Pattern Properties

n 2 + 4n + 3. The answer in decimal form (for the Blitz): 0, 75. Solution. (n + 1)(n + 3) = n lim m 2 1

Reflection and Refraction

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Rotation Matrices and Homogeneous Transformations

Thin Lenses Drawing Ray Diagrams

LINES AND PLANES CHRIS JOHNSON

Refraction of Light at a Plane Surface. Object: To study the refraction of light from water into air, at a plane surface.

Understanding Line Scan Camera Applications

STUDIES IN ROBOTIC VISION, OPTICAL ILLUSIONS AND NONLINEAR DIFFUSION FILTERING

PHYSIOLOGICALLY-BASED DETECTION OF COMPUTER GENERATED FACES IN VIDEO

Photogrammetry Metadata Set for Digital Motion Imagery

Video stabilization for high resolution images reconstruction

Geometry Course Summary Department: Math. Semester 1

MODULATION TRANSFER FUNCTION MEASUREMENT METHOD AND RESULTS FOR THE ORBVIEW-3 HIGH RESOLUTION IMAGING SATELLITE

waves rays Consider rays of light from an object being reflected by a plane mirror (the rays are diverging): mirror object

Orthogonal Projections

Refractive Index Measurement Principle

Determine whether the following lines intersect, are parallel, or skew. L 1 : x = 6t y = 1 + 9t z = 3t. x = 1 + 2s y = 4 3s z = s

Introduction Epipolar Geometry Calibration Methods Further Readings. Stereo Camera Calibration

PARAMETRIC MODELING. David Rosen. December By carefully laying-out datums and geometry, then constraining them with dimensions and constraints,

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition

Solutions to Math 51 First Exam January 29, 2015

Lecture 2: Geometric Image Transformations

Rutgers Analytical Physics 750:228, Spring 2016 ( RUPHY228S16 )

0.8 Rational Expressions and Equations

Equations Involving Lines and Planes Standard equations for lines in space

10.5. Click here for answers. Click here for solutions. EQUATIONS OF LINES AND PLANES. 3x 4y 6z 9 4, 2, 5. x y z. z 2. x 2. y 1.

THE problem of visual servoing guiding a robot using

Chapter 36 - Lenses. A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University

RELEASE NOTES CameraTracker 1.0. CameraTracker Release Notes

Understanding astigmatism Spring 2003

Graphing Linear Equations

Real-Time Automated Simulation Generation Based on CAD Modeling and Motion Capture

ENGN D Photography / Winter 2012 / SYLLABUS

Largest Fixed-Aspect, Axis-Aligned Rectangle

Anamorphic imaging with three mirrors: a survey

Math 241 Lines and Planes (Solutions) x = 3 3t. z = 1 t. x = 5 + t. z = 7 + 3t

THREE DIMENSIONAL GEOMETRY

The Effective Number of Bits (ENOB) of my R&S Digital Oscilloscope Technical Paper

Matt Cabot Rory Taca QR CODES

Image Formation. 7-year old s question. Reference. Lecture Overview. It receives light from all directions. Pinhole

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM Arun Prakash 1

Integrated sensors for robotic laser welding

1.5 SOLUTION SETS OF LINEAR SYSTEMS

Solving Systems of Linear Equations

Mechanics lecture 7 Moment of a force, torque, equilibrium of a body

Resolution Enhancement of Photogrammetric Digital Images

TWO-DIMENSIONAL TRANSFORMATION

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Unified Lecture # 4 Vectors

Application of Data Matrix Verification Standards

Automatic Detection of PCB Defects

Daniel F. DeMenthon and Larry S. Davis. Center for Automation Research. University of Maryland

Math 241, Exam 1 Information.

Highlight Removal by Illumination-Constrained Inpainting

Solutions to Practice Problems

Wii Remote Calibration Using the Sensor Bar

5.3 The Cross Product in R 3

Astromechanics. 1 solar day = sidereal days

C) D) As object AB is moved from its present position toward the left, the size of the image produced A) decreases B) increases C) remains the same

4. CAMERA ADJUSTMENTS

Transcription:

The Lecture Contains: Pinhole camera model 6.1 Intrinsic camera parameters A. Perspective projection using homogeneous coordinates B. Principal-point offset C. Image-sensor characteristics file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_1.htm[12/31/2015 1:23:40 PM]

Pinhole camera model In this section, we describe the image acquisition process known as the pinhole camera model, which is widely used. More specifically, we first discuss the model that integrates the internal or intrinsic camera parameters, such as the focal length and the lens distortion. Secondly, we extend the presented simple camera model to integrate external or extrinsic camera parameters corresponding to the position and orientation of the camera. 6.1 Intrinsic camera parameters The pinhole camera model defines the geometric relationship between a 3D point and its 2D corresponding projection onto the image plane. When using a pinhole camera model, this geometric mapping from 3D to 2D is called a perspective projection. We denote the center of the perspective projection (the point in which all the rays intersect) as the optical center or camera center and the line perpendicular to the image plane passing through the optical center as the optical axis (see Figure 6.2 ). Additionally, the intersection point of the image plane with the optical axis is called the principal point. The pinhole camera that models a perspective projection of 3D points onto the image plane can be described as follows. file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_2.htm[12/31/2015 1:23:40 PM]

A. Perspective projection using homogeneous coordinates Let us consider a camera with the optical axis being collinear to the center being located at the origin of a 3D coordinate system (see Figure 6.1). - axis and the optical Figure 6.1 The ideal pinhole camera model describes the relationship between a 3D point and its corresponding 2D projection. file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_3.htm[12/31/2015 1:23:40 PM]

The projection of a 3D world point onto the image plane at pixel position can be written as (6.3) where f denotes the focal length. To avoid such a non-linear division operation, the previous relation can be reformulated using the projective geometry framework, as (6.4) This relation can be the expressed in matrix notation by (6.5) where is the homogeneous scaling factor. file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_4.htm[12/31/2015 1:23:41 PM]

B. Principal-point offset Most of the current imaging systems define the origin of the pixel coordinate system at the top-left pixel of the image. However, it was previously assumed that the origin of the pixel coordinate system corresponds to the principal point, located at the center of the image (see Figure 6.2(a)). A conversion of coordinate systems is thus necessary. Using homogeneous coordinates, the principal-point position can be readily integrated into the projection matrix. The perspective projection equation becomes now (6.6) C. Image-sensor characteristics To derive the relation described by Equation ( 6.6 ), it was implicitly assumed that the pixels of the image sensor are square, i.e., aspect ratio is 1 : 1 and pixels are not skewed. However, both assumptions may not always be valid. First, for example, an NTSC TV system defines non-square pixels with an aspect ratio of 10 : 11. In practice, the pixel aspect ratio is often provided by the image-sensor manufacturer. Second, pixels can potentially be skewed, especially in the case that the image is acquired by a frame grabber. file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_5.htm[12/31/2015 1:23:41 PM]

In this particular case, the pixel grid may be skewed due to an inaccurate synchronization of the pixelsampling process. Both previously mentioned imperfections of the imaging system can be taken into account in the camera model, using the parameters? and t, which model the pixel aspect ratio and skew of the pixels, respectively (see Figure 6.2(b) ). The projection mapping can be now updated as (6.7) with being a 3D point defined with homogeneous coordinates. In practice, when employing recent digital cameras, it can be safely assumed that pixels are square and nonskewed. The projection matrix that incorporates the intrinsic parameters is denoted as K. The all zero element vector is denoted by. Figure 6.2 (a) The image and camera coordinate system. (b) Non-ideal image sensor with non-square, skewed pixels. file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2030/30_6.htm[12/31/2015 1:23:41 PM]