Computer Vision: Lecture 3

Similar documents
Similar matrices and Jordan form

Geometric Camera Parameters

Linear Algebra Review. Vectors

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Solving Linear Systems, Continued and The Inverse of a Matrix

x = + x 2 + x

Lecture 5: Singular Value Decomposition SVD (1)

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Lecture 2: Homogeneous Coordinates, Lines and Conics

Using row reduction to calculate the inverse and the determinant of a square matrix

Inner product. Definition of inner product

Lecture 5 Least-squares

by the matrix A results in a vector which is a reflection of the given

A note on companion matrices

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Lecture 14: Section 3.3

Solving Systems of Linear Equations

Subspaces of R n LECTURE Subspaces

Orthogonal Diagonalization of Symmetric Matrices

Wii Remote Calibration Using the Sensor Bar

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Auto Head-Up Displays: View-Through for Drivers

Linear Algebra Methods for Data Mining

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Applied Linear Algebra I Review page 1

Solving Systems of Linear Equations

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

EXPERIMENTAL EVALUATION OF RELATIVE POSE ESTIMATION ALGORITHMS

Factorization Theorems

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

Relating Vanishing Points to Catadioptric Camera Calibration

1 Review of Least Squares Solutions to Overdetermined Systems

Inner Product Spaces and Orthogonality

Lecture notes on linear algebra

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

α = u v. In other words, Orthogonal Projection

Similarity and Diagonalization. Similar Matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Data Mining: Algorithms and Applications Matrix Math Review

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

8 Square matrices continued: Determinants

5. Orthogonal matrices

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

MATHEMATICAL METHODS OF STATISTICS

Lecture 2: Geometric Image Transformations

Review Jeopardy. Blue vs. Orange. Review Jeopardy

LINEAR ALGEBRA. September 23, 2010

Solving Quadratic Equations

Daniel F. DeMenthon and Larry S. Davis. Center for Automation Research. University of Maryland

Numerical Methods I Solving Linear Systems: Sparse Matrices, Iterative Methods and Non-Square Systems

Introduction to General and Generalized Linear Models

Linearly Independent Sets and Linearly Dependent Sets

Zeros of a Polynomial Function

5.04 Principles of Inorganic Chemistry II

Methods for Finding Bases

Numerical Analysis Lecture Notes

[1] Diagonal factorization

Torgerson s Classical MDS derivation: 1: Determining Coordinates from Euclidean Distances

Lecture 2 Linear functions and examples

Recall that two vectors in are perpendicular or orthogonal provided that their dot

521493S Computer Graphics. Exercise 2 & course schedule change

1 Introduction to Matrices

INTRODUCTION TO RENDERING TECHNIQUES

Operation Count; Numerical Linear Algebra

LINEAR ALGEBRA W W L CHEN

A =

The Determinant: a Means to Calculate Volume

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

1 VECTOR SPACES AND SUBSPACES

BANACH AND HILBERT SPACE REVIEW

2.3. Finding polynomial functions. An Introduction:

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Linear Equations in Linear Algebra

CS3220 Lecture Notes: QR factorization and orthogonal transformations

2-View Geometry. Mark Fiala Ryerson University

160 CHAPTER 4. VECTOR SPACES

Lecture 8: Signal Detection and Noise Assumption

10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

3D head model from stereo images by a self-organizing neural network

Given a point cloud, polygon, or sampled parametric curve, we can use transformations for several purposes:

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

State of Stress at Point

Question 2: How do you solve a matrix equation using the matrix inverse?

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

March 29, S4.4 Theorems about Zeros of Polynomial Functions

Eigenvalues and Eigenvectors

6. Cholesky factorization

STUDIES IN ROBOTIC VISION, OPTICAL ILLUSIONS AND NONLINEAR DIFFUSION FILTERING

Collaborative Filtering. Radek Pelánek

MAT188H1S Lec0101 Burbulla

Lecture 1: Schur s Unitary Triangularization Theorem

PEST - Beyond Basic Model Calibration. Presented by Jon Traum

2. SPATIAL TRANSFORMATIONS

6. Vectors Scott Surgent (surgent@asu.edu)

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Transcription:

Computer Vision: Lecture 3 Carl Olsson 2013-01-28 Carl Olsson Computer Vision: Lecture 3 2013-01-28 1 / 27

Todays Lecture Camera Calibration The inner parameters - K. Projective vs. Euclidean Reconstruction. Finding the camera matrix. DLT - Direct Linear Transformation. Normalization of uncalibrated cameras. Radial distortion Carl Olsson Computer Vision: Lecture 3 2013-01-28 2 / 27

The Inner Parameters -K The matrix K is the upper triangular matrix: γf sf x 0 K = 0 f y 0. 0 0 1 f - focal length γ - aspect ratio s - skew (x 0, y 0 ) - principal points Carl Olsson Computer Vision: Lecture 3 2013-01-28 3 / 27

The Inner Parameters -K The focal length f fx fy 1 = f 0 0 0 f 0 0 0 1 Re scales the images (e.g. meters pixels). x y 1 Carl Olsson Computer Vision: Lecture 3 2013-01-28 4 / 27

The Inner Parameters -K The principal point (x 0, y 0 ) fx + x 0 fy + y 0 = 1 f 0 x 0 0 f y 0 0 0 1 Re centers the image. Typically transforms the point (0, 0, 1) to the middle of the image. x y 1 480 0 0 640 Carl Olsson Computer Vision: Lecture 3 2013-01-28 5 / 27

The Inner Parameters -K Aspect ratio γfx + x 0 fy + y 0 1 = γf 0 x 0 0 f y 0 0 0 1 Pixels are not always squares but can be rectangular. In such cases the scaling in the x-direction should be different from the y-direction. x y 1 Skew γf sf x 0 0 f y 0 0 0 1 Corrects for tilted pixels. Typically zero. Carl Olsson Computer Vision: Lecture 3 2013-01-28 6 / 27

Projective vs. Euclidean Reconstruction Calibrated Cameras A camera P = K [R t], where the inner parameters K are known is called calibrated. If we change coordinates in the image using x = K 1 x, we get a so called normalized (calibrated) camera x = K 1 K [R t] X = [R t] X. Carl Olsson Computer Vision: Lecture 3 2013-01-28 7 / 27

Projective vs. Euclidean Reconstruction Projective The reconstruction is determined up to a projective transformation. If λx = PX, then for any projective transformation X = H 1 X we have λx = PHH 1 X = PH X. PH is also a valid camera. Carl Olsson Computer Vision: Lecture 3 2013-01-28 8 / 27

Projective vs. Euclidean Reconstruction Euclidean The reconstruction is determined up to a similarity transformation. If λx = [R t]x, then for any similarity transformation we have X = [ λ Q v x = [R t] 1 s 0 s [ sq v 0 1 ] 1 X ] X = [RQ Since RQ is a rotation this is a normalized camera. Rv + t s ] X. Carl Olsson Computer Vision: Lecture 3 2013-01-28 9 / 27

Projective vs. Euclidean Reconstruction Projective Euclidean Arch of triumph, Paris. The reconstructions have exactly the same reprojection error. But the projective coordinate system makes things look strange. Carl Olsson Computer Vision: Lecture 3 2013-01-28 10 / 27

Finding K See lecture notes. Carl Olsson Computer Vision: Lecture 3 2013-01-28 11 / 27

RQ-factorization Theorem If A is an n n matrix then there is an orthogonal matrix Q and a right triangular matrix R such that A = RQ. (If A is invertible and the diagonal elements are chosen the be positive, then the factorization is unique.) Note: In our case we will use K for the triangular matrix and R for the rotation. Carl Olsson Computer Vision: Lecture 3 2013-01-28 12 / 27

Finding K See lecture notes. Carl Olsson Computer Vision: Lecture 3 2013-01-28 13 / 27

Direct Linear Transformation - DLT Finding the camera matrix Use images of a known object to eliminate the projective ambiguity. If X i are 3d-points of a known object, and x i corresponding projections we have λ 1 x 1 = PX 1 λ 2 x 2 = PX 2. λ N x N = PX N. There are 3N equations and 11 + N unknowns. We need 3N 11 + N N 6 points to solve the problem. Carl Olsson Computer Vision: Lecture 3 2013-01-28 14 / 27

Direct Linear Transformation - DLT Matrix Formulation P = p T 1 p T 2 p T 3 where p i are the rows of P The first equality is X T 1 p 1 λ 1 x 1 = 0 X T 1 p 2 λ 1 y 1 = 0 X T 1 p 3 λ 1 = 0, where x 1 = (x 1, y 1, 1). In matrix form X T 1 0 0 x 1 0 X T 1 0 y 1 0 0 X T 1 1 p 1 p 2 p 3 λ 1 Carl Olsson Computer Vision: Lecture 3 2013-01-28 15 / 27 = 0 0 0

Direct Linear Transformation - DLT Matrix Formulation More equations: X T 1 0 0 x 1 0 0... 0 X T 1 0 y 1 0 0... 0 0 X T 1 1 0 0... X T 2 0 0 0 x 2 0... 0 X T 2 0 0 y 2 0... 0 0 X T 2 0 1 0... X T 3 0 0 0 0 x 3... 0 X T 3 0 0 0 y 3... 0 0 X T 3 0 0 1............ } {{ } =M p 1 p 2 p 3 λ 1 λ 2 λ 3 }. {{ } =v = 0 Carl Olsson Computer Vision: Lecture 3 2013-01-28 16 / 27

Direct Linear Transformation - DLT Homogeneous Least Squares See lecture notes... Carl Olsson Computer Vision: Lecture 3 2013-01-28 17 / 27

Singular values decomposition Theorem Each m n matrix M (with real coefficients) can be factorized into M = USV T, where U and V are orthogonal (m m and n n respectively), S = [ diag(σ1, σ 2,..., σ r ) 0 0 0 σ 1 σ 2... σ r > 0 and r is the rank of the matrix. ], Carl Olsson Computer Vision: Lecture 3 2013-01-28 18 / 27

Direct Linear Transformation - DLT Homogeneous Least Squares See lecture notes... Carl Olsson Computer Vision: Lecture 3 2013-01-28 19 / 27

Direct Linear Transformation - DLT Improving the Numerics (Normalization of uncalibrated cameras) The matrix contains entries x i, y j and ones. Since x i and y i can be about a thousand, the numerics are often greatly improved by translating the coordinates such that their center of mass is zero and then rescaling the coordinates to be roughly 1. Change coordinates according to s 0 s x x = 0 s sȳ x. 0 0 1 Solve the homogeneous linear least squares system and transform back to the original coordinate system. Similar transformations for the 3D-points X i may also improve the results. Carl Olsson Computer Vision: Lecture 3 2013-01-28 20 / 27

Pose estimation using DLT 3D points measured using scanning arm. Carl Olsson Computer Vision: Lecture 3 2013-01-28 21 / 27

Pose estimation using DLT 14 points used for computing the camera matrix. Carl Olsson Computer Vision: Lecture 3 2013-01-28 22 / 27

Pose estimation using DLT 14 points used for computing the camera matrix. Carl Olsson Computer Vision: Lecture 3 2013-01-28 23 / 27

Texturing the chair Project the rest of the points into the image. Carl Olsson Computer Vision: Lecture 3 2013-01-28 24 / 27

Texturing the chair Form triangles. Use the texture from the image. Carl Olsson Computer Vision: Lecture 3 2013-01-28 25 / 27

Textured chair Carl Olsson Computer Vision: Lecture 3 2013-01-28 26 / 27

Radial Distortion Not modeled by the K -matrix. Cannot be removed by a projective mapping since lines are not mapped onto lines (see Szeliski). Carl Olsson Computer Vision: Lecture 3 2013-01-28 27 / 27

To do Start working on assignment 2: Exercises 1, 2, 3, 4, 5, 6 and 7. Computer Exercises: 1, 2 and 3. Carl Olsson Computer Vision: Lecture 3 2013-01-28 28 / 27