Orthogonal Projections



Similar documents
Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

α = u v. In other words, Orthogonal Projection

Recall that two vectors in are perpendicular or orthogonal provided that their dot

ISOMETRIES OF R n KEITH CONRAD

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

[1] Diagonal factorization

Math 215 HW #6 Solutions

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Review Jeopardy. Blue vs. Orange. Review Jeopardy

MATH APPLIED MATRIX THEORY

x = + x 2 + x

Chapter 6. Orthogonality

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Solutions to Math 51 First Exam January 29, 2015

5. Orthogonal matrices

1 Introduction to Matrices

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Vector Notation: AB represents the vector from point A to point B on a graph. The vector can be computed by B A.

NOTES ON LINEAR TRANSFORMATIONS

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

LINEAR ALGEBRA. September 23, 2010

Section 1.1. Introduction to R n

LINEAR ALGEBRA W W L CHEN

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Vector Math Computer Graphics Scott D. Anderson

Chapter 17. Orthogonal Matrices and Symmetries of Space

Inner product. Definition of inner product

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

3. INNER PRODUCT SPACES

Math 312 Homework 1 Solutions

Question 2: How do you solve a matrix equation using the matrix inverse?

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

T ( a i x i ) = a i T (x i ).

Linear Algebra Notes

Inner Product Spaces and Orthogonality

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

9 Multiplication of Vectors: The Scalar or Dot Product

Lecture 14: Section 3.3

LINES AND PLANES CHRIS JOHNSON

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Eigenvalues and Eigenvectors

Orthogonal Projections and Orthonormal Bases

5.3 The Cross Product in R 3

Solving Systems of Linear Equations

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

A =

Linear Algebra Notes for Marsden and Tromba Vector Calculus

3 Orthogonal Vectors and Matrices

( ) which must be a vector

Linear Algebra Review. Vectors

MAT188H1S Lec0101 Burbulla

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Data Mining: Algorithms and Applications Matrix Math Review

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Applied Linear Algebra

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Subspaces of R n LECTURE Subspaces

Inner products on R n, and more

MATH1231 Algebra, 2015 Chapter 7: Linear maps

12.5 Equations of Lines and Planes

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Section Continued

Inner Product Spaces

Equations Involving Lines and Planes Standard equations for lines in space

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

1 Sets and Set Notation.

4.3 Least Squares Approximations

Similar matrices and Jordan form

1 VECTOR SPACES AND SUBSPACES

Methods for Finding Bases

Orthogonal Diagonalization of Symmetric Matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Name: Section Registered In:

Similarity and Diagonalization. Similar Matrices

MAT 242 Test 3 SOLUTIONS, FORM A

LINEAR ALGEBRA W W L CHEN

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Grassmann Algebra in Game Development. Eric Lengyel, PhD Terathon Software

Least-Squares Intersection of Lines

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Solving Simultaneous Equations and Matrices

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Matrix Representations of Linear Transformations and Changes of Coordinates

University of Tampere Computer Graphics 2013 School of Information Sciences Exercise

Using row reduction to calculate the inverse and the determinant of a square matrix

Unified Lecture # 4 Vectors

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

160 CHAPTER 4. VECTOR SPACES

521493S Computer Graphics. Exercise 2 & course schedule change

Lecture 5: Singular Value Decomposition SVD (1)

6. Cholesky factorization

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

Transcription:

Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors in R n, and let W = Span(X,..., X k ). In other words, the vectors X,..., X k form a basis for the k-dimensional subspace W of R n. Suppose we are given another vector Y R n. How can we project Y onto W orthogonally? In other words, can we find a vector Ŷ W so that Y Ŷ is orthogonal (perpendicular) to all of W? See Figure. To begin, translate this question into the language of matrices and dot products. We need to find a vector Ŷ W such that (Y Ŷ ) Z, for all vectors Z W. () Actually, it s enough to know that Y Ŷ is perpendicular to the vectors X,..., X k that span W. This would imply that () holds. (Why?) Expressing this using dot products, we need to find Ŷ W so that Xi T (Y Ŷ ) =, for all i =,,..., k. () This condition involves taking k dot products, one for each X i. We can do them all at once by setting up a matrix A using the X i as the columns of A, that is, let A = X X X k. Note that each vector X i R n has n coordinates, so that A is an n k matrix. The set of conditions listed in () can now be re-written: A T (Y Ŷ ) =, which is equivalent to A T Y = A T Ŷ. ()

Figure : Projection of a vector onto a subspace. Meanwhile, we need the projected vector Ŷ to be a vector in W, since we are projecting onto W. This means that Ŷ lies in the span of the vectors X,..., X k. In other words, c Ŷ = c X + c X + + c k X k = A. c c k = AC. where C is a k-dimensional column vector. On combining this with the matrix equation () we have A T Y = A T AC. If we knew what C was then we would also know Ŷ, since we were given the columns X i of A, and Ŷ = AC. To solve for C just invert the k k matrix AT A to get (A T A) A T Y = C. () How do we know that (A T A) exists? Let s assume it does for now, and then address this question later on. Now finally we can find our projected vector Ŷ. Since Ŷ = AC, multiply both sides of () to obtain A(A T A) A T Y = AC = Ŷ. The matrix Q = A(A T A) A T is called the projection matrix for the subspace W. According to our derivation above, the projection matrix Q maps a vector Y R n to its orthogonal projection (i.e. its shadow) QY = Ŷ in the subspace W. It is easy to check that Q has the following nice properties:

. Q T = Q.. Q = Q. One can show that any matrix satisfying these two properties is in fact a projection matrix for its own column space. You can prove this using the hints given in the exercises. There remains one problem. At a crucial step in the derivation above we took the inverse of the k k matrix A T A. But how do we know this matrix is invertible? It is invertible, because the columns of A, the vectors X,..., X k, were assumed to be linearly independent. But this claim of invertibility needs a proof. Lemma. Suppose A is an n k matrix, where k n, such that the columns of A are linearly independent. Then the k k matrix A T A is invertible. Proof of Lemma : Suppose that A T A is not invertible. In this case, there exists a vector X such that A T AX =. It then follows that (AX) (AX) = (AX) T AX = X T A T AX = X T =, so that the length AX = (AX) (AX) =. In other words, the length of AX is zero, so that AX =. Since X, this implies that the columns of A are linearly dependent. Therefore, if the columns of A are linearly independent, then A T A must be invertible. Example: Compute the projection matrix Q for the -dimensional subspace W of R spanned by the vectors (,,, ) and (,,, ). What is the orthogonal projection of the vector (,,, ) onto W? Then Solution: Let A T A = [ 6 A = so that the projection matrix Q is given by [ Q = A(A T A) A T =. and (A T A) = 6 [ [ 6, = We can now compute the orthogonal projection of the vector (,,, ) onto W. This is 7 =

Reflections We have seen earlier in the course that reflections of space across (i.e. through) a plane is linear transformation. Like rotations, a reflection preserves lengths and angles, although, unlike rotations, a reflection reverses orientation ( handedness ). Once we have projection matrices it is easy to compute the matrix of a reflection. Let W denote a plane passing through the origin, and suppose we want to reflect a vector v across this plane, as in Figure. Let u denote a unit vector along W, that is, let u be a normal to the plane W. We will think of u and v as column vectors. The projection of v along the line through u is then given by: ˆv = P roj u (v) = u(u T u) u T v. But since we chose u to be a unit vector, u T u = u u =, so that ˆv = P roj u (v) = uu T v. Let Q u denote the matrix uu T, so that ˆv = Q u v. What is the reflection of v across W? It is the vector Refl W (v) that lies on the other side of W from v, exactly the same distance from W as is v, and having the same projection into W as v. See Figure. The distance between v and its reflection is exactly twice the distance of v to W, and the difference between v and its reflection is perpendicular to W. That is, the difference between v and its reflection is exactly twice the projection of v along the unit normal u to W. This observation yields the equation: so that v Refl W (v) = Q u v, Refl W (v) = v Q u v = Iv Q u v = (I uu T )v. The matrix H W = I uu T is called the reflection matrix for the plane W, and is also sometimes called a Householder matrix. Example: Compute the reflection of the vector v = (,, ) across the plane x y + 7z =. Solution: The vector w = (,, 7) is normal to the plane, and w T w = + () + 7 =, so a unit normal will be u = w w = (,, 7). The reflection matrix is then given by H = I uu T = I wwt = I 7 7 [ 7 =

so that Figure : Reflection of the vector v across the plane W. = H = 7 7 7 9 /7 /7 /7 /7 6/7 7/7 /7 7/7 /7 The reflection of v across W is then given by /7 /7 /7 Hv = /7 6/7 7/7 /7 7/7 /7 =, 9/7 8/7 /7

Reflections and Projections Notice in Figure that the projection of v into W is the midpoint of the vector v and its reflection Hv = Refl W (v); that is, Qv = (v + Hv) or, equivalently Hv = Qv v, where Q = Q W denotes the projection onto W. (This is not the same as Q u in the previous section, which was projection onto the normal of W.) Recall that v = Iv, where I is the identity matrix. Since these identities hold for all v, we obtain the matrix identities: Q = (I + H) and H = Q I, So once you have computed the either the projection or reflection matrix for a subspace of R n, the other is quite easy to obtain. 6

Exercises. Suppose that M is an n n matrix such that M T = M = M. Let W denote the column space of M. (a) Suppose that Y W. (This means that Y = MX for some X.) Prove that MY = Y. (b) Suppose that v is a vector in R n. Why is Mv W? (c) If Y W, why is v Mv Y? (d) Conclude that Mv is the projection of v into W.. Compute the projection of the vector v = (,, ) onto the plane x + y z =.. Compute the projection matrix Q for the subspace W of R spanned by the vectors (,,, ) and (,,, ).. Compute the orthogonal projection of the vector z = (,,, ) onto the subspace W of Problem. above. What does your answer tell you about the relationship between the vector z and the subspace W?. Recall that a square matrix P is said to be an orthogonal matrix if P T P = I. Show that Householder matrices are always orthogonal matrices; that is, show that H T H = I. 6. Compute the Householder matrix for reflection across the plane x + y z =. 7. Compute the reflection of the vector v = (,, ) across the plane x + y z =. What happens when you add v to its reflection? How does this sum compare to your answer from Exercise? Draw a sketch to explain this phenomenon. 8. Compute the reflection of the vector v = (, ) across the line l in R spanned by the vector (, ). Sketch the vector v, the line l and the reflection of v across l. (Do not confuse the spanning vector for l with the normal vector to l.) 9. Compute the Householder matrix H for reflection across the hyperplane x + x x x = in R. Then compute the projection matrix Q for this hyperplane.. Compute the Householder matrix for reflection across the plane z = in R. Sketch the reflection involved. Your answer should not be too surprising! 7

Selected Solutions to Exercises:. We describe two ways to solve this problem. Solution : Pick a basis for the plane. Since the plane is -dimensional, any two independent vectors in the plane will do, say, (,, ) and (,, ). Set A = [ and A T A = The projection matrix Q for the plane is [ Q = A(A T A) A T = / / / / [ = We can now project any vector onto the plane by multiplying by Q: / / / P rojection(v) = Qv = / / / = / / / / / / / / / / / / / / / Solution : First, project v onto the normal vector n = (,, ) to the plane: y = P roj n (v) = v n n = (/, /, /). n n Since y is the component of v orthogonal to the plane, the vector v y is the orthogonal projection of v onto the plane. The solution (given in row vector notation) is as in the previous solution. v y = (,, ) (/, /, /) = (/, /, /), Note: Which method is better? The second way is shorter for hyperplanes (subspaces of R n having dimension n), but finding the projection matrix Q is needed if you are projecting from R n to some intermediate dimension k, where you no longer have a single normal vector to work with. For example, a -subspace in R has a -dimensional orthogonal complement as well, so one must compute a projection matrix in order to project to either component of R, as in the next problem.. Set A = so that AT A = [ 8

The projection matrix Q for the plane is Q = A(A T A) A T = [ [ = 6. Here is a hint: Use the fact that H = I uu T, where I is the identity matrix and u is a unit column vector. What is H T =? What is H T H =? 6. The vector v = (,, ) is normal to the plane x + y z =, so the vector u = (,, ) = v is a unit normal. Expressing u and v as column vectors we find that I uu T = I (/)vv T = I = = [ 9