Lecture 16: Cauchy-Schwarz, triangle inequality, orthogonal projection, and Gram-Schmidt orthogonalization (1)

Similar documents
Section Inner Products and Norms

Inner Product Spaces and Orthogonality

Similarity and Diagonalization. Similar Matrices

Orthogonal Diagonalization of Symmetric Matrices

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

NOTES ON LINEAR TRANSFORMATIONS

α = u v. In other words, Orthogonal Projection

3. INNER PRODUCT SPACES

Inner product. Definition of inner product

T ( a i x i ) = a i T (x i ).

Numerical Analysis Lecture Notes

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Inner Product Spaces

Lecture 1: Schur s Unitary Triangularization Theorem

Linear Algebra Review. Vectors

Solutions to Math 51 First Exam January 29, 2015

Orthogonal Projections and Orthonormal Bases

Inner products on R n, and more

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Applied Linear Algebra I Review page 1

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Section 4.4 Inner Product Spaces

Lecture 14: Section 3.3

LEARNING OBJECTIVES FOR THIS CHAPTER

Inner Product Spaces. 7.1 Inner Products

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Lecture 5: Singular Value Decomposition SVD (1)

Notes on Symmetric Matrices

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Matrix Representations of Linear Transformations and Changes of Coordinates

BANACH AND HILBERT SPACE REVIEW

Orthogonal Bases and the QR Algorithm

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

ISOMETRIES OF R n KEITH CONRAD

Systems of Linear Equations

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

by the matrix A results in a vector which is a reflection of the given

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Solving Linear Systems, Continued and The Inverse of a Matrix

Math 312 Homework 1 Solutions

A =

Chapter 17. Orthogonal Matrices and Symmetries of Space

October 3rd, Linear Algebra & Properties of the Covariance Matrix

3 Orthogonal Vectors and Matrices

160 CHAPTER 4. VECTOR SPACES

THE DIMENSION OF A VECTOR SPACE

1 VECTOR SPACES AND SUBSPACES

Recall that two vectors in are perpendicular or orthogonal provided that their dot

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

The Determinant: a Means to Calculate Volume

Notes on Determinant

LINEAR ALGEBRA. September 23, 2010

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Numerical Analysis Lecture Notes

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Factorization Theorems

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Lecture 3: Finding integer solutions to systems of linear equations

LINEAR ALGEBRA W W L CHEN

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

Solution to Homework 2

Chapter 6. Orthogonality

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

( ) which must be a vector

1. Let P be the space of all polynomials (of one real variable and with real coefficients) with the norm

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

1 Sets and Set Notation.

5. Orthogonal matrices

Name: Section Registered In:

Lecture 4: Partitioned Matrices and Determinants

LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS. 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method

MATH1231 Algebra, 2015 Chapter 7: Linear maps

WHEN DOES A CROSS PRODUCT ON R n EXIST?

Finite dimensional C -algebras

LEARNING OBJECTIVES FOR THIS CHAPTER

Linear Algebra: Determinants, Inverses, Rank

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

1 Introduction to Matrices

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Examination paper for TMA4115 Matematikk 3

How To Prove The Dirichlet Unit Theorem

Math 215 HW #6 Solutions

Linear Algebra Notes for Marsden and Tromba Vector Calculus

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Solving Systems of Linear Equations

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Methods for Finding Bases

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n

1 Homework 1. [p 0 q i+j p i 1 q j+1 ] + [p i q j ] + [p i+1 q j p i+j q 0 ]

4.5 Linear Dependence and Linear Independence

Math 67: Modern Linear Algebra (UC Davis, Fall 2011) Summary of lectures Prof. Dan Romik

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Lecture 18 - Clifford Algebras and Spin groups

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

Transcription:

Lecture 16: Cauchy-Schwarz, triangle inequality, orthogonal projection, and Gram-Schmidt orthogonalization (1) Travis Schedler Thurs, Nov 4, 2010 (version: Thurs, Nov 4, 4:40 PM)

Goals (2) Corrected change-of-basis formula! The Cauchy-Schwarz inequality Triangle inequality Orthogonal projection and least-squares approximations Orthonormal bases Gram-Schmidt orthogonalization

Corrected change-of-basis formula! (3) Take the linear transformation T := L A L(Mat(2, 1, F)) of left-multiplication by the matrix ( ) 1 2 A :=. 3 4 Now, in the standard basis (v 1 (, v ) 2 ), M(T ( ) = ) A. 1 0 Consider the new basis v 1 :=, v 1 2 :=. Then we get 1 Av 1 = ( ) 3 = 3v 1 + 4v 7 2, Av 2 = ( ) 2 = 2v 1 + 2v 4 2. Thus, M (v 1,v 2 ) (T ) = ( ) 3 2. 4 2

Change-of-basis continued (4) ( ) v Incorrect change-of-basis matrix: 1 v 2 = S ( ) 1 1 S =. Then 0 1 ( v1 v 2 ), i.e., ( ) ( ) ( ) S 1 1 1 1 2 1 1 M (v1,v 2 )(T )S = 0 1 3 4 0 1 ( ) ( ) ( ) 2 2 1 1 2 4 = = M 3 4 0 1 3 7 (v 1,v 2 ) (T ). Correct one is the transpose: ( v 1 v 2) = ( v1 v 2 ) S. Then ( ) ( ) ( ) S 1 1 0 1 2 1 0 M (v1,v 2 )(T )S = 1 1 3 4 1 1 ( ) ( ) ( ) 1 2 1 0 3 2 = = = M 2 2 1 1 4 2 (v 1,v 2 ) (T ).

Correct change-of-basis and triangular changes-of-basis (5) Correct: Change of basis matrix from (v i ) to (v i ) is: S = M (v i ),(v i )(I ) = put on i-th column the coefficients of v i in terms of v 1,..., v n. (cf. Homework 5.) I.e. S = (s ij ) with v i = s 1i v 1 + + s ni v n. If T L(Mat(n, 1, F)) and (v i ) = standard basis, then the change-of-basis is S = (v 1 v n): put columns together. Corollary (similar to Proposition 5.12) The change-of-basis matrix S = (s ij ) from (v i ) to (v i ) is upper-triangular if and only if v i = s 1i v 1 + + s ii v i for all i, i.e., s ji = 0 for j > i. Corollary (cf. Proposition 5.12) Let S be an upper-triangular change of basis. Then M (vi )(T ) is upper-triangular iff M (v i )(T ) is upper-triangular. Proof: If A is upper triangular, so is S 1 AS, and vice-versa.

The Cauchy-Schwarz inequality (6) Theorem (6.6: Cauchy-Schwarz inequality) u, v u v. Moreover, this is an equality iff one of u and v is a scalar multiple of the other. Proof. It is enough to suppose v 0. Write u = proj v (u) + w. Then u 2 = proj v (u) + w 2 = proj v (u) 2 + w 2 u,v v,v v 2 = u,v 2 v,v v, v = u,v 2. v 2 Hence, u 2 v 2 u, v 2 as desired. Equality holds iff w = 0, i.e., proj v (u) = u, i.e., u is a multiple of v.

Proof of triangle inequality (7) Theorem: u + v u + v. Equality holds iff one of u, v is a positive multiple of the other. u + v 2 = u + v, u + v = u, u + v, v + ( u, v + v, u ) = u 2 + v 2 + 2 Re u, v u 2 + v 2 + 2 u, v u 2 + v 2 + 2 u v = ( u + v ) 2. Here, Re is the real part: Re(a + bi) := a for a, b R. Thus 2 Re(z) = z + z. Equality holds in the first inequality iff u, v is nonnegative. Equality holds in the second inequality iff one of u, v is a multiple of the other. For both equalities to hold: say u = λv and u, v 0: then λ v, v 0, i.e., λ 0.

Orthogonal complement (8) Definition Let U V.Then, U := {u V : u, u = 0, u U}. Theorem (Theorem 6.29) U is a complement to U, i.e., V = U U. Proof. First, u U U implies u, u = 0, i.e., u = 0. So U U = {0}. Let (u 1,..., u k ) be a basis of U. Consider the map T : V F k given by T (v) = ( v, u 1,..., v, u k ). Then, U = null T. Since U U = 0, T U is injective. Since dim U = k = dim F k, T U is also surjective. By rank-nullity, dim V = dim U + dim U. So V = U U.

Orthogonal projection (9) Let U V. Then P U,U Proposition (Proposition 6.36) is the orthogonal projection. Let v V. Then, for all u U, v P U,U (v) v u. Thus, u := P U,U (v) is the best approximation of v by an element of U. It is also called the least-squares approximation because, when V = F n, it minimizes v u = n i=1 v i u i 2. Proof. Write v u = (v P U,U (v)) + (P U,U (v) u). By definition, v P U,U (v) U, and P U,U (v) u U. By the Pythagorean theorem, v u 2 = v P U,U (v) 2 + P U,U (v) u 2. Note: equality holds iff u = P U,U (v).

Orthonormal bases (10) Lemma Suppose that v 1,..., v k is a set of nonzero vectors such that v i v j for all i j. Then v 1,..., v k are linearly independent. Proof. Observe v i, λ 1 v 1 + + λ k v k = λ i v i, v i. If λ i 0, this is nonzero. Hence λ 1 v 1 + + λ k v k 0. Thus, there can be no linear dependence. Definition An orthonormal basis v 1,..., v n of (V,, ) is a basis of V such that v i, v j = δ ij. Definition: A list (v 1,..., v k ) is orthonormal if v i, v j = δ ij. Corollary If dim V = n, then any orthonormal list of length n is an orthonormal basis.

Formula for orthogonal projection (11) Proposition (6.35) Let e 1,..., e k be an orthonormal basis of U. Then, P U,U (v) = v, e 1 e 1 + + v, e k e k = proj e1 v + + proj ek v. Corollary (Theorem 6.17): when U = V and dim V = n, v = v, e 1 e 1 + + v, e n e n. Proof. Let u be the RHS. Then v u, e i = v, e i v, e i = 0 for all i. Hence, v u U. Since u U and v = u + (v u), we deduce u = P U,U (v).

Gram-Schmidt orthogonalization (12) Theorem (6.20: Gram-Schmidt) Given a basis (v 1,..., v n ) of V, there is an upper-triangular change of basis e i := s 1i v 1 + + s ii v i so that (e 1,..., e n ) is orthonormal. Moreover, the s ij are computable by an effective algorithm! Proof. By induction on n = dim V ; base case dim V = 0. Let U := Span(v 1,..., v n 1 ). Inductively, form the orthonormal basis (e 1,..., e n 1 ) of U via upper-triangular change of basis. It suffices to find e n such that e n, e j = δ nj (then automatically upper-triangular). Set e n := v n P U,U (v n ) U. So e n, e j = 0 for j n. e n is nonzero since v n / U. Set e n := e n/ e n. Then e n, e j = δ nj.

Corollaries (13) Corollary (Corollary 6.24) Every finite-dimensional inner-product space has an orthonormal basis. Proof: Apply Gram-Schmidt to an arbitrary basis. Corollary (Corollary 6.25) Every orthonormal list of vectors in V can be extended to an orthonormal basis of V. Proof: Extend to an arbitrary basis and perform Gram-Schmidt. Corollary (Corollary 6.27) Suppose that M(T ) is (block) upper-triangular in some basis. Then there exists an orthonormal basis where it is still (block) upper triangular. Proof: Pick an arbitrary basis in which the matrix is as desired, and then apply Gram-Schmidt. It is an upper-triangular change of basis, so it preserves (block) upper-triangularity.

Further corollaries (14) Corollary (Corollary 6.27+) Over C, there exists an orthonormal basis in which M(T ) is upper-triangular. Over R, there exists an orthonormal basis in which it is block upper-triangular with diagonal blocks of size 1 1 or 2 2. Corollary (Corollary 6.33) (U ) = U. Proof: Clearly U (U ). They are both complements to U, so they have the same dimension. Hence they are equal.

Norm formulas (15) Proposition (Proposition 6.15) Let (e 1,..., e k ) be an orthonormal list. Then a 1 e 1 + + a k e k 2 = a 1 2 + + a k 2. Proof: Repeated application of the Pythagorean theorem, since a i e i, a i e i = a i 2. Corollary Let (e 1,..., e k ) be an orthonormal basis of U. Then P U,U (v) = v, e 1 2 + + v, e k 2. Proof: Apply the previous corollary and P U,U (v) = v, e 1 e 1 + + v, e k e k. When U = V, we get v 2 = v, e 1 2 + + v, e n 2 (cf. Theorem 6.17).