Inner products on R n, and more



Similar documents
Chapter 6. Orthogonality

Similarity and Diagonalization. Similar Matrices

Section Inner Products and Norms

Lecture 1: Schur s Unitary Triangularization Theorem

Inner Product Spaces and Orthogonality

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Linear Algebra Review. Vectors

[1] Diagonal factorization

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

MATH APPLIED MATRIX THEORY

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Similar matrices and Jordan form

Notes on Symmetric Matrices

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

University of Lille I PC first year list of exercises n 7. Review

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Finite dimensional C -algebras

Data Mining: Algorithms and Applications Matrix Math Review

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

α = u v. In other words, Orthogonal Projection

Math 312 Homework 1 Solutions

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Lecture 2 Matrix Operations

1 Introduction to Matrices

Orthogonal Diagonalization of Symmetric Matrices

1 VECTOR SPACES AND SUBSPACES

BANACH AND HILBERT SPACE REVIEW

NOTES ON LINEAR TRANSFORMATIONS

GENERATING SETS KEITH CONRAD

DATA ANALYSIS II. Matrix Algorithms

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Chapter 17. Orthogonal Matrices and Symmetries of Space

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Notes on Determinant

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

5. Orthogonal matrices

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Solving Linear Systems, Continued and The Inverse of a Matrix

Orthogonal Projections

by the matrix A results in a vector which is a reflection of the given

1 Sets and Set Notation.

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Introduction to Matrix Algebra

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

T ( a i x i ) = a i T (x i ).

LINEAR ALGEBRA W W L CHEN

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

MAT188H1S Lec0101 Burbulla

Factorization Theorems

The Characteristic Polynomial

Examination paper for TMA4115 Matematikk 3

Inner Product Spaces

Solution to Homework 2

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Applied Linear Algebra I Review page 1

Matrix Representations of Linear Transformations and Changes of Coordinates

Notes on Linear Algebra. Peter J. Cameron

Systems of Linear Equations

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

2.3 Convex Constrained Optimization Problems

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Orthogonal Bases and the QR Algorithm

Math 215 HW #6 Solutions

26. Determinants I. 1. Prehistory

Numerical Analysis Lecture Notes

Section 4.4 Inner Product Spaces

Inner product. Definition of inner product

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

THREE DIMENSIONAL GEOMETRY

ISOMETRIES OF R n KEITH CONRAD

6. Cholesky factorization

Section 9.5: Equations of Lines and Planes

3. INNER PRODUCT SPACES

LINEAR ALGEBRA. September 23, 2010

160 CHAPTER 4. VECTOR SPACES

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

9 Multiplication of Vectors: The Scalar or Dot Product

BILINEAR FORMS KEITH CONRAD

Using row reduction to calculate the inverse and the determinant of a square matrix

Linear Algebra I. Ronald van Luijk, 2012

4 MT210 Notebook Eigenvalues and Eigenvectors Definitions; Graphical Illustrations... 3

Vector and Matrix Norms

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Numerical Methods I Eigenvalue Problems

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Transcription:

Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + + x n y n? The answer is yes and no. For example, the following are inner products on R 2 : < x, y >= 2x 1 y 1 + 3x 2 y 2 < x, y >= x 1 y 1 + 2x 1 y 2 + 2x 2 y 1 + 3x 2 y 2 But the answer is: There are no fancier examples! In fact, this result is even true for finite-dimensional vector spaces over F! Note: In the following, we will denote vectors in R n and C n by column vectors, not [ row-vectors 1. For example, instead of writing x = (1, 2) 2, we will 1 write x =. 2] 2 Inner products on R n In this section, we will prove the following result: Prop: < x, y > is an inner product on R n if and only if < x, y >= x T Ay, where A is a symmetric matrix whose eigenvalues are strictly positive 3 1 This will simplify matters later on 2 Here we mean the point, not the dot product 3 Such a matrix is called symmetric and positive-definite 1

Example 1: For example, if n = 2, and A = [ ] 1 2, we get: 2 3 < x, y >= x 1 y 1 + 2x 1 y 2 + 2x 2 y 1 + 3x 2 y 2 Example 2: If A = I, then < x, y > just becomes the usual dot product! The point is that those are the fanciest examples we can get! Note: From now on, let us denote by (e 1,, e n ) the standard basis of R n. Proof: ( ) Suppose <, > is an inner product of R n, and let x, y R n. Then since (e 1,, e n ) is a basis for R n, there are scalars x 1,, x n and y 1,, y n such that: x = x 1 e 1 + + x n e n y = y 1 e 1 + + y n e n < x, y >= < x 1 e 1 + + x n e n, y 1 e 1 + + y n e n > n = x i y j < e i, e j > = i,j=1 n x i a ij y j i,j=1 Where we define a ij = < e i, e j >. Also, in the second line we used a distributive property similar to (a + b)(c + d) = ac + ad + bc + cd (but for n terms) Now if you let A to be the matrix whose (i, j)-th entry is a ij, then the above becomes: < x, y >= x T Ay Moreover, a ij =< e i, e j >=< e j, e i >= a ji, so A is symmetric. And since A is symmetric, by a theorem in chapter 7, we know that A has n (possibly repeated) eigenvalues λ 1,, λ n. Let v 1,, v n be the corresponding eigenvectors, which are all nonzero. < v i, v i >= v T i Av i = v T i λ i v i = λ i v T i v i 2

However, < v i, v i >, hence λ i v T i v i > 0, but since v T i v i > 0 (since v T i v i is the usual dot product between v i and v i ), this implies λ i > 0, so all the eigenvalues of A are positive ( ) Suppose < x, y >= x T Ay. Then you can check that <, > is linear in each variable. Moreover: < y, x >= y T Ax = (x T A T y) T = (x T Ay) T = x T Ay = < x, y > Where the third equality follows from A T = A and the last equality follows because x T Ay is just a scalar. Finally: < x, x >= x T Ax Now since A is symmetric, A is normal (you will see that later), and hence there exists an invertible matrix P with P 1 = P T, such that A = P DP T (you will learn that later too, i.e. A is orthogonally diagonalizable), where D is the diagonal matrix of eigenvalues λ i of A, and by assumption λ i > 0 for all i. < x, x >= x T Ax = x T P DP T x = (P T x) T DP T x = ydy T Where y = P T x. But then if y = a 1 y 1 + + a n y n and you calculate this out, you should get: < x, x >= ydy T = λ 1 a 2 1 + + λ n a 2 n 0 (since λ i > 0), So < x, x > 0. Moreover, if < x, x >= 0, then λ 1 a 2 1 + + λ n a 2 n = 0, but then a 1 = = a n = 0 (since λ i > 0), but then y = 0, and so x = (P T ) 1 y = (P T ) T y = P y = P 0 = 0. Hence <, > satisfies all the requirements for an inner product, hence (, ) is an inner product! 3

3 Inner products on C n In fact, a similar proof works for C n, except that you have to replace all the transposes by adjoints! (i.e. replace all the T with ). Hence, we get the following result: Prop: < x, y > is an inner product on C n if and only if < x, y >= x Ay, where A is a self-adjoint matrix whose eigenvalues are strictly positive 4 4 Inner products on finite-dimensional vector spaces In fact, if V is a finite-dimensional vector space over F, then a version of the above result still holds, using the following trick: Let n = dim(v ) and (v 1,, v n ) be a basis for V. Here, we will prove the following result gives an explicit description of all inner products on V : Theorem: < x, y > is an inner product on V if and only if: < x, y >= (Mx) AM(y) where A is a self-adjoint matrix with positive eigenvalues 5, where M : V F n is the usual coordinate map given by: a 2 M(v) = M(a 1 v 1 + + a n v n ) =. a n a 1 Example: If V = P 2 (R), then the following is an inner product on V : < a 0 +a 1 x+a 2 x 2, b 0 +b 1 x+b 2 x 2 >= a 0 b 0 +2a 0 b 1 +3a 0 b 2 +2a 1 b 0 +2a 1 b 1 +4a 1 b 2 +3a 2 b 0 +4a 2 b 1 +8a 2 b 2 1 2 3 Here A = 2 2 4 3 4 3 4 Note that A self-adjoint implies that A has only real eigenvalues 5 If V is a vector space over R, then replace self-adjoint with symmetric and with T 4

Proof: ( ): Check that < x, y > is an inner product on V (this is similar to the proof in section 2) ( ): First of all, note from chapter 3 that M is invertible, with inverse: M 1 a 1. a n Now let <, > be an inner product on V. = a 1 v 1 + + a n v n Lemma: Then (x, y ) :=< M 1 (x ), M 1 (y ) > is an inner product on F n Proof: The only tricky thing to prove is that (x, x ) = 0 implies x = 0. However: (x, x ) = 0 < M 1 (x ), M 1 (x ) >= 0 M 1 (x ) = 0 x = 0 Where in the second implication, we used that <, > is an inner product on V, and in the third implication, we used that M 1 is injective. But since (x, y ) is an inner product on F n, by sections 2 and 3, we get that: (x, y ) = (x ) Ay For some self-adjoint (or symmetric) matrix A with only positive eigenvalues. But then it follows that < M 1 (x ), M 1 (y ) >= (x ) Ay Now let x, y be arbitrary vectors in V. Then we can write x = M 1 M(x) and y = M 1 M(y) < x, y >=< M 1 M(x), M 1 M(y) >= M(x) AM(y) Where in the second equality, we used the above result with x = M(x) and y = M(y). 5