MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.



Similar documents
Inner Product Spaces

α = u v. In other words, Orthogonal Projection

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Inner product. Definition of inner product

Vector and Matrix Norms

Inner Product Spaces and Orthogonality

Similarity and Diagonalization. Similar Matrices

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Chapter 6. Orthogonality

Section Inner Products and Norms

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Orthogonal Diagonalization of Symmetric Matrices

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Metric Spaces. Chapter Metrics

3. INNER PRODUCT SPACES

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

1 VECTOR SPACES AND SUBSPACES

Section 4.4 Inner Product Spaces

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Numerical Analysis Lecture Notes

Linear Algebra: Vectors

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

Høgskolen i Narvik Sivilingeniørutdanningen STE6237 ELEMENTMETODER. Oppgaver

Inner Product Spaces. 7.1 Inner Products

BANACH AND HILBERT SPACE REVIEW

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

i=(1,0), j=(0,1) in R 2 i=(1,0,0), j=(0,1,0), k=(0,0,1) in R 3 e 1 =(1,0,..,0), e 2 =(0,1,,0),,e n =(0,0,,1) in R n.

5. Orthogonal matrices

Linear Algebra I. Ronald van Luijk, 2012

MAT 1341: REVIEW II SANGHOON BAEK

Recall that two vectors in are perpendicular or orthogonal provided that their dot

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

4.5 Linear Dependence and Linear Independence

MA106 Linear Algebra lecture notes

Let H and J be as in the above lemma. The result of the lemma shows that the integral

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

Orthogonal Projections and Orthonormal Bases

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Rotation Matrices and Homogeneous Transformations

Section 1.1. Introduction to R n

1 Introduction to Matrices

MATH PROBLEMS, WITH SOLUTIONS

Notes on Symmetric Matrices

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Math 241, Exam 1 Information.

Elementary Linear Algebra

5.3 The Cross Product in R 3

Vectors and Vector Spaces

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

MATH APPLIED MATRIX THEORY

Linear Algebra Review. Vectors

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

THE DIMENSION OF A VECTOR SPACE

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

T ( a i x i ) = a i T (x i ).

1 Inner Products and Norms on Real Vector Spaces

Matrix Representations of Linear Transformations and Changes of Coordinates

Computing Orthonormal Sets in 2D, 3D, and 4D

ISOMETRIES OF R n KEITH CONRAD

I. Pointwise convergence

Mathematics Course 111: Algebra I Part IV: Vector Spaces

F Matrix Calculus F 1

WHEN DOES A CROSS PRODUCT ON R n EXIST?

Notes on Linear Algebra. Peter J. Cameron

Math Placement Test Practice Problems

160 CHAPTER 4. VECTOR SPACES

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Vector Spaces 4.4 Spanning and Independence

Figure 1.1 Vector A and Vector F

Vectors Math 122 Calculus III D Joyce, Fall 2012

Mathematical Methods of Engineering Analysis

Applied Linear Algebra I Review page 1

Lecture 5 Principal Minors and the Hessian

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Chapter 17. Orthogonal Matrices and Symmetries of Space

Math 215 HW #6 Solutions

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

SECOND DERIVATIVE TEST FOR CONSTRAINED EXTREMA

1 Norms and Vector Spaces

12.5 Equations of Lines and Planes

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Methods for Finding Bases

Numerical Analysis Lecture Notes

THREE DIMENSIONAL GEOMETRY

1 Completeness of a Set of Eigenfunctions. Lecturer: Naoki Saito Scribe: Alexander Sheynis/Allen Xue. May 3, The Neumann Boundary Condition

Associativity condition for some alternative algebras of degree three

Inequalities of Analysis. Andrejs Treibergs. Fall 2014

Math Practice Exam 2 with Some Solutions

(Basic definitions and properties; Separation theorems; Characterizations) 1.1 Definition, examples, inner description, algebraic properties

Linearly Independent Sets and Linearly Dependent Sets

Transcription:

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α : V R is called a norm on V if it has the following properties: (i) α(x) 0, α(x) = 0 only for x = 0 (positivity) (ii) α(rx) = r α(x) for all r R (homogeneity) (iii) α(x + y) α(x) + α(y) (triangle inequality) Notation. The norm of a vector x V is usually denoted x. Different norms on V are distinguished by subscripts, e.g., x 1 and x 2.

Examples. V = R n, x = (x 1, x 2,...,x n ) R n. x = max( x 1, x 2,..., x n ). x p = ( x 1 p + x 2 p + + x n p) 1/p, p 1. Examples. V = C[a, b], f : [a, b] R. f = max f (x). a x b ( b 1/p f p = f (x) dx) p, p 1. a

Normed vector space Definition. A normed vector space is a vector space endowed with a norm. The norm defines a distance function on the normed vector space: dist(x,y) = x y. Then we say that a sequence x 1,x 2,... converges to a vector x if dist(x,x n ) 0 as n. Also, we say that a vector x is a good approximation of a vector x 0 if dist(x,x 0 ) is small.

Inner product The notion of inner product generalizes the notion of dot product of vectors in R n. Definition. Let V be a vector space. A function β : V V R, usually denoted β(x,y) = x,y, is called an inner product on V if it is positive, symmetric, and bilinear. That is, if (i) x,x 0, x,x = 0 only for x = 0 (positivity) (ii) x, y = y, x (symmetry) (iii) r x, y = r x, y (homogeneity) (iv) x + y,z = x,z + y,z (distributive law) An inner product space is a vector space endowed with an inner product.

Examples. V = R n. x,y = x y = x 1 y 1 + x 2 y 2 + + x n y n. x,y = d 1 x 1 y 1 + d 2 x 2 y 2 + + d n x n y n, where d 1, d 2,...,d n > 0. x,y = (Dx) (Dy), where D is an invertible n n matrix.

Problem. Find an inner product on R 2 such that e 1,e 1 = 2, e 2,e 2 = 3, and e 1,e 2 = 1, where e 1 = (1, 0), e 2 = (0, 1). Let x = (x 1, x 2 ), y = (y 1, y 2 ) R 2. Then x = x 1 e 1 + x 2 e 2, y = y 1 e 1 + y 2 e 2. Using bilinearity, we obtain x,y = x 1 e 1 + x 2 e 2, y 1 e 1 + y 2 e 2 = x 1 e 1, y 1 e 1 + y 2 e 2 + x 2 e 2, y 1 e 1 + y 2 e 2 = x 1 y 1 e 1,e 1 + x 1 y 2 e 1,e 2 + x 2 y 1 e 2,e 1 + x 2 y 2 e 2,e 2 = 2x 1 y 1 x 1 y 2 x 2 y 1 + 3x 2 y 2. It remains to check that x,x > 0 for x 0. x,x = 2x 2 1 2x 1 x 2 + 3x 2 2 = (x 1 x 2 ) 2 + x 2 1 + 2x 2 2.

Example. V = M m,n (R), space of m n matrices. A, B = trace (AB T ). If A = (a ij ) and B = (b ij ), then A, B = m Examples. V = C[a, b]. f, g = f, g = b a b a f (x)g(x) dx. f (x)g(x)w(x) dx, i=1 j=1 where w is bounded, piecewise continuous, and w > 0 everywhere on [a, b]. w is called the weight function. n a ij b ij.

Theorem Suppose x,y is an inner product on a vector space V. Then x,y 2 x,x y,y for all x,y V. Proof: For any t R let v t = x + ty. Then v t,v t = x,x + 2t x,y + t 2 y,y. The right-hand side is a quadratic polynomial in t (provided that y 0). Since v t,v t 0 for all t, the discriminant D is nonpositive. But D = 4 x,y 2 4 x,x y,y. Cauchy-Schwarz Inequality: x,y x,x y,y.

Cauchy-Schwarz Inequality: x,y x,x y,y. Corollary 1 x y x y for all x,y R n. Equivalently, for all x i, y i R, (x 1 y 1 + + x n y n ) 2 (x1 2 + + x2 n)(y1 2 + + y2 n). Corollary 2 For any f, g C[a, b], ( b 2 b f (x)g(x) dx) f (x) 2 dx a a b a g(x) 2 dx.

Norms induced by inner products Theorem Suppose x,y is an inner product on a vector space V. Then x = x,x is a norm. Proof: Positivity is obvious. Homogeneity: rx = rx, rx = r 2 x,x = r x,x. Triangle inequality (follows from Cauchy-Schwarz s): x + y 2 = x + y,x + y = x,x + x,y + y,x + y,y x,x + x,y + y,x + y,y x 2 + 2 x y + y 2 = ( x + y ) 2.

Examples. The length of a vector in R n, x = x1 2 + x2 2 + + x2 n, is the norm induced by the dot product x y = x 1 y 1 + x 2 y 2 + + x n y n. ( b 1/2 The norm f 2 = f (x) dx) 2 on the vector space C[a, b] is induced by the inner product f, g = b a a f (x)g(x) dx.

Angle Since x,y x y, we can define the angle between nonzero vectors in any vector space with an inner product (and induced norm): (x,y) = arccos x,y x y. Then x,y = x y cos (x,y). In particular, vectors x and y are orthogonal (denoted x y) if x,y = 0.

x x + y y Pythagorean Law: x y = x + y 2 = x 2 + y 2 Proof: x + y 2 = x + y,x + y = x,x + x,y + y,x + y,y = x,x + y,y = x 2 + y 2.

y x y x x + y x y Parallelogram Identity: x + y 2 + x y 2 = 2 x 2 + 2 y 2 Proof: x+y 2 = x+y,x+y = x,x + x,y + y,x + y,y. Similarly, x y 2 = x,x x,y y,x + y,y. Then x+y 2 + x y 2 = 2 x,x + 2 y,y = 2 x 2 + 2 y 2.

Orthogonal sets Let V be an inner product space with an inner product, and the induced norm. Definition. A nonempty set S V of nonzero vectors is called an orthogonal set if all vectors in S are mutually orthogonal. That is, 0 / S and x,y = 0 for any x,y S, x y. An orthogonal set S V is called orthonormal if x = 1 for any x S. Remark. Vectors v 1,v 2,...,v k V form an orthonormal set if and only if { 1 if i = j v i,v j = 0 if i j

Examples. V = R n, x,y = x y. The standard basis e 1 = (1, 0, 0,...,0), e 2 = (0, 1, 0,...,0),..., e n = (0, 0, 0,...,1). It is an orthonormal set. V = R 3, x,y = x y. v 1 = (3, 5, 4), v 2 = (3, 5, 4), v 3 = (4, 0, 3). v 1 v 2 = 0, v 1 v 3 = 0, v 2 v 3 = 0, v 1 v 1 = 50, v 2 v 2 = 50, v 3 v 3 = 25. Thus the set {v 1,v 2,v 3 } is orthogonal but not orthonormal. An orthonormal set is formed by normalized vectors w 1 = v 1 v 1, w 2 = v 2 w 3 = v 3 v 3. v 2,

π V = C[ π, π], f, g = f (x)g(x) dx. π f 1 (x) = sin x, f 2 (x) = sin 2x,..., f n (x) = sin nx,... f m, f n = π π sin(mx) sin(nx) dx = { π if m = n 0 if m n Thus the set {f 1, f 2, f 3,... } is orthogonal but not orthonormal. It is orthonormal with respect to a scaled inner product f, g = 1 π π π f (x)g(x) dx.

Orthogonality = linear independence Theorem Suppose v 1,v 2,...,v k are nonzero vectors that form an orthogonal set. Then v 1,v 2,...,v k are linearly independent. Proof: Suppose t 1 v 1 + t 2 v 2 + + t k v k = 0 for some t 1, t 2,...,t k R. Then for any index 1 i k we have t 1 v 1 + t 2 v 2 + + t k v k,v i = 0,v i = 0. = t 1 v 1,v i + t 2 v 2,v i + + t k v k,v i = 0 By orthogonality, t i v i,v i = 0 = t i = 0.

Orthonormal bases Let v 1,v 2,...,v n be an orthonormal basis for an inner product space V. Theorem Let x = x 1 v 1 + x 2 v 2 + + x n v n and y = y 1 v 1 + y 2 v 2 + + y n v n, where x i, y j R. Then (i) x,y = x 1 y 1 + x 2 y 2 + + x n y n, (ii) x = x 2 1 + x2 2 + + x2 n. Proof: (ii) follows from (i) when y = x. n n n x,y = x i v i, y j v j = x i v i, i=1 = n j=1 n x i y j v i,v j = i=1 i=1 j=1 i=1 n x i y i. n y j v j j=1

Orthogonal projection Theorem Let V be an inner product space and V 0 be a finite-dimensional subspace of V. Then any vector x V is uniquely represented as x = p + o, where p V 0 and o V 0. The component p is the orthogonal projection of the vector x onto the subspace V 0. We have o = x p = min v V 0 x v. That is, the distance from x to the subspace V 0 is o.

o x p V 0

Let V be an inner product space. Let p be the orthogonal projection of a vector x V onto a finite-dimensional subspace V 0. If V 0 is a one-dimensional subspace spanned by a vector v then p = x,v v,v v. If v 1,v 2,...,v n is an orthogonal basis for V 0 then p = x,v 1 v 1,v 1 v 1 + x,v 2 v 2,v 2 v 2 + + x,v n v n,v n v n. Indeed, p,v i = n j=1 x,v j v j,v j v j,v i = x,v i v i,v i v i,v i = x,v i = x p,v i = 0 = x p v i = x p V 0.