Inner product. Definition of inner product



Similar documents
Inner Product Spaces

5. Orthogonal matrices

3. INNER PRODUCT SPACES

Inner Product Spaces and Orthogonality

Orthogonal Diagonalization of Symmetric Matrices

α = u v. In other words, Orthogonal Projection

MAT 242 Test 3 SOLUTIONS, FORM A

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Similarity and Diagonalization. Similar Matrices

Linear Algebra Notes for Marsden and Tromba Vector Calculus

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Section Inner Products and Norms

Lecture 14: Section 3.3

9 Multiplication of Vectors: The Scalar or Dot Product

Chapter 6. Orthogonality

Math 215 HW #6 Solutions

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

1 Introduction to Matrices

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Section 1.1. Introduction to R n

Limits and Continuity

LINEAR ALGEBRA W W L CHEN

Section 4.4 Inner Product Spaces

1 VECTOR SPACES AND SUBSPACES

Orthogonal Projections

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

ISOMETRIES OF R n KEITH CONRAD

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

MATH APPLIED MATRIX THEORY

x = + x 2 + x

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Systems of Linear Equations

Linear Algebra Review. Vectors

[1] Diagonal factorization

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Continued Fractions and the Euclidean Algorithm

LEARNING OBJECTIVES FOR THIS CHAPTER

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

Numerical Analysis Lecture Notes

Notes on Symmetric Matrices

LINEAR ALGEBRA. September 23, 2010

BANACH AND HILBERT SPACE REVIEW

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Solving Systems of Linear Equations

1 Inner Products and Norms on Real Vector Spaces

Class Meeting # 1: Introduction to PDEs

Math 4310 Handout - Quotient Vector Spaces

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Math 241, Exam 1 Information.

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

MAT 1341: REVIEW II SANGHOON BAEK

Notes on Determinant

Factorization Theorems

Math Practice Exam 2 with Some Solutions

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Solutions to Math 51 First Exam January 29, 2015

Lecture 5 Least-squares

NOTES ON LINEAR TRANSFORMATIONS

Orthogonal Projections and Orthonormal Bases

5.3 The Cross Product in R 3

WHEN DOES A CROSS PRODUCT ON R n EXIST?

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

LS.6 Solution Matrices

by the matrix A results in a vector which is a reflection of the given

Introduction to Geometric Algebra Lecture II

Vector Math Computer Graphics Scott D. Anderson

Linear Algebra I. Ronald van Luijk, 2012

1 Lecture: Integration of rational functions by decomposition

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Math 312 Homework 1 Solutions

LINEAR ALGEBRA W W L CHEN

JUST THE MATHS UNIT NUMBER 8.5. VECTORS 5 (Vector equations of straight lines) A.J.Hobson

DERIVATIVES AS MATRICES; CHAIN RULE

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Lecture 1: Schur s Unitary Triangularization Theorem

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Quantum Physics II (8.05) Fall 2013 Assignment 4

Inner products on R n, and more

1 Lecture 3: Operators in Quantum Mechanics

Lecture 5: Singular Value Decomposition SVD (1)

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Lecture L3 - Vectors, Matrices and Coordinate Transformations

GROUP ALGEBRAS. ANDREI YAFAEV

Wavelet analysis. Wavelet requirements. Example signals. Stationary signal 2 Hz + 10 Hz + 20Hz. Zero mean, oscillatory (wave) Fast decay (let)

Chapter 20. Vector Spaces and Bases

Metric Spaces. Chapter Metrics

12.5 Equations of Lines and Planes

Vector Spaces 4.4 Spanning and Independence

Vectors Math 122 Calculus III D Joyce, Fall 2012

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

We shall turn our attention to solving linear systems of equations. Ax = b

Transcription:

Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product Slide 2 Definition 1 (Inner product) Let V be a vector space over IR. An inner product (, ) is a function V V IR with the following properties 1. u V, (u, u) 0, and (u, u) = 0 u = 0; 2. u, v V, holds (u, v) = (v, u); 3. u, v, w V, and a, b IR holds (au + bv, w) = a(u, w) + b(v, w). Notation: V together with (, ) is called an inner product space.

Math 20F Linear Algebra Lecture 25 2 Examples The Euclidean inner product in IR 2. Let V = IR 2, and {e 1, e 2 } be the standard basis. Given two arbitrary vectors x = x 1 e 1 + x 2 e 2 and y = y 1 e 1 + y 2 e 2, then (x, y) = x 1 y 1 + x 2 y 2. Slide 3 Notice that (e 1, e 1 ) = 1, (e 2, e 2 ) = 1, and (e 1, e 2 ) = 0. It is also called dot product, and denoted as x y. The Euclidean inner product in IR n. Let V = IR n, and {e i } n i=1 be the standard basis. Given two arbitrary vectors x = n i=1 x ie i and y = n i=1 y ie i, then n (x, y) = x i y i. i=1 Notice that (e i, e j ) = I ij Examples An inner product in the vector space of continuous functions in [0, 1], denoted as V = C([0, 1]), is defined as follows. Given two arbitrary vectors f(x) and g(x), introduce the inner product Slide 4 (f, g) = 1 0 f(x)g(x) dx. An inner product in the vector space of functions with one continuous first derivative in [0, 1], denoted as V = C 1 ([0, 1]), is defined as follows. Given two arbitrary vectors f(x) and g(x), then (f, g) = 1 0 [f(x)g(x) + f (x)g (x)] dx.

Math 20F Linear Algebra Lecture 25 3 Norm An inner product space induces a norm, that is, a notion of length of a vector. Slide 5 Definition 2 (Norm) Let V, (, ) be a inner product space. The norm function, or length, is a function V IR denoted as, and defined as u = (u, u). Example: The Euclidean norm in IR 2 is given by u = (x, x) = (x 1 ) 2 + (x 2 ) 2. Examples The Euclidean norm in IR n is given by u = (x, x) = (x 1 ) 2 + + (x n ) 2. Slide 6 A norm in the space of continuous functions V = C([0, 1]) is given by 1 f = (f, f) = 0 [f(x)] 2 dx. For example, one can check that the length of f(x) = 3x is 1.

Math 20F Linear Algebra Lecture 25 4 Distance A norm in a vector space, in turns, induces a notion of distance between two vectors, defined as the length of their difference. Slide 7 Definition 3 (Distance) Let V, (, ) be a inner product space, and be its associated norm. The distance between u and v V is given by dist(u, v) = u v. Example: The Euclidean distance between to points x and y IR 3 is x y = (x 1 y 1 ) 2 + (x 2 y 2 ) 2 + (x 3 y 3 ) 2. Orthogonal vectors Theorem 1 Let V be a vector space and u, v V. Then, u + v = u v (u, v) = 0. Slide 8 Proof: then, u + v 2 = (u + v, u + v) = u 2 + v 2 + 2(u, v). u v 2 = (u v, u v) = u 2 + v 2 2(u, v). u + v 2 u v 2 = 4(u, v).

Math 20F Linear Algebra Lecture 26 5 Orthogonal vectors Definition 4 (Orthogonal vectors) Let V, (, ) be an inner product space. Two vectors u, v V are orthogonal, or perpendicular, if and only if Slide 9 (u, v) = 0. We call them orthogonal, because the diagonal of the parallelogram formed by u and v have the same length. Theorem 2 Let V be a vector space and u, v V be orthogonal vectors. Then u + v 2 = u 2 + v 2. Example Vectors u = [1, 2] T and v = [2, 1] T in IR 2 are orthogonal with the inner product (u, v) = u 1 v 1 + u 2 v 2, because, Slide 10 (u, v) = 2 2 = 0. The vectors cos(x), sin(x) C([0, 2π]) are orthogonal, with the inner product (f, g) = 2π 0 fg dx, because 2π 2π (cos(x), sin(x)) = sin(x) cos(x) dx = 1 sin(2x) dx, 0 2 0 (cos(x), sin(x)) = 1 ( ) cos(2x) 2π 0 = 0. 4

Math 20F Linear Algebra Lecture 26 6 Review: Orthogonal vectors. Orthogonal vectors Slide 11 Orthogonal projection along a vector. Orthogonal bases. Orthogonal projection onto a subspace. Gram-Schmidt orthogonalization process. Review or orthogonal vectors Slide 12 Definition 5 (Orthogonal vectors) Let V, (, ) be an inner product vector space. Two vectors u, v V are orthogonal, or perpendicular, if and only if (u, v) = 0. Theorem 3 Let V, (, ) be an inner product vector space. u, v V are orthogonal u + v = u v, u + v 2 = u 2 + v 2.

Math 20F Linear Algebra Lecture 26 7 Orthogonal projection along a vector Fix V, (, ), and u V, with u 0. Can any vector x V be decomposed in orthogonal parts with respect to u, that is, x = ˆx + x with (ˆx, x ) = 0 and ˆx = cu? Is this decomposition unique? Slide 13 Theorem 4 (Orthogonal decomposition along a vector) V, (, ), an inner product vector space, and u V, with u 0. Then, any vector x V can be uniquely decomposed as where ˆx = x = ˆx + x, (x, u) u 2 u, x = x ˆx. Therefore, ˆx is proportional to u, and (ˆx, x ) = 0. Orthogonal projection along a vector Proof: Introduce ˆx = cu, and then write x = cu + x. The condition (ˆx, x ) = 0 implies that (u, x ) = 0, then (x, u) = c(u, u), c = (x, u) u 2, Slide 14 then ˆx = (x, u) u 2 u, x = x ˆx. This decomposition is unique, because, given a second decomposition x = ŷ + y with y = du, and (ŷ, y ) = 0, then, (u, y ) = 0 and cu + x = du + y c = d, from a multiplication by u, and then, x = y.

Math 20F Linear Algebra Lecture 26 8 Orthogonal bases Slide 15 Definition 6 Let V, (, ) be an n dimensional inner product vector space, and {u 1,, u n } be a basis of V. The basis is orthogonal (u i, u j ) = 0, for all i j. The basis is orthonormal it is orthogonal, and in addition, u i = 1, for all i, where i, j = 1,, n. Orthogonal bases Slide 16 Theorem 5 Let V, (, ) be an n dimensional inner product vector space, and {u 1,, u n } be an orthogonal basis. Then, any x V can be written as x = c 1 u 1 + + c n u n, with the coefficients have the form c i = (x, u i), i = 1,, n. u i 2 Proof: The set {u 1,, u n} is a basis, so there exist coefficients c i such that x = c 1u 1 + + c nu n. The basis is orthogonal, so multiplying the expression of x by u i, and recalling (u i, u j) = 0 for all i j, one gets, (x, u i) = c i(u i, u i). The u i are nonzero, so (u i, u i) = u i 2 0, so c i = (x, u i)/ u i 2.

Math 20F Linear Algebra Lecture 26 9 Slide 17 Notice: Orthogonal projections onto subspaces To write x in an orthogonal basis means to do an orthogonal decomposition of x along each basis vector. (All this holds for vector spaces of functions.) Theorem 6 Let V, (, ) be an n dimensional inner product vector space, and W V be a p dimensional subspace. Let {u 1,, u p } be an orthogonal basis of W. Then, any x V can be decomposed as x = ˆx + x with (ˆx, x ) = 0 and ˆx = c 1 u 1 + + c p u p, where the coefficients c i are given by c i = (x, u i), i = 1,, p. u i 2 Slide 18 Gram-Schmidt Orthogonalization process Orthogonal bases are convenient to carry out computations. Jorgen Gram and Erhard Schmidt by the year 1900 made standard a process to compute an orthogonal basis from an arbitrary basis. (They actually needed it for vector spaces of functions. Laplace, by 1800, used this process on IR n.)

Math 20F Linear Algebra Lecture 27 10 Gram-Schmidt Orthogonalization process Theorem 7 Let V, (, ) be an inner product vector space, and {u 1,, u n } be an arbitrary basis of V. Then, an orthogonal basis of V is given by the vectors {v 1,, v n }, where Slide 19 v 1 = u 1, v 2 = u 2 (u 2, v 1 ) v 1 2 v 1, v 3 = u 3 (u 3, v 1 ) v 1 2 v 1 (u 3, v 2 ) v 2 2 v 2,. v n = u n. n 1 i=1 (u n, v i ) v i 2 v i. Least-squares approximation Slide 20 Review: Gram-Schmidt orthogonalization process. Least-squares approximation. Definition. Normal equation. Examples.

Math 20F Linear Algebra Lecture 27 11 Gram-Schmidt Orthogonalization process Theorem 8 Let V, (, ) be an inner product vector space, and {u 1,, u n } be an arbitrary basis of V. Then, an orthogonal basis of V is given by the vectors {v 1,, v n }, where Slide 21 v 1 = u 1, v 2 = u 2 (u 2, v 1 ) v 1 2 v 1, v 3 = u 3 (u 3, v 1 ) v 1 2 v 1 (u 3, v 2 ) v 2 2 v 2,. v n = u n. n 1 i=1 (u n, v i ) v i 2 v i. Least-squares approximation Slide 22 Let V, W be vector spaces and A : V W be linear. Given b W then the linear equation Ax = b either has a solution x or it has no solutions. Suppose now that there is an inner product in W, say (, ) W, with associated norm W. Then there exists a notion of approximate solution, given as follows.

Math 20F Linear Algebra Lecture 27 12 Least-squares approximation Definition 7 (Approximate solution) Let V, W be vector spaces and let (, ) W, W be an inner product and its associate norm in W. Let A : V W be linear, and b W be an arbitrary vector. An approximate solution to the linear equation Slide 23 is a vector ˆx V such that Ax = b, Aˆx b W Ax b W,, x V. Remark: ˆx is also called least-squares approximation, because ˆx makes the number Aˆx b 2 W = [(Ax) 1 b 1] 2 + + [(Ax) m b m] 2 as small as possible, where m = dim(w ). Least-squares approximation Slide 24 Theorem 9 Let V, W be vector spaces and let (, ) W, W be an inner product and its associate norm in W. Let A : V W be linear, and b W be an arbitrary vector. If ˆx V satisfies that (Aˆx b) Range(A), then ˆx is a least-squares solution of Ax = b.

Math 20F Linear Algebra Lecture 27 13 Least-squares approximation Proof: The hypothesis (Aˆx b) Range(A) implies that for all x V holds Slide 25 Ax b = Aˆx b + Ax Aˆx, = (Aˆx b) + A(x ˆx). The two terms on the right hand side are orthogonal, by hypothesis, then Pythagoras theorem holds, so Ax b 2 W = Aˆx b 2 W + A(x ˆx) 2 W Aˆx b 2 W, so ˆx is a least-squares solution. Least-squares approximation Theorem 10 Let IR n, (, ) n, and IR m, (, ) m be the Euclidean inner product spaces and A : IR n IR m be linear, identified with an m n matrix. Fix b IR m. Then, Slide 26 ˆx IR n is solution of A T Aˆx = A T b (Aˆx b) Col(A). Proof: Let ˆx such that A T Aˆx = A T b. Then, A T Aˆx = A T b, A T (Aˆx b) = 0, (a i, (Aˆx b)) m = 0, for all a i column vector of A, where we used the notation A = [a 1,, a n]. Therefore, the condition A T Aˆx = A T b is equivalent to (Aˆx b) Col(A).

Math 20F Linear Algebra Lecture 27 14 Least-squares approximation The previous two results can be summarized in the following one: Slide 27 Theorem 11 Let IR n, (, ) n, and IR m, (, ) m be the Euclidean inner product spaces and A : IR n IR m be linear, identified with an m n matrix. Fix b IR m. If ˆx IR n is solution of A T Aˆx = A T b, then ˆx is a least squares solution of Ax = b.