Chapter 6. Orthogonality



Similar documents
Similarity and Diagonalization. Similar Matrices

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Orthogonal Diagonalization of Symmetric Matrices

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

by the matrix A results in a vector which is a reflection of the given

Inner Product Spaces and Orthogonality

[1] Diagonal factorization

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Chapter 17. Orthogonal Matrices and Symmetries of Space

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Inner products on R n, and more

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Section Inner Products and Norms

MATH APPLIED MATRIX THEORY

Linear Algebra Review. Vectors

x = + x 2 + x

LINEAR ALGEBRA. September 23, 2010

Similar matrices and Jordan form

α = u v. In other words, Orthogonal Projection

Applied Linear Algebra I Review page 1

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

5. Orthogonal matrices

Lecture 1: Schur s Unitary Triangularization Theorem

Applied Linear Algebra

MAT 242 Test 2 SOLUTIONS, FORM T

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Inner Product Spaces

Lecture 5: Singular Value Decomposition SVD (1)

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Inner product. Definition of inner product

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Eigenvalues and Eigenvectors

Numerical Methods I Eigenvalue Problems

Numerical Analysis Lecture Notes

University of Lille I PC first year list of exercises n 7. Review

Introduction to Matrix Algebra

Orthogonal Projections

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Notes on Symmetric Matrices

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

CS3220 Lecture Notes: QR factorization and orthogonal transformations

9 Multiplication of Vectors: The Scalar or Dot Product

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Math 215 HW #6 Solutions

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Orthogonal Projections and Orthonormal Bases

Lecture 14: Section 3.3

Section 4.4 Inner Product Spaces

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

DATA ANALYSIS II. Matrix Algorithms

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Lecture 5 Principal Minors and the Hessian

LINEAR ALGEBRA W W L CHEN

The Characteristic Polynomial

5.3 The Cross Product in R 3

6. Cholesky factorization

October 3rd, Linear Algebra & Properties of the Covariance Matrix

4 MT210 Notebook Eigenvalues and Eigenvectors Definitions; Graphical Illustrations... 3

ISOMETRIES OF R n KEITH CONRAD

1 Symmetries of regular polyhedra

APPLICATIONS. are symmetric, but. are not.

1 VECTOR SPACES AND SUBSPACES

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

3. INNER PRODUCT SPACES

Numerical Analysis Lecture Notes

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

Examination paper for TMA4115 Matematikk 3

Linear Algebra Notes

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

Nonlinear Iterative Partial Least Squares Method

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

1 Sets and Set Notation.

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Solutions to Assignment 10

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Chapter 20. Vector Spaces and Bases

More than you wanted to know about quadratic forms

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

Orthogonal Bases and the QR Algorithm

1. Let P be the space of all polynomials (of one real variable and with real coefficients) with the norm

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Data Mining: Algorithms and Applications Matrix Math Review

9 MATRICES AND TRANSFORMATIONS

A linear combination is a sum of scalars times quantities. Such expressions arise quite frequently and have the form

MAT 242 Test 3 SOLUTIONS, FORM A

Brief Introduction to Vectors and Matrices

LS.6 Solution Matrices

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Transcription:

6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be unit vectors and that the columns of an orthogonal matrix are mutually orthogonal (inspiring a desire to call them orthonormal matrices, but this is not standard terminology). Theorem 6.5. Characterizing Properties of an Orthogonal Matrix. Let A be an n n matrix. The following conditions are equivalent: 1. The rows of A form an orthonormal basis for R n. 2. The columns of A form an orthonormal basis for R n. 3. The matrix A is orthogonal that is, A is invertible and A 1 = A T.

6.3 Orthogonal Matrices 2 Proof. Suppose the columns of A are vectors a 1, a 2,..., a n. Then A is orthogonal if and only if I = A T A = a 1 a 2. a n... a 1 a 2 a n... and we see that the diagonal entries of the product are a j a j = 1 therefore each vector is a unit vector. All off-diagonal entries of I are 0 and so for i j we have a i a j = 0. Therefore the columns of A are orthonormal (and conversely if the columns of A are orthonormal then A T A = I). Now A T = A 1 if and only if A is orthogonal, so A is orthogonal if and only if AA T = I or (A T ) T A T = I. SoA is orthogonal if and only if A T is orthogonal, and hence the rows of A are orthonormal if and only if A is orthogonal. Example. Page 358 number 2.

6.3 Orthogonal Matrices 3 Theorem 6.6. Properties of A x for an Orthogonal Matrix A. Let A be an orthogonal n n matrix and let x and y be any column vectors in R n. Then 1. (A x) (A y) = x y, 2. A x = x, and 3. The angle between nonzero vectors x and y equals the angle between A x and A y. Proof. Recall that x y =( x T ) y. Then since A is orthogonal, [(A x) (A y)] = (A x) T A y = x T A T A y = x T I y = x T y =[ x y ] and the first property is established. For the second property, A x = A x A x = x x = x. Since dot products and norms are preserved under multiplication by A, then the angle ( ) cos 1 x y x x y y ( ) = cos 1 (A x) (A y). (A x) (A x) (A y) (A y)

6.3 Orthogonal Matrices 4 Example. Page 358 number 12. Theorem 6.7. Orthogonality of Eigenspaces of a Real Symmetric Matrix. Eigenvectors of a real symmetric matrix that correspond to different eigenvalues are orthogonal. That is, the eigenspaces of a real symmetric matrix are orthogonal. Proof. Let A be an n n symmetric matrix, and let v 1 and v 2 be eigenvectors corresponding to distinct eigenvalues λ 1 and λ 2, respectively. Then A v 1 = λ 1 v 1 and A v 2 = λ 2 v 2. We need to show that v 1 and v 2 are orthogonal. Notice that λ 1 ( v 1 v 2 )=(λ 1 v 1 ) v 2 =(A v 1 ) v 2 =(A v 1 ) T v 2 =( v T 1 A T ) v 2. Similarly [λ 2 ( v 1 v 2 )] = v T 1 A v 2. Since A is symmetric, then A = A T and λ 1 ( v 1 v 2 )=λ 2 ( v 1 v 2 )or(λ 1 λ 2 )( v 1 v 2 )=0. Since λ 1 λ 2 0, then it must be the case that v 1 v 2 = 0 and hence v 1 and v 2 are orthogonal.

6.3 Orthogonal Matrices 5 Theorem 6.8. Fundamental Theorem of Real Symmetric Matrices. Every real symmetric matrix A is diagonalizable. The diagonalization C 1 AC = D can be achieved by using a real orthogonal matrix C. Proof. By Theorem 5.5, matrix A has only real roots of its characteristic polynomial and the algebraic multiplicity of each eigenvalue is equal to its geometric multiplicity. Therefore we can find a basis for R n which consists of eigenvectors of A. Next, we can use the Gram-Schmidt process to create an orthonormal basis for each eigenspace. We know by Theorem 6.7 that the basis vectors from different eigenspaces are perpendicular, and so we have a basis of mutually orthogonal eigenvectors of unit length. As in Section 5.2, we make matrix C by using these unit eigenvectors as columns and we have that C 1 AC = D where D consists of the eigenvalues of A. Since the columns of C form an orthonormal set, matrix C is a real orthogonal matrix, as claimed.

6.3 Orthogonal Matrices 6 Note. The converse of Theorem 6.8 is also true. If D = C 1 AC is a diagonal matrix and C is an orthogonal matrix, then A is symmetric (see Exercise 24). The equation D = C 1 AC is called the orthogonal diagonalization of A. Example. Page 355 Example 4. Definition 6.5. A linear transformation T : R n R n is orthogonal if it satisfies T ( v ) T ( w)= v w for all v, w R n. Theorem 6.9. Orthogonal Transformations vis-à-vis Matrices. A linear transformation T of R n into itself is orthogonal if and only if its standard matrix representation A is an orthogonal matrix. Proof. By definition, T preserves dot products if and only if it is orthogonal, and so its standard matrix A must preserve dot products and so by Theorem 6.5 A is orthogonal. Conversely, we know that the columns of A are T ( e 1 ),T( e 2 ),...,T( e n ) where e j is the jth unit coordinate vector

6.3 Orthogonal Matrices 7 of R n, by Theorem 3.10. We have T ( e i ) T ( e j )= e i e j = 0 if i j, 1 if i = j and so the columns of A form an orthonormal basis of R n. So A is an orthogonal matrix. Example. Page 359 number 34.