[1] Diagonal factorization



Similar documents
Chapter 6. Orthogonality

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

MATH APPLIED MATRIX THEORY

Similar matrices and Jordan form

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

x = + x 2 + x

by the matrix A results in a vector which is a reflection of the given

Similarity and Diagonalization. Similar Matrices

Chapter 17. Orthogonal Matrices and Symmetries of Space

Linear Algebra Review. Vectors

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Orthogonal Projections

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

1 Introduction to Matrices

Examination paper for TMA4115 Matematikk 3

Orthogonal Diagonalization of Symmetric Matrices

Inner products on R n, and more

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Section Inner Products and Norms

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Data Mining: Algorithms and Applications Matrix Math Review

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Section V.3: Dot Product

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

LINEAR ALGEBRA. September 23, 2010

Introduction to Matrix Algebra

Vector Math Computer Graphics Scott D. Anderson

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Eigenvalues and Eigenvectors

5. Orthogonal matrices

Linear Algebra Notes for Marsden and Tromba Vector Calculus

LS.6 Solution Matrices

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Lecture 5: Singular Value Decomposition SVD (1)

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Solutions to Math 51 First Exam January 29, 2015

University of Lille I PC first year list of exercises n 7. Review

9 Multiplication of Vectors: The Scalar or Dot Product

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Inner Product Spaces

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

Notes on Symmetric Matrices

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Lecture 14: Section 3.3

Elasticity Theory Basics

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Math 215 HW #6 Solutions

Section 9.5: Equations of Lines and Planes

Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

DATA ANALYSIS II. Matrix Algorithms

Least-Squares Intersection of Lines

Inner product. Definition of inner product

Lecture notes on linear algebra

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Recall that two vectors in are perpendicular or orthogonal provided that their dot

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Numerical Methods I Eigenvalue Problems

8 Square matrices continued: Determinants

160 CHAPTER 4. VECTOR SPACES

Inner Product Spaces and Orthogonality

LINES AND PLANES CHRIS JOHNSON

Equations Involving Lines and Planes Standard equations for lines in space

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Using row reduction to calculate the inverse and the determinant of a square matrix

Question 2: How do you solve a matrix equation using the matrix inverse?

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

APPLICATIONS. are symmetric, but. are not.

Rotation Matrices and Homogeneous Transformations

Lecture 1: Schur s Unitary Triangularization Theorem

Lecture 2 Matrix Operations

MAT 242 Test 2 SOLUTIONS, FORM T

1 VECTOR SPACES AND SUBSPACES

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Lecture 2: Homogeneous Coordinates, Lines and Conics

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Vectors Math 122 Calculus III D Joyce, Fall 2012

Linear Algebra I. Ronald van Luijk, 2012

Unified Lecture # 4 Vectors

A vector is a directed line segment used to represent a vector quantity.

Nonlinear Iterative Partial Least Squares Method

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Linear Algebra and TI 89

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Orthogonal Projections and Orthonormal Bases

DETERMINANTS TERRY A. LORING

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Elementary Linear Algebra

Manifold Learning Examples PCA, LLE and ISOMAP

Introduction to Matrices for Engineers

7 Gaussian Elimination and LU Factorization

Math Lecture 33 : Positive Definite Matrices

Math 241, Exam 1 Information.

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Applied Linear Algebra I Review page 1

Transcription:

8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall: if Ax = λx, then the system ẏ = Ay has a general solution of the form y = c e λ t x + c 2 e λ 2t x 2, where the λ i are eigenvalues with corresponding eigenvectors x i. I m never going to see eigenvectors without putting them into a matrix. And I m never going to see eigenvalues without putting them into a matrix. Let s look at an example from last class. A = [ 5 2 2 5. We found that this had eigenvectors [ [ and I m going to form a matrix out of these eigenvectors called the eigenvector matrix S: [ S = Then lets look at what happens when we multiply AS, and see that we can factor this into S and a diagonal matrix Λ: [ [ [ [ [ 5 2 7 3 7 0 = = 2 5 7 3 0 3 A S S Λ We call matrix Λ with eigenvalues λ on the diagonal the eigenvalue matrix. So we see that AS = SΛ, but we can multiply both sides on the right by S and we get a factorization A = SΛS. We ve factored A into 3 pieces..

Properties of Diagonalization A 2 = SΛS SΛS = SΛ 2 S A = (SΛS ) = (S ) Λ S = SΛ S Diagonal matrices are easy to square and invert because you simply square or invert the elements along the diagonal! [2 Solving systems of first order differential equations The entire reason we are finding eigenvectors is to solve differential equations. Let s express our solution to the differential equation in terms of S and Λ: y = [ [ [ e x x λ t 0 c 2 0 e λ 2t c 2 S e Λt c What determines c? Suppose we have an initial condition y(0). Plugging this into our vector equation above we can solve for c: y(0) = SIc S y(0) = c The first line simply expresses our initial condition as a linear combination of the eigenvectors, y(0) = c x + c 2 x 2. The second equation just multiplies the first by S on both sides to solve for c in terms of y(0) and S, which we know, or can compute from what we know. Steps for solving a differential equation Step 0. Find λ i and x i. 2

Step. Use the initial condition to compute the parameters: c = S y(0) Step 2. Multiply c by e Λt and S: y = Se Λt S y(0). [3 Symmetric and Orthonormal Matrices In our example, we saw that A was symmetric (A = A T ) implied that the eigenvectors were perpendicular, or orthogonal. Perpendicular and orthogonal are two words that mean the same thing. Now, the eigenvectors we chose [ [ and had length 2. If we make them unit length, we can choose eigenvectors that are both orthogonal and unit length. This is called orthonormal. Question: Are the unit length vectors also eigenvectors? [ [ / 2 / and 2 / 2 / 2 Yes! If Ax = λx, then A x x = λ x x. It turns out that finding the inverse of a matrix whose columns are orthonormal is extremely easy! All you have to do is take the transpose! 3

Claim If S has orthonormal columns, then S = S T. Example [ [ cos(θ) sin(θ) cos(θ) sin(θ) = sin(θ) cos(θ) sin(θ) cos(θ) [ 0 0 S S T = I If the inverse exists, it is unique, so S T must be the inverse! If we set θ = π/4 we get but what we found was [, 2 [. 2 Fortunately we can multiply the second column by negative, and it is still and eigenvector. So in the 2 by 2 case, we can always choose the eigenvectors of a symmetric matrix so that the eigenvector matrix is not only orthonormal, but also so that it is a rotation matrix! In general, a set of vectors x,..., x n is said to be orthonormal if the dot product of any vector with itself is : x i x i = x T i x i =, and the dot product of any two vectors that are not equal is zero: when i j. x i x j = x T i x j = 0, This tells us that the matrix product: 4

x T x T 2 x T 3 x T x x T x 2 x T x 3 x x 2 x 3 = x T 2 x x T 2 x 2 x T 2 x 3 = x T 3 x x T 3 x x T 3 x 3 0 0 0 0 0 0 S T S = I Example We ve seen that 2 by 2 orthonormal eigenvector matrices can be chosen to be rotation matrices. Let s look at a 3 by 3 rotation matrix: S = 2 2 2 2 3 2 2 As an exercise, test that all vector dot products are zero if the vectors are not equal, and are one if it is a dot product with itself. This is a particularly nice matrix because there are no square roots! And this is also a rotation matrix! But it is a rotation is 3 dimensions. Find a symmetric matrix A whose eigenvector matrix is S. All we have to do is choose any Λ with real entries along the diagonal, and then A = SΛS T is symmetric! Recall that (AB) T = B T A T. We can use this to check that this A is in fact symmetric: A T = (SΛS T ) T = S T T Λ T S T = SΛS T This works because transposing a matrix twice returns the original matrix, and transposing a diagonal matrix does nothing! In physics and engineering this is called the principal axis theorem. In math, this is the spectral theorem. 5

Why is it called the principal axis theorem? An ellipsoid whose principal axis are along the standard x, y, and z axes can be written as the equation ax 2 + by 2 + cz 2 =, which in matrix form is [ a 0 0 x x y z 0 b 0 y = 0 0 c z However, what you consider a general ellipsoid, the 3 principal direction can be pointing in any direction. They are orthogonal direction though! And this means that we can get back to the standard basis elements by applying a rotation matrix S whose columns are orthonormal. Thus our equation for a general ellipsoid is: T x a 0 0 x S y 0 b 0 S y = z 0 0 c z [ a 0 0 x x y z S T 0 b 0 S y = 0 0 c z 6

M.I.T. 8.03 Ordinary Differential Equations 8.03 Extra Notes and Exercises c Haynes Miller, David Jerison, Jennifer French and M.I.T., 203