5. Orthogonal matrices



Similar documents
Inner Product Spaces

Inner product. Definition of inner product

Chapter 6. Orthogonality

Linear Algebra Review. Vectors

1 Introduction to Matrices

6. Cholesky factorization

α = u v. In other words, Orthogonal Projection

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Inner Product Spaces and Orthogonality

MAT 242 Test 3 SOLUTIONS, FORM A

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Linear Algebra Notes for Marsden and Tromba Vector Calculus

CS3220 Lecture Notes: QR factorization and orthogonal transformations

3 Orthogonal Vectors and Matrices

Lecture 2 Matrix Operations

Chapter 17. Orthogonal Matrices and Symmetries of Space

Orthogonal Projections

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

[1] Diagonal factorization

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Section Inner Products and Norms

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Math 215 HW #6 Solutions

Lecture 1: Schur s Unitary Triangularization Theorem

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

x = + x 2 + x

LINEAR ALGEBRA. September 23, 2010

MAT188H1S Lec0101 Burbulla

ISOMETRIES OF R n KEITH CONRAD

Numerical Methods I Eigenvalue Problems

Similarity and Diagonalization. Similar Matrices

Recall that two vectors in are perpendicular or orthogonal provided that their dot

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

MATH APPLIED MATRIX THEORY

Lecture 14: Section 3.3

Applied Linear Algebra I Review page 1

Inner products on R n, and more

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

1 VECTOR SPACES AND SUBSPACES

Linear Algebra Notes

Linear Algebra: Vectors

Electrical Engineering 103 Applied Numerical Computing

3. INNER PRODUCT SPACES

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Question 2: How do you solve a matrix equation using the matrix inverse?

Review Jeopardy. Blue vs. Orange. Review Jeopardy

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Math 312 Homework 1 Solutions

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

Solution of Linear Systems

Lecture L3 - Vectors, Matrices and Coordinate Transformations

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Systems of Linear Equations

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

F Matrix Calculus F 1

LS.6 Solution Matrices

Factorization Theorems

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Wavelet analysis. Wavelet requirements. Example signals. Stationary signal 2 Hz + 10 Hz + 20Hz. Zero mean, oscillatory (wave) Fast decay (let)

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Elementary Linear Algebra

8. Linear least-squares

Orthogonal Diagonalization of Symmetric Matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

1 Inner Products and Norms on Real Vector Spaces

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Notes on Symmetric Matrices

Geometric Transformations

Lecture 3: Coordinate Systems and Transformations

Lecture 5: Singular Value Decomposition SVD (1)

Lecture 5 Least-squares

MAT 1341: REVIEW II SANGHOON BAEK

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Section 9.5: Equations of Lines and Planes

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

5.3 The Cross Product in R 3

Lecture notes on linear algebra

Unified Lecture # 4 Vectors

A linear combination is a sum of scalars times quantities. Such expressions arise quite frequently and have the form

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

More than you wanted to know about quadratic forms

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Numerical Analysis Lecture Notes

12. Inner Product Spaces

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Notes on Determinant

Linear Algebra: Determinants, Inverses, Rank

1 Symmetries of regular polyhedra

Name: Section Registered In:

APPLICATIONS. are symmetric, but. are not.

Solving Linear Systems of Equations. Gerald Recktenwald Portland State University Mechanical Engineering Department

Transcription:

L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1

Orthonormal vectors a set of real m-vectors {a 1, a 2,, a n } is orthonormal if the vectors have unit norm: a i = 1 they are mutually orthogonal: a T i a j = 0 if i j Example 0 0 1, 1 1 1 2 0, 1 1 1 2 0 Orthogonal matrices 5-2

Matrix with orthonormal columns A R m n has orthonormal columns if its Gram matrix is the identity A T A = [ a 1 a 2 a n ] T [ a1 a 2 a n ] = = a T 1 a 1 a T 1 a 2 a T 1 a n a T 2 a 1 a T 2 a 2 a T 2 a n a T na 1 a T na 2 a T na n 1 0 0 0 1 0 0 0 1 there is no standard short name for matrix with orthonormal columns Orthogonal matrices 5-3

Matrix-vector product if A R m n has orthonormal columns, then the linear function f(x) = Ax preserves inner products: (Ax) T (Ay) = x T A T Ay = x T y preserves norms: Ax = ( (Ax) T (Ax) ) 1/2 = (x T x) 1/2 = x preserves distances: Ax Ay = x y preserves angles: (Ax, Ay) = arccos ( (Ax) T ) (Ay) Ax Ay = arccos ( x T y ) x y = (x, y) Orthogonal matrices 5-4

Left invertibility if A R m n has orthonormal columns, then A is left invertible with left inverse A T : by definition A T A = I A has linearly independent columns (from p 4-24 or p 5-2): Ax = 0 = A T Ax = x = 0 A is tall or square: m n (see page 4-13) Orthogonal matrices 5-5

Outline matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns

Orthogonal matrix Orthogonal matrix: a square real matrix with orthonormal columns Nonsingularity (from equivalences on page 4-14): if A is orthogonal, then A is invertible, with inverse A T : A T A = I A is square } = AA T = I A T is also an orthogonal matrix rows of A are orthonormal (have norm one and are mutually orthogonal) Note: if A R m n has orthonormal columns and m > n, then AA T I Orthogonal matrices 5-6

Permutation matrix let π = (π 1, π 2,, π n ) be a permutation (reordering) of (1, 2,, n) we associate with π the n n permutation matrix A A iπi = 1, A ij = 0 if j π i Ax is a permutation of the elements of x: Ax = (x π1, x π2,, x πn ) A has exactly one element equal to 1 in each row and each column Orthogonality: permutation matrices are orthogonal A T A = I because A has exactly one element equal to one in each row (A T A) ij = n A ki A kj = k=1 { 1 i = j 0 otherwise A T = A 1 is the inverse permutation matrix Orthogonal matrices 5-7

Example permutation on {1, 2, 3, 4} (π 1, π 2, π 3, π 4 ) = (2, 4, 1, 3) corresponding permutation matrix and its inverse A = 0 1 0 0 0 0 0 1 1 0 0 0 0 0 1 0, A 1 = A T = 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 A T is permutation matrix associated with the permutation ( π 1, π 2, π 3, π 4 ) = (3, 1, 4, 2) Orthogonal matrices 5-8

Plane rotation Rotation in a plane Ax A = [ cos θ sin θ sin θ cos θ ] θ x Rotation in a coordinate plane in R n : for example, A = cos θ 0 sin θ 0 1 0 sin θ 0 cos θ describes a rotation in the (x 1, x 3 ) plane in R 3 Orthogonal matrices 5-9

Reflector A = I 2aa T with a a unit-norm vector ( a = 1) a reflector matrix is symmetric and orthogonal A T A = (I 2aa T )(I 2aa T ) = I 4aa T + 4aa T aa T = I Ax is reflection of x through the hyperplane H = {u a T u = 0} H line through a and origin x y z = Ax y = x (a T x)a = (I aa T )x z = y + (y x) = (I 2aa T )x (see page 2-35) Orthogonal matrices 5-10

Product of orthogonal matrices if A 1,, A k are orthogonal matrices and of equal size, then the product A = A 1 A 2 A k is orthogonal: A T A = (A 1 A 2 A k ) T (A 1 A 2 A k ) = A T k A T 2 A T 1 A 1 A 2 A k = I Orthogonal matrices 5-11

Linear equation with orthogonal matrix linear equation with orthogonal coefficient matrix A of size n n Ax = b solution is x = A 1 b = A T b can be computed in 2n 2 flops by matrix-vector multiplication cost is less than order n 2 if A has special properties; for example, permutation matrix: reflector (given a): plane rotation: 0 flops order n flops order 1 flops Orthogonal matrices 5-12

Outline matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns

Tall matrix with orthonormal columns suppose A R m n is tall (m > n) and has orthonormal columns A T is a left inverse of A: A T A = I A has no right inverse; in particular AA T I on the next pages, we give a geometric interpretation to the matrix AA T Orthogonal matrices 5-13

Range the span of a set of vectors is the set of all their linear combinations: span(a 1, a 2,, a n ) = {x 1 a 1 + x 2 a 2 + + x n a n x R n } the range of a matrix A R m n is the span of its column vectors: range(a) = {Ax x R n } Example range( 1 0 1 1 1 2 0 1 1 ) = x 1 + x 3 x 1 + x 2 + 2x 3 x 2 x 3 x 1, x 2, x 3 R = {(u, u + v, v) u, v R} Orthogonal matrices 5-14

Projection on range of matrix with orthonormal columns suppose A R m n has orthonormal columns; we ll show that the vector AA T b is the orthogonal projection of an m-vector b on range(a) b range(a) = {Ax x R n } Aˆx = AA T b ˆx = A T b satisfies Aˆx b < Ax b for all x ˆx this extends the result on page 2-11 (where A = (1/ a )a) Orthogonal matrices 5-15

Proof the squared distance of b to an arbitrary point Ax in range(a) is Ax b 2 = A(x ˆx) + Aˆx b 2 (where ˆx = A T b) = A(x ˆx) 2 + Aˆx b 2 + 2(x ˆx) T A T (Aˆx b) = A(x ˆx) 2 + Aˆx b 2 = x ˆx 2 + Aˆx b 2 Aˆx b 2 with equality only if x = ˆx line 3 follows because A T (Aˆx b) = ˆx A T b = 0 line 4 follows from A T A = I Orthogonal matrices 5-16

Orthogonal decomposition the vector b is decomposed as a sum b = z + y with z range(a), y range(a) y = b AA T b b range(a) z = AA T b such a decomposition exists and is unique for every b b = Ax + y, A T y = 0 A T b = x, y = b AA T b Orthogonal matrices 5-17

Outline matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns

Gram matrix A C m n has orthonormal columns if its Gram matrix is the identity: A H A = [ a 1 a 2 a n ] H [ a1 a 2 a n ] = = a H 1 a 1 a H 1 a 2 a H 1 a n a H 2 a 1 a H 2 a 2 a H 2 a n a H n a 1 a H n a 2 a H n a n 1 0 0 0 1 0 0 0 1 columns have unit norm: a i 2 = a H i a i = 1 columns are mutually orthogonal: a H i a j = 0 for i j Orthogonal matrices 5-18

Unitary matrix Unitary matrix a square complex matrix with orthonormal columns is called unitary Inverse A H A = I A is square } = AA H = I a unitary matrix is nonsingular with inverse A H if A is unitary, then A H is unitary Orthogonal matrices 5-19

Discrete Fourier transform matrix recall definition from page 3-33 (with ω = e 2πj/n and j = 1) W = 1 1 1 1 1 ω 1 ω 2 ω (n 1) 1 ω 2 ω 4 ω 2(n 1) 1 ω (n 1) ω 2(n 1) ω (n 1)(n 1) the matrix (1/ n)w is unitary (proof on next page): inverse of W is W 1 = (1/n)W H 1 n W H W = 1 n W W H = I inverse discrete Fourier transform of n-vector x is W 1 x = (1/n)W H x Orthogonal matrices 5-20

we show that W H W = ni Gram matrix of DFT matrix conjugate transpose of W is 1 1 1 1 1 ω ω 2 ω n 1 W H = 1 ω 2 ω 4 ω 2(n 1) 1 ω n 1 ω 2(n 1) ω (n 1)(n 1) i, j element of Gram matrix is (W H W ) ij = 1 + ω i j + ω 2(i j) + + ω (n 1)(i j) (W H W ) ii = n, (W H W ) ij = ωn(i j) 1 ω i j 1 = 0 if i j (last step follows from ω n = 1) Orthogonal matrices 5-21