Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain



Similar documents
Chapter 17. Orthogonal Matrices and Symmetries of Space

Orthogonal Projections

ISOMETRIES OF R n KEITH CONRAD

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Inner Product Spaces and Orthogonality

Similarity and Diagonalization. Similar Matrices

Lecture 14: Section 3.3

Chapter 6. Orthogonality

α = u v. In other words, Orthogonal Projection

Linear Algebra Notes for Marsden and Tromba Vector Calculus

[1] Diagonal factorization

1 Symmetries of regular polyhedra

5. Orthogonal matrices

Orthogonal Projections and Orthonormal Bases

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

LINEAR ALGEBRA W W L CHEN

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Recall that two vectors in are perpendicular or orthogonal provided that their dot

1 VECTOR SPACES AND SUBSPACES

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Math 215 HW #6 Solutions

NOTES ON LINEAR TRANSFORMATIONS

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Solutions to Math 51 First Exam January 29, 2015

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Systems of Linear Equations

Inner product. Definition of inner product

x = + x 2 + x

Lecture 1: Schur s Unitary Triangularization Theorem

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Section 1.1. Introduction to R n

Orthogonal Diagonalization of Symmetric Matrices

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Inner Product Spaces

9 Multiplication of Vectors: The Scalar or Dot Product

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

by the matrix A results in a vector which is a reflection of the given

CS3220 Lecture Notes: QR factorization and orthogonal transformations

3 Orthogonal Vectors and Matrices

3. INNER PRODUCT SPACES

THE DIMENSION OF A VECTOR SPACE

4.5 Linear Dependence and Linear Independence

8 Square matrices continued: Determinants

Examination paper for TMA4115 Matematikk 3

1 Introduction to Matrices

LINEAR ALGEBRA. September 23, 2010

Geometric Transformations

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Notes on Symmetric Matrices

Vector Math Computer Graphics Scott D. Anderson

APPLICATIONS. are symmetric, but. are not.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Notes on Determinant

MAT 1341: REVIEW II SANGHOON BAEK

MAT188H1S Lec0101 Burbulla

160 CHAPTER 4. VECTOR SPACES

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

University of Lille I PC first year list of exercises n 7. Review

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

Math 312 Homework 1 Solutions

Review Jeopardy. Blue vs. Orange. Review Jeopardy

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Eigenvalues and Eigenvectors

GROUPS ACTING ON A SET

Matrix Representations of Linear Transformations and Changes of Coordinates

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

Solving Systems of Linear Equations

Methods for Finding Bases

Computing Orthonormal Sets in 2D, 3D, and 4D

THREE DIMENSIONAL GEOMETRY

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Unified Lecture # 4 Vectors

LS.6 Solution Matrices

MA106 Linear Algebra lecture notes

Numerical Analysis Lecture Notes

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

Lecture L3 - Vectors, Matrices and Coordinate Transformations

1 Sets and Set Notation.

The Determinant: a Means to Calculate Volume

Applications of Fermat s Little Theorem and Congruences

Vectors Math 122 Calculus III D Joyce, Fall 2012

Applied Linear Algebra

Linear Algebra Review. Vectors

WHEN DOES A CROSS PRODUCT ON R n EXIST?

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

5.3 The Cross Product in R 3

9.4. The Scalar Product. Introduction. Prerequisites. Learning Style. Learning Outcomes

Introduction to Matrices for Engineers

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Transcription:

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal matrix if or, equivalently, if A T = A 1. If we view the matrix A as a family of column vectors: A = A T A = I, (1) A 1 A 2 A n then A T A = A T 1 A T 2. A 1 A 2 A 3 = A T 1 A 1 A T 1 A 2 A T 1 A n A T 2 A 1 A T 2 A 2 A T 2 A n...... A T n A T n A 1 A T n A 2 A T n A n So the condition (1) asserts that A is an orthogonal matrix iff A T i A j = { 1 if i = j 0 if i j that is, iff the columns of A form an orthonormal set of vectors. Orthogonal matrices are also characterized by the following theorem. Theorem 1 Suppose that A is an n n matrix. The following statements are equivalent: 1. A is an orthogonal matrix. 2. AX = X for all X R n. 3. AX AY = X Y for all X, Y R n. In other words, a matrix A is orthogonal iff A preserves distances and iff A preserves dot products. Proof: We will prove that 1. 2. 3. 1. 1

1. 2. : Suppose that A is orthogonal, so that A T A = I. For all column vectors X R n, we have so that AX = X. AX 2 = (AX) T AX = X T A T AX = X T IX = X T X = X 2, 2. 3. : Suppose that A is a square matrix such that AX = X for all X R n. Then, for all X, Y R n, we have X + Y 2 = (X + Y ) T (X + Y ) = X T X + 2X T Y + Y T Y = X 2 + 2X T Y + Y 2 and similarly, A(X + Y ) 2 = AX + AY 2 = (AX + AY ) T (AX + AY ) = AX 2 + 2(AX) T AY + AY 2. Since AX = X and AY = Y and A(X +Y ) = X +Y, it follows that (AX) T AY = X T Y. In other words, AX AY = X Y. 3. 1. : Suppose that A is a square matrix such that AX AY = X Y for all X, Y R n Let e i denote the i-th standard basis vector for R n, and let A i denote the i-th column of A, as above. Then { A T i A j = (Ae i ) T 1 if i = j Ae j = Ae i Ae j = e i e j = 0 if i j so that the columns of A are an orthonormal set, and A is an orthogonal matrix. We conclude this section by observing two useful properties of orthogonal matrices. Proposition 2 Suppose that A and B are orthogonal matrices. 1. AB is an orthogonal matrix. 2. Either det(a) = 1 or det(a) = 1. The proof is left to the exercises. Note: The converse is false. There exist matrices with determinant ±1 that are not orthogonal. 2. The n-reflections Theorem Recall that if u R n is a unit vector and W = u then H = I 2uu T is the reflection matrix for the subspace W. Since reflections preserve distances, it follows from Theorem 1 that H must be an orthogonal matrix. (You can also verify condition (1) directly.) We also showed earlier in the course that H = H 1 = H T and H 2 = I. It turns out that every orthogonal matrix can be expressed as a product of reflection matrices. 2

Theorem 3 (n-reflections Theorem) Let A be an n n orthogonal matrix. There exist n n reflection matrices H 1, H 2,..., H k such that A = H 1 H 2 H k, where 0 k n. In other words, every n n orthogonal matrix can be expressed as a product of at most n reflections. Proof: The theorem is trivial in dimension 1. Assume it holds in dimension n 1. For the n-dimensional case, let z = Ae n, and let H be a reflection of R n that exchanges z and e n. Then HAe n = Hz = e n, so HA fixes e n. Moreover, HA is also an orthogonal matrix by Proposition 2, so HA preserves distances and angles. In particular, if we view R n 1 as the hyperplane e n, then HA must map R n 1 to itself. By the induction assumption, HA must be expressible as a product of at most n 1 reflections on R n 1, which extend (along the e n direction) to reflections of R n as well. In other words, either HA = I or HA = H 2 H k, where k n. Setting H 1 = H, and keeping in mind that HH = I (since H is a reflection!), we have A = HHA = H 1 H 2 H k. Proposition 4 If H is a reflection matrix, then det H = 1. Proof: See Exercises. Corollary 5 If A is an orthogonal matrix and A = H 1 H 2 H k, then det A = ( 1) k. So an orthogonal matrix A has determinant equal to +1 iff A is a product of an even number of reflections. 3. Classifying 2 2 Orthogonal Matrices Suppose that A is a 2 2 orthogonal matrix. We know from the first section that the columns of A are unit vectors and that the two columns are perpendicular (orthonormal!). Any unit vector u in the plane R 2 lies on the unit circle centered at the origin, and so can be expressed in the form u = (cos θ, sin θ) for some angle θ. So we can describe the first column of A as follows: cos θ?? A = sin θ?? What are the possibilities for the second column? Since the second column must be a unit vector perpendicular to the first column, there remain only two choices: cos θ sin θ cos θ sin θ A = or A = sin θ cos θ sin θ cos θ 3

The first case is the rotation matrix, which rotates R 2 counterclockwise around the origin by the angle θ. The second case is a reflection across the line that makes an angle θ/2 from the x-axis (counterclockwise). But which is which? You can check that the rotation matrix (on the left) has determinant 1, while the reflection matrix (on the right) has determinant -1. This is consistent with Proposition 4 and Corollary 5, since a rotation of R 2 can always be expressed as a product of two reflections (how?). 4. Classifying 3 3 Orthogonal Matrices The n-reflection Theorem 3 leads to a complete description of the 3 orthogonal matrices. In particular, a 3 3 orthogonal matrix must a product of 0, 1, 2, or 3 reflections. Theorem 6 Let A be a 3 3 orthogonal matrix. 1. If det A = 1 then A is a rotation matrix. 2. If det A = 1 and A T = A, then either A = I or A is a reflection matrix. 3. If det A = 1 and A T A, then A is a product of 3 reflections (that is, A is a non-trivial rotation followed by a reflection). Proof: By Theorem 3, A is a product of 0, 1, 2, or 3 reflections. Note that if A = H 1 H k, then det A = ( 1) k. If det A = 1 then A must be a product of an even number of reflections, either 0 reflections (so that A = I, the trivial rotation), or 2 reflections, so that A is a rotation. If det A = 1 then A must be a product of an odd number of reflections, either 1 or 3. If A is a single reflection then A = H for some Householder matrix H. In this case we observed earlier that H T = H so A T = A. Conversely, if det A = 1 and A T = A then det( A) = 1 (since A is a 3 3 matrix) and A T = A = A 1 as well. It follows that A is a rotation that squares to the identity. If A I, then the only time this happens is when we rotate by the angle π (that is, 180 o ) around some axis. But if A is a 180 o rotation around some axis, then A = ( A) must be the reflection across the equatorial plane for that axis (draw a picture!). So A is a single reflection. Finally if det A = 1 and A T A, then A cannot be a rotation or a pure reflection, so A must be a product of at least 3 reflections. Corollary 7 Let A be a 3 3 orthogonal matrix. 1. If det A = 1 then A is a rotation matrix. 2. If det A = 1 then A is a rotation matrix. 4

Proof: If det A = 1 then A is a rotation matrix, by Theorem 6. If det A = 1 then det( A) = ( 1) 3 det A = 1. Since A is also orthogonal, A must be a rotation. Corollary 8 Suppose that A and B are 3 3 rotation matrices. Then AB is also a rotation matrix. Proof: If A and B are 3 3 rotation matrices, then A and B are both orthogonal with determinant +1. It follows that AB is orthogonal, and det AB = det A det B = 1 1 = 1. Theorem 6 then implies that AB is also a rotation matrix. Note that the rotations represented by A, B, and AB may each have completely different angles and axes of rotation! Given two rotations A and B around two different axes of rotation, it is far from obvious that AB will also be a rotation (around some mysterious third axis). But this is true, by Corollary 8. Later on we will see how to compute precisely the angle and axis of rotation of a rotation matrix. Exercises: 1. (a) Suppose that A is an orthogonal matrix. Prove that either det A = 1 or det A = 1. (b) Find a 2 2 matrix A such that det A = 1, but also such that A is not an orthogonal matrix. 2. Suppose that A and B are orthogonal matrices. Prove that AB is an orthogonal matrix. 3. Suppose that H = I 2uu T is a reflection matrix. Let v 1,..., v n 1 be an orthonormal basis for the subspace u. (Here u denotes the orthogonal complement to the line spanned by u.) (a) What is Hu =? (b) What is Hv i =? (c) Let M be the matrix with columns as follows M = v 1 v n 1 u What is HM =? What are the columns of HM? (d) By comparing det HM to det M prove that det H = 1. 4. (a) Give a geometric description (and sketch) of the reflection performed by the matrix cos θ sin θ H 1 = sin θ cos θ 5

(b) Give a geometric description (and sketch) of the reflection performed by the matrix H 2 = [ 1 0 0 1 ] (c) Show that a rotation in R 2 is a product of two reflections by showing that H 1 H 2 is a rotation matrix. Give a geometric description (and sketch) of the rotation performed by the matrix H 1 H 2. 5. Let u, v, w be an orthonormal basis (of column vectors) for R 3, and let A be the matrix given by A = uu T + vv T + ww T. (a) What is Au =? What is Av =? What is Aw =? (b) Let P be the matrix P = u v w What is AP =? (c) Prove that A = I (the identity!). 6. Give geometric descriptions of the what happens when you multiply a vector in R 3 by each of the following orthogonal matrices: cos θ sin θ 0 cos θ 0 sin θ 1 0 0 sin θ cos θ 0 0 1 0 0 cos θ sin θ 0 0 1 sin θ 0 cos θ 0 sin θ cos θ Sketch what happens in each case. 6