Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.



Similar documents
Chapter 6. Orthogonality

Chapter 17. Orthogonal Matrices and Symmetries of Space

Similarity and Diagonalization. Similar Matrices

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Inner Product Spaces and Orthogonality

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Orthogonal Diagonalization of Symmetric Matrices

Recall that two vectors in are perpendicular or orthogonal provided that their dot

ISOMETRIES OF R n KEITH CONRAD

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

[1] Diagonal factorization

LINEAR ALGEBRA W W L CHEN

Section Inner Products and Norms

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

LINEAR ALGEBRA. September 23, 2010

Linear Algebra Review. Vectors

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

5. Orthogonal matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

9 Multiplication of Vectors: The Scalar or Dot Product

1 VECTOR SPACES AND SUBSPACES

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Eigenvalues and Eigenvectors

x = + x 2 + x

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

MATH APPLIED MATRIX THEORY

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Inner products on R n, and more

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Orthogonal Projections and Orthonormal Bases

Numerical Analysis Lecture Notes

Linear Algebra Notes for Marsden and Tromba Vector Calculus

APPLICATIONS. are symmetric, but. are not.

University of Lille I PC first year list of exercises n 7. Review

Math 215 HW #6 Solutions

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

by the matrix A results in a vector which is a reflection of the given

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Applied Linear Algebra

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Linear Algebra: Vectors

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

MAT188H1S Lec0101 Burbulla

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Examination paper for TMA4115 Matematikk 3

Lecture 1: Schur s Unitary Triangularization Theorem

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

1 Introduction to Matrices

Orthogonal Projections

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Lecture notes on linear algebra

3. INNER PRODUCT SPACES

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Inner Product Spaces

α = u v. In other words, Orthogonal Projection

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

The Characteristic Polynomial

Mathematics Course 111: Algebra I Part IV: Vector Spaces

CS3220 Lecture Notes: QR factorization and orthogonal transformations

DATA ANALYSIS II. Matrix Algorithms

Introduction to Matrix Algebra

Computing Orthonormal Sets in 2D, 3D, and 4D

4 MT210 Notebook Eigenvalues and Eigenvectors Definitions; Graphical Illustrations... 3

3 Orthogonal Vectors and Matrices

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

1 Sets and Set Notation.

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Lecture 14: Section 3.3

Section 9.5: Equations of Lines and Planes

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

1 Symmetries of regular polyhedra

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

October 3rd, Linear Algebra & Properties of the Covariance Matrix

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

MAT 242 Test 3 SOLUTIONS, FORM A

Vector and Matrix Norms

Chapter 20. Vector Spaces and Bases

Applied Linear Algebra I Review page 1

Inner product. Definition of inner product

The Determinant: a Means to Calculate Volume

MAT 242 Test 2 SOLUTIONS, FORM T

Numerical Methods I Eigenvalue Problems

8 Square matrices continued: Determinants

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

Rotation Matrices and Homogeneous Transformations

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Linear Algebra: Determinants, Inverses, Rank

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Data Mining: Algorithms and Applications Matrix Math Review

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Linear Algebra I. Ronald van Luijk, 2012

Name: Section Registered In:

Notes on Determinant

Section 1.1. Introduction to R n

Transcription:

ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation? Rotations are rigid motions, in the geometric sense of preserving the lengths of vectors and the angle between vectors. The cosine of the angle formed by two vectors is given by their inner product (or dot product ) divided by the products of their lengths. Thus if our linear transformation preserves lengths of vectors and also the inner product of two vectors, it will automatically be a rigid motion. In fact, preservation of inner products already implies preservation of lengths, since v 2 = v v, for any v. Definition. An n n matrix is orthogonal if A t A = I n. Recall the basic property of the transpose (for any A): Av w = v A t w, v, w R n. It implies that requiring A to have the property: is the same as requiring: Av Aw = v w, v, w R n. v A t Aw = v w, v, w R n. This is certainly true for orthogonal matrices; thus the action of an orthogonal matrices on vectors in R n preserves lengths and angles. Basic properties. (1) A matrix is orthogonal exactly when its column vectors have length one, and are pairwise orthogonal; likewise for the row vectors. In short, the columns (or the rows) of an orthogonal matrix are an orthonormal basis of R n, and any orthonormal basis gives rise to a number of orthogonal matrices. (2) Any orthogonal matrix is invertible, with A 1 = A t. If A is orthogonal, so are A T and A 1. (3) The product of orthogonal matrices is orthogonal: if A t A = I n and B t B = I n, (AB) t (AB) = (B t A t )AB = B t (A t A)B = B t B = I n. 1

(2) and (3) (plus the fact that the identity is orthogonal) can be summarized by saying the n n orthogonal matrices form a matrix group, the orthogonal group O n. (4)The 2 2 rotation matrices R θ are orthogonal. Recall: [ ] cos θ sin θ R θ =. sin θ cos θ (R θ rotates vectors by θ radians, counterclockwise.) (5)The determinant of an orthogonal matrix is equal to 1 or -1. The reason is that, since det(a) = det(a t ) for any A, and the determinant of the product is the product of the determinants, we have, for A orthogonal: 1 = det(i n ) = det(a t A) = det(a( t )det(a) = (deta) 2. (6) Any real eigenvalue of an orthogonal matrix has absolute value 1. To see this, consider that Rv = v for any v, if R is orthogonal. But if v 0 is an eigenvector with eigenvalue λ: Rv = λv v = Rv = λ v, hence λ = 1. (Actually, it is also true that each complex eigenvalue must have modulus 1, and the argument is similar). Reflections. A linear transformation T of R n is a reflection if there is a one-dimensional subspace L (a line through 0) so that T v = v for v L and T v = v for v in the orthogonal complement L. Letting n be a unit vector spanning L, we find the expression for T : T v = v 2(v n)n, v R n. If {v 1,..., v n 1 } is a basis of L, in the basis B = {\,,..., \ } of R n the matrix of the reflection T takes the form (say, for n = 3): [T ] B = 1 0 0 0 1 0 0 0 1. Remark. Like rotations, reflections are rigid motions of space. How are they different? (Not at all a trivial question!) The following distinction 2

is easy to work with: a rotation R is really the end result of a continuous motion, where we start by doing nothing (the identity map), and gradually move all vectors by rotations R(t) (where t denotes time ), until we arrive at the desired rotation R = R(1). Since det(i n ) = 1 and det(r(t)) depends continuously on t (and is always 1 or 1), we must have detr = 1 if R is a rotation. On the other hand, from the above matrix expression we see that a reflection (as defined above) clearly has determinant 1. Ex 1. Reflection on a given plane in R 3. Find the orthogonal matrix (in the standard basis) that implements reflection on the plane with equation: 2x 1 + 3x 2 + x 3 = 0 Solution. The orthogonal line L is spanned by the unit vector: From the expression given above: n = 1 14 (2, 3, 1). in particular: T v = v 2 1 v, (2, 1, 3) (2, 1, 3), 14 T e 1 = e 1 (4/14)(2, 1, 3) = (3/7, 2/7, 6/7) T e 2 = e 2 (2/14)(2, 1, 3) = ( 2/7, 6/7, 3/7) T e 3 = e 3 (6/14)(2, 1, 3) = ( 6/7, 3/7, 2/7) These are the columns of the matrix of T in the standard basis: [T ] B0 = 1 3 2 6 2 6 3. 7 6 3 2. Rotations. Ex. 2. Rotation in R 3 with given axis and given angle. Find the 3 3 orthogonal matrix that implements the rotation R in R 3 with axis the subspace spanned by (1, 2, 3) (a line L), by an angle of π/6 radians ( counterclockwise when looking down the axis). Solution. The first step is to find a positive orthonormal basis of R 3, B = {u, v 1, v 2 }, where u is a unit vector spanning L and v 1, v 2 are an orthonormal 3

basis for the orthogonal plane L. Positive means det[u v 1 v 2 ] = 1. We take u = (1/ 14)(1, 2, 3). Start with any basis of of L, say (by trial and error : take any two l.i. vectors satisfying the equation x 1 + 2x 2 + 3x 3 = 0 of L ): w 1 = (3, 0, 1), w 2 = (0, 3, 2). Since: det 1 3 0 2 0 3 3 1 2 = 33 > 0, the basis {u, w 1, w 2 } of R 3 is positive. Applying Gram-Schmidt to w 1, w 2, we obtain an orthonormal basis of the plane L : v 1 = (1/ 10)(3, 0, 1), v 2 = (1/ 35)( 1, 5, 3). (The basis {u, v 1, v 2 } is still positive.) Second step: computing the action of R on the basis vectors. Since u is a vector on the axis, it is fixed by R: Ru = u. In the plane L, R acts exactly like the counterclockwise rotation by π/6 radians in R 2, so we know what R does to an orthonormal basis. We obtain: Ru = (1/ 14)(1, 2, 3) Rv 1 = cos(π/6)v 1 + sin(π/6)v 2 = ( 3/2)v 1 + (1/2)v 2 Rv 2 = sin(π/6)v 1 + cos(π/6)v 2 = ( 1/2)v 1 + ( 3/2)v 2. This means that the matrix of R in the adapted basis B is: 1 0 0 [R] B = 0 3/2 1/2 0 1/2. 3/2 This is orthogonal; in fact, in block form, it has a 1 (corresponding to the axis u, an eigenvector of R with eigenvalue 1) and a 2 2 block given by the rotation R π/6 in the plane. Third step. Now use the change of basis formula. We have: [R] B0 = P [R] B P 1 = P [R] B P t, since P is orthogonal. The change of basis matrix P is given by the calculation in the first step: 1/ 14 3/ 10 1/ 35 P = 2/ 14 0 5/ 35 3/ 14 1/ 10 3/. 35 4

And that s it. Anyone so inclined can go ahead and multiply out the matrices, but this is best left to a computer. Remark on eigenvalues. (i) Reflections: the eigenvalues are 1 (with eigenspace L) and 1 (with eigenspace L.) Reflections are diagonalizable. (ii)rotations. As remarked above, any vector on the axis of a rotation in R 3 is an eigenvector for the eigenvalue λ = 1. In fact, the eigenspace E(1) is the axis. Is it possible to see algebraically that a rotation A in R 3 (as defined above: an orthogonal matrix with determinant 1) has 1 as an eigenvalue? Well, since the determinant is 1, the characteristic equation has the form: λ 3 + a 2 λ 2 + a 1 λ + 1 = 0. A polynomial of degree three has at least one real root, and each real root is either 1 or 1 (since A is orthogonal- property (6) above). If all the roots are real, at least one must be 1 (since their product is the determinant, namely 1). If there are two complex conjugate roots (a ± ib) and the real root is -1, the product of the three roots is (a + ib)(a ib)( 1) = (a 2 + b 2 ), which can t possibly be equal to 1. This shows that at least one of the roots is 1, pictorially: every rotation in R 3 has an axis. From the geometric construction of a rotation, we see that, in fact, the other two eigenvalues must be complex (since in the plane orthogonal to the axis there are no fixed directions, hence no real eigenvectors.) Orthogonal projections. Let V R n be a subspace, dim(v ) = r. The orthogonal projection P : R n R n is defined by: P v = v, if v V ; P v = 0, if v V. Note: P is not given by an orthogonal matrix! Let {v 1,... v r } be an orthonormal basis of V ; form the corresponding matrix A = [v 1 v 2... v r ] M n r. The matrix of P in the standard basis {e 1,... e n } of R n is given by: [P ] std = [P e 1... P e n ]. From: P e i = (e i v 1 )v 1 +... + (e i v r )v r, we see that the columns of [P ] std are linear combinations of the columns of A. Thus [P ] std = AB for an r n matrix B, with entries in the i th. column: B ji = e i v j, j = 1,..., r. 5

That is, B ji is the i th component of v j (in the standard basis), which is the entry A ij of A. So B = A t and: [P ] std = AA t. From this it is clear that [P ] std is symmetric. (Contrast with A t A = I r.) 6