Orthogonal Transformations Math 217 Professor Karen Smith

Similar documents
( ) which must be a vector

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

α = u v. In other words, Orthogonal Projection

Similarity and Diagonalization. Similar Matrices

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

LINEAR ALGEBRA. September 23, 2010

Orthogonal Projections

Orthogonal Diagonalization of Symmetric Matrices

Mathematics Course 111: Algebra I Part IV: Vector Spaces

NOTES ON LINEAR TRANSFORMATIONS

ISOMETRIES OF R n KEITH CONRAD

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

x = + x 2 + x

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Linear Algebra Notes

8 Square matrices continued: Determinants

[1] Diagonal factorization

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Solutions to Math 51 First Exam January 29, 2015

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Chapter 6. Orthogonality

Similar matrices and Jordan form

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

University of Lille I PC first year list of exercises n 7. Review

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

1 VECTOR SPACES AND SUBSPACES

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Chapter 7. Permutation Groups

Applied Linear Algebra I Review page 1

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Chapter 17. Orthogonal Matrices and Symmetries of Space

Linear Algebra Review. Vectors

A =

5. Orthogonal matrices

LINEAR ALGEBRA W W L CHEN

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

160 CHAPTER 4. VECTOR SPACES

1 Sets and Set Notation.

Matrix Representations of Linear Transformations and Changes of Coordinates

Using row reduction to calculate the inverse and the determinant of a square matrix

Inner Product Spaces and Orthogonality

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

3 Orthogonal Vectors and Matrices

521493S Computer Graphics. Exercise 2 & course schedule change

1 Introduction to Matrices

Lecture notes on linear algebra

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Name: Section Registered In:

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Lecture 1: Schur s Unitary Triangularization Theorem

T ( a i x i ) = a i T (x i ).

Question 2: How do you solve a matrix equation using the matrix inverse?

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Lecture 14: Section 3.3

Section 1.1. Introduction to R n

6. Cholesky factorization

Methods for Finding Bases

MAT188H1S Lec0101 Burbulla

Section Inner Products and Norms

3. INNER PRODUCT SPACES

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Inner product. Definition of inner product

S on n elements. A good way to think about permutations is the following. Consider the A = 1,2,3, 4 whose elements we permute with the P =

Lecture 5: Singular Value Decomposition SVD (1)

Solving Linear Systems, Continued and The Inverse of a Matrix

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

MATHEMATICS FOR ENGINEERS BASIC MATRIX THEORY TUTORIAL 2

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

MAT 242 Test 2 SOLUTIONS, FORM T

GROUPS ACTING ON A SET

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Eigenvalues and Eigenvectors

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Math 215 HW #6 Solutions

MATH1231 Algebra, 2015 Chapter 7: Linear maps

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Linear Algebra Methods for Data Mining

26. Determinants I. 1. Prehistory

Solving Systems of Linear Equations

1 Symmetries of regular polyhedra

Geometric Transformations

You know from calculus that functions play a fundamental role in mathematics.

by the matrix A results in a vector which is a reflection of the given

Math Practice Exam 2 with Some Solutions

Systems of Linear Equations

Orthogonal Projections and Orthonormal Bases

Data Mining: Algorithms and Applications Matrix Math Review

MATH APPLIED MATRIX THEORY

Math 312 Homework 1 Solutions

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Factorization Theorems

Transcription:

Definition: A linear transformation R n Theorem: If R n Orthogonal Transformations Math 217 Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. T R n is orthogonal if T ( x) = x for all x R n. T R n is orthogonal, then x y = T x T y for all vectors x and y in R n. A. Discuss this definition and theorem with your table: what does each say in words? What, intuitively and geometrically, really is an orthogonal transformation? What structure on R n is preserved by T? What some of the important consequences? What does an orthogonal transformation do to the unit n-cube? What about to a circle? Any shape? How is this different from a linear transformation that is not orthogonal? If you are stuck, think first only about orthogonal transformations in R 2. What does an orthogonal transformation do to the unit square? Line segments? Other shapes? B. Which of the following maps are orthogonal transformations? Think geometrically. 1. R 2 R 2 given by rotation counterclockwise through θ in R 2 2. The identity map R 3 R 3 3. The reflection R 3 R 3 over a plane (though the origin). 4. The projection R 3 R 3 onto a subspace V of dimension 2 3 1 5. Multiplication by. 2 5 6. Dilation R 3 R 3 by a factor of 3. cosθ sinθ 7. Multiplication by. sinθ cosθ Solution note: 1, 2, 3, and 7 all preserve lengths of vectors, so are orthogonal. The others do not. For 4, note that everything in V goes to zero, so its length is not preserved. For (5), note that e 1 goes to to the first column, which is not of length 1. For (6), note lengths get scaled by 3. C. Let R n T R n be an orthogonal transformation. 1. Prove that T is injective. [Hint: consider the kernel.] 2. Prove that T is an isomorphism. 3. Is the inverse of an orthogonal transformation also orthogonal? Prove it! 4. Is the composition of orthogonal transformations also orthogonal? Prove it!

Solution note: (1) It suffices to show the kernel is zero. Assume x ker T. So T ( x) = 0. But if T is orthogonal also T ( x) = x, which means also x = 0. So x = 0 and T is injective. (2) This follows from rank-nullity: the kernel has dimension zero so the image has dimension n. Since the target is R n, it must be surjective. So T is both injective and surjective, which means it is bijective. By definition, a bijective linear transformation is an isomorphism. (3) Yes. Since T (X) = x for all x, we have T 1 (T (x)) = T (x) for all x which means, substituting y = T (x) that T 1 (y) = y for all y R n. (4) Yes. If both S and T are orthogonal, then S(T (X)) = T (X) = x for all x. So S T is orthogonal. D. Suppose R 2 T a c R 2 is given by left multiplication by A =. Assuming T is orthogonal, b d illustrate an example what such a T does to a unit square. Label your picture with a, b, c, d. What do you think det A can be? Solution note: You should see a square, with one corner on the origin and sides of length 1. E. Let A be the standard matrix of an orthogonal transformation T : R n R n (meaning that T ( x) = A x for all x R n. Prove that the columns of A are an orthonormal basis for R n. Solution note: Each column is length one since they are the images of the e i which are length 1. Also, since e i e j (for i j), it is also true that T ( e i ) T ( e j ) = 0. These are the columns of A.

F. Definition: An n n matrix is orthogonal if its columns are orthonormal. cosθ sinθ 1. Is A = orthogonal? Is A sinθ cosθ orthogonal? Compute A T A. 3/5 4/5 0 2. Is B = 4/5 3/5 0 orthogonal? Is B T orthogonal? Compute B T B. 0 0 1 3. Prove that if M is an orthogonal matrix, then M 1 = M T. [Hint: write M as a row of columns and M T as a column of rows. Compute M T M. ] G. Theorem: Let T : R n R n be a linear transformation with (standard) matrix A. Then T is orthogonal if and only if A has orthonormal columns. 1. Scaffold a proof of this theorem. What needs to be shown? 2. Show that if T is orthogonal, the matrix of T has orthornormal columns [Hint: use the Theorem on page 1.] 3. Show the converse. H. Prove that the rows of an orthogonal matrix are also orthonormal. G. Can you prove very important Theorem on the first page. [Hint: consider x + y.] I. An Application of QR factorization. Suppose that we are given a system of n linear equations in n unknowns: A x = b. 1. Assuming A is invertible, express the solutions to this system in terms of A 1 and b. 2. Assume that A = QR is the QR-factorization of A. What does this mean? What are the sizes of the matrices in this case? 3. What happens if we multiply both sides of the equation A x = b by Q T? How might this simplify the problem of finding the solution to this system? 4 4/5 11/5 4. Suppose that A = 0 10 1. Applying the Gram-Schmidt process to the columns 3 3/5 23/5 4/5 0 3/5 of this matrix we get 0 1 0. Find the QR factorization of A. 3/5 0 4/5 Solution note: We have A = QR where Q is the orthogonal matrix above and R is 5 1 1 0 10 1. 0 0 5

5. Use your QR factorization to quickly solve the system A x = [0 0 25] T without row reducing!. Solution note: To solve A x = QR x = b, multiply both sides by Q 1, which is Q T 15 since Q is orthogonal. We have the equivalent system R x = Q T b = 0. This is 20 easy to solve because R is upper triangular. We get z = 4 from the bottom row. Then the second row gives 10y 4 = 0, so y = 2/5. The top row gives 5x + 2/5 + 4 = 15, so x = 53/5.

G. TRUE OR FALSE. Justify. In all problems, T denotes a linear transformation from R n to itself, and A is its matrix in the standard basis. 1. If T is orthogonal, then x y = T x T y for all vectors x and y in R n. 2. If T sends every pair of orthogonal vectors to another pair of orthogonal vectors, then T is orthogonal. 3. If T is orthogonal, then T is invertible. 4. An orthogonal projection is orthogonal. 5. If A is the matrix of an orthogonal transformation T, then the columns of A are orthonormal. 6. The transpose of an orthogonal matrix is orthogonal. 7. The product of two orthogonal matrices (of the same size) is orthogonal. 8. If A is the matrix of an orthogonal transformation T, then AA T is the identity matrix. 9. If A 1 = A T, then A is the matrix of an orthogonal transformation of R n.