Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation



Similar documents
LINEAR ALGEBRA W W L CHEN

α = u v. In other words, Orthogonal Projection

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

MAT188H1S Lec0101 Burbulla

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Chapter 17. Orthogonal Matrices and Symmetries of Space

Inner Product Spaces and Orthogonality

Mathematics Course 111: Algebra I Part IV: Vector Spaces

5.3 The Cross Product in R 3

( ) which must be a vector

Systems of Linear Equations

THREE DIMENSIONAL GEOMETRY

Lecture 14: Section 3.3

ISOMETRIES OF R n KEITH CONRAD

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

1 VECTOR SPACES AND SUBSPACES

Recall that two vectors in are perpendicular or orthogonal provided that their dot

T ( a i x i ) = a i T (x i ).

Orthogonal Projections

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

MATH1231 Algebra, 2015 Chapter 7: Linear maps

Solving Linear Systems, Continued and The Inverse of a Matrix

NOTES ON LINEAR TRANSFORMATIONS

Rotation Matrices and Homogeneous Transformations

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Lecture notes on linear algebra

Math 312 Homework 1 Solutions

Linear Algebra Notes for Marsden and Tromba Vector Calculus

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Similarity and Diagonalization. Similar Matrices

Vector Math Computer Graphics Scott D. Anderson

Math 241, Exam 1 Information.

Geometric Transformations

ex) What is the component form of the vector shown in the picture above?

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Geometric Transformation CS 211A

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Vector Spaces 4.4 Spanning and Independence

1 Sets and Set Notation.

Section 9.5: Equations of Lines and Planes

v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

Vector Spaces. Chapter R 2 through R n

5. Orthogonal matrices

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

x = + x 2 + x

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

Inner Product Spaces

Section 9.1 Vectors in Two Dimensions

Name: Section Registered In:

Solutions to old Exam 1 problems

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

LINEAR ALGEBRA W W L CHEN

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

GCE Mathematics (6360) Further Pure unit 4 (MFP4) Textbook

Lecture 1: Schur s Unitary Triangularization Theorem

Numerical Analysis Lecture Notes

Matrix Representations of Linear Transformations and Changes of Coordinates

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

[1] Diagonal factorization

Chapter 6. Orthogonality

Lecture 3: Coordinate Systems and Transformations

Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

12.5 Equations of Lines and Planes

LINEAR MAPS, THE TOTAL DERIVATIVE AND THE CHAIN RULE. Contents

The Vector or Cross Product

Section Continued

Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Section 1.1. Introduction to R n

3. INNER PRODUCT SPACES

9 MATRICES AND TRANSFORMATIONS

4.5 Linear Dependence and Linear Independence

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

2D Geometric Transformations

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Module 8 Lesson 4: Applications of Vectors

Unified Lecture # 4 Vectors

Linearly Independent Sets and Linearly Dependent Sets

Section 2.7 One-to-One Functions and Their Inverses

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Equations Involving Lines and Planes Standard equations for lines in space

160 CHAPTER 4. VECTOR SPACES

Methods for Finding Bases

One advantage of this algebraic approach is that we can write down

A vector is a directed line segment used to represent a vector quantity.

Solutions for Review Problems

Let H and J be as in the above lemma. The result of the lemma shows that the integral

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Mechanics 1: Vectors

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Math 215 HW #6 Solutions

WHEN DOES A CROSS PRODUCT ON R n EXIST?

Transcription:

Chapter 6 Linear Transformation 6 Intro to Linear Transformation Homework: Textbook, 6 Ex, 5, 9,, 5,, 7, 9,5, 55, 57, 6(a,b), 6; page 7- In this section, we discuss linear transformations 89

9 CHAPTER 6 LINEAR TRANSFORMATION Recall, from calculus courses, a funtion f : X Y from a set X to a set Y associates to each x X a unique element f(x) Y Following is some commonly used terminologies: X is called the domain of f Y is called the codomain of f If f(x) = y, then we say y is the image of x The preimage of y is preimage(y) = {x X : f(x) = y} 4 The range of f is the set of images of elements in X In this section we deal with functions from a vector sapce V to another vector space W, that respect the vector space structures Such a function will be called a linear transformation, defined as follows Definition 6 Let V and W be two vector spaces A function T : V W is called a linear transformation of V into W, if following two prperties are true for all u,v V and scalars c T(u+v) = T(u)+T(v) (We say that T preserves additivity) T(cu) = ct(u) (We say that T preserves scalar multiplication) Reading assignment Read Textbook, Examples -, p 6- Trivial Examples: Following are two easy exampes

6 INTRO TO LINEAR TRANSFORMATION 9 Let V,W be two vector spaces Define T : V W as T(v) = for all v V Then T is a linear transformation, to be called the zero transformation Let V be a vector space Define T : V V as T(v) = v for all v V Then T is a linear transformation, to be called the identity transformation of V 6 Properties of linear transformations Theorem 6 Let V and W be two vector spaces Suppose T : V W is a linear transformation Then T() = T( v) = T(v) for all v V T(u v) = T(u) T(v) for all u,v V 4 If v = c v + c v + + c n v n then T(v) = T (c v + c v + + c n v n ) = c T (v )+c T (v )+ +c n T (v n ) Proof By property () of the definition 6, we have T() = T() = T() =

9 CHAPTER 6 LINEAR TRANSFORMATION So, () is proved Similarly, T( v) = T(( )v) = ( )T(v) = T(v) So, () is proved Then, by property () of the definition 6, we have T(u v) = T(u + ( )v) = T(u) + T(( )v) = T(u) T(v) The last equality follows from () So, () is proved To prove (4), we use induction, on n For n = : we have T(c v ) = c T(v ), by property () of the definition 6 For n =, by the two properties of definition 6, we have T(c v + c v ) = T(c v ) + T(c v ) = c T(v ) + c T(v ) So, (4) is prove for n = Now, we assume that the formula (4) is valid for n vectors and prove it for n We have T (c v + c v + + c n v n ) = T (c v + c v + + c n v n )+T (c n v n ) = (c T (v ) + c T (v ) + + c n T (v n )) + c n T(v n ) So, the proof is complete 6 Linear transformations given by matrices Theorem 6 Suppose A is a matrix of size m n Given a vector v = v v Rn define T(v) = Av = A v v v n v n Then T is a linear transformation from R n to R m

6 INTRO TO LINEAR TRANSFORMATION 9 Proof From properties of matrix multiplication, for u,v R n and scalar c we have and T(u + v) = A(u + v) = A(u) + A(v) = T(u) + T(v) The proof is complete T(cu) = A(cu) = cau = ct(u) Remark Most (or all) of our examples of linear transformations come from matrices, as in this theorem Reading assignment Read Textbook, Examples -, p 65-6 Projections along a vector in R n Projections in R n is a good class of examples of linear transformations We define projection along a vector Recall the definition 56 of orthogonal projection, in the context of Euclidean spaces R n Definition 64 Suppose v R n is a vector Then, for u R n define proj v (u) = v u v v Then proj v : R n R n is a linear transformation Proof This is because, for another vector w R n and a scalar c, it is easy to check proj v (u+w) = proj v (u)+proj v (w) and proj v (cu) = c (proj v (u)) The point of such projections is that any vector u R n can be written uniquely as a sum of a vector along v and another one perpendicular to v: u = proj v (u) + (u proj v (u))

94 CHAPTER 6 LINEAR TRANSFORMATION It is easy to check that (u proj v (u)) proj v (u) Exercise 65 (Ex 4, p 7) Let T(v,v,v ) = (v + v, v v,v v ) Compute T( 4, 5, ) Solution: T( 4, 5, ) = ( ( 4)+5, 5 ( 4), 4 ) = (,, 5) Compute the preimage of w = (4,, ) Solution: Suppose (v,v,v ) is in the preimage of (4,, ) Then (v + v, v v,v v ) = (4,, ) So, v +v = 4 v v = v v = The augmented matrix of this system is 4 its Gauss Jordan f orm 574 8574 574 So, Preimage((4,, )) = {(574, 8574, 574)}

6 INTRO TO LINEAR TRANSFORMATION 95 Exercise 66 (Ex p 7) Determine whether the function T : R R T(x,y) = (x,y) is linear? Solution: We have T((x,y) + (z,w)) = T(x + z,y + w) = ((x + z),y + w) (x,y) + (z,w) = T(x,y) + T(z,w) So, T does not preserve additivity So, T is not linear Alternately, you could check that T does not preserve scalar multiplication Alternately, you could check this failure(s), numerically For example, T((, ) + (, )) = T(, ) = (9, ) T(, ) + T(, ) Exercise 67 (Ex 4, p 7) Let T : R R be a linear transformation such that T(,, ) = (, 4, ), T(,, ) = (,, ), T(,, ) = (,, ) Compute T(, 4, ) Solution: We have (, 4, ) = (,, ) + 4(,, ) (,, ) So, T(, 4, ) = T(,, )+4T(,, ) T(,, ) = (, 4, )+(,, )+(,, ) = (, 5, ) Remark A linear transformation T : V V can be defined, simply by assigning values T(v i ) for any basis {v,v,,v n } of V In this case of the our problem, values were assigned for the standard basis {e,e,e } of R

96 CHAPTER 6 LINEAR TRANSFORMATION Exercise 68 (Ex 8, p 7) Let 4 A = Let T : R 5 R be the linear transformation T(x) = Ax Compute T(,,,, ) Solution: T(,,,, ) = 4 = 7 5 Compute preimage, under T, of (, 8) Solution: The preimage consists of the solutions of the linear system 4 x x x x 4 = 8 The augmented matrix of this system is 4 8 x 5 The Gauss-Jordan form is 5 4 5 5 4

6 INTRO TO LINEAR TRANSFORMATION 97 We use parameters x = t,x 4 = s,x 5 = u and the solotions are given by x = 5 + t + 5s + 4u,x = t,x = 4 + 5s,x 4 = s,x 5 = u So, the preimage T (, 8) = {(5+t+5s+4u, t, 4+5s, s, u) : t,s,u R} Exercise 69 (Ex 54 (edited), p 7) Let T : R R be the linear transformation such that T(, ) = (, ) and T(, ) = (, ) Compute T(, 4) Solution: We have to write (, 4) = a(, )+b(, ) Solving (, 4) = 5(, ) 5(, ) So, T(, 4) = 5T(, ) 5T(, ) = 5(, ) 5(, ) = (, 5) Compute T(, ) Solution: We have to write (, ) = a(, )+b(, ) Solving (, ) = 5(, ) 5(, ) So, T(, ) = 5T(, ) 5T(, ) = 5(, ) 5(, ) = (, ) Exercise 6 (Ex 6 (edited), p 7) Let T : R R the projection T(u) = proj v (u) where v = (,, )

98 CHAPTER 6 LINEAR TRANSFORMATION Find T(x,y,z) Solution: See definition 64 proj v (x,y,z) = v (x,y,z) v v = (,, ) (x,y,z) (,, ) ( x + y + z = Compute T(5,, 5) (,, ) = x + y + z (,, ), x + y + z, x + y + z Solution: We have ( x + y + z T(5,, 5) =, x + y + z, x + y + z ) ( =,, ) Compute the matrix of T ) Solution: The matrix is given by A =, because T(x,y,z) = x y z

6 KERNEL AND RANGE OF LINEAR TRANSFORMATION99 6 Kernel and Range of linear Transformation We will essentially, skip this section Before we do that, let us give a few definitions Definition 6 Let V,W be two vector spaces and T : V W a linear transformation Then the kernel of T, denoted by ker(t), is the set of v V such that T(v) = Notationally, ker(t) = {v V : T(v) = } It is easy to see the ker(t) is a subspace of V Recall, range of T, denoted by range(t), is given by range(t) = {v W : w = T(v) for some v V } It is easy to see the range(t) is a subspace of W We say the T is isomorphism, if T is one-to-one and onto It follows, that T is an isomorphism if ker(t) = {} and range(t) = W

CHAPTER 6 LINEAR TRANSFORMATION 6 Matrices for Linear Transformations Homework: Textbook, 6, Ex 5, 7,,, 7, 9,,, 5, 9,,, 5(a,b), 7(a,b), 9, 4, 45, 47; p 97 Optional Homework: Textbook, 6, Ex 57, 59; p 98 (We will not grade them) In this section, to each linear transformation we associate a matrix Linear transformations and matrices have very deep relationships In fact, study of linear transformations can be reduced to the study of matrices and conversely First, we will study this relationship for linear transformations T : R n R m ; and later study the same for linear transformations T : V W of general vector spaces In this section, we will denote the vectors in R n, as column matrices Recall, written as columns, the standard basis of R n is given by B = {e,e,,e n },,, Theorem 6 Let T : R n R m be a linear transformation Write T(e ) = a a,t(e ) = a a,,t(e n) = a n a n a m a m a mn

6 MATRICES FOR LINEAR TRANSFORMATIONS These columns T(e ),T(e ),T(e n ) form a m n matrix A as follows, a a a n A = a a a n a m a m a mn This matrix A has the property that T(v) = Av for all v R n This matrix A is called the standard matrix of T Proof We can write v R n as v v = v = v e + v e + + v n e n v n We have, Av = a a a n a a a n a m a m a mn v v v n = v a a a m +v a a a m + +v n = v T (e )+v T (e )+ +v n T (e n ) = T (v e + v e + + v n e n ) = T(v) The proof is complete Reading assignment: Read Textbook, Examples,; page 89-9 In our context of linear transformations, we recall the following definition of composition of functions Definition 6 Let a n a n a mn T : R n R m, T : R m R p

CHAPTER 6 LINEAR TRANSFORMATION be two linear transformations Define the composition T : R n R p of T with T as T(v) = T (T (v)) for v R n The composition T is denoted by T = T ot Diagramatically, R n T R m T=T ot T R p Theorem 6 Suppose T : R n R m, T : R m R p are two linear transformations Then, the composition T = T ot : R n R p is a linear transformation Suppose A is the standard matrix of T and A is the standard matrix of T Then, the standard matrix of the composition T = T ot is the product A = A A Proof For u,v R n and scalars c, we have T(u+v) = T (T (u+v)) = T (T (u)+t v)) = T (T (u))+t T (v)) = T(u)+T(v) and T(cu) = T (T (cu)) = T (ct (u)) = ct (T (u)) = ct(u) So, T preserves addition and scalar multiplication Therefore T is a linear transformation and () is proved To prove (), we have T(u) = T (T (u)) = T (A u) = A (A u) = (A A )u

6 MATRICES FOR LINEAR TRANSFORMATIONS Therefore T(e ) is the first column of A A, T(e ) is the second column of A A, and so on Therefore, the standard matrix of T is A A The proof is complete Reading assignment: Read Textbook, Examples ; page 9 Definition 64 Let T : R n R n, T : R n R n be two linear transformations such that for every v R n we have T (T (v)) = v and T (T (v)) = v, then we say that T is the inverse of T, and we say that T is invertible Such an inverse T of T is unique and is denoted by T (Remark Let End(R n ) denote the set of all linear transformations T : R n R n Then End(R n ) has a binary operation by composition The identity operation I : R n R n acts as the identitiy under this composition operation The definition of inverse of T above, just corresponds to the inverse under this composition operation) Theorem 65 Let T : R n R n be a linear transformations and let A be the standard matrix of T Then, the following are equivalent, T is invertible T is an isomorphism A is invertible And, if T is invertible, then the standard matrix of T is A

4 CHAPTER 6 LINEAR TRANSFORMATION Proof (First, recall definition 6, that T is isomorphism if T is to and onto) (Proof of () () :) Suppose T is invertible and T be the inverse Suppose T(u) = T(v) Then, u = T (T(u)) = T (T(v)) = v So, T is to Also, given u R n we have So, T is an isomorphism u = T(T (u)) So, T onto R n (Proof of () () :) So, assume T is an isomorphism Then, Ax = T(x) = T() x = So, Ax = has an unique solution Therefore A is invertible and () follows from () (Proof of () () :) Suppose A is invertible Let T (x) = A x, then T is a linear transformation and it is easily checked that T is the inverse of T So, () follows from () The proof is complete Reading assignment: Read Textbook, Examples 4; page 9 6 Nonstandard bases and general vector spaces The above discussion about (standard) matrices of linear transformations T had to be restricted to linear transformations T : R n R m This wa sbecause R n has a standard basis {e,e,,e n, } that we could use Suppose T : V W is a linear transformation between two abstract vector spaces V,W Since V and W has no standard bases, we cannot associate a matrix to T But, if we fix a basis B of V and B of W we can associate a matrix to T We do it as follows

6 MATRICES FOR LINEAR TRANSFORMATIONS 5 Theorem 66 Suppose T : V W is a linear transformation between two vector spaces V,W Let B = {v,v,,v n } be basis of V and B = {w,w,,w m } be basis of w We can write T(v ) = w w w m a a a m Writing similar equations for T(v ),,T(v n ), we get T(v ) T(v ) T(v n ) = w w w m a a a n a a a n a m a m a mn Then, for v = x v + x v + + x n v n, We have a a a n T(v) = w w w m a a a n a m a m a mn Write A = a ij Then, T is determined by A, with respect to bases B,B x x x n Exercise 67 (Ex 6, p 97) Let T(x,y,z) = (5x y + z, z + 4y, 5x + y)

6 CHAPTER 6 LINEAR TRANSFORMATION What is the standard matrix of T? Solution: We have T : R R We write vectors x R as columns x x = y instead of (x, y, z) Recall the standard basis e = We have T(e ) = z, e = 5 5, T(e ) =, e = So, the standard matrix of T is 5 A = 4 4 5 Exercise 68 (Ex, p 97) Let of R, T(e ) = T(x,y,z) = (x + y, y z) Write down the standard matrix of T and use it to find T(,, ) Solution: In this case, T : R R With standard basis e,e,e as in exercise 67, we have T(e ) =, T(e ) =, T(e ) =

6 MATRICES FOR LINEAR TRANSFORMATIONS 7 So, the standard matrix of T is A = Therefore, T(,, ) = A = = 4 We will write our answer in as a row: T(,, ) = (, 4) Exercise 69 (Ex 8, p 97) Let T be the reflection in the line y = x in R So, T(x,y) = (y,x) Write down the standard matrix of T Solution: In this case, T : R R With standard basis e = (, ) T,e = (, ) T, we have T(e ) =, T(e ) = So, the standard matrix of T is A = Use the standard matrix to compute T(, 4) Solution: Of course, we know T(, 4) = (4, ) They want use to use the standard matrix to get the same answer We have 4 T(, 4) = A = = or T(, 4) = (4, ) 4 4

8 CHAPTER 6 LINEAR TRANSFORMATION Lemma 6 Suppose T : R R is the counterclockwise rotation by a fixed angle θ Then T(x,y) = cos θ sin θ x sin θ cos θ y Proof We can write x = r cos φ, y = r sin φ, where r = x + y, tanφ = y/x By definition T(x,y) = (r cos(φ + θ),r sin(φ + θ)) Using trig-formulas r cos(φ + θ) = r cosφcos θ r sin φ sin θ = x cos θ y sin θ and r sin(φ + θ) = r sin φ cos θ + r cos φ sin θ = y cos θ + x sin θ So, cos θ sin θ T(x,y) = sin θ cos θ The proof is complete x y Exercise 6 (Ex, p 97) Let T be the counterclockwise rotation in R by angle o Write down the standard matrix of T Solution: We use lemma 6, with θ = So, the standard matrix of T is A = cosθ sin θ cos sin = sin θ cos θ sin cos = 5 5

6 MATRICES FOR LINEAR TRANSFORMATIONS 9 Compute T(, ) Solution: We have T(, ) = A = 5 5 = + We write our answer as rows: T(, ) = (, + ) Exercise 6 (Ex, p 97) Let T be the projection on to the vector w = (, 5) in R : T(u) = proj w (u) Find the standard matrix Solution: See definition 64 T(x,y) = proj w (x,y) = w (x,y) (, 5) (x,y) x 5y w = (, 5) = (, 5) w (, 5) 6 ( ) x 5y 5x + 5y =, 6 6 So, with e = (, ) T,e = (, ) T we have (write/think everything as columns): T(e ) = 6 5 6 So, the standard matrix is Compute T(, ) A =, T(e ) = 5 6 6 5 5 6 6 5 6 5 6 Solution: We have T(, ) = A = 5 6 6 5 5 6 6 We write our answer in row-form: T(, ) = ( 5, 65 6) = 5 65 6

CHAPTER 6 LINEAR TRANSFORMATION Exercise 6 (Ex 6, p 97) Let T(x,y,z) = (x y + z, x y,y 4z) Write down the standard matrix of T Solution: with e = (,, ) T,e = (,, ) T,e = (,, ) T we have (write/think everything as columns): T(e ) =, T(e ) = So, the standard matrix is A = Compute T(,, ) 4, T(e ) = 4 Solution: We have T(,, ) = A = 4 = 7 7 Exercise 64 (Ex 4, p 97) Let T : R R, T (x,y) = (x y, x + y) and T : R R, T (x,y) = (y, ) Compute the standard matrices of T = T ot and T = T T Solution: We solve it in three steps:

6 MATRICES FOR LINEAR TRANSFORMATIONS Step-: First, compute the standard matrix of T With e = (, ) T,e = (, ) T we have (write/think everything as columns): T (e ) =, T (e ) = So, the standard matrix of T is A = Step-: Now, compute the standard matrix of T With e = (, ) T,e = (, ) T we have : T (e ) =, T (e ) = So, the standard matrix of T is A = Step- By theorem 6, the standard matrix of T = T T is A A = = So, T(x,y) = (x+y, ) Similarly, the standard matrix of T = T T is A A = = So, T (x,y) = (y, y) Exercise 65 (Ex 46, p 97) Determine whether T(x,y) = (x + y,x y)

CHAPTER 6 LINEAR TRANSFORMATION is invertible or not Solution: Because of theorem 65, we will check whether the standard matrix of T is invertible or not With e = (, ) T,e = (, ) T we have : T(e ) =, T(e ) = So, the standard matrix of T is A = Note, deta = 4 So, T is invertible and hence T is invertible Exercise 66 (Ex 58, p 97) Determine whether T(x,y) = (x y,,x + y) Use B = {v = (, ),v = (, )} as basis of the domain R and B = {w = (,, ),w = (,, ),w = (,, )} as basis of codomain R Compute matrix of T with respect to B,B Solution: We use theorem 66 We have T(u ) = T(, ) = (,, ), T(u ) = T(, ) = (,, ) We solve the equation: (,, ) = aw + bw + cw = a(,, ) + b(,, ) + c(,, ) and we have (,, ) = (,, ) (,, )+(,, ) = w w w

64 TRANSITION MATRICES AND SIMILARITY Similarly, we solve (,, ) = aw + bw + cw = a(,, ) + b(,, ) + c(,, ) and we have (,, ) = (,, ) (,, ) + (,, ) = w w w So, the matrix of T with respect to the bases B,B is A = 64 Transition Matrices and Similarity We will skip this section I will just explain the section heading You know what are Transition matrices of a linear transformation T : V W They are a matrices described in theorem 66 Definition 64 Suppose A,B are two square matrices of size n n We say A,B are similar, if A = P BP for some invertible matrix P

4 CHAPTER 6 LINEAR TRANSFORMATION 65 Applications of Linear Trans Homework: Textbook, 65 Ex (a), (a), 5, 7, 9, 5, 7, 9, 4, 49, 5, 5, 55, 6, 65; page 44-45 In this section, we discuss geometric interpretations of linear transformations represented by elementary matrices Proposition 65 Let A be a matrix and x T(x,y) = A y We will write the right hand side as a row, which is an abuse of natation If A =, then T(x,y) = A x y = ( x,y) T represents the reflection in y axis See Textbook, Example (a), p47 for the diagram If A =, then T(x,y) = A x y = (x, y) T represents the reflection in x axis See Textbook, Example (b), p47 for the diagram

65 APPLICATIONS OF LINEAR TRANS 5 If A =, then T(x,y) = A x y = (y,x) T represents the reflection in line y = x See Textbook, Example (c), p47 for the diagram 4 If A = k, then T(x,y) = A x y = (kx,y) T If k >, then T represents expansion in horizontal direction and < k <, then T represents contraction in horizontal direction See Textbook, Example Fig 6, p49 for diagrams 5 If A = k, then T(x,y) = A x y = (x,ky) T If k >, then T represents expansion in vertical direction and < k <, then T represents contraction in vertical direction See Textbook, Example Fig 6, p49 for diagrams 6 If A = k, then T(x,y) = A x y = (x + ky,y) T Then T represents horizontal shear (Assume k > ) The upperhalf plane are sheared to right and lower-half plane are sheared to left The points on the x axis reamain fixed See Textbook, Example, fig 64, p49 for diagrams

6 CHAPTER 6 LINEAR TRANSFORMATION 7 If A = k, then T(x,y) = A x y = (x,kx + y) T Then T represents vertical shear (Assume k > ) The righthalf-plane are sheared to upward and left-half-plane are sheared to downward The points on the y axis reamain fixed See Textbook, Example, fig 65, p49 for diagrams 65 Computer Graphics Linear transformations are used in computer graphics to move figures on the computer screens I am sure all kinds of linear (and nonlinear) transformations are used Here, we will only deal with rotations by an angle θ, around () x axis, () y axis and () z axis as follows: Proposition 65 Suppose θ is an angle Suppose we want to rotate the point (x, y, z) counterclockwise about z axis through an angle θ Let us denote this transformation by T and write T(x,y,z) = (x,y,z ) T Then using lemma 6, we have T(x,y,z) = x y z = cos θ sin θ x sin θ cos θ y z = x cos θ y sin θ x sin θ + y cos θ Similarly, we can write down the linear transformations corresponding to rotation around x axis and y axis We write down the transition matrices for these three matrices as follows: z The standard matrix for this transformation of counterclockwise

65 APPLICATIONS OF LINEAR TRANS 7 rotation by an angle θ, about x axis is cosθ sin θ sinθ cosθ The standard matrix for this transformation of counterclockwise rotation by an angle θ, about y axis is cos θ sinθ sin θ cosθ The standard matrix for this transformation of counterclockwise rotation by an angle θ, about z axis is cos θ sin θ sin θ cos θ Reading assignment: Read Textbook, Examples 4 and 5; page 4-4 Exercise 65 (Ex 6, p 44) Let T(x,y) = (x, y) (This is a vertical expansion) Sketch the image of the unit square with vertices (, ), (, ), (, ), (, ) Solution: We have T(, ) = (, ), T(, ) = (, ), T(, ) = (, ), T(, ) = (, )

8 CHAPTER 6 LINEAR TRANSFORMATION Diagram:,,,,,, Here, the solid arrows represent the original rectangle and the brocken arrows represent the image Exercise 654 (Ex, p 44) Let T be the reflection in the line y = x Sketch the image of the rectangle with vertices (, ), (, ), (, ), (, ) Solution: Recall, (see Proposition 65 ()) that T(x, y) = (y, x) We have T(, ) = (, ),T(, ) = (, ),T(, ) = (, ),T(, ) = (, ) Diagram:,,,,,,, Here, the solid arrows represent the original rectangle and the brocken arrows represent the image

65 APPLICATIONS OF LINEAR TRANS 9 Exercise 655 (Ex 8, p 44) Suppose T is the expansion and contraction represented by T(x,y) = ( x, ) y Sketch the image of the rectangle with vertices (, ), (, ), (, ), (, ) Solution: Recall, (see Proposition 65 ()) that (x, y) = (y, x)we have T(, ) = (, ),T(, ) = (, ),T(, ) = (, ),T(, ) = (, ) Diagram:,,,,, 5,,, Here, the solid arrows represent the original rectangle and thebrocken arrows represent the image Exercise 656 (Ex 44, p 44) Give the geometric description of the linear transformation defined by the elementary matrix A = Solution: By proposition 65 (6) this is a horizontal shear Here, T(x,y) = (x + y,y)

CHAPTER 6 LINEAR TRANSFORMATION Exercise 657 (Ex 5 and 54, p 45) Find the matrix of the transformation T that will produce a 6 o rotation about the x axis Then compute the image T(,, ) Solution: By proposition 65 () the matrix is given by A = cos θ sin θ = cos 6 o sin 6 o = sinθ cos θ So, T(,, ) = A = sin 6 o cos 6 o = + Exercise 658 (Ex 64, p 45) Determine the matrix that will produce a 45 o rotation about the y axis followed by 9 o rotation about the z axis Then also compute the image of the line segment from (,, ) to (,, ) Solution: We will do it in three (or four) steps Step- Let T be the rotation by 45 o about the y axis By proposition 65 () the matrix of T is given by A = cos θ sinθ cos 45 sin 45 = sin θ cos θ sin 45 cos 45 = Step- Let T be the rotation by 9 o rotation about the z axis By proposition 65 () the matrix of T is given by B = cos θ sin θ cos 9 o sin 9 o sin θ cos θ = sin 9 o cos 9 o =

65 APPLICATIONS OF LINEAR TRANS Step- So, thematrix of the composite transformation T = T T is matrix BA = = The Last Part: So, T(,, ) = BA = =

CHAPTER 6 LINEAR TRANSFORMATION