Section 7.4 Matrix Representations of Linear Operators

Similar documents
LINEAR ALGEBRA W W L CHEN

Chapter 6. Orthogonality

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Inner Product Spaces and Orthogonality

Similarity and Diagonalization. Similar Matrices

by the matrix A results in a vector which is a reflection of the given

Orthogonal Diagonalization of Symmetric Matrices

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

1 Sets and Set Notation.

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH APPLIED MATRIX THEORY

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

NOTES ON LINEAR TRANSFORMATIONS

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

LS.6 Solution Matrices

Chapter 17. Orthogonal Matrices and Symmetries of Space

Inner Product Spaces

Section Inner Products and Norms

Inner products on R n, and more

( ) which must be a vector

Math 312 Homework 1 Solutions

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

1 VECTOR SPACES AND SUBSPACES

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

University of Lille I PC first year list of exercises n 7. Review

Matrix Representations of Linear Transformations and Changes of Coordinates

Systems of Linear Equations

Name: Section Registered In:

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

[1] Diagonal factorization

α = u v. In other words, Orthogonal Projection

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Chapter 20. Vector Spaces and Bases

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Section Continued

4.5 Linear Dependence and Linear Independence

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

The Characteristic Polynomial

Linearly Independent Sets and Linearly Dependent Sets

Vector and Matrix Norms

Lecture 1: Schur s Unitary Triangularization Theorem

System of First Order Differential Equations

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Recall that two vectors in are perpendicular or orthogonal provided that their dot

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

MATH1231 Algebra, 2015 Chapter 7: Linear maps

LINEAR ALGEBRA. September 23, 2010

T ( a i x i ) = a i T (x i ).

Notes on Symmetric Matrices

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

ISOMETRIES OF R n KEITH CONRAD

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

MAT188H1S Lec0101 Burbulla

Linear Algebra Review. Vectors

ON TORI TRIANGULATIONS ASSOCIATED WITH TWO-DIMENSIONAL CONTINUED FRACTIONS OF CUBIC IRRATIONALITIES.

Math Practice Exam 2 with Some Solutions

October 3rd, Linear Algebra & Properties of the Covariance Matrix

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Examination paper for TMA4115 Matematikk 3

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Inner Product Spaces. 7.1 Inner Products

Linear Algebra I. Ronald van Luijk, 2012

Numerical Methods I Eigenvalue Problems

Brief Introduction to Vectors and Matrices

MAT 242 Test 2 SOLUTIONS, FORM T

Solutions to Math 51 First Exam January 29, 2015

PROJECTIVE GEOMETRY. b3 course Nigel Hitchin

CHAPTER 2. Eigenvalue Problems (EVP s) for ODE s

Vector Spaces 4.4 Spanning and Independence

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Lecture 18 - Clifford Algebras and Spin groups

MA106 Linear Algebra lecture notes

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

1 Introduction to Matrices

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

Eigenvalues, Eigenvectors, and Differential Equations

Similar matrices and Jordan form

Introduction to Matrix Algebra

Orthogonal Bases and the QR Algorithm

DATA ANALYSIS II. Matrix Algorithms

Chapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

PYTHAGOREAN TRIPLES KEITH CONRAD

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan

Transcription:

Section 7.4 Matrix Representations of Linear Operators Definition. Φ B : V à R n defined as Φ B (c 1 v 1 +c v + + c n v n ) = [c 1 c c n ] T. Property: [u + v] B = [u] B + [v] B and [cu] B = c[u] B for all u, v V and scalar c. 1

Section 7.4 Matrix Representations of Linear Operators Definition. Property: [u + v] B = [u] B + [v] B and [cu] B = c[u] B for all u, v V and scalar c. Example: Let V = Span B, where B = {et cost, e t sint} is L.I. and thus a basis of V. Consider the function v = e t cos (t π /4). (Is it in V?) Then v is in V since In addition, [v] B = v = 1 p e t (cos t +sint) = 1 p e t cos t + 1 p e t sin t h 1 p 1 p i T

Consider a linear transformation T: V W Questions: 1) Can we define a standard matrix for T? ) If not, what kind of matrix representation of T can we formulate? In this course, we will consider only a simpler case where T is a linear operator (i.e., the domain and the codomain are the same vector space). 3

Let T : V V be a linear operator on an n-dimensional vector space V with a basis B. Define the linear operator Φ B T (Φ B ) -1 : R n R n, and consider its standard matrix A, called the matrix representation of T with respect to B and denoted as [T] B. With the notations, [T] B = A and T A = Φ B T (Φ B ) -1. V T V (Φ B ) -1 Φ B R n Φ B T (Φ B ) -1 R n 4

Let T : V V be a linear operator on an n-dimensional vector space V with a basis B. Define the linear operator Φ B T (Φ B ) -1 : R n R n, and consider its standard matrix A, called the matrix representation of T with respect to B and denoted as [T] B. With the notations, [T] B = A and T A = Φ B T (Φ B ) -1. V T V (Φ B ) -1 Φ B R n Φ B T (Φ B ) -1 Question: How to express [T] B in terms of T and b 1, b,, b n? R n 5

Property: If B = {v 1, v,, v n }, then [T] B = [ [T(v 1 )] B [T(v )] B [T(v n )] B ]. Proof [T] B = A Ae j = T A (e j ) = Φ B T (Φ B ) -1 (e j ) = Φ B T(v j ) = [T(v j )] B. Example: Let T: P P be defined by T(p(x)) = p(0) + 3p(1)x + p()x for all p(x) in P. Then T is linear. For B = {1, x, x }, [T] B = A = [ a 1 a a 3 ] and 6 a 1 0 = [ T(1)] = [1+ 3 x+ x ] = 3, [ T( x)] [3x x ] 3 a = = + =, 1 1 B B B B 0 1 0 0 a3 = [ T( x )] [3x 4 x ] 3, so [ T] [ 1 3] 3 3 3 B = + B = B = a a a =. 4 1 4

Example: Let V = Span B, where B = {et cost, e t sint} is L.I. and thus a basis of V, and the linear operator D: V V be defined by D( f ) = f ʹ for all f V. Then D( e D( e t t cost) sin t) = = (1) e (1) e t t t cost + ( 1) e sin t [ D] t B cost + (1) e sin t 1 = 1 1 1 7

Theorem 7.10 Proof [T(v)] B = Φ B T(v) = Φ B T (Φ B ) -1 Φ B (v) = T A ([v] B ) = [T] B [v] B, where A = [T] B. 8

Example: Relative to the basis B = {1, x, x } of P, the coordinate vector of p(x) = 5-4x + 3x is [p(x)] B = [ 5-4 3 ] T. Then [pʹ (x)] B = [D(p(x))] B = [D] B [p(x)] B,where D: P P is defined by D(p(x)) = pʹ (x), and 0 1 0 0 1 0 5 4 [ D] 0 0 [ pʹ ( x)] 0 0 4 6 B = B = =. 0 0 0 0 0 0 3 0 9

The Matrix Representation of the Inverse of an Invertible Linear Operator Proof (a) Note that Φ B is an isomorphism with an inverse (Φ B ) -1, which is also an isomorphism. If T is invertible, then T A = Φ B T (Φ B ) -1 is a composition of isomorphisms. So T A is invertible and has an invertible standard matrix A. If A is invertible, then T A = Φ B T (Φ B ) -1 is invertible. So T = (Φ B )-1 T A Φ B is invertible. 10

The Matrix Representation of the Inverse of an Invertible Linear Operator Proof (b) By (a) and the invertibility of T, T C = Φ B T -1 (Φ B ) -1, where C = [T -1 ] B. Also by (a), T A -1 = (T A ) -1 = Φ B T -1 (Φ B ) -1. T C = T A -1 C = A -1. 11

Example: In the vector space V with a basis B = {et cost, e t sint} and a linear operator D: V V defined by D( f ) = f ʹ f V, it is known that [D] B = apple 1 1 1 1. So an anti-derivative of e t sint is D -1 (e t sint). Since [D -1 ] B = ([D] B )-1 and [e t sint] B = [ 0 1]T, [D 1 (e t sin t)] B = apple 1 1 1 1 apple 0 1 = apple 1 1 i.e., D -1 (e t sint) = -(e t cost)/ + (e t sint)/. 1

Definition. Example: For linear operator D: C C defined by D( f ) = f ʹ with C = {f f :R R, f has derivatives of all order}, it has an eigenvector e λt C corresponding to the eigenvalue λ, since D( f )(t) = (e λt )ʹ = λe λt = λ f (t). Any scalar λ is an eigenvalue of D. D has infinitely many eigenvalues. 13

Example: f is a solution of yʹ ʹ + 4y = 0 f C, since f must be twice-differentiable and f ʹ ʹ = -4f, which imply that the fourth derivative of f exists (f ʹ ʹ ʹ ʹ = -4f ʹ ʹ ), and so on. f 0 is an eigenvector of D : C C corresponding to the eigenvalue -4 f eigenspace of D corresponding to the eigenvalue -4 Clearly, every vector in the eigenspace of D corresponding to the eigenvalue -4 is a solution of yʹ ʹ + 4y = 0. 14

Example: U: R n n R n n defined by U(A) = A T is an isomorphism. The eigenspace of U corresponding to the eigenvalue 1 is the set of A Rn n such that U(A) = A T = A, i.e., the set of symmetric matrices. The eigenspace of U corresponding to the eigenvalue -1 is the set of A Rn n such that U(A) = A T = -A, i.e., the set of skew-symmetric matrices. U only has eigenvalues 1 and -1, since U(A) = A T = λa implies A = (A T ) T = (λa) T = λa T = λ(λa) = λ A. 15

Eigenvalues and Eigenvectors of a Matrix Representation of a Linear Operator Proof only if ( ) Suppose v 0 satisfies T(v) = λv. [v] B 0 and A[v] B = [T] B [v] B = [T(v)] B = [λv] B = λ[v] B if ( ) Suppose w 0 satisfies Aw = [T] B w = λw. Let v = (Φ B ) -1 (w) 0. Φ B T(v) = [T(v)] B = [T] B [v] B = λ[v] B = Φ B (λv) T(v) = λv since Φ B is an isomorphism. 16

Example: Let T: P P be defined as T(p(x)) = p(0) + 3p(1)x + p()x 17 for all p(x) in P. Then T is linear, and [T] B = A = where B = {1, x, x } is a basis of P. 4 1 0 0 3 3 3 1 4 The characteristic polynomial of A is -(t - 1) (t - 6). Span{[ 0-3 ] T } is the eigenspace of A corresponding to the eigenvalue 1 ap(x) with a 0 and p(x) = -3x + x is an eigenvector of T corresponding to the eigenvalue 1. Span{[ 0 1 1 ] T } is the eigenspace of A corresponding to the eigenvalue 6 bq(x) with b 0 and q(x) = x + x is an eigenvector of T corresponding to the eigenvalue 6. 3 5

Example: U: R R defined by U(A) = A T is an isomorphism and only has eigenvalues 1 and -1. Let a basis of R be B = apple 1 0 0 0, apple 0 1 0 0, apple 0 0 1 0, apple 0 0 0 1 [U] B = 6 4 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 3 7 5 The characteristic polynomial of [U] B is (t - 1) 3 (t + 1), for which indeed the only roots are 1 and -1. 18

19 Homework Set for Section 7.4 Section 7.4: Problems 1, 5, 9, 11, 19, 1, 3, 8, 3, 36, 39, 40, 43, 44, 46.