Tv = Tu 1 + + Tu N. (u 1,...,u M, v 1,...,v N ) T(Sv) = S(Tv) = Sλv = λ(sv).

Similar documents
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

by the matrix A results in a vector which is a reflection of the given

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Similarity and Diagonalization. Similar Matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Section Inner Products and Norms

MAT 242 Test 2 SOLUTIONS, FORM T

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Chapter 20. Vector Spaces and Bases

Orthogonal Diagonalization of Symmetric Matrices

MATH APPLIED MATRIX THEORY

Linear Algebra I. Ronald van Luijk, 2012

Chapter 6. Orthogonality

Solutions to Math 51 First Exam January 29, 2015

Inner Product Spaces

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

University of Lille I PC first year list of exercises n 7. Review

Brief Introduction to Vectors and Matrices

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Linear Algebra Review. Vectors

Inner Product Spaces. 7.1 Inner Products

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

LEARNING OBJECTIVES FOR THIS CHAPTER

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Name: Section Registered In:

5. Linear algebra I: dimension

Chapter 17. Orthogonal Matrices and Symmetries of Space

BANACH AND HILBERT SPACE REVIEW

4.5 Linear Dependence and Linear Independence

Inner products on R n, and more

[1] Diagonal factorization

Vector and Matrix Norms

Introduction to Matrix Algebra

LINEAR ALGEBRA W W L CHEN

Matrix Representations of Linear Transformations and Changes of Coordinates

α = u v. In other words, Orthogonal Projection

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

THE DIMENSION OF A VECTOR SPACE

A =

Math Practice Exam 2 with Some Solutions

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Subspaces of R n LECTURE Subspaces

6.2 Permutations continued

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Examination paper for TMA4115 Matematikk 3

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

Vector Spaces 4.4 Spanning and Independence

Lecture 18 - Clifford Algebras and Spin groups

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

NOTES ON LINEAR TRANSFORMATIONS

Numerical Analysis Lecture Notes

Lecture 1: Schur s Unitary Triangularization Theorem

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS

Solving Linear Systems, Continued and The Inverse of a Matrix

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Similar matrices and Jordan form

4 MT210 Notebook Eigenvalues and Eigenvectors Definitions; Graphical Illustrations... 3

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

LINEAR ALGEBRA. September 23, 2010

Data Mining: Algorithms and Applications Matrix Math Review

8 Square matrices continued: Determinants

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

Notes on Determinant

PROJECTIVE GEOMETRY. b3 course Nigel Hitchin

Teoretisk Fysik KTH. Advanced QM (SI2380), test questions 1

T ( a i x i ) = a i T (x i ).

( ) which must be a vector

Review Jeopardy. Blue vs. Orange. Review Jeopardy

On the D-Stability of Linear and Nonlinear Positive Switched Systems

Section 4.4 Inner Product Spaces

Derek Holt and Dmitriy Rumynin. year 2009 (revised at the end)

Inner Product Spaces and Orthogonality

MA106 Linear Algebra lecture notes

(Basic definitions and properties; Separation theorems; Characterizations) 1.1 Definition, examples, inner description, algebraic properties

1 Sets and Set Notation.

Solution to Homework 2

Solving Systems of Linear Equations

1. Let P be the space of all polynomials (of one real variable and with real coefficients) with the norm

Math 215 HW #6 Solutions

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Finite dimensional C -algebras

Lecture 3: Finding integer solutions to systems of linear equations

mod 10 = mod 10 = 49 mod 10 = 9.

1 VECTOR SPACES AND SUBSPACES

GROUP ALGEBRAS. ANDREI YAFAEV

Mathematics Course 111: Algebra I Part IV: Vector Spaces

160 CHAPTER 4. VECTOR SPACES

More than you wanted to know about quadratic forms

Lecture 13 Linear quadratic Lyapunov theory

Polynomial Invariants

Linear Algebra: Determinants, Inverses, Rank

Eigenvalues, Eigenvectors, and Differential Equations

Transcription:

54 CHAPTER 5. EIGENVALUES AND EIGENVECTORS 5.6 Solutions 5.1 Suppose T L(V). Prove that if U 1,...,U M are subspaces of V invariant under T, then U 1 + + U M is invariant under T. Suppose v = u 1 + + u N U 1 + + U M, with u n U n. Then Tv = Tu 1 + + Tu N. Since each U n is invariant under T, Tu n = U n, so Tv U 1 + + U M. 5.2 Suppose T L(V). Prove that the intersection of any collection of subspaces of V invariant under T is invariant under T. Suppose we have a set of subspaces {U r }, with each U r invariant under T. Let v r U r. Then Tv U r for each r, and so r U r is invariant under T. 5.3 Prove or give a counterexample: if U is a subspace of V that is invariant under every operator on V, then U = {0} or U = V. We ll prove the contrapositive: if U is a subspace of V and U {0} and U V, then there is an operator T on V such that U is not invariant under T. Let (u 1,...,u M ) be a basis for U, which we extend to a basis (u 1,...,u M, v 1,...,v N ) of V. The assumption U {0} and U V means that M 1 and N 1. Define a linear map T by Tu 1 = v 1, Tu n = 0, n > 1. Since v 1 / U, the subspace U is not invariant under the operator T. 5.4 Suppose S, T L(V) are such that ST = TS. Prove that null(t λi) is invariant under S for every λ F. Suppose that T v = λv. Then Sv satisfies T(Sv) = S(Tv) = Sλv = λ(sv). Thus if v is an eigenvector of T with eigenvalue λ, so is Sv. 5.5 Define T L(F 2 ) by T(w, z) = (z, w).

5.6. SOLUTIONS 55 Find all eigenvalues and eigenvectors of T. Suppose (w, z) (0, 0) and T(w, z) = (z, w) = λ(w, z). Then Of course this leads to z = λw, w = λz. w = λz = λ 2 w, z = λw = λ 2 z. Since w 0 or z 0, we see that λ 2 = 1, so λ = ±1. A basis of eigenvectors is (w 1, z 1 ) = (1, 1), (w 2, z 2 ) = ( 1, 1), and they have eigenvalues 1 and 1 respectively. 5.6 Define T L(F 3 ) by T(z 1, z 2, z 3 ) = (2z 2, 0, 5z 3 ). Find all eigenvalues and eigenvectors of T. Suppose (z 1, z 2, z 3 ) (0, 0, 0) and T(z 1, z 2, z 3 ) = (2z 2, 0, 5z 3 ) = λ(z 1, z 2, z 3 ). If λ = 0 then z 2 = z 3 = 0, and one checks easily that v 1 = (1, 0, 0) is an eigenvector with eigenvalue 0. If λ 0 then z 2 = 0, 2z 2 = λz 1 = 0, 5z 3 = λz 3, so z 1 = 0 and λ = 5. The eigenvector for λ = 5 is v 2 = (0, 0, 1). These are the only eigenvalues, and each eigenspace is one dimensional. 5.7 Suppose N is a positive integer and T L(F N ) is defined by T(x 1,...,x N ) = (x 1 + + x N,...,x 1 + + x N ). Find all eigenvalues and eigenvectors of T.

56 CHAPTER 5. EIGENVALUES AND EIGENVECTORS First, any vector of the form v 1 = (α,..., α), α F, is an eigenvector with eigenvalue N. If v 2 is any vector v 2 = (x 1,...,x N ), x n = 0, then v 2 is an eigenvector with eigenvalue 0. Here are N independent eigenvectors: and n v 1 = (1, 1,..., 1), v n = (1, 0,..., 0) E n, n 2), where E n denotes the n-th standard basis vector. 5.10 T L(V) is invertible and λ F\{0}. Prove that λ is an eigenvalue of T if and only if 1/λ is an eigenvalue of T 1. Suppose v 0 and Tv = λv. Then v = T 1 Tv = λt 1 v, or T 1 v = 1 λ v, and the other direction is similar. 5.11 Suppose S, T L(V). Prove that ST and T S have the same eigenvalues. Suppose v 0 and ST v = λv. Multiply by T to get TS(Tv) = λtv. Thus if Tv 0 then λ is also an eigenvalue of TS, with nonzero eigenvector Tv. On the other hand, if Tv = 0, then λ = 0 is an eigenvalue of ST. But if T is not invertible, then rangets ranget is not equal to V, so TS has a nontrivial null space, hence 0 is an eigenvalue of TS. 5.12 Suppose T L(V) is such that every vector in V is an eigenvector of T. Prove that T is a scalar multiple of the identity operator.

5.6. SOLUTIONS 57 Pick a basis (v 1,...,v N ) for V. By assumption, Tv n = λ n v n. Pick any two distince indices, m, n. We also have Write this as Since v m and v n are independent, T(v m + v n ) = λ(v m + v n ) = λ m v m + λ n v n. 0 = (λ λ m )v m + (λ λ n )v n. λ = λ m = λ n, and all the λ n are equal. 5.14 Suppose S, T L(V) and S is invertible. Prove that if p P(F) is a polynomial, then p(sts 1 ) = Sp(T)S 1. First let s show that for positive integers n, (STS 1 ) n = ST n S 1. We may do this by induction, with nothing to show if n = 1. Assume it s true for n = k, and consider Then (STS 1 ) k+1 = (STS 1 ) k (STS 1 ) = ST k S 1 STS 1 = ST k+1 S 1. Now suppose p(z) = a n z N + + a 1 z + a 0. p(sts 1 ) = a n (STS 1 ) n = a n ST n S 1 = S( a n T n )S 1 = Sp(T)S 1. 5.15 Suppose F = C, T L(V), p P(C), and α C. Prove that α is an eigenvalue of p(t) if and only if α = p(λ) for some eigenvalue λ of T. Suppose first that v 0 is an eigenvector of T with eigenvalue λ. That is Tv = λv.

58 CHAPTER 5. EIGENVALUES AND EIGENVECTORS Then for positive integers n, and so T n v = T n 1 λv = = λ n v, p(t)v = p(λ)v. That is α = p(λ) is an eigenvalue of p(t) if λ is an eigenvalue of T. Suppose now that α is an eigenvalue of p(t), so there is a v 0 with or p(t)v = αv, (p(t) αi)v = 0. Since F = C, we may factor the polynomial p(t) αi into linear factors 0 = (p(t) αi)v = n (T λ n I)v. At least one of the factors is not invertible, so at least one of the λ n, say λ 1, is an eigenvalue of T. Let w 0 be a eigenvector for T with eigenvalue λ 1. Then 0 = (T λ N I) (T λ 1 I)w = (p(t) αi)w, so w is an eigenvector for p(t) with eigenvalue α. But by the first part of the argument, p(t)w = p(λ 1 )w = αw, and α = p(λ 1 ). 5.16 Show that the result in the previous exercise does not hold if C is replaced with R. Take T : R 2 R 2 given by T(x, y) = ( y, x). We ve seen previously that T has no real eigenvalues. On the other hand, T 2 (x, y) = ( x, y) = 1 (x, y). 5.17 Suppose V is a complex vector space and T L(V). Prove that T has an invariant subspace of dimension j for each j = 1,...,dim V.

5.6. SOLUTIONS 59 Let (v 1,...,v N ) be a basis with respect to which T has an upper triangular matrix. Then by Proposition 5.12 T : span(v 1,...,v j ) span(v 1,...,v j ). 5.18 Give an example of an operator whose matrix with respect to some basis contains only 0 s on the diagonal, but the operator is invertible. ( ) 0 1 T =. 1 0 5.19 Give an example of an operator whose matrix with respect to some basis contains only nonzero numbers on the diagonal, but the operator is not invertible. We can use the idea of problem 5.7, taking ( ) 1 1 T =. 1 1 If v = [1, 1], then Tv = 0.