Solutions Chapter Simple Dynamical Systems. evv 9/9/ c 2004 Peter J. Olver Solve the following initial value problems: (a) du

Size: px
Start display at page:

Download "Solutions Chapter Simple Dynamical Systems. evv 9/9/ c 2004 Peter J. Olver Solve the following initial value problems: (a) du"

Transcription

1 Solutions hapter 8 sds 8 Simple Dynamical Systems 8 Solve the following initial value problems: (a) du du = u u() = (b) = u u() = dt dt du (c) dt (a) u(t) = e t (b) u(t) = e (t ) (c) u(t) = e (t+) = u u( ) = 8 Suppose a radioactive material has a half-life of years What is the decay rate γ? Starting with an initial sample of grams how much will be left after years? years? years? γ = log / 69 fter years: 9 gram; after years gram; after years 977 gram 8 arbon-4 has a half-life of 7 years Human skeletal fragments discovered in a cave are analyzed and found to have only 64% of the carbon-4 that living tissue would have How old are the remains? Solve e (log )t/7 = 64 for t = 7 log 64/ log = 9 years 84 Prove that if t is the half-life of a radioactive material then u(nt ) = n u() Explain the meaning of this equation in your own words (log )t/t y (86) u(t) = u() e = u() t/t = n u() when t = nt fter every time period of duration t the amount of material is cut in half 8 bacteria colony grows according to the equation du/dt = u How long until the colony doubles? quadruples? If the initial population is how long until the population reaches million? u(t) = u() e t To double we need e t = so t = log / = To quadruple takes twice as long t = 664 To reach million needs t = log 6 / = Deer in Northern Minnesota reproduce according to the linear differential equation du dt = 7u where t is measured in years If the initial population is u() = and the environment can sustain at most deer how long until the deer run out of resources? The solution is u(t) = u() e 7 t For the given initial conditions u(t) = when t = log(/)/7 = 964 years 87 onsider the inhomogeneous differential equation du = au + b where a b are constants dt (a) Show that u = b/a is a constant equilibrium solution (b) Solve the differential equation Hint: Look at the differential equation satisfied by v = u u (c) Discuss the stability of the equilibrium solution u evv 9/9/4 4 c 4 Peter J Olver

2 (a) If u(t) u = b du then = = au + b hence it is a solution a dt (b) v = u u satisfies dv dt = av so v(t) = cea t and so u(t) = ce a t b a (c) The equilibrium solution is asymptotically stable if and only if a < and is stable if a = 88 Use the method of Exercise 87 to solve the following initial value problems: (a) du du du = u u() = (b) = u + u() = (c) = u + 6 u() = dt dt dt (a) u(t) = + e t (b) u(t) = (c) u(t) = e (t ) 89 The radioactive waste from a nuclear reactor has a half-life of years Waste is continually produced at the rate of tons per year and stored in a dump site (a) Set up an inhomogeneous differential equation of the form in Exercise 87 to model the amount of radioactive waste (b) Determine whether the amount of radioactive material at the dump increases indefinitely decreases to zero or eventually stabilizes at some fixed amount (c) Starting with a brand new site how long until the dump contain tons of radioactive material? (a) du dt = log u = / log 7 tons (c) The solution is u(t) = when t = log log log u + 69 u + (b) Stabilizes at the equilibrium solution 4years log h exp log t i which equals 8 Suppose that hunters are allowed to shoot a fixed number of the Northern Minnesota deer in Exercise 86 each year (a) Explain why the population model takes the form du = 7u b where b is the number killed yearly (Ignore the season aspects of hunting) (b) If b = how long until the deer run out of resources? Hint: See Exercise dt 87 (c) What is the maximal rate at which deer can be hunted without causing their extinction? (a) The rate of growth remains proportional to the population while hunting decreases the population by a fixed amount (This assumes hunting is done continually throughout the year which is not what happens in real life) (b) The solution is u(t) = 7 e 7 t + 7 Solving u(t) = gives t = /7 log = 4694 years 7 /7 (c) We need the equilibrium u = b/7 to be less than the initial population so b < deer 8 (a) Prove that if u (t) and u (t) are any two distinct solutions to du = au with a > dt then u (t) u (t) as t (b) If a = and u () = u () = how long do you have to wait until u (t) u (t) >? (a) u (t) u (t) = e a t u () u () when a > (b) t = log(/)/ = (a) Write down the exact solution to the initial value problem du dt = 7 u u() = evv 9/9/4 46 c 4 Peter J Olver

3 (b) Suppose you make the approximation u() = t what point does your solution differ from the true solution by unit? by? (c) nswer the same question if you also approximate the coefficient in the differential equation by du = 87 u dt (a) u(t) = e t/7 (b) One unit: t = log h /(/ ) i /(/7) = 68; units: t = log h /(/ ) i /(/7) = 68 (c) One unit: t 8 solves e t/7 e 87 t = solver) units: t 748 solves e t/7 e 87 t = (Use a numerical equation 8 Let a be complex Prove that u(t) = ce a t is the (complex) solution to our scalar ordinary differential equation (8) Describe the asymptotic behavior of the solution as t and the stability properties of the zero equilibrium solution The solution is still valid as a complex solution If Re a > then u(t) as t and the origin is an unstable equilibrium If Re a = then u(t) remains bounded t and the origin is a stable equilibrium If Re a < then u(t) as t and the origin is an asymptotically stable equilibrium ev 8 Eigenvalues and Eigenvectors 8 Find the eigenvalues and eigenvectors of the following matrices: (a) (b) (c) (d) (e) (i) (f ) (j) (a) Eigenvalues: ; (b) Eigenvalues: (c) Eigenvalues: ; 6 4 (g) 6 6 (h) (k) 4 eigenvectors: ; eigenvectors: 4 eigenvectors: (d) Eigenvalues: + i i ; eigenvectors: i (e) Eigenvalues: 4 ; eigenvectors: i evv 9/9/4 47 c 4 Peter J Olver

4 (f ) Eigenvalues: 6 + 6; eigenvectors: + i + i (g) Eigenvalues: + i i ; eigenvectors: i + i (h) Eigenvalues: ; eigenvectors: (i) simple eigenvalue eigenvector ; double eigenvalue eigenvectors (j) Two double eigenvalues: eigenvectors and 7 eigenvectors (k) Eigenvalues: 4; eigenvectors: cos θ sin θ 8 (a) Find the eigenvalues of the rotation matrix R θ = For what values sin θ cos θ of θ are the eigenvalues real? (b) Explain why your answer gives an immediate solution to Exercise 7c (a) The eigenvalues are cos θ ± i sin θ = e ± i θ with eigenvectors They are real i only for θ = and π (b) ecause R θ a I has an inverse if and only if a is not an eigenvalue 8 nswer Exercise 8a for the reflection matrix F θ = The eigenvalues are ± with eigenvectors ( sin θ cos θ ) T cos θ sin θ sin θ cos θ 84 Write down (a) a matrix that has as one of its eigenvalues and ( ) T as a corresponding eigenvector; (b) a matrix that has ( ) T as an eigenvector for the eigenvalue (a) O and (b) I are trivial examples 8 (a) Write out the characteristic equation for the matrix α β γ (b) Show that given any numbers a b and c there is a matrix with characteristic equation λ + aλ + bλ + c = (a) The characteristic equation is λ + aλ + bλ + c = (b) Use the matrix in part (a) 86 Find the eigenvalues and eigenvectors of the cross product matrix = c b c a b a The eigenvalues are ± i a + b + c If a = b = c = then = O and all vectors evv 9/9/4 48 c 4 Peter J Olver

5 are eigenvectors Otherwise the eigenvectors are i + i a b c b a i a + b + c 87 Find all eigenvalues and eigenvectors of the following complex matrices: (a) (b) (c) (d) i i i i + i + i (a) Eigenvalues: i + i ; eigenvectors: (b) Eigenvalues: ± ; eigenvectors: i ( ± ) (c) Eigenvalues: i ; eigenvectors: (d) simple eigenvector + i ; i double eigenvectors + i ac bc a + b + i i i i i + i + i + i 88 Find all eigenvalues and eigenvectors of (a) the n n zero matrix O; (b) the n n identity matrix I (a) Since Ov = = v we conclude that is the only eigenvalue; all nonzero vectors are eigenvectors (b) Since I v = v = v we conclude that is the only eigenvalue; all nonzero vectors are eigenvectors 89 Find the eigenvalues and eigenvectors of an n n matrix with every entry equal to Hint: Try with n = and then generalize For n = the eigenvalues are and the eigenvectors are and For n = the eigenvalues are and the eigenvectors are and In general the eigenvalues are with multiplicity n and n which is simple The eigenvectors corresponding to the eigenvalues are of the form ( v v v n ) T where v = and v j = for j = n The eigenvector corresponding to n is ( ) T 8 Let be a given square matrix (a) Explain in detail why any nonzero scalar multiple of an eigenvector of is also an eigenvector (b) Show that any nonzero linear combination of two eigenvectors v w corresponding to the same eigenvalue is also an eigenvector (c) Prove that a linear combination cv + dw with c d of two eigenvectors corresponding to different eigenvalues is never an eigenvector (a) If v = λv then (cv) = cv = cλv = λ(cv) and so cv satisfies the eigenvector equation with eigenvalue λ Moreover since v also cv for c and so cv is a bona fide eigenvector (b) If v = λv w = λw then (cv + dw) = cv + dw = cλv + dλw = λ(cv + dw) (c) Suppose v = λv w = µw Then v and w must be linearly independent as otherwise they would be scalar multiples of each other and hence have the same eigenvalue Thus (cv + dw) = cv + dw = cλv + dµw = ν(cv + dw) if and only if cλ = cν and dµ = dν which when λ µ is only possible if either c = or d = evv 9/9/4 49 c 4 Peter J Olver

6 8 True or false: If v is a real eigenvector of a real matrix then a nonzero complex multiple w = cv for c is a complex eigenvector of True by the same computation as in Exercise 8(a) cv is an eigenvector for the same (real) eigenvalue λ 8 Define the shift map S: n n by S v v v n v n T = ( v v v n v ) T (a) Prove that S is a linear map and write down its matrix representation (b) Prove that is an orthogonal matrix (c) Prove that the sampled exponential vectors ω ω n in (84) form an eigenvector basis of What are the eigenvalues? (a) = (b) T = I by direct computation or equivalently the columns of are the standard orthonormal basis vectors e n e e e n written in a slightly different order (c) Since ω k = S ω k = e k π i /n e 4 k π i /n e (n ) k π i /n«t e k π i /n e 4 k π i /n e (n ) k π i /n«t = e k π i /n ω k and so ω k is an eigenvector with corresponding eigenvalue e k π i /n (a) ompute the eigenvalues and corresponding eigenvectors of = (b) ompute the trace of and check that it equals the sum of the eigenvalues (c) Find the determinant of and check that it is equal to to the product of the eigenvalues (a) Eigenvalues: ; eigenvectors: ( ) T T ( ) T (b) tr = = + + (c) det = = ( ) 84 Verify the trace and determinant formulae (84 ) for the matrices in Exercise 8 (a) tr = = + ( ); det = = ( ) (b) tr = 6 = + ; det = 6 = (c) tr = 4 = + ; det = 4 = (d) tr = = ( + i ) + ( i ); det = = ( + i ) ( i ) (e) tr = 8 = ; det = = 4 (f ) tr = = ( 6); det = 6 = 6 ( 6) (g) tr = = + ( + i ) + ( i ); det = = ( + i ) ( i ) (h) tr = 4 = + + ; det = = (i) tr = = ( ) + + ; det = 4 = ( ) (j) tr = = ( ) + ( ) ; det = 49 = ( ) ( ) 7 7 evv 9/9/4 4 c 4 Peter J Olver

7 (k) tr = = ; det = 4 = 4 8 (a) Find the explicit formula for the characteristic polynomial det( λ I ) = λ + aλ bλ + c of a general matrix Verify that a = tr c = det What is the formula for b? (b) Prove that if has eigenvalues λ λ λ then a = tr = λ + λ + λ b = λ λ + λ λ + λ λ c = det = λ λ λ (a) a = a + a + a = tr b = a a a a + a a a a + a a a a c = a a a + a a a + a a a a a a a a a a a a = det (b) When the factored form of the characteristic polynomial is multiplied out we obtain (λ λ )(λ λ )(λ λ ) = λ + (λ + λ + λ )λ (λ λ + λ λ + λ λ )λ λ λ λ giving the eigenvalue formulas for a b c 86 Prove that the eigenvalues of an upper triangular (or lower triangular) matrix are its diagonal entries If U is upper triangular so is U λ I and hence p(λ) = det(u λ I ) is the product of the diagonal entries so p(λ) = Q (u ii λ) and so the roots of the characteristic equation are the diagonal entries u u nn 87 Let J a be the n n Jordan block matrix (8) Prove that its only eigenvalue is λ = a and the only eigenvectors are the nonzero scalar multiples of the standard basis vector e Since J a λ I is an upper triangular matrix with λ a on the diagonal its determinant is det(j a λ I ) = (a λ) n and hence its only eigenvalue is λ = a of multiplicity n (Or use Exercise 86) Moreover (J a a I )v = ( v v v n ) T = if and only if v = ce 88 Suppose that λ is an eigenvalue of (a) Prove that cλ is an eigenvalue of the scalar multiple c (b) Prove that λ + d is an eigenvalue of + d I (c) More generally cλ + d is an eigenvalue of = c + d I for scalars c d parts (a) (b) are special cases of part (c): If v = λ v then v = (c + d I )v = (c λ + d) v 89 Show that if λ is an eigenvalue of then λ is an eigenvalue of If v = λv then v = λv = λ v and hence v is also an eigenvector of with eigenvalue λ 8 True or false: (a) If λ is an eigenvalue of both and then it is an eigenvalue of the sum + (b) If v is an eigenvector of both and then it is an eigenvector of + (a) False For example is an eigenvalue of both and but the eigenvalues of + = are ± i (b) True If v = λv and v = µv then (+)v = (λ + µ)v and so v is an eigenvector with eigenvalue λ + µ 8 True or false: If λ is an eigenvalue of and µ is an eigenvalue of then λµ is an eigenvalue of the matrix product = False in general but true if the eigenvectors coincide: If v = λv and v = µv then v = (λµ)v and so v is an eigenvector with eigenvalue λµ 8 Let and be n n matrices Prove that the matrix products and have the same eigenvalues If v = λv then w = λw where w = v Thus as long as w it evv 9/9/4 4 c 4 Peter J Olver

8 is an eigenvector of with eigenvalue λ However if w = then v = and so the eigenvalue is λ = which implies that is singular ut then so is which also has as an eigenvalue Thus every eigenvalue of is an eigenvalue of The converse follows by the same reasoning Note: This does not imply their null eigenspaces or kernels have the same dimension; compare Exercise 88 In anticipation of Section 86 even though and have the same eigenvalues they may have different Jordan canonical forms 8 (a) Prove that if λ is a nonzero eigenvalue of then /λ is an eigenvalue of (b) What happens if has as an eigenvalue? (a) Starting with v = λ v multiply both sides by and divide by λ to obtain v = (/λ) v Therefore v is an eigenvector of with eigenvalue /λ (b) If is an eigenvalue then is not invertible 84 (a) Prove that if det > then has at least one eigenvalue with λ > det < are all eigenvalues λ <? Prove or find a counter-example (b) If (a) If all λ j then so is their product λ λ n = det which is a contradiction (b) False = has eigenvalues while det = 8 Prove that is a singular matrix if and only if is an eigenvalue Recall that is singular if and only if ker {} ny v ker satisfies v = = v Thus ker is nonzero if and only if has a null eigenvector 86 Prove that every nonzero vector v R n is an eigenvector of if and only if is a scalar multiple of the identity matrix Let v w be any two linearly independent vectors Then v = λv and w = µw for some λ µ ut v + w is an eigenvector if and only if (v + w) = λv + µw = ν(v + w) which requires λ = µ = ν Thus v = λv for every v which implies = λ I 87 How many unit eigenvectors correspond to a given eigenvalue of a matrix? If λ is a simple real eigenvalue then there are two real unit eigenvectors: u and u For a complex eigenvalue if u is a unit complex eigenvector so is e i θ u and so there are infinitely many complex unit eigenvectors (The same holds for a real eigenvalue if we also allow complex eigenvectors) If λ is a multiple real eigenvalue with eigenspace of dimension greater than then there are infinitely many unit real eigenvectors in the eigenspace 88 True or false: (a) Performing an elementary row operation of type # does not change the eigenvalues of a matrix (b) Interchanging two rows of a matrix changes the sign of its eigenvalues (c) Multiplying one row of a matrix by a scalar multiplies one of its eigenvalues by the same scalar ll false Simple examples suffice to disprove them 89 (a) True or false: If λ v and λ v solve the eigenvalue equation (8) for a given matrix so does λ + λ v + v (b) Explain what this has to do with linearity False The eigenvalue equation v = λv is not linear in the eigenvalue and eigenvector since (v + v ) (λ + λ )(v + v ) in general 8 n elementary reflection matrix has the form Q = I uu T where u R n is a unit vector (a) Find the eigenvalues and eigenvectors for the elementary reflection matrices evv 9/9/4 4 c 4 Peter J Olver

9 corresponding to the following unit vectors: (i) (ii) 4 (iii) (iv) (b) What are the eigenvalues and eigenvectors of a general elementary reflection matrix? (a) (i) Q = (ii) Q = Eigenvalue has eigenvector: Eigenvalue has eigenvector: ; Eigenvalue has eigenvector: 4 ; Eigenvalue has eigenvector: (iii) Q = Eigenvalue has eigenvector: ; Eigenvalue has eigenvectors: (iv) Q = Eigenvalue has eigenvector: ; Eigenvalue has eigenvectors: (b) u is an eigenvector with eigenvalue ll vectors orthogonal to u are eigenvectors with eigenvalue + 8 Let and be similar matrices so = S S for some nonsingular matrix S (a) Prove that and have the same characteristic polynomial: p (λ) = p (λ) (b) Explain why similar matrices have the same eigenvalues (c) Do they have the same eigenvectors? If not how are their eigenvectors related? (d) Prove that the converse is not true by showing that and have the same eigenvalues but are not similar det( λ I ) = det(s S λ I ) = det h S ( λ I )S i (a) = det S det( λ I ) det S = det( λ I ) (b) The eigenvalues are the roots of the common characteristic equation (c) Not usually If w is an eigenvector of then v = S w is an eigenvector of and conversely (d) oth have as a double eigenvalue Suppose = S S or equivalently S = S for some S = x y Then equating entries z w we must have x y = x x + y = z w = z + w = w which implies x = y = z = w = and so S = O which is not invertible 8 Let be a nonsingular n n matrix with characteristic polynomial p (λ) (a) Explain how to construct the characteristic polynomial p (λ) of its inverse directly from p (λ) 4 4 (b) heck your result when = (i) (ii) 4 (a) p (λ) = det( λ I ) = det λ λ I = ( λ)n det p λ evv 9/9/4 4 c 4 Peter J Olver

10 Or equivalently if p (λ) = ( ) n λ n + c n λ n + + c λ + c then since c = det " p (λ) = ( ) n λ n + c λ n + + c # n λ + c c c (b) (i) = Then p (λ) = λ λ while (ii) = p (λ) = λ + λ = λ λ λ + Then p (λ) = λ + λ 7λ + while p (λ) = λ + 7 λ λ + = λ λ + λ 7 λ + 8 square matrix is called nilpotent if k = O for some k (a) Prove that the only eigenvalue of a nilpotent matrix is (The converse is also true; see Exercise 86) (b) Find examples where k O but k = O for k = and in general (a) If v = λ v then = k v = λ k v and hence λ k = (b) If has size (k + ) (k + ) and is all zero except for a ii+ = on the supra-diagonal ie a k k Jordan block J k with zeros along the diagonal 84 (a) Prove that every eigenvalue of a matrix is also an eigenvalue of its transpose T (b) Do they have the same eigenvectors? (c) Prove that if v is an eigenvector of with eigenvalue λ and w is an eigenvector of T with a different eigenvalue µ λ then v and w are orthogonal vectors with respect to the dot product (d) Illustrate this result when (i) = 4 (ii) = 4 (a) det( T λ I ) = det( λ I ) T = det( λ I ) and hence and T have the same characteristic polynomial (b) No See the examples (c) λ v w = (v) T w = v T T w = µ v w so if µ λ v w = and the vectors are orthogonal (d) (i) The eigenvalues are ; the eigenvectors of are v = v = ; the eigenvectors of T are w = w = and v w are orthogonal as are v w The eigenvalues are ; the eigenvectors of are v = v = v = ; the eigenvectors of T are w = w = w = Note that v i is orthogonal to w j whenever i j 8 (a) Prove that every real matrix has at least one real eigenvalue (b) Find a real evv 9/9/4 44 c 4 Peter J Olver

11 4 4 matrix with no real eigenvalues (c) an you find a real matrix with no real eigenvalues? (a) The characteristic equation of a matrix is a cubic polynomial and hence has at least one real root (b) has eigenvalues ± i (c) No since the characteristic polynomial is degree and hence has at least one real root 86 (a) Show that if is a matrix such that 4 = I then the only possible eigenvalues of are i and i (b) Give an example of a real matrix that has all four numbers as eigenvalues (a) If v = λv then v = 4 v = λ 4 v and hence any eigenvalue must satisfy λ 4 = (b) 87 projection matrix satisfies P = P Find all eigenvalues and eigenvectors of P If P v = λv then P v = λ v Since P v = P v we find λv = λ v and so since v = λ so the only eigenvalues are λ = ll v rng P are eigenvectors with eigenvalue since if v = P u then P v = P u = P u = v whereas all w ker P are null eigenvectors 88 True or false: ll n n permutation matrices have real eigenvalues False For example has eigenvalues ± i 89 (a) Show that if all the row sums of are equal to then has as an eigenvalue (b) Suppose all the column sums of are equal to Does the same result hold? Hint: Use Exercise 84 (a) ccording to Exercise 9 if z = ( ) T then z is the vector of row sums of and hence by the assumption z = z so z is an eigenvector with eigenvalue (b) Yes since the column sums of are the row sums of T and Exercise 84 says that and T have the same eigenvalues 84 Let Q be an orthogonal matrix (a) Prove that if λ is an eigenvalue then so is /λ (b) Prove that all its eigenvalues are complex numbers of modulus λ = In particular the only possible real eigenvalues of an orthogonal matrix are ± (c) Suppose v = x + i y is a complex eigenvector corresponding to a non-real eigenvalue Prove that its real and imaginary parts are orthogonal vectors having the same Euclidean norm (a) If Qv = λv then Q T v = Q v = λ v and so λ is an eigenvalue of Q T Exercise 84 says that a matrix and its transpose have the same eigenvalues (b) If Qv = λv then by Exercise 6 v = Qv = λ v and hence λ = Note that this proof also applies to complex eigenvalues/eigenvectors using the Hermitian norm (c) If e i θ = cos θ + i sin θ is the eigenvalue then Qx = cos θx sin θy Qy = sin θx + cos θy evv 9/9/4 4 c 4 Peter J Olver

12 while Q x = cos θx sin θy Q y = sin θx + cos θy Thus cos θ x + cos θ sin θx y+sin θy = Q x = x = Qx = cos θ x cos θ sin θx y+sin θy and so for x y = provided θ ± π ±π Moreover x = cos θ x + sin θy while by the same calculation y = sin θ x + cos θy which imply x = y For θ = π we have Qx = y Qy = x and so x y = (Qy) (Qx) = x y is also zero 84 (a) Prove that every proper orthogonal matrix has + as an eigenvalue or false: n improper orthogonal matrix has as an eigenvalue (b) True (a) ccording to Exercise 8 a orthogonal matrix has at least one real eigenvalue which by Exercise 84 must be ± If the other two eigenvalues are complex conjugate µ± i ν then the product of the eigenvalues is ±(µ +ν ) Since this must equal the determinant of Q which by assumption is positive we conclude that the real eigenvalue must be + Otherwise all the eigenvalues of Q are real and they cannot all equal as otherwise its determinant would be negative (b) True It must either have three real eigenvalues of ± of which at least one must be as otherwise its determinant would be + or a complex conjugate pair of eigenvalues λ λ and its determinant is = ± λ so its real eigenvalue must be and its complex eigenvalues ± i 84 (a) Show that the linear transformation defined by proper orthogonal matrix corresponds to rotating through an angle around a line through the origin in R the axis of the rotation Hint: Use Exercise 84(a) (b) Find the axis and angle of rotation of the orthogonal matrix (a) The axis of the rotation is the eigenvector v corresponding to the eigenvalue + Since Qv = v the rotation fixes the axis and hence must rotate around it hoosing an orthonormal basis u u u where u is a unit eigenvector in the direction of the axis of rotation while u + i u is a complex eigenvector for the eigenvalue e i θ In this basis Q has matrix form cos θ sin θ where θ is the angle of rotation (b) The axis is the eigenvec- sin θ cos θ tor for the eigenvalue The complex eigenvalue is 7 + i and so the angle is θ = cos Find all invariant subspaces cf Exercise 74 of a rotation in R In general besides the trivial invariant subspaces {} and R the axis of rotation and its orthogonal complement plane are invariant If the rotation is by 8 then any line in the orthogonal complement plane is also invariant If R = I then every subspace is invariant 844 Suppose Q is an orthogonal matrix (a) Prove that K = I Q Q T is a positive semi-definite matrix (b) Under what conditions is K >? (a) (Q I ) T (Q I ) = Q T Q Q Q T + I = I Q Q T = K and hence K is a Gram matrix which is positive semi-definite by Theorem 8 (b) The Gram matrix is positive definite if and only if ker(q I ) = {} which means that evv 9/9/4 46 c 4 Peter J Olver

13 Q does not have an eigenvalue of 84 Prove that every proper affine isometry F (x) = Q x + b of R where det Q = + is one of the following: (a) a translation x + b (b) a rotation centered at some point of R or (c) a screw consisting of a rotation around an axis followed by a translation in the direction of the axis Hint: Use Exercise 84 If Q = I then we have a translation Otherwise we can write F (x) = Q(x c) + c in the form of a rotation around the center point c provided we can solve (Q I )c = b y the Fredholm alternative this requires b to be orthogonal to coker(q I ) which is also spanned by the rotation axis v ie the eigenvector for the eigenvalue + of Q T = Q More generally we write F (x) = Q(x c) + c + tv and identify the affine map as a screw around the axis in the direction of v passing through c 846 Let M n be the n n tridiagonal matrix hose diagonal entries are all equal to and whose sub- and super-diagonal entries all equal (a) Find the eigenvalues and eigenvectors of M and M directly (b) Prove that the eigenvalues and eigenvectors of M n are explicitly given by λ k = cos k π n + v k = sin k π n + How do you know that there are no other eigenvalues? (a) For M = : eigenvalues ; eigenvectors k π nk π T sin sin k = n n + n + eigenvalues ; eigenvectors (b) The j th entry of the eigenvalue equation M n v k = λ k v k reads sin (j )k π n + + sin (j + )k π n + = cos For M = k π n + sin j k π n + : which is a standard trigonometric identity: sin α + sin β = cos α β sin α + β These are all the eigenvalues because an n n matrix has at most n eigenvalues 847 Let a b be fixed scalars Determine the eigenvalues and eigenvectors of the n n tridiagonal matrix with all diagonal entries equal to a and all sub- and super-diagonal entries equal to b Hint: See Exercises We have = a I +bm n so by Exercises 88 and 846 it has the same eigenvectors k π as M n while its corresponding eigenvalues are a + bλ k = a + b cos for k = n n Find a formula for the eigenvalues of the tricirculant n n matrix Z n that has s on the sub- and super-diagonals as well as its ( n) and (n ) entries while all other entries are Hint: Use Exercise 846 as a guide λ k = cos k π n v k π 4k π 6k π (n )k π T k = cos cos cos cos n n n n for k = n 849 Let be an n n matrix with eigenvalues λ λ k and an m m matrix with evv 9/9/4 47 c 4 Peter J Olver

14 eigenvalues µ µ l Show that the (m+n) (m+n) block diagonal matrix D = O O has eigenvalues λ λ k µ µ l and no others How are the eigenvectors related? Note first that if v = λv then D v = v = λ v v and so is an eigenvector for D with eigenvalue λ Similarly each eigenvalue µ and eigenvector w of gives an eigenvector of D Finally to check that D has no other eigenvalue we compute w D v = v v = λ and hence if v then λ is an eigenvalue of while if w w w w then it must also be an eigenvalue for 8 Let = a b be a matrix (a) Prove that satisfies its own characteristic c d equation meaning p () = (tr ) + (det ) I = O Remark: This result is a special case of the ayley Hamilton Theorem to be developed in Exercise 866 (b) Prove the inverse formula (tr ) I = when det (c) heck the ayley det Hamilton and inverse formulas when = (a) Follows by direct computation (b) Multiply the characteristic equation by and rearrange terms (c) tr = 4 det = 7 and one checks I = O 8 Deflation: Suppose has eigenvalue λ and corresponding eigenvector v (a) Let b be any vector Prove that the matrix = v b T also has v as an eigenvector now with eigenvalue λ β where β = v b (b) Prove that if µ λ β is any other eigenvalue of then it is also an eigenvalue of Hint: Look for an eigenvector of the form w + cv where w is the eigenvector of (c) Given a nonsingular matrix with eigenvalues λ λ λ n and λ λ j j explain how to construct a deflated matrix whose eigenvalues are λ λ n (d) Try out your method on the matrices and (a) v = ( v b T )v = v (b v)v = (λ β)v (b) (w + cv) = ( v b T )(w + cv) = µw + (c(λ β) b w)v = µ(w + cv) provided c = b w/(λ β µ) (c) Set = λ v b T where v is the first eigenvector of and b is any vector such that b v = For example we can set b = v / v (Weilandt deflation [] chooses b = r j /(λ v j ) where v j is any nonzero entry of v and r j is the corresponding row of ) (d) (i) The eigenvalues of are 6 and the eigenvectors The deflated matrix = λ v vt v = has eigenvalues and eigenvectors (ii) The eigenvalues of are 4 and the eigenvectors The de- evv 9/9/4 48 c 4 Peter J Olver

15 flated matrix = λ v vt v = eigenvectors 4 4 has eigenvalues and the diag 8 Eigenvector ases and Diagonalization 8 Which of the following are complete eigenvalues for the indicated matrix? What is the dimension of the associated eigenspace? (a) (b) 4 4 (c) (d) (e) 4 (f ) i i i (g) i (a) omplete dim = with basis ( ) T (b) Not complete dim = with basis ( ) T (c) omplete dim = with basis ( ) T (d) Not an eigenvalue (e) omplete dim = with basis ( ) T ( ) T (f ) omplete dim = with basis ( i ) T (g) Not an eigenvalue (h) Not complete dim = with basis ( ) T (h) 8 Find the eigenvalues and a basis for the each of the eigenspaces of the following matrices Which are complete? i (a) (b) (c) (d) (e) i 6 8 (f ) 4 4 (g) 6 (h) (i) (a) Eigenvalue: Eigenvector: (b) Eigenvalues: Eigenvectors: Not complete omplete evv 9/9/4 49 c 4 Peter J Olver

16 ± i (c) Eigenvalues: ± i Eigenvectors: omplete i i (d) Eigenvalues: i Eigenvectors: omplete (e) Eigenvalue has eigenspace basis Not complete (f ) Eigenvalue has eigenspace basis Eigenvalue has omplete (g) Eigenvalue has eigenspace basis Eigenvalue has Not complete (h) Eigenvalue has Eigenvalue has Eigenvalue has omplete (i) Eigenvalue has eigenspace basis Eigenvalue has Not complete 8 Which of the following matrices admit eigenvector bases of R n? For those that do exhibit such a basis If not what is the dimension of the subspace of R n spannedby the eigenvectors? (a) (b) (c) (d) 4 4 (e) (f ) (g) (h) 4 (a) Eigenvalues: 4; the eigenvectors (b) Eigenvalues: i + i ; the eigenvectors is i form a basis for R i are not real so the dimension (c) Eigenvalue: ; there is only one eigenvector v = spanning a one-dimensional subspace of R (d) The eigenvalue has eigenvector while the eigenvalue has eigenvectors The eigenvectors form a basis for R (e) The eigenvalue has eigenvector while the eigenvalue has eigenvector The eigenvectors span a two-dimensional subspace of R evv 9/9/4 4 c 4 Peter J Olver

17 8 (f ) λ = The eigenvectors are and forming a basis for R 7 i i (g) the eigenvalues are i i The eigenvectors are and The real eigenvectors span only a one-dimensional subspace of R 4 (h) The eigenvalues are i i + The eigenvectors are i i 6 The real eigenvectors span a two-dimensional subspace of R 4 84 nswer Exercise 8 with R n replaced by n ases (abdfgh) all have eigenvector bases of n 8 (a) Give an example of a matrix that only has as an eigenvalue and has only one linearly independent eigenvector (b) Give an example that has two linearly independent eigenvectors (a) (b) 86 True or false: (a) Every diagonal matrix is complete (b) Every upper triangular matrix is complete (a) True The standard basis vectors are eigenvectors (b) False The Jordan matrix is incomplete since e is the only eigenvector 87 Prove that if is a complete matrix so is c+d I where c d are any scalars Hint: Use Exercise 88 ccording to Exercise 88 every eigenvector of is an eigenvector of c + d I with eigenvalue cλ + d and hence if has a basis of eigenvectors so does c + d I 88 (a) Prove that if is complete so is (b) Give an example of an incomplete matrix such that is complete (a) Every eigenvector of is an eigenvector of with eigenvalue λ and hence if has a basis of eigenvectors so does (b) = with = O 89 Suppose v v n forms an eigenvector basis for the complete matrix with λ λ n the corresponding eigenvalues Prove that every eigenvalue of is one of the λ λ n Suppose v = λv Write v = nx i = c i v i Then v = independence λ i c i = λ c i THus either λ = λ i or c i = nx i = i i c i λ i v i and hence by linear 8 (a) Prove that if λ is an eigenvalue of then λ n is an eigenvalue of n (b) State and prove a converse if is complete Hint: Use Exercise 89 (The completeness hypothesis is not essential but this is harder relying on the Jordan canonical form) QED (a) If v = λv then by induction n v = λ n v and hence v is an eigenvector with eigenvalue λ n (b) onversely if is complete and n has eigenvalue µ then there is a (complex) n th root λ = n µ that is an eigenvalue of Indeed the eigenvector basis of is an evv 9/9/4 4 c 4 Peter J Olver

18 eigenvector basis of n and hence using Exercise 89 every eigenvalue of n is the n th power of an eigenvalue of 8 Show that if is complete then any similar matrix = S S is also complete s in Exercise 8 if v is an eigenvector of then S v is an eigenvector of Moreover if v v n form a basis so do S v S v n ; see Exercise 4 for details 8 Let U be an upper triangular matrix with all its diagonal entries equal Prove that U is complete if and only if U is a diagonal matrix ccording to Exercise 86 its only eigenvalue is λ the common value of its diagonal entries and so all eigenvectors belong to ker(u λ I ) Thus U is complete if and only if dim ker(u λ I ) = n if and only if U λ I = O QED 8 Show that each eigenspace of an n n matrix is an invariant subspace as defined in Exercise 74 Let V = ker( λ I ) If v V then v V since ( λ I )v = ( λ I )v = QED 84 (a) Prove that if v = x ± i y is a complex conjugate pair of eigenvectors of a real matrix corresponding to complex conjugate eigenvalues µ ± i ν with ν then x and y are linearly independent real vectors (b) More generally if v j = x j ± i y j j = k are complex conjugate pairs of eigenvectors corresponding to distinct pairs of complex conjugate eigenvalues µ j ± i ν j ν j then the real vectors x x k y y k are linearly independent (a) Let µ ± i ν be the corresponding eigenvalues Then the complex eigenvalue equation (x + i y) = (µ + i ν)(x + i y) implies that x = µx ν y and y = ν x + µy Now suppose cx + dy = for some (c d) ( ) Then = (cx + dy) = (cµ + dν)x + ( cν + dµ)y The determinant of the coefficient matrix of these two sets of equations for x y is c d det = (c cµ + dν cν + dµ + d )ν because we are assuming the eigenvalues are truly complex This implies x = y = which contradicts the fact that we have an eigenvector (b) This is proved by induction Suppose we know x x k y y k are linearly independent If x x k y y k were linearly dependent there would exist (c k d k ) ( ) such that z k = c k x k + d k y k is some linear combination of x x k y y k ut then z k = (c k µ + d k ν)x k + ( c k ν + d k µ)y k is also a linear combination as is z k = (c k (µ ν ) + d k µν)x k + ( c k µν + d k (µ ν ))y k Since the coefficient matrix of the first two such vectors is nonsingular a suitable linear combination would vanish: = az k + bz k + c z k which would give a vanishing linear combination of x x k y y k which by the induction hypothesis must be trivial little more work demonstrates that this implies z k = and so in contradiction to part (b) would imply x k y k are linearly dependent QED 8 Diagonalize the following matrices: (a) 9 (b) 6 4 (c) 4 evv 9/9/4 4 c 4 Peter J Olver

19 (d) (h) (a) S = (e) (i) 8 (f ) (j) D = (b) S = D = (c) S = + i i D = + i i (d) S = D = (e) S = 6 D = 7 7 i + i (f ) S = D = (g) S = (h) S = (i) S = (j) S = 6 (g) i i D = 4 6 D = D = i i i + i + i i D = i i 86 Diagonalize the Fibonacci matrix F = = + 87 Diagonalize the matrix the result? S = i i + + of rotation through 9 How would you interpret D = i rotation does not stretch any real vectors but i evv 9/9/4 4 c 4 Peter J Olver

20 somehow corresponds to two complex stretches 88 Diagonalize the rotation matrices (a) (a) (b) = = i i i i (b) i i i i + i i i i 89 Which of the following matrices have real diagonal forms? (a) (b) 4 8 (c) (d) (e) (f ) 4 (a) Yes distinct real eigenvalues (b) No complex eigenvalues ± i 6 (c) No complex eigenvalues ± i (d) No incomplete eigenvalue (and complete eigenvalue ) (e) Yes distinct real eigenvalues 4 (f ) Yes complete real eigenvalues 8 Diagonalize the following complex matrices: (a) (b) (c) i i i i + i (a) S = D = + i + i (b) S = + i D = i + i (c) S = i D = 4 (d) S = i + i D = i i i + i (d) i + i i i i 8 Write down a real matrix that has (a) eigenvalues and corresponding eigenvectors (b) eigenvalues and associated eigenvectors ; (c) an eigenvalue of and corresponding eigenvectors ; (d) an eigenvalue evv 9/9/4 44 c 4 Peter J Olver

21 + i and corresponding eigenvector ; (e) an eigenvalue and corresponding eigenvector + i i ; (f ) an eigenvalue + i and corresponding eigenvector i i Use = S ΛS For parts (e) (f ) you can choose any other eigenvalues and eigenvectors you want to fill in S and Λ (a) (b) (c) (d) 4 4 (e) example: (f ) example: 6 8 matrix has eigenvalues and and associated eigenvectors and Write down the matrix form of the linear transformation L[u] = u in terms of (a) the standard basis e e ; (b) the basis consisting of its eigenvectors; (c) the basis 4 (a) 6 (b) 8 (c) Prove that two complete matrices have the same eigenvalues (with multiplicities) if and only if they are similar ie = S S for some nonsingular matrix S Let S be the eigenvector matrix for and S the eigenvector matrix for Thus by the hypothesis S S = Λ = S S and hence = S S S S = S S where S = S S 84 Let be obtained from by permuting both its rows and columns using the same permutation π so b ij = a π(i)π(j) Prove that and have the same eigenvalues How are their eigenvectors related? Let v = ( v v v n ) T be an eigenvector for Let ev be obtained by applying the permutation to the entries of v so ev i = v π(i) Then the i th entry of ev is nx j = b ij ev j = nx j = a π(i)π(j) v π(j) = nx j = and hence ev is an eigenvector of with eigenvalue λ a π(i)j v j = λv π(i) = λev i QED 8 True or false: If is a complete upper triangular matrix then it has an upper triangular eigenvector matrix S True Let λ j = a jj denote the j th diagonal entry of which is the same as the j th eigenvalue We will prove that the corresponding eigenvector is a linear combination of e e j which is equivalent to the eigenvector matrix S being upper triangular We use induction on the size n Since is upper triangular it leaves the subspace V spanned by e e n invariant and hence its restriction is an (n ) (n ) upper triangular matrix Thus by induction and completeness possesses n eigenvectors of the required form The remaining eigenvector v n cannot belong to V (otherwise the eigenvectors would be linearly dependent) and hence must involve e n QED 86 Suppose the n n matrix is diagonalizable How many different diagonal forms does it have? evv 9/9/4 4 c 4 Peter J Olver

22 The diagonal entries are all eigenvalues and so are obtained from each other by permutation If all eigenvalues are distinct then there are n different diagonal forms otherwise n if it has distinct eigenvalues of multiplicities j j k there are distinct diagonal j j k forms 87 haracterize all complete matrices that are their own inverses: = Write down a non-diagonal example = I if and only if D = I and so all its eigenvalues are ± Examples: = with eigenvalues and eigenvectors ; or even simpler = 4 88 Two n n matrices are said to be simultaneously diagonalizable if there is a nonsingular matrix S such that both S S and S S are diagonal matrices (a) Show that simultaneously diagonalizable matrices commute: = (b) Prove that the converse is valid provided one of the matrices has no multiple eigenvalues (c) Is every pair of commuting matrices simultaneously diagonalizable? (a) If = S ΛS and = S D S where Λ D are diagonal then = S ΛD S = S D ΛS = since diagonal matrices commute (b) ccording to Exercise (e) the only matrices that commute with an n n diagonal matrix with distinct entries is another diagonal matrix Thus if = and = S ΛS where all entries of Λ are distinct then D = S S commutes with Λ and hence is a diagonal matrix (c) No the matrix commutes with the identity matrix but is not diagonalizable See also Exercise 4 evsym 84 Eigenvalues of Symmetric Matrices 84 Find the eigenvalues and an orthonormal eigenvector basis for the following symmetric matrices: (a) (b) (c) (d) (e) (a) eigenvalues: ; (b) eigenvalues: 7 ; eigenvectors: eigenvectors: 7 + (c) eigenvalues: 7 ; eigenvectors: q 6 6 q evv 9/9/4 46 c 4 Peter J Olver

23 (d) eigenvalues: 6 4; (e) eigenvalues: 9 ; eigenvectors: eigenvectors: Determine whether the following symmetric matrices are positive definite by computing their eigenvalues Validate your conclusions by using the methods from hapter 4 4 (a) (b) (c) (d) (a) eigenvalues ± 7; positive definite (b) eigenvalues 7; not positive definite (c) eigenvalues ; positive semi-definite (d) eigenvalues 6 ± ; positive definite 84 Prove that a symmetric matrix is negative definite if and only if all its eigenvalues are negative Use the fact that K = N is positive definite and so has all positive eigenvalues The eigenvalues of N = K are λ j where λ j are the eigenvalues of K lternatively mimic the proof in the book for the positive definite case 844 How many orthonormal eigenvector bases does a symmetric n n matrix have? If all eigenvalues are distinct there are n different bases governed by the choice of sign in the unit eigenvectors ±u k If the eigenvalues are repeated there are infinitely many since any orthonormal basis of each eigenspace will contribute to an orthonormal eigenvector basis of the matrix 84 Let = a b (a) Write down necessary and sufficient conditions on the entries c d a b c d that ensures that has only real eigenvalues (b) Verify that all symmetric matrices satisfy your conditions (a) The characteristic equation p(λ) = λ (a + d)λ + (ad bc) = has real roots if and only if its discriminant is non-negative: (a + d) 4(ad bc) = (a d) + 4bc which is the necessary and sufficient condition for real eigenvalues (b) If is symmetric then b = c and so the discriminant is (a d) + 4b 846 Let T = be a real skew-symmetric n n matrix (a) Prove that the only possible real eigenvalue of is λ = (b) More generally prove that all eigenvalues λ of are purely imaginary ie Re λ = (c) Explain why is an eigenvalue of whenever n is odd (d) Explain why if n = the eigenvalues of O are i ω i ω for some real ω (e) Verify these facts for the particular matrices (i) (ii) 4 (iii) (iv) 4 evv 9/9/4 47 c 4 Peter J Olver

24 (a) If v = λv and v is real then λ v = (v) v = (v) T v = v T T v = v T v = v (v) = λ v and hence λ = (b) Using the Hermitian dot product λ v = (v) v = v T T v = v T v = v (v) = λ v and hence λ = λ so λ is purely imaginary (c) Since det = cf Exercise 9 at least one of the eigenvalues of must be c b (d) The characteristic polynomial of = c a is λ + λ(a + b + c ) and b a hence the eigenvalues are ± i a + b + c and so are all zero if and only if = O (e) The eigenvalues are: (i) ± i (ii) ± i (iii) ± i (iv) ± i ± i 847 (a) Prove that every eigenvalue of a Hermitian matrix so T = as in Exercise 649 is real (b) Show that the eigenvectors corresponding to distinct eigenvalues are orthogonal under the Hermitian dot product on n (c) Find the eigenvalues and eigenvectors of the following Hermitian matrices and verify orthogonality: i i i (i) (ii) (iii) i i i + i i (a) Let v = λv Using the Hermitian dot product λ v = (v) v = v T T v = v T v = v (v) = λ v and hence λ = λ which implies that the eigenvalue λ is real (b) Let v = λv w = µw Then λv w = (v) w = v T T w = v T w = v (w) = µv w since µ is real Thus if λ µ then v w = (c) (i) eigenvalues ± ; eigenvectors: ( ) i ( + ) i i + i (ii) eigenvalues 4 ; eigenvectors: (iii) eigenvalues ± ; eigenvectors: i i 848 Let M > be a fixed positive definite n n matrix nonzero vector v is called a generalized eigenvector of the n n matrix K if K v = λm v v (8) where the scalar λ is the corresponding generalized eigenvalue (a) Prove that λ is a generalized eigenvalue of the matrix K if and only if it is an ordinary eigenvalue of the matrix M K How are the eigenvectors related? (b) Now suppose K is a symmetric matrix Prove that its generalized eigenvalues are all real Hint: First explain why this does not follow from part (a) Instead mimic the proof of part (a) of Theorem 8 using the weighted Hermitian inner product v w = v T M w in place of the dot product (c) Show that if K > then its generalized eigenvalues are all positive: λ > (d) Prove that the eigenvectors corresponding to different generalized eigenvalues are orthogonal under the weighted inner product v w = v T M w (e) Show that if the matrix pair K M has n distinct generalized eigenvalues then the eigenvectors form an orthogonal basis for R n Reevv 9/9/4 48 c 4 Peter J Olver

25 mark: One can by mimicking the proof of part (c) of Theorem 8 show that this holds even when there are repeated generalized eigenvalues (a) Rewrite (8) as M K v = λv and so v is an eigenvector for M K with eigenvalue λ (b) If v is an generalized eigenvector then since K M are real matrices K v = λ M v Therefore λ v = λv T M v = (λm v) T v = (K v) T v = v T (K v) = λv T M v = λ v and hence λ is real (c) If K v = λm v K w = µm w with λ µ and v w real then λ v w = (λm v) T w = (K v) T w = v T (K w) = µv T M w = µ v w and so if λ µ then v w = proving orthogonality (d) If K > then λ v v = v T (λm v) = v T K v > and so by positive definiteness of M λ > (e) Part (b) proves that the eigenvectors are orthogonal with respect to the inner product induced by M and so the result follows immediately from Theorem 849 ompute the generalized eigenvalues and eigenvectors as in (8) for the following matrix pairs Verify orthogonality of the eigenvectors under the appropriate inner product (a) K = M = (b) K = M = (c) K = (e) K = M = 4 8 M = (d) K = (f ) K = M = 6 99 M = (a) eigenvalues: ; eigenvectors: ; (b) eigenvalues: ; eigenvectors: ; (c) eigenvalues: 7 ; eigenvectors: ; 6 6 (d) eigenvalues: 9 ; eigenvectors: ; 4 (e) eigenvalues: ; eigenvectors: ; (f ) is a double eigenvalue with eigenvector basis while is a simple eigenvalue with eigenvector For orthogonality you need to select an M orthogonal basis of the two-dimensional eigenspace say by using Gram Schmidt 84 Let L = L : R n R n be a self-adjoint linear transformation with respect to the inner product Prove that all its eigenvalues are real and the eigenvectors are orthogonal Hint: Mimic the proof of Theorem 8 replacing the dot product by the given inner product evv 9/9/4 49 c 4 Peter J Olver

26 If L[v ] = λv then using the inner product λ v = L[v ] v = v L[v ] = λ v which proves that the eigenvalue λ is real Similarly if L[w ] = µ : w then λ v w = L[v ] w = v L[w ] = µ v w and so if λ µ then v w = 84 The difference map : n n is defined as = S I where S is the shift map of Exercise 8 (a) Write down the matrix corresponding to (b) Prove that the sampled exponential vectors ω ω n from (84) form an eigenvector basis of What are the eigenvalues? (c) Prove that K = T has the same eigenvectors as What are its eigenvalues? (d) Is K positive definite? (e) ccording to Theorem 8 the eigenvectors of a symmetric matrix are real and orthogonal Use this to explain the orthogonality of the sampled exponential vectors ut why aren t they real? (a) (b) Using Exercise 8(c) ω k = (S I )ω k = e k π i /n ω k and so ω k is an eigenvector of with corresponding eigenvalue e k π i /n (c) Since S is an orthogonal matrix S T = S and so S T ω k = e k π i /n ω k Therefore K ω k = (S T I )(S I )ω k = ( I S S T )ω k = e k π i /n e k π i /n ω k = cos k π i n ω k and hence ω k is an eigenvector of K with corresponding eigenvalue cos k π i n (d) Yes K > since its eigenvalues are all positive; or note that K = T is a Gram matrix with ker = {} (e) Each eigenvalue cos k π i (n k)π i = cos for k n n n is double with a twodimensional eigenspace spanned by ω k and ω n k = ω k The correpsonding real eigenvectors are Re ω k = ω k + ω n k and Im ω k = i ω k i ω n k On the other hand if k = n (which requires that n be even) the eigenvector ω n/ = ( )T is real c c c c c n c n c c c c n 84 n n n circulant matrix has the form = c n c n c c c n c c c c 4 c in which the entries of each succeeding row are obtained by moving all the previous row s entries one slot to the right the last entry moving to the front (a) heck that the shift matrix S of Exercise 8 the difference matrix and its symmetric product K of Exercise 84 are all circulant matrices (b) Prove that the sampled exponential vectors evv 9/9/4 44 c 4 Peter J Olver

27 ω ω n cf (84) are eigenvectors of Thus all circulant matrices have the same eigenvectors What are the eigenvalues? (c) Prove that Ω n Ω n = Λ where Ω n is the Fourier matrix in Exercise 76 and Λ is the diagonal matrix with the eigenvalues of along the diagonal (d) Find the eigenvalues and eigenvectors of the following circulant matrices: (i) (ii) (iii) (iv) (e) Find the eigenvalues of the tricirculant matrices in Exercise 7 an you find a general formula for the n n version? Explain why the eigenvalues must be real and positive Does your formula reflect this fact? (f ) Which of the preceding matrices are invertible? Write down a general criterion for checking the invertibility of circulant matrices (a) The eigenvector equation ω k = (c + c e k π i /n + c e 4 k π i /n + + +c n e (n ) k π i /n )ω k can either be proved directly or by noting that and using Exercise 8(c) (b) (i) Eigenvalues ; = c I + c S + c S + + c n S n eigenvectors (ii) Eigenvalues 6 i + i ; eigenvectors (iii) Eigenvalues i + i ; eigenvectors i i + i i i i i i + i i (iv) Eigenvalues 4 ; eigenvectors i i (c) The eigenvalues are (i) 6 ; (ii) ; (iii) ; (iv) in the n n case they are 4 + cos k π for k = n The eigenvalues are real and n positive because the matrices are symmetric positive definite (d) ases (iii) in (d) and all matrices in part (e) are invertible In general an n n circulant matrix is invertible if and only if none of the roots of the polynomial c +c x+ + c n x n = is an n th root of unity: x e k π/n 84 Write out the spectral decomposition of thefollowing matrices: 4 (a) (b) (c) (d) 4 4 evv 9/9/4 44 c 4 Peter J Olver

Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

LINEAR ALGEBRA. September 23, 2010

LINEAR ALGEBRA. September 23, 2010 LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................

More information

Applied Linear Algebra I Review page 1

Applied Linear Algebra I Review page 1 Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties

More information

Orthogonal Diagonalization of Symmetric Matrices

Orthogonal Diagonalization of Symmetric Matrices MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,

More information

Section 6.1 - Inner Products and Norms

Section 6.1 - Inner Products and Norms Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,

More information

Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

More information

Orthogonal Bases and the QR Algorithm

Orthogonal Bases and the QR Algorithm Orthogonal Bases and the QR Algorithm Orthogonal Bases by Peter J Olver University of Minnesota Throughout, we work in the Euclidean vector space V = R n, the space of column vectors with n real entries

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013 Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,

More information

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively. Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

More information

by the matrix A results in a vector which is a reflection of the given

by the matrix A results in a vector which is a reflection of the given Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n. ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?

More information

The Characteristic Polynomial

The Characteristic Polynomial Physics 116A Winter 2011 The Characteristic Polynomial 1 Coefficients of the characteristic polynomial Consider the eigenvalue problem for an n n matrix A, A v = λ v, v 0 (1) The solution to this problem

More information

Lecture 1: Schur s Unitary Triangularization Theorem

Lecture 1: Schur s Unitary Triangularization Theorem Lecture 1: Schur s Unitary Triangularization Theorem This lecture introduces the notion of unitary equivalence and presents Schur s theorem and some of its consequences It roughly corresponds to Sections

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

A note on companion matrices

A note on companion matrices Linear Algebra and its Applications 372 (2003) 325 33 www.elsevier.com/locate/laa A note on companion matrices Miroslav Fiedler Academy of Sciences of the Czech Republic Institute of Computer Science Pod

More information

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices. Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R

More information

MATH 551 - APPLIED MATRIX THEORY

MATH 551 - APPLIED MATRIX THEORY MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

More information

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

More information

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions. 3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms

More information

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Recall that two vectors in are perpendicular or orthogonal provided that their dot Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

More information

Notes on Determinant

Notes on Determinant ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 6 Eigenvalues and Eigenvectors 6. Introduction to Eigenvalues Linear equations Ax D b come from steady state problems. Eigenvalues have their greatest importance in dynamic problems. The solution

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J. Olver 5. Inner Products and Norms The norm of a vector is a measure of its size. Besides the familiar Euclidean norm based on the dot product, there are a number

More information

Linear Algebra: Determinants, Inverses, Rank

Linear Algebra: Determinants, Inverses, Rank D Linear Algebra: Determinants, Inverses, Rank D 1 Appendix D: LINEAR ALGEBRA: DETERMINANTS, INVERSES, RANK TABLE OF CONTENTS Page D.1. Introduction D 3 D.2. Determinants D 3 D.2.1. Some Properties of

More information

Inner Product Spaces

Inner Product Spaces Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

More information

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION 4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:

More information

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A = MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the

More information

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued). MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0

More information

University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7. Review University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

More information

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar

More information

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal

More information

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3 Math 24 FINAL EXAM (2/9/9 - SOLUTIONS ( Find the general solution to the system of equations 2 4 5 6 7 ( r 2 2r r 2 r 5r r x + y + z 2x + y + 4z 5x + 6y + 7z 2 2 2 2 So x z + y 2z 2 and z is free. ( r

More information

1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES 1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

More information

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010 Math 550 Notes Chapter 7 Jesse Crawford Department of Mathematics Tarleton State University Fall 2010 (Tarleton State University) Math 550 Chapter 7 Fall 2010 1 / 34 Outline 1 Self-Adjoint and Normal Operators

More information

[1] Diagonal factorization

[1] Diagonal factorization 8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

More information

ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

More information

LS.6 Solution Matrices

LS.6 Solution Matrices LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions

More information

Solution to Homework 2

Solution to Homework 2 Solution to Homework 2 Olena Bormashenko September 23, 2011 Section 1.4: 1(a)(b)(i)(k), 4, 5, 14; Section 1.5: 1(a)(b)(c)(d)(e)(n), 2(a)(c), 13, 16, 17, 18, 27 Section 1.4 1. Compute the following, if

More information

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components The eigenvalues and eigenvectors of a square matrix play a key role in some important operations in statistics. In particular, they

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

NOTES ON LINEAR TRANSFORMATIONS

NOTES ON LINEAR TRANSFORMATIONS NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all

More information

1 Sets and Set Notation.

1 Sets and Set Notation. LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

More information

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL R. DRNOVŠEK, T. KOŠIR Dedicated to Prof. Heydar Radjavi on the occasion of his seventieth birthday. Abstract. Let S be an irreducible

More information

Continued Fractions and the Euclidean Algorithm

Continued Fractions and the Euclidean Algorithm Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction

More information

3. INNER PRODUCT SPACES

3. INNER PRODUCT SPACES . INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.

More information

Similar matrices and Jordan form

Similar matrices and Jordan form Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive

More information

Matrix Representations of Linear Transformations and Changes of Coordinates

Matrix Representations of Linear Transformations and Changes of Coordinates Matrix Representations of Linear Transformations and Changes of Coordinates 01 Subspaces and Bases 011 Definitions A subspace V of R n is a subset of R n that contains the zero element and is closed under

More information

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i. Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(

More information

CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation

CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation Chapter 2 CONTROLLABILITY 2 Reachable Set and Controllability Suppose we have a linear system described by the state equation ẋ Ax + Bu (2) x() x Consider the following problem For a given vector x in

More information

Orthogonal Projections

Orthogonal Projections Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

More information

CS3220 Lecture Notes: QR factorization and orthogonal transformations

CS3220 Lecture Notes: QR factorization and orthogonal transformations CS3220 Lecture Notes: QR factorization and orthogonal transformations Steve Marschner Cornell University 11 March 2009 In this lecture I ll talk about orthogonal matrices and their properties, discuss

More information

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0. Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability

More information

9 MATRICES AND TRANSFORMATIONS

9 MATRICES AND TRANSFORMATIONS 9 MATRICES AND TRANSFORMATIONS Chapter 9 Matrices and Transformations Objectives After studying this chapter you should be able to handle matrix (and vector) algebra with confidence, and understand the

More information

Notes on Symmetric Matrices

Notes on Symmetric Matrices CPSC 536N: Randomized Algorithms 2011-12 Term 2 Notes on Symmetric Matrices Prof. Nick Harvey University of British Columbia 1 Symmetric Matrices We review some basic results concerning symmetric matrices.

More information

Linear Algebra I. Ronald van Luijk, 2012

Linear Algebra I. Ronald van Luijk, 2012 Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.

More information

( ) which must be a vector

( ) which must be a vector MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are

More information

Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

More information

8 Square matrices continued: Determinants

8 Square matrices continued: Determinants 8 Square matrices continued: Determinants 8. Introduction Determinants give us important information about square matrices, and, as we ll soon see, are essential for the computation of eigenvalues. You

More information

BANACH AND HILBERT SPACE REVIEW

BANACH AND HILBERT SPACE REVIEW BANACH AND HILBET SPACE EVIEW CHISTOPHE HEIL These notes will briefly review some basic concepts related to the theory of Banach and Hilbert spaces. We are not trying to give a complete development, but

More information

Inner products on R n, and more

Inner products on R n, and more Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +

More information

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

More information

Systems of Linear Equations

Systems of Linear Equations Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and

More information

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2. Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given

More information

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in 2-106. Total: 175 points.

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in 2-106. Total: 175 points. 806 Problem Set 4 Solution Due Wednesday, March 2009 at 4 pm in 2-06 Total: 75 points Problem : A is an m n matrix of rank r Suppose there are right-hand-sides b for which A x = b has no solution (a) What

More information

ALGEBRAIC EIGENVALUE PROBLEM

ALGEBRAIC EIGENVALUE PROBLEM ALGEBRAIC EIGENVALUE PROBLEM BY J. H. WILKINSON, M.A. (Cantab.), Sc.D. Technische Universes! Dsrmstedt FACHBEREICH (NFORMATiK BIBL1OTHEK Sachgebieto:. Standort: CLARENDON PRESS OXFORD 1965 Contents 1.

More information

3 Orthogonal Vectors and Matrices

3 Orthogonal Vectors and Matrices 3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

T ( a i x i ) = a i T (x i ).

T ( a i x i ) = a i T (x i ). Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)

More information

Section 4.4 Inner Product Spaces

Section 4.4 Inner Product Spaces Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer

More information

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of

More information

Factorization Theorems

Factorization Theorems Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary

More information

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation

More information

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two

More information

Solving Systems of Linear Equations

Solving Systems of Linear Equations LECTURE 5 Solving Systems of Linear Equations Recall that we introduced the notion of matrices as a way of standardizing the expression of systems of linear equations In today s lecture I shall show how

More information

α = u v. In other words, Orthogonal Projection

α = u v. In other words, Orthogonal Projection Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

More information

Chapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor

Chapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter

More information

MAT 242 Test 2 SOLUTIONS, FORM T

MAT 242 Test 2 SOLUTIONS, FORM T MAT 242 Test 2 SOLUTIONS, FORM T 5 3 5 3 3 3 3. Let v =, v 5 2 =, v 3 =, and v 5 4 =. 3 3 7 3 a. [ points] The set { v, v 2, v 3, v 4 } is linearly dependent. Find a nontrivial linear combination of these

More information

Math 312 Homework 1 Solutions

Math 312 Homework 1 Solutions Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please

More information

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1 (d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which

More information

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited Physics 116A Winter 2011 The Matrix Elements of a 3 3 Orthogonal Matrix Revisited 1. Introduction In a class handout entitled, Three-Dimensional Proper and Improper Rotation Matrices, I provided a derivation

More information

Solutions to Math 51 First Exam January 29, 2015

Solutions to Math 51 First Exam January 29, 2015 Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not

More information

Lecture notes on linear algebra

Lecture notes on linear algebra Lecture notes on linear algebra David Lerner Department of Mathematics University of Kansas These are notes of a course given in Fall, 2007 and 2008 to the Honors sections of our elementary linear algebra

More information

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Matrix Algebra A. Doerr Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Some Basic Matrix Laws Assume the orders of the matrices are such that

More information

Examination paper for TMA4115 Matematikk 3

Examination paper for TMA4115 Matematikk 3 Department of Mathematical Sciences Examination paper for TMA45 Matematikk 3 Academic contact during examination: Antoine Julien a, Alexander Schmeding b, Gereon Quick c Phone: a 73 59 77 82, b 40 53 99

More information

4.5 Linear Dependence and Linear Independence

4.5 Linear Dependence and Linear Independence 4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then

More information

26. Determinants I. 1. Prehistory

26. Determinants I. 1. Prehistory 26. Determinants I 26.1 Prehistory 26.2 Definitions 26.3 Uniqueness and other properties 26.4 Existence Both as a careful review of a more pedestrian viewpoint, and as a transition to a coordinate-independent

More information

Name: Section Registered In:

Name: Section Registered In: Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are

More information

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1. MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column

More information

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold: Linear Algebra A vector space (over R) is an ordered quadruple (V, 0, α, µ) such that V is a set; 0 V ; and the following eight axioms hold: α : V V V and µ : R V V ; (i) α(α(u, v), w) = α(u, α(v, w)),

More information

Inner product. Definition of inner product

Inner product. Definition of inner product Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product

More information

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α

More information

Linear Algebra Notes

Linear Algebra Notes Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note

More information