More Linear Algebra Study Problems

Similar documents
Linear Algebra Notes

Similarity and Diagonalization. Similar Matrices

by the matrix A results in a vector which is a reflection of the given

Chapter 17. Orthogonal Matrices and Symmetries of Space

NOTES ON LINEAR TRANSFORMATIONS

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

ISOMETRIES OF R n KEITH CONRAD

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

A =

University of Lille I PC first year list of exercises n 7. Review

Chapter 6. Orthogonality

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MATH1231 Algebra, 2015 Chapter 7: Linear maps

Orthogonal Diagonalization of Symmetric Matrices

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

LINEAR ALGEBRA W W L CHEN

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Solutions to Math 51 First Exam January 29, 2015

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Orthogonal Projections

x = + x 2 + x

LINEAR ALGEBRA. September 23, 2010

Chapter 20. Vector Spaces and Bases

1 VECTOR SPACES AND SUBSPACES

MAT 242 Test 2 SOLUTIONS, FORM T

Linear Algebra Review. Vectors

Applied Linear Algebra

Mathematics Course 111: Algebra I Part IV: Vector Spaces

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

1 Sets and Set Notation.

Section Inner Products and Norms

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

1 Introduction to Matrices

MATH APPLIED MATRIX THEORY

Matrix Representations of Linear Transformations and Changes of Coordinates

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

α = u v. In other words, Orthogonal Projection

Math 215 HW #6 Solutions

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Name: Section Registered In:

Numerical Analysis Lecture Notes

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

5.3 The Cross Product in R 3

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Inner Product Spaces and Orthogonality

[1] Diagonal factorization

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Modélisation et résolutions numérique et symbolique

( ) which must be a vector

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Lecture 14: Section 3.3

Systems of Linear Equations

Lecture Notes 2: Matrices as Systems of Linear Equations

T ( a i x i ) = a i T (x i ).

Methods for Finding Bases

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Eigenvalues and Eigenvectors

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Inner products on R n, and more

Section 1.1. Introduction to R n

Orthogonal Projections and Orthonormal Bases

Examination paper for TMA4115 Matematikk 3

Math 312 Homework 1 Solutions

Similar matrices and Jordan form

Vector and Matrix Norms

Introduction to Matrix Algebra

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

4.5 Linear Dependence and Linear Independence

Applied Linear Algebra I Review page 1

3. INNER PRODUCT SPACES

Inner Product Spaces

Data Mining: Algorithms and Applications Matrix Math Review

Solving Systems of Linear Equations

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

5. Orthogonal matrices

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

MA106 Linear Algebra lecture notes

Solving Linear Systems, Continued and The Inverse of a Matrix

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

CS3220 Lecture Notes: QR factorization and orthogonal transformations

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Vector Math Computer Graphics Scott D. Anderson

Linearly Independent Sets and Linearly Dependent Sets

160 CHAPTER 4. VECTOR SPACES

9 MATRICES AND TRANSFORMATIONS

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

October 3rd, Linear Algebra & Properties of the Covariance Matrix

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

Classification of Cartan matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Linear Algebra I. Ronald van Luijk, 2012

Transcription:

More Linear Algebra Study Problems The final exam will cover chapters -3 except chapter. About half of the exam will cover the material up to chapter 8 and half will cover the material in chapters 9-3. Refer to the previous study problems for more exercises on the material up to chapter 8, as well as the homework problems for all chapters (except chapter ). The problems below are in no particular order.. Find the kernel and image of the matrix 3 4 5 6 A = 3 4 5 6 3 4 5 6. 3 4 5 6 Solution: The kernel of A is the hyperplane in R 6 with equation x + x + + 6x 6 = 0. The image is the line in R 4 through the vector (,,, ).. Find the 3 3 matrix that reflects about the plane x + y + z = 0. Solution: Use the formula (3) from chap. 7: Av = v v, u u, where u is a unit vector normal to the plane. Here u = (/ 3)(,, ) and the matrix is A = 3 (You can check your answer by computing the eigenspaces of your matrix.) 3. Find the angle and axis of rotation of the matrix A = 8 4 4 7 4. 9 4 8 Solution: The cosine of the rotation angle θ is given by cos(θ) = (tr(a) ) =, so θ = π. The axis is ker(a I) = R(, 4, ). 4. Find the inverse of the matrix 0 A =. 0 Check your answer. Then use your answer to solve the system x + y = x + y + z = y + z =

Solution: 3 A =, x =, y =, z =. 5. Find the ranks and the kernels of the following matrices. 0 0 3 4 A = 0 0 0 0 0, B = 5 6 7 8 8 7 6 5. 0 0 0 4 3 Solutions: rank(a) = 3, ker A = Re 4, rank(b) =, ker(b) is the plane spanned by (, 3, 0, ), (,,, 0). 6. Find the inverse of the matrix Check your answer. Solutions: 0 0 A = 0 0 0 0. 0 0 0 A = 0 0 0. 0 0 0 7. Find, if possible, planes in R 4 which meet the hyperplane x + y + z + w = 0 in a) a point; b) a line; c) a plane. Solutions: A plane is the intersection of two hyperplanes, so we are looking at the intersection of three hyperplanes, one of which is x+y+z +w = 0. Let the other two hyperplanes be ax+by+cz +dw = 0 and ex + fy + gz + hw = 0. This intersection is the kernel of the matrix A = a b c d. e f g h By the Kernel-Image Theorem we have dim(ker A) = 4 dim(im A) = 4 rank(a), since rank(a) 3. Hence it is not possible for the intersection to be a point. The intersection is a line exactly when rank(a) = 3, for example, A = 0 0 0, 0 0 0

and the intersection is a plane exactly when rank(a) =, for example A = 0 0 0. 0 0 0 0 8. A 4 4 matrix A with real entries satisfies A 4 = I. What are the possible characteristic polynomials of A? Solutions: The eigenvalues of A must satisfy λ 4 =, so λ {,, i, i}. Since A is real, its characteristic polynomial P A (x) is real, so ±i must appear in pairs. The possibilities for P A (x) are: (x ) 4, (x ) 3 (x + ), (x ) (x + ), (x )(x + ) 3, (x + ) 4, (x + )(x ), (x + )(x ), (x + )(x + ), (x + ). 9. Find a 4 4 non-diagonal matrix with eigenvalues,, 3, 4. Solutions: Let D be the diagonal matrix with diagonal entries,, 3, 4. For almost any invertible matrix B we will have BDB non-diagonal. Just don t choose B to have exactly one nonzero entry in each row and column. 0. Let M = 3. 6 3 3 Find a nonzero vector v such that Mv = v. Solutions: We must find a -eigenvector of M, or equivalently a 6-eigenvector for the matrix A = 6M = 3. 3 3 This means we must compute the kernel of 5 A 6I = 3. 3 3 The answer is any nonzero scalar multiple of (8, 9, ).. Determine whether the following sets of vectors are linearly independent or not. (a) (, 0,, 0), (, 0, 0, ), (0,, 0, ), (0,,, 0) (b) (, 0,, 0), (, 0, 0, ), (0,, 0, ), (0, 0,, 0) 3

(c) (,, 3), (4, 5, 6), (7, 8, 9). (d) (a, b, c), (d, e, f), (0, 0, 0). (e) (,, 3, 4), (, 3, 4, 5), (3, 4, 5, 6), (4, 5, 6, 7), (5, 6, 7, 8) Solutions: (a) Linearly Dependent, since (, 0,, 0) (, 0, 0, ) + (0,, 0, ) (0,,, 0) = (0, 0, 0, 0). (b) Linearly Independent, because setting leads to the equations c (, 0,, 0) + c (, 0, 0, ) + c 3 (0,, 0, ) + c 4 (0, 0,, 0) = (0, 0, 0, 0) c + c = 0, c 3 = 0, c + c 4 = 0, c + c 3 = 0, and the only solution of these equations is all c i = 0. (c) Linearly Dependent, since (,, 3) (4, 5, 6) + (7, 8, 9) = (0, 0, 0). (d) Linearly Dependent, since 0(a, b, c) + 0(d, e, f) + (0, 0, 0) = (0, 0, 0). Any set of vectors which contains the zero vector is linearly dependent. (e) Linearly Dependent, because there are more vectors than components in the vectors.. Suppose A is a 3 3 matrix and u, v, w are nonzero vectors in R 3 such that Au = 0, Av = v, Aw = w. Show that u, v, w are linearly independent. (This is a special case of a theorem from class (which one?), but prove it here from scratch.) Solution: Suppose c u+c v +c 3 w = 0. Apply A to both sides of this equation. We get c v +c 3 w = 0. Apply A again, and get c v + 4c 3 w = 0. Subtract these last two equations, and get c 3 w = 0, so c 3 = 0. Then c = 0, and then c = 0 as well. 3. Give an example of four vectors u, u, u 3, u 4 in R 3 which are linearly dependent, but any three of the four are linearly independent. Solution: One such example is (, 0, 0), (0,, 0), (0, 0, ), (,, ). 4. Suppose A is an n m matrix, and the columns of A are linearly independent. What is ker A? Solution: Suppose x = (x,..., x m ) belongs to ker A. Let u,..., u m be the columns of A. Then Ax = x u + + x m u m = 0. 4

Since the u i s are linearly independent, we have all x i = 0. So x = 0. Thus, ker A is zero. 5. Determine whether or not the following sets of vectors u, u, u 3 are bases of R 3. (a) u = (,, ), u = (,, 0), u 3 = (, 0, 0) (b) u = (,, 0), u = (0,, ), u 3 = (, 0, ) (c) u = (,, 3), u = (4, 5, 6), u 3 = (7, 8, 9) (d) u = (,, 3), u = (6, 5, 4), u 3 = (7, 8, 9) Solutions: It suffices to check if the sets are linearly independent. (a) basis (b) not a basis (u 3 = u + u ) (c) not a basis (u = u + u 3 ) (d) not a basis (7u 3 = 3u + 6u 3 ) 6. The intersection of two subspaces V and W of R n is the set of vectors which belong to both V and W. This intersection is denoted by V W. The union of V and W is the set of vectors belonging to either V or W, and is denoted V W. (a) Show that V W is a subspace of R n. (b) Show that V W is not, in general, a subspace of R n. (c) Let A = [a ij ] be an n m matrix. Describe the kernel of A as an intersection of hyperplanes. (How many hyperplanes, and where do they live?) Solutions: (a) Check closure under addition and scalar multiplication. (b) For example, the union of the line through e and the line through e is not a subspace, because it doesn t contain e + e. (c) For each i such that the i th row of A is not all zero, consider the hyperplane with equation a i x + a i x + + a im x m = 0. Then ker A is the intersection of these hyperplanes in R n, one hyperplane for every nonzero row of A. 7. In R 3, two planes intersect in either a line or a plane. The minimum possible dimension of the intersection is (which happens when the intersection is a line). What is the minimum possible 5

dimension of intersection of two hyperplanes in R 4? In general, for two subspaces V, W of R n, with dimensions dim V = k, and dim W = l, what is the minimum possible dimension of V W? Solution: The intersection of two hyperplanes in R 4 is the kernel of a 4 matrix A. We have ker A = 4 rank(a). As rank(a), the minimum interesection two dimensional. If k + l n then the intersection is the kernel of a (k + l) n matrix A, and dim(ker A) = n rank(a) n (k + l) with equality if and only if rank(a) = k + l. So in this case the minimum possible dimension of V W is n (k + l). If k + l > n then there is a (k + l) n matrix A of rank n, for which ker(a) = {0}. So in this case the minimum possible dimension of V W is 0. 8. Let A be the reflection about a plane in R 3 with normal vector n, and let B be a 3 3 rotation matrix. Show that BAB is reflection about the plane with normal vector Bn. Solution: BAB is a product of three orthogonal matrices, hence is orthogonal. Since An = n we have BAB (Bn) = BA(n) = B( n) = Bn. If u is in the plane perpendicular to Bn then B u is in the plane perpendicular to u, and the same calculation shows that BAB u = u. So BAB fixes the plane perpendicular to Bn and negates Bn. It follows that BAB is reflection about Bn. 9. Let A be a rotation about an axis u in R 3 by angle θ, measured counterclockwise as u points at you. Let B be a 3 3 rotation matrix. Show that BAB is a rotation and give its axis and angle in terms u and θ. Solution: BAB is a product of three orthogonal matrices, hence is orthogonal. And det(bab ) = det(a) = since A is a rotation. Hence BAB is also a rotation. Since u is on the axis for A we have BAB (Bu) = BAu = Bu, so Bu is on the axis for BAB. Finally the angle of rotation is determined by the trace, which is the same for A and BAB. Therefore BAB also rotates by θ. And since B is a rotation, it preserves handedness, so the direction of rotation by θ is also counterclockwise as Bu points at you. [If B had been a reflection everything above would be the same, but BAB would rotate by θ in the other direction. ] 0. Let D : P 3 P 3 be the linear map given by D(f) = df/dx. (a) Find the matrix of D with respect to the basis {, x, x, x 3 }. (b) Find the matrix of D with respect to the basis of Legendre polynomials {P 0, P, P, P 3 }. (c) Find a matrix B which conjugates the matrix in part (a) to the matrix in part (b). Solution: 6

(a) We compute the effect of D on the basis vectors, x, x, x 3 : D() = 0, D(x) = =, D(x ) = x = x, D(x 3 ) = 3x = 3 x, So the matrix of D with respect to the basis {, x, x, x 3 } is 0 0 0 0 0 0 0 0 0 3 0 0 0 0 (b) We compute the effect of D on the basis vectors P 0, P, P, P 3, where we recall that We find P 0 =, P = x, P = (3x ) P 3 = (5x3 3x). D(P 0 ) = 0, D(P ) = = P 0, D(P ) = 3x = 3 P, D(P 3 ) = 5 x 3 = 5P + P 0, So the matrix of D with respect to the basis {P 0, P, P, P 3 } is 0 0 0 0 3 0 0 0 0 5 0 0 0 0 (c) Such a matrix B is the change of basis matrix. More precisely if A is the matrix in (a) and A is the matrix in (b), then B AB = A, where 0 / 0 B = 0 0 3/ 0 0 3/ 0. 0 0 0 5/ 7