Diagonalisation. Chapter 3. Introduction. Eigenvalues and eigenvectors. Reading. Definitions

Size: px
Start display at page:

Download "Diagonalisation. Chapter 3. Introduction. Eigenvalues and eigenvectors. Reading. Definitions"

Transcription

1 Chapter 3 Diagonalisation Eigenvalues and eigenvectors, diagonalisation of a matrix, orthogonal diagonalisation fo symmetric matrices Reading As in the previous chapter, there is no specific essential reading for this chapter. It is essential that you do some reading, but the topics discussed in this chapter are adequately covered in many texts on linear algebra. The list below gives examples of relevant reading. (For full publication details, see Chapter.) Ostaszewski, A. Mathematics in Economics, Chapter 7, Sections 7., 7.4, 7.6. Ostaszewski, A. Advanced Mathematical Methods. Chapter 5, sections 5. and 5.. Leon, S.J., Linear Algebra with Applications. Chapter 6, sections 6. and 6.3. Simon, C.P. and Blume, L., Mathematics for Economists. Chapter 3, sections 3., 3.7. Introduction One of the most useful techniques in applications of matrices and linear algebra is diagonalisation. Before discussing this, we have to look at the topic of eigenvalues and eigenvectors. We shall explore a number of applications of diagonalisation in the next chapter of the guide. Eigenvalues and eigenvectors Definitions Suppose that A is a square matrix. The number λ is said to be an eigenvalue of A if for some non-zero vector x, Ax = λx. Any non-zero vector x for which this 37

2 equation holds is called an eigenvector for eigenvalue λ or an eigenvector of A corresponding to eigenvalue λ. Finding eigenvalues and eigenvectors To determine whether λ is an eigenvalue of A, we need to determine whether there are any non-zero solutions to the matrix equation Ax = λx. Note that the matrix equation Ax = λx is not of the standard form, since the right-hand side is not a fixed vector b, but depends explicitly on x. However, we can rewrite it in standard form. Note that λx = λix, where I is, as usual, the identity matrix. So, the equation is equivalent to Ax = λix, or Ax λix =, which is equivalent to (A λi)x =. Now, a square linear system Bx = has solutions other than x = precisely when B =. Therefore, taking B = A λi, λ is an eigenvalue if and only if the determinant of the matrix A λi is zero. This determinant, p(λ) = A λi, is known as the characteristic polynomial of A, since it is a polynomial in the variable λ. To find the eigenvalues, we solve the equation A λi =. Let us illustrate with a very simple example. Example: Let Then A λi = A =. λ λ = λ and the characteristic polynomial is A λi = λ λ = ( λ)( λ) = λ 3λ + = λ 3λ. So the eigenvalues are the solutions of λ 3λ =. To solve this, one could use either the formula for the solutions to a quadratic, or simply observe that the equation is λ(λ 3) = with solutions λ = and λ = 3. Hence the eigenvalues of A are and 3. To find an eigenvector for eigenvalue λ, we have to find a solution to (A λi)x =, other than the zero vector. (I stress the fact that eigenvectors cannot be the zero vector because this is a mistake many students make.) This is easy, since for a particular value of λ, all we need to do is solve a simple linear system We illustrate by finding the eigenvectors for the matrix of the example just given. Example: We find eigenvectors of A =. We have seen that the eigenvalues are and 3. To find an eigenvector for eigenvalue we solve the system (A I)x = : that is, Ax =, or x =. x This could be solved using row operations. (Note that it cannot be solved by using inverse matrices since A is not invertible. In fact, inverse matrix techniques or 38

3 Cramer s rule will never be of use here since λ being an eigenvalue means that A λi is not invertible.) However, we can solve this fairly directly just by looking at the equations. We have to solve x + x =, x + x =. Clearly both equations are equivalent. From either one, we obtain x = x. We can choose x to be any number we like. Let s take x = ; then we need x = x =. It follows that an eigenvector for is x =. The choice x = was arbitrary; we could have chosen any non-zero number, so, for example, the following are eigenvectors for : 5.,. 5. There are infinitely many eigenvectors for : for each α, α α is an eigenvector for. But be careful not to think that you can choose α = ; for then x becomes the zero vector, and this is never an eigenvector, simply by definition. To find an eigenvector for 3, we solve (A 3I)x =, which is ( This is equivalent to the equations ) x = x ( ). x + x =, x x =, which are together equivalent to the single equation x = x. If we choose x =, we obtain the eigenvector x =. (Again, any non-zero scalar multiple of this vector is also an eigenvector for eigenvalue 3.) We illustrated with a example just for simplicity, but you should be able to work with 3 3 matrices. We give three such examples. Example: Suppose that A = Find the eigenvalues of A and obtain one eigenvector for each eigenvalue. To find the eigenvalues we solve A λi =. Now, 4 λ 4 A λi = 4 λ λ = (4 λ) 4 λ λ λ 4 4 = (4 λ) ((4 λ)(8 λ) 6) + 4 ( 4(4 λ)) = (4 λ) ((4 λ)(8 λ) 6) 6(4 λ). 39

4 Now, we notice that each of the two terms in this expression has 4 λ as a factor, so instead of expanding everything, we take 4 λ out as a common factor, obtaining A λi = (4 λ) ((4 λ)(8 λ) 6 6) = (4 λ)(3 λ + λ 3) = (4 λ)(λ λ) = (4 λ)λ(λ ). It follows that the eigenvalues are 4,,. (The characteristic polynomial will not always factorise so easily. Here it was simple because of the common factor (4 λ). The next example is more difficult.) To find an eigenvector for 4, we have to solve the equation (A 4I)x =, that is, 4 4 x x = x 3 Of course, we could use row operations, but the system is simple enough to solve straight away. The equations are 4x 3 = 4x 3 = 4x + 4x + 4x 3 =, so x 3 = and x = x. Choosing x =, we get the eigenvector. (Again, we can choose x to be any non-negative number. eigenvalue 4 are all non-zero multiples of this vector.) So the eigenvalues for Activity 3. Determine eigenvectors for and. You should find that for λ =, your eigenvector is a non-zero multiple of and that for λ = your eigenvector is a non-zero multiple of. Example: Let 3 A =. Given that is an eigenvalue of A, find all the eigenvalues of A. 4

5 We calculate the characteristic polynomial of A: 3 A λi = = ( 3 λ) λ λ ( ) λ λ = ( 3 λ)(λ + λ ) + ( λ ) ( + λ) = λ 3 4λ 5λ. Now, the fact that is an eigenvalue means that is a solution of the equation A λi =, which means that (λ ( )), that is, (λ + ), is a factor of the characteristic polynomial A λi. So this characteristic polynomial can be written in the form (λ + )(aλ + bλ + c). Clearly we must have a = and c = to obtain the correct λ 3 term and the correct constant. Given this, b = 3. In other words, the characteristic polynomial is (λ + )( λ 3λ ) = (λ + )(λ + 3λ + ) = (λ + )(λ + )(λ + ). That is, A λi = (λ + ) (λ + ). The eigenvalues are the solutions to A λi =, so they are λ = and λ =. Note that in this case, there are only two eigenvalues (or, the eigenvalue is repeated, or has multiplicity, as it is sometimes said). Example: Let A = 3. 3 Then (check this!), the characteristic polynomial is λ 3 + 8λ λ + 6. factorises (check!) as (λ )(λ )(λ 4), This so the eigenvalues are and 4. There are only two eigenvalues in this case. (We sometimes say that the eigenvalue is repeated or has multiplicity, because (λ ) is a factor of the characteristic polynomial.) To find an eigenvector for λ = 4, we have to solve the equation (A 4I)x =, that is, The equations are x x = x 3 x x + x 3 = x = x x x 3 =, so x = and x = x 3. Choosing x 3 =, we get the eigenvector. For λ =, we have to solve the equation (A I)x =, that is, x x =. x 3. 4

6 This system is equivalent to the single equation x x +x 3 =. (Convince yourself!) Choosing x 3 = and x = we have x =, so we obtain the eigenvector. Complex eigenvalues Although we shall only deal in this subject with real matrices (that is, matrices whose entries are real numbers), it is possible for such real matrices to have complex eigenvalues. This is not something you have to spend much time on, but you have to be aware of it. We briefly describe complex numbers. The complex numbers For more discussion of are based on the complex number i, which is defined to be the square root of. (Of course, no such real number exists.) Any complex number z can be written in the form z = a + bi where a, b are real numbers. We call a the real part and b the imaginary part of z. Of course, any real number is a complex number, since a = a + i. The following example shows a matrix with complex eigenvalues, and it also demonstrates how to deal with complex numbers. Example: Consider the matrix A =. We shall see that it has complex 9 eigenvalues. First, the characteristic polynomial A λi is ( λ) +9 = λ λ+. (Check this!) Using the formula for the roots of a quadratic equation, the eigenvalues are ± 36. Now 36 = (36)( ) = 36 = 6 = 6i. complex numbers, see Appendix A3 of Simon and Blume. So the eigenvalues are the complex numbers + 3i and 3i. Let s proceed with finding eigenvectors. To find an eigenvector for + 3i, we solve (A ( + 3i)I)x =, which is 3i x =. 9 3i This is equivalent to the equations x 3ix + x =, 9x 3ix =. But the second equation is just 3i times the first, so both are equivalent. Taking the second, we see that x = i/3x. So an eigenvector is (taking x = 3) ( i, 3) T. For λ = 3i, we end up solving the system a solution of which is (i, 3) T. 3ix + x =, 9x + 3ix =, You should be aware, then, that even though we are not dealing with matrices that have complex numbers as their entries, the possibility still exists that eigenvalues (and eigenvectors) will involve complex numbers. However, if a matrix is symmetric (that is, it equals its transpose), then it certainly has real eigenvalues. This useful fact, which we shall prove later, is important when we consider quadratic forms in the next chapter. 4

7 Diagonalisation of a square matrix Square matrices A and B are similar if there is an invertible matrix P such that P AP = B. The matrix A is diagonalisable if it is similar to a diagonal matrix; in other words, if there is a diagonal matrix D and an invertible matrix P such that P AP = D. Suppose that matrix A is diagonalisable, and that P AP = D, where D is a diagonal matrix λ λ D = diag(λ, λ,..., λ n ) =.... λ n (Note the useful notation for describing the diagonal matrix D.) AP = DP. If the columns of P are the vectors v, v,..., v n, then and So this means that AP = A (v... v n ) = (Av... Av n ), λ λ DP =... (v... v n ) = (λ v... λ n v n ). λ n Av = λ v, Av = λ v,..., Av n = λ n v n. Then we have The fact that P exists means that none of the vectors v i is the zero vector. So this means that (for i =,,..., n) λ i is an eigenvalue of A and v i is a corresponding eigenvector. Since P has an inverse, these eigenvectors are linearly independent. Therefore, A has n linearly independent eigenvectors. Conversely, if A has n linearly independent eigenvectors, then the matrix P whose columns are these eigenvectors will be invertible, and we will have P AP = D where D is a diagonal matrix with entries equal to the eigenvalues of A. We have therefore established the following result. Theorem 3. A matrix A is diagonalisable if and only if it has n linearly independent eigenvectors. Suppose that this is the case, and let v,..., v n be n linearly independent eigenvectors, where v i is an eigenvector for eigenvalue λ i. Then the matrix P = (v... v n ) is such that P exists, and P AP = D where D = diag(λ,..., λ n ). There is a more sophisticated way to think about this result, in terms of change of basis and matrix representations of linear transformations. Suppose that T is the linear transformation corresponding to A, so that T (x) = Ax for all x. Suppose that A has a set of n linearly independent eigenvectors B = {x, x,..., x n }, corresponding (respectively) to the eigenvalues λ,..., λ n. Since this is a linearly independent set of size n in R n, B is a basis for R n. By Theorem.8, the matrix of T with respect to B is A T [B, B] = ([T (x )] B... [T (x n )] B ). But T (x i ) = Ax i = λx i, so the coordinate vector of T (x i ) with respect to B is [T (x i )] B = (,,...,, λ i,,..., ), 43

8 which has λ i in entry i and all other entries zero. Therefore A T [B, B] = diag(λ,..., λ n ) = D. But by Theorem., A T [B, B] = P A T P, where P = (x... x n ) and A T is the matrix representing T, which in this case is simply A itself. We therefore see that P AP = A T [B, B] = D, and so the matrix P diagonalises A. Example: Consider again the matrix A = We have seen that it has three distinct eigenvalues, 4,, and that eigenvectors corresponding to eigenvalues 4,, are (in that order),,. We now form the matrix P whose columns are these eigenvectors: P =. Then, according to the theory, P should have an inverse, and we should have P AP = D = diag(4,, ). To check that this is true, we could calculate P and evaluate the product. The inverse may be calculated using either elementary row operations or determinants. (Matrix inversion is not part of this subject: however, it is part of the pre-requisite subject Mathematics for economists. You should therefore know how to invert a matrix.) Activity 3. Calculate P and verify that P AP = D. Not all n n matrices have n linearly independent eigenvalues, as the following example shows. Example: The matrix A = ( 4 ) has characteristic polynomial λ 6λ + 9 = (λ 3), so there is only one eigenvalue, λ = 3. The eigenvectors are the non-zero solutions to (A 3I)x = : that is, x =. This is equivalent to the single equation x + x =, with general solution x = x. Setting x = r, we see that the solution set of the system consists of all x 44

9 r vectors of the form as r runs through all non-zero real numbers. So the r eigenvectors are precisely the non-zero scalar multiples of the fixed vector. Any two eigenvectors are therefore multiples of each other and hence form a linearly dependent set. In other words, there are not two linearly independent eigenvectors, and the matrix is not diagonalisable. The following result is useful. It shows that if a matrix has n different eigenvalues then it is diagonalisable. Theorem 3. Eigenvectors corresponding to different eigenvalues are linearly independent. So if an n n matrix has n different eigenvalues, then it has a set of n linearly independent eigenvectors and is therefore diagonalisable. For a proof, see Ostaszewski, Mathematics in Economics, Section 7.4. It is not, however, necessary for the eigenvalues to be distinct. What is needed for diagonalisation is a set of n linearly independent eigenvectors, and this can happen even when there is a repeated eigenvalue (that is, when there are fewer than n different eigenvalues). The following example illustrates this. Example: We considered the matrix A = 3 3 above, and we saw that it has only two eigenvalues, 4 and. If we want to diagonalise it, we need to find three linearly independent eigenvectors. We found that an eigenvector corresponding to λ = 4 is, and that, for λ =, the eigenvectors are given by the non-zero-vector solutions to the system consisting of just the single equation x x + x 3 =. Above, we simply wanted to find an eigenvector, but now we want to find two which, together with the eigenvector for λ = 4, form a linearly independent set. Now, the system for the eigenvectors corresponding to λ = has just one equation and is therefore of rank ; it follows that the solution set is two-dimensional. Let s see exactly what the general solution looks like. We have x = x x 3, and x, x 3 can be chosen independently of each other. Setting x 3 = r and x = s, we see that the general solution is x x = s r s = s + r, (r, s R). x 3 r This shows that the solution space (the eigenspace, as it is called in this instance) is spanned by the two linearly independent vectors,. Now, each of these is an eigenvector corresponding to eigenvalue and, together with our eigenvector for λ = 4, the three form a linearly independent set. So there are 45

10 three linearly independent eigenvectors, even though two of them correspond to the same eigenvalue. The matrix is therefore diagonalisable. We may take P =. Then (Check!) P AP = D = diag(4,, ). Orthogonal diagonalisation of symmetric matrices The matrix we considered above, A = , is symmetric: that is, its transpose A T is equal to itself. It turns out that such matrices are always diagonalisable. They are, furthermore, diagonalisable in a special way. A matrix P is orthogonal if P T P = P P T = I: that is, if P has inverse P T. A matrix A is said to be orthogonally diagonalisable if there is an orthogonal matrix P such that P T AP = D where D is a diagonal matrix. Note that P T = P, so P T AP = P AP. The argument given above shows that the columns of P must be n linearly independent eigenvectors of A. But the condition that P T P = I means something else, as we now discuss. Suppose that the columns of P are x, x,..., x n, so that P = (x x... x n ). Then the rows of the transpose P T are x T,..., x T n, so x T x T P T =. Calculating the matrix product P T P, we find that the (i, j)-entry of P T P is x T i x j. But, since P T P = I, we must have x T n. x T i x i = (i =,,..., n), x T i x j = (i j). We say that vectors x, y are orthogonal if the matrix product x T y is. (Orthogonality will be discussed in more detail later.) So, any two of the eigenvectors x,..., x n must be orthogonal. Furthermore, for i =,,..., n, x T i x i =. The length of a vector x is x = n x i = x T x. i= So, not only must any two of these eigenvectors be orthogonal, but each must have length. We shall discuss orthogonality in more detail in the next chapter. For the moment, we have the following result. Theorem 3.3 If the matrix A is symmetric (A T = A) then eigenvectors corresponding to different eigenvalues are orthogonal. 46

11 Proof Suppose that λ and µ are any two different eigenvalues of A and that x, y are corresponding eigenvectors. Then Ax = λx and Ay = µy. The trick in this proof is to find two different expressions for the product x T Ay (which then must, of course, be equal to each other). Note that the matrix product x T Ay is a matrix or, equivalently, a number. First, since Ay = µy, we have x T Ay = x T (µy) = µx T y. But also, since Ax = λx, we have (Ax) T = (λx) T = λx T. Now, for any matrices M, N, (MN) T = N T M T, so (Ax) T = x T A T. But A T = A (because A is symmetric), so x T A = λx T and hence x T Ay = λx T y. We therefore have two different expressions for x T Ay: it equals µx T y and λx T y. Hence, µx T y = λx T y, or (µ λ)x T y =. But since λ µ (they are different eigenvalues), we have µ λ. We deduce, therefore, that x T y =. But this says precisely that x and y are orthogonal, which is exactly what we wanted to prove. This is quite a sneaky proof: the trick is to remember to consider x T Ay. The result just presented shows that if an n n symmetric matrix has exactly n different eigenvalues then any n corresponding eigenvectors are orthogonal to one another. Since we may take the eigenvectors to have length, this shows that the matrix is orthogonally diagonalisable. The following result makes this precise. Theorem 3.4 Suppose that A has n different eigenvalues. Take n corresponding eigenvectors, each of length. (Recall that the length of a vector x is just x = n i= x i.) Form the matrix P which has these eigenvectors as its columns. Then P = P T (that is, P is an orthogonal matrix) and P T AP = D, the diagonal matrix whose entries are the eigenvalues of A. (Note that we have only shown here that symmetric matrices with n different eigenvalues are orthogonally diagonalisable, but it turns out that all symmetric matrices are orthogonally diagonalisable.) Example: We work with the same matrix we used earlier, A = As we have already observed, this is symmetric. We have seen that it has three distinct eigenvalues, 4,. Earlier, we found that eigenvectors for eigenvalues 4,, are (in that order),,. Activity 3.3 Convince yourself that any two of these three eigenvectors are orthogonal. 47

12 Now, these eigenvectors are not of length. For example, the first one has length + ( ) + =. If we divide each entry of it by, we will indeed obtain an eigenvector of length : / /. We can similarly normalise the other two vectors, obtaining / 3 / 3 /, / 6 / 6 3 /. 6 Activity 3.4 Make sure you understand this normalisation. We now form the matrix P whose columns are these normalised eigenvectors: P = / / 3 / 6 / / 3 / 6 / 3 /. 6 Then P is orthogonal and P T AP = D = diag(4,, ). Activity 3.5 Check that P is orthogonal by calculating P T P. Example: Let A = Note that A is symmetric. We find an orthogonal matrix P such that P T AP is a diagonal matrix. The characteristic polynomial of A is 7 λ 9 A λi = λ 9 7 λ = ( λ)[(7 λ)(7 λ) 8] = ( λ)(λ 4λ 3) = ( λ)(λ 6)(λ + ), where we have expanded the determinant using the middle row. So the eigenvalues are, 6,. An eigenvector for λ = is given by 5x + 9z =, 9x + 5z =. This means x = z =. So we may take (,, ) T. This already has length so there is no need to normalise it. (Recall that we need three eigenvectors which are of length.) For λ = we find that an eigenvector is (,, ) T (or some multiple of this). To normalise (that is, to make of length ), we divide by its length, which is, obtaining (/ )(,, ) T. For λ = 6, we find a normalised eigenvector is (/ )(,, ). It follows that if we let P = / / / /, then P is orthogonal and P T AP = D = diag(,, 6). Check this! 48

13 Learning outcomes This chapter has discussed eigenvalues and eigenvectors and the very important technique of diagonalisation. We shall see in the next chapter how useful a technique diagonalisation is. At the end of this chapter and the relevant reading, you should be able to: explain what is meant by eigenvectors and eigenvalues, and by diagonalisation find eigenvalues and corresponding eigenvectors for a square matrix diagonalise a diagonalisable matrix recognise what diagonalisation says in terms of change of basis and matrix representation of linear transformations perform orthogonal diagonalisation on a symmetric matrix that has distinct eigenvalues. Sample examination questions The following are typical exam questions, or parts of questions. Question 3. Find the eigenvalues of the matrix A = and find an eigenvector for each eigenvalue. Hence find an invertible matrix P and a diagonal matrix D such that P AP = D. Question 3. Prove that the matrix is not diagonalizable. Question 3.3 Let A be any (real) n n matrix and suppose λ is an eigenvalue of A. Show that {x : Ax = λx}, the set of eigenvectors for eigenvalue λ together with the zero-vector, is a subspace of R n. Question 3.4 Let Show that the vector A = 6 6. x = is an eigenvector of A. What is the corresponding eigenvalue? Find the other eigenvalues of A, and an eigenvector for each of them. Find an invertible matrix P and a diagonal matrix D such that P AP = D. 49

14 Question 3.5 Let A =. 3 Find an invertible matrix P and a diagonal matrix D such that P AP = D. Question 3.6 Let A =. Find the eigenvalues of A, and an eigenvector for each of them. Find an orthogonal matrix P and a diagonal matrix D such that P T AP = D. Question 3.7 Suppose that A is a real diagonalisable matrix and that all the eigenvalues of A are non-negative. Prove that there is a matrix B such that B = A. Sketch answers or comments on selected questions Question 3. The characteristic polynomial is λ 3 + 4λ 48, which is easily factorised as λ(λ 6)(λ 8). So the eigenvalues are, 6, 8. Corresponding eigenvectors, respectively, are calculated to be (and non-zero multiples of) (/, /, ) T, (/,, ) T, (/4,, ) T. We may therefore take / / /4 P = /, D = diag(, 6, 8). Question 3. You can check that the only eigenvalue is and that the corresponding eigenvectors are all the scalar multiples of (, ) T. So there cannot be two linearly independent eigenvectors, and hence the matrix is not diagonalisable. Question 3.3 Denote the set described by W. First, W. Suppose now that x, y are in W and that α R. We need to show that x + y and αx are also in W. We know that Ax = λx and Ay = λy, so and so x + y and αx are indeed in W. A(x + y) = Ax + Ay = λx + λy = λ(x + y) A(αx) = α(ax) = α(λx) = λ(αx), Question 3.4 It is given that x is an eigenvector. To determine the corresponding eigenvalue we work out Ax. This should be λx where λ is the required eigenvalue. Performing the calculation, we see that Ax = x and so λ =. The characteristic 5

15 polynomial of A is p(λ) = λ 3 + λ + λ. Since λ = is a root, we know that (λ ) is a factor. Factorising, we obtain p(λ) = (λ )( λ + λ + ) = (λ )(λ )(λ + ), so the other eigenvalues are λ =,. Corresponding eigenvectors are, respectively, (,, ) T and (,, ) T. We may therefore take P =, D = diag(,, ). Question 3.5 This is slightly more complicated since there are not 3 distinct eigenvalues. The eigenvalues turn out to be and, with two occurring twice. An eigenvector for is (,, ) T. We need to find a set of 3 linearly independent eigenvectors, so we need another two coming from the eigenspace corresponding to. You should find that the eigenspace for λ = is two-dimensional and has a basis consisting of (,, ) T and (,, ) T. These two vectors together with (,, ) T do indeed form a linearly independent set. Therefore we may take P =, D = diag(,, ). Question 3.6 The characteristic polynomial turns out to be p(λ) = λ 3 + 4λ 3λ = λ(λ 3)(λ ), so the eigenvalues are,, 3. Corresponding eigenvectors are, respectively, (,, ) T, (,, ) T, (,, ) T. To perform orthogonal diagonalisation (rather than simply diagonalisation) we need to normalise these (that is, make them of length by dividing each by its length). The lengths of the vectors are (respectively), 3,, 6, so the normalised eigenvectors are ( / 3, / 3, / 3) T, (, /, / ) T, (/ 6, / 6, / 6) T. If we take then P = / 3 / 6 / 3 / / 6 / 3 / /, 6 P AP = diag(,, 3) = D. Question 3.7 Since A can be diagonalised, we have P AP = D for some P, where D = diag(λ,..., λ n ), these entries being the eigenvalues of A. It is given that all λ i. We have A = P DP. Let Then B = P diag( λ, λ,..., λ n )P. B = P diag( λ, λ,..., λ n )P P diag( λ, λ,..., λ n )P and we are done. = P diag( λ, λ,..., λn )P = P DP = A, 5

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively. Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

Inner products and orthogonality

Inner products and orthogonality Chapter 5 Inner products and orthogonality Inner product spaces, norms, orthogonality, Gram-Schmidt process Reading The list below gives examples of relevant reading. (For full publication details, see

More information

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued). MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

More information

1 Eigenvalues and Eigenvectors

1 Eigenvalues and Eigenvectors Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x

More information

SOLUTIONS TO HOMEWORK #7, MATH 54 SECTION 001, SPRING 2012

SOLUTIONS TO HOMEWORK #7, MATH 54 SECTION 001, SPRING 2012 SOLUTIONS TO HOMEWORK #7, MATH 54 SECTION, SPRING JASON FERGUSON Beware of typos These may not be the only ways to solve these problems In fact, these may not even be the best ways to solve these problems

More information

Solutions to Assignment 12

Solutions to Assignment 12 Solutions to Assignment Math 7, Fall 6.7. Let P have the inner product given by evaluation at,,, and. Let p =, p = t and q = /(t 5). Find the best approximation to p(t) = t by polynomials in Span{p, p,

More information

SOLUTIONS TO PROBLEM SET 6

SOLUTIONS TO PROBLEM SET 6 SOLUTIONS TO PROBLEM SET 6 18.6 SPRING 16 Note the difference of conventions: these solutions adopt that the characteristic polynomial of a matrix A is det A xi while the lectures adopt the convention

More information

MATH 551 - APPLIED MATRIX THEORY

MATH 551 - APPLIED MATRIX THEORY MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

More information

Section 2.1. Section 2.2. Exercise 6: We have to compute the product AB in two ways, where , B =. 2 1 3 5 A =

Section 2.1. Section 2.2. Exercise 6: We have to compute the product AB in two ways, where , B =. 2 1 3 5 A = Section 2.1 Exercise 6: We have to compute the product AB in two ways, where 4 2 A = 3 0 1 3, B =. 2 1 3 5 Solution 1. Let b 1 = (1, 2) and b 2 = (3, 1) be the columns of B. Then Ab 1 = (0, 3, 13) and

More information

Eigenvalues and eigenvectors of a matrix

Eigenvalues and eigenvectors of a matrix Eigenvalues and eigenvectors of a matrix Definition: If A is an n n matrix and there exists a real number λ and a non-zero column vector V such that AV = λv then λ is called an eigenvalue of A and V is

More information

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Lecture No. # 06 Method to Find Eigenvalues and Eigenvectors Diagonalization

More information

Symmetric Matrices and Quadratic Forms

Symmetric Matrices and Quadratic Forms 7 Symmetric Matrices and Quadratic Forms 7.1 DIAGONALIZAION OF SYMMERIC MARICES SYMMERIC MARIX A symmetric matrix is a matrix A such that. A A Such a matrix is necessarily square. Its main diagonal entries

More information

Lecture 14. Diagonalization

Lecture 14. Diagonalization International College of Economics and Finance (State University Higher School of Economics) Lectures on Linear Algebra by Vladimir Chernya Lecture. Diagonalization To be read to the music of Flowers on

More information

MAT 242 Test 2 SOLUTIONS, FORM T

MAT 242 Test 2 SOLUTIONS, FORM T MAT 242 Test 2 SOLUTIONS, FORM T 5 3 5 3 3 3 3. Let v =, v 5 2 =, v 3 =, and v 5 4 =. 3 3 7 3 a. [ points] The set { v, v 2, v 3, v 4 } is linearly dependent. Find a nontrivial linear combination of these

More information

Linear Algebra PRACTICE EXAMINATION SOLUTIONS

Linear Algebra PRACTICE EXAMINATION SOLUTIONS Linear Algebra 2S2 PRACTICE EXAMINATION SOLUTIONS 1. Find a basis for the row space, the column space, and the nullspace of the following matrix A. Find rank A and nullity A. Verify that every vector in

More information

1 Eigenvalues and Eigenvectors

1 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors The product Ax of a matrix A M n n (R) and an n-vector x is itself an n-vector Of particular interest in many settings (of which differential equations is one) is the following

More information

Math 24 Winter 2010 Wednesday, February 24

Math 24 Winter 2010 Wednesday, February 24 (.) TRUE or FALSE? Math 4 Winter Wednesday, February 4 (a.) Every linear operator on an n-dimensional vector space has n distinct eigenvalues. FALSE. There are linear operators with no eigenvalues, and

More information

Math 480 Diagonalization and the Singular Value Decomposition. These notes cover diagonalization and the Singular Value Decomposition.

Math 480 Diagonalization and the Singular Value Decomposition. These notes cover diagonalization and the Singular Value Decomposition. Math 480 Diagonalization and the Singular Value Decomposition These notes cover diagonalization and the Singular Value Decomposition. 1. Diagonalization. Recall that a diagonal matrix is a square matrix

More information

Math 2040: Matrix Theory and Linear Algebra II Solutions to Assignment 3

Math 2040: Matrix Theory and Linear Algebra II Solutions to Assignment 3 Math 24: Matrix Theory and Linear Algebra II Solutions to Assignment Section 2 The Characteristic Equation 22 Problem Restatement: Find the characteristic polynomial and the eigenvalues of A = Final Answer:

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Math 20F Linear Algebra Lecture 2 Eigenvalues and Eigenvectors Slide Review: Formula for the inverse matrix. Cramer s rule. Determinants, areas and volumes. Definition of eigenvalues and eigenvectors.

More information

MATH 2030: EIGENVALUES AND EIGENVECTORS

MATH 2030: EIGENVALUES AND EIGENVECTORS MATH 200: EIGENVALUES AND EIGENVECTORS Eigenvalues and Eigenvectors of n n matrices With the formula for the determinant of a n n matrix, we can extend our discussion on the eigenvalues and eigenvectors

More information

Linear Algebra A Summary

Linear Algebra A Summary Linear Algebra A Summary Definition: A real vector space is a set V that is provided with an addition and a multiplication such that (a) u V and v V u + v V, (1) u + v = v + u for all u V en v V, (2) u

More information

Summary of week 8 (Lectures 22, 23 and 24)

Summary of week 8 (Lectures 22, 23 and 24) WEEK 8 Summary of week 8 (Lectures 22, 23 and 24) This week we completed our discussion of Chapter 5 of [VST] Recall that if V and W are inner product spaces then a linear map T : V W is called an isometry

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems Chap 8 Linear Algebra: Matrix Eigenvalue Problems Sec 81 Eigenvalues, Eigenvectors Eigenvectors of a (square!) matrix A are vectors x, not zero vectors, such that if you multiply them by A, you get a vector

More information

Solutions to Assignment 9

Solutions to Assignment 9 Solutions to Assignment 9 Math 7, Fall 5.. Construct an example of a matrix with only one distinct eigenvalue. [ ] a b We know that if A then the eigenvalues of A are the roots of the characteristic equation

More information

c 1 v 1 + c 2 v c k v k

c 1 v 1 + c 2 v c k v k Definition: A vector space V is a non-empty set of objects, called vectors, on which the operations addition and scalar multiplication have been defined. The operations are subject to ten axioms: For any

More information

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION 4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:

More information

Coordinates. Definition Let B = { v 1, v 2,..., v n } be a basis for a vector space V. Let v be a vector in V, and write

Coordinates. Definition Let B = { v 1, v 2,..., v n } be a basis for a vector space V. Let v be a vector in V, and write MATH10212 Linear Algebra Brief lecture notes 64 Coordinates Theorem 6.5 Let V be a vector space and let B be a basis for V. For every vector v in V, there is exactly one way to write v as a linear combination

More information

Orthogonal Diagonalization of Symmetric Matrices

Orthogonal Diagonalization of Symmetric Matrices MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

by the matrix A results in a vector which is a reflection of the given

by the matrix A results in a vector which is a reflection of the given Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

Linear Dependence Tests

Linear Dependence Tests Linear Dependence Tests The book omits a few key tests for checking the linear dependence of vectors. These short notes discuss these tests, as well as the reasoning behind them. Our first test checks

More information

Lecture 19: Section 4.4

Lecture 19: Section 4.4 Lecture 19: Section 4.4 Shuanglin Shao November 11, 2013 Coordinate System in Linear Algebra. (1). Recall that S = {v 1, v 2,, v r } is linearly independent if the equation c 1 v 1 + + c r v r = 0 implies

More information

More Linear Algebra Study Problems

More Linear Algebra Study Problems More Linear Algebra Study Problems The final exam will cover chapters -3 except chapter. About half of the exam will cover the material up to chapter 8 and half will cover the material in chapters 9-3.

More information

(a) Compute the dimension of the kernel of T and a basis for the kernel. The kernel of T is the nullspace of A, so we row reduce A to find

(a) Compute the dimension of the kernel of T and a basis for the kernel. The kernel of T is the nullspace of A, so we row reduce A to find Scores Name, Section # #2 #3 #4 #5 #6 #7 #8 Midterm 2 Math 27-W, Linear Algebra Directions. You have 0 minutes to complete the following 8 problems. A complete answer will always include some kind of work

More information

MA 52 May 9, Final Review

MA 52 May 9, Final Review MA 5 May 9, 6 Final Review This packet contains review problems for the whole course, including all the problems from the previous reviews. We also suggest below problems from the textbook for chapters

More information

Math 1180, Hastings. Notes, part 9

Math 1180, Hastings. Notes, part 9 Math 8, Hastings Notes, part 9 First I want to recall the following very important theorem, which only concerns square matrices. We will need to use parts of this frequently. Theorem Suppose that A is

More information

Vector Spaces and Linear Transformations

Vector Spaces and Linear Transformations Vector Spaces and Linear Transformations Beifang Chen Fall 6 Vector spaces A vector space is a nonempty set V whose objects are called vectors equipped with two operations called addition and scalar multiplication:

More information

Solution Set 8, Fall 11

Solution Set 8, Fall 11 Solution Set 8 186 Fall 11 1 What are the possible eigenvalues of a projection matrix? (Hint: if P 2 P and v is an eigenvector look at P 2 v and P v) Show that the values you give are all possible Solution

More information

[1] Diagonal factorization

[1] Diagonal factorization 8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

More information

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations. Row operations.

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations. Row operations. A linear system of equations of the form Sections 75 78 & 8 a x + a x + + a n x n = b a x + a x + + a n x n = b a m x + a m x + + a mn x n = b m can be written in matrix form as AX = B where a a a n x

More information

Linear Algebra and Matrices

Linear Algebra and Matrices LECTURE Linear Algebra and Matrices Before embarking on a study of systems of differential equations we will first review, very quickly, some fundamental objects and operations in linear algebra.. Matrices

More information

Questions on Eigenvectors and Eigenvalues

Questions on Eigenvectors and Eigenvalues Questions on Eigenvectors and Eigenvalues If you can answer these questions without any difficulty, the question set on this portion within the exam should not be a problem at all. Definitions Let A be

More information

Math 22 Final Exam 1

Math 22 Final Exam 1 Math 22 Final Exam. (36 points) Determine if the following statements are true or false. In each case give either a short justification or example (as appropriate) to justify your conclusion. T F (a) If

More information

Matrices, vectors, and vector spaces

Matrices, vectors, and vector spaces Chapter 2 Matrices, vectors, and vector spaces Reading Revision, vectors and matrices, vector spaces, subspaces, linear independence and dependence, bases and dimension, rank of a matrix, linear transformations

More information

Matrix Inverses. Since the linear system. can be written as. where. ,, and,

Matrix Inverses. Since the linear system. can be written as. where. ,, and, Matrix Inverses Consider the ordinary algebraic equation and its solution shown below: Since the linear system can be written as where,, and, (A = coefficient matrix, x = variable vector, b = constant

More information

(Practice)Exam in Linear Algebra

(Practice)Exam in Linear Algebra (Practice)Exam in Linear Algebra First Year at The Faculties of Engineering and Science and of Health This test has 9 pages and 15 problems. In two-sided print. It is allowed to use books, notes, photocopies

More information

A Crash Course in Linear Algebra

A Crash Course in Linear Algebra A Crash Course in Linear Algebra Jim Fakonas October, 202 Definitions The goal of this section is to provide a brief refresher in the basic terms and concepts of linear algebra, listed here roughly in

More information

University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7. Review University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

More information

MATH 304 Linear Algebra Lecture 22: Eigenvalues and eigenvectors (continued). Characteristic polynomial.

MATH 304 Linear Algebra Lecture 22: Eigenvalues and eigenvectors (continued). Characteristic polynomial. MATH 304 Linear Algebra Lecture 22: Eigenvalues and eigenvectors (continued). Characteristic polynomial. Eigenvalues and eigenvectors of a matrix Definition. Let A be an n n matrix. A number λ R is called

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Coordinates and linear transformations (Leon 3.5, 4.1 4.3) Coordinates relative to a basis Change of basis, transition matrix Matrix

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalue-eigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are

More information

Solution based on matrix technique Rewrite. ) = 8x 2 1 4x 1x 2 + 5x x1 2x 2 2x 1 + 5x 2

Solution based on matrix technique Rewrite. ) = 8x 2 1 4x 1x 2 + 5x x1 2x 2 2x 1 + 5x 2 8.2 Quadratic Forms Example 1 Consider the function q(x 1, x 2 ) = 8x 2 1 4x 1x 2 + 5x 2 2 Determine whether q(0, 0) is the global minimum. Solution based on matrix technique Rewrite q( x1 x 2 = x1 ) =

More information

2.5 Complex Eigenvalues

2.5 Complex Eigenvalues 1 25 Complex Eigenvalues Real Canonical Form A semisimple matrix with complex conjugate eigenvalues can be diagonalized using the procedure previously described However, the eigenvectors corresponding

More information

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013 Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,

More information

16 Eigenvalues and eigenvectors

16 Eigenvalues and eigenvectors 6 Eigenvalues and eigenvectors Definition: If a vector x 0 satisfies the equation Ax = λx for some real or complex number λ then λ is said to be an eigenvalue of the matrix A and x is said to be an eigenvector

More information

Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

ROW REDUCTION AND ITS MANY USES

ROW REDUCTION AND ITS MANY USES ROW REDUCTION AND ITS MANY USES CHRIS KOTTKE These notes will cover the use of row reduction on matrices and its many applications, including solving linear systems, inverting linear operators, and computing

More information

(a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular.

(a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular. Theorem.7.: (Properties of Triangular Matrices) (a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular. (b) The product

More information

Vector Spaces and Matrices Kurt Bryan

Vector Spaces and Matrices Kurt Bryan Matrices as Functions Vector Spaces and Matrices Kurt Bryan Up to now matrices have been pretty static objects. We ve used them mainly as a bookkeeping tool for doing Gaussian elimination on systems of

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary

More information

Lecture 8. Econ August 19

Lecture 8. Econ August 19 Lecture 8 Econ 2001 2015 August 19 Lecture 8 Outline 1 Eigenvectors and eigenvalues 2 Diagonalization 3 Quadratic Forms 4 Definiteness of Quadratic Forms 5 Uniqure representation of vectors Eigenvectors

More information

V = [v 1, v 2,... v n ].

V = [v 1, v 2,... v n ]. COORDINATES, EIGENVALUES, AND EIGENVECTORS Review of coordinates The standard basis The standard basis for R n is the basis E = {e, e, e n }, where e j is the vector with in the j-th position and zeros

More information

Notes on Hermitian Matrices and Vector Spaces

Notes on Hermitian Matrices and Vector Spaces 1. Hermitian matrices Notes on Hermitian Matrices and Vector Spaces Defn: The Hermitian conjugate of a matrix is the transpose of its complex conjugate. So, for example, if M = 1 i 0, 1 i 1 + i then its

More information

SUBSPACES. Chapter Introduction. 3.2 Subspaces of F n

SUBSPACES. Chapter Introduction. 3.2 Subspaces of F n Chapter 3 SUBSPACES 3. Introduction Throughout this chapter, we will be studying F n, the set of all n dimensional column vectors with components from a field F. We continue our study of matrices by considering

More information

Note: A typo was corrected in the statement of computational problem #19.

Note: A typo was corrected in the statement of computational problem #19. Note: A typo was corrected in the statement of computational problem #19. 1 True/False Examples True or false: Answers in blue. Justification is given unless the result is a direct statement of a theorem

More information

Math 210 Linear Algebra Fall 2014

Math 210 Linear Algebra Fall 2014 Math 210 Linear Algebra Fall 2014 Instructor's Name: Office Location: Office Hours: Office Phone: E-mail: Course Description This is a first course in vectors, matrices, vector spaces, and linear transformations.

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

2.6 The Inverse of a Square Matrix

2.6 The Inverse of a Square Matrix 200/2/6 page 62 62 CHAPTER 2 Matrices and Systems of Linear Equations 0 0 2 + i i 2i 5 A = 0 9 0 54 A = i i 4 + i 2 0 60 i + i + 5i 26 The Inverse of a Square Matrix In this section we investigate the

More information

October 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix

October 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix Linear Algebra & Properties of the Covariance Matrix October 3rd, 2012 Estimation of r and C Let rn 1, rn, t..., rn T be the historical return rates on the n th asset. rn 1 rṇ 2 r n =. r T n n = 1, 2,...,

More information

Vector coordinates, matrix elements and changes of basis

Vector coordinates, matrix elements and changes of basis Physics 6A Winter Vector coordinates, matrix elements and changes of basis. Coordinates of vectors and matrix elements of linear operators Let V be an n-dimensional real (or complex) vector space. Vectors

More information

1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)

1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible

More information

Section 6.1 - Inner Products and Norms

Section 6.1 - Inner Products and Norms Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,

More information

Inverses and powers: Rules of Matrix Arithmetic

Inverses and powers: Rules of Matrix Arithmetic Contents 1 Inverses and powers: Rules of Matrix Arithmetic 1.1 What about division of matrices? 1.2 Properties of the Inverse of a Matrix 1.2.1 Theorem (Uniqueness of Inverse) 1.2.2 Inverse Test 1.2.3

More information

POL502: Linear Algebra

POL502: Linear Algebra POL502: Linear Algebra Kosuke Imai Department of Politics, Princeton University December 12, 2005 1 Matrix and System of Linear Equations Definition 1 A m n matrix A is a rectangular array of numbers with

More information

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Recall that two vectors in are perpendicular or orthogonal provided that their dot Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

More information

Inner products on R n, and more

Inner products on R n, and more Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 4 Eigenvalues and Eigenvectors In this chapter we will look at matrix eigenvalue problems for and 3 3 matrices These are crucial in many areas of physics, and is a useful starting point for more

More information

Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 3191 Applied Linear Algebra Lecture 5: Quadratic Forms Stephen Billups University of Colorado at Denver Math 3191Applied Linear Algebra p.1/16 Diagonalization of Symmetric Matrices Recall: A symmetric

More information

GRE math study group Linear algebra examples D Joyce, Fall 2011

GRE math study group Linear algebra examples D Joyce, Fall 2011 GRE math study group Linear algebra examples D Joyce, Fall 20 Linear algebra is one of the topics covered by the GRE test in mathematics. Here are the questions relating to linear algebra on the sample

More information

Math 108B Selected Homework Solutions

Math 108B Selected Homework Solutions Math 108B Selected Homework Solutions Charles Martin March 5, 2013 Homework 1 5.1.7 (a) If matrices A, B both represent T under different bases, then for some invertible matrix Q we have B = QAQ 1. Then

More information

RANK AND NULLITY. x 1. x m

RANK AND NULLITY. x 1. x m RANK AND NULLITY. The row and column spaces Let A be an m n matrix. Then A has n columns, each of which is a vector in R m. The linear span of the columns is a subspace of R n. It s called the column space

More information

Math 54 Midterm 2, Fall 2015

Math 54 Midterm 2, Fall 2015 Math 54 Midterm 2, Fall 2015 Name (Last, First): Student ID: GSI/Section: This is a closed book exam, no notes or calculators allowed. It consists of 7 problems, each worth 10 points. The lowest problem

More information

7. Linearly Homogeneous Functions and Euler's Theorem

7. Linearly Homogeneous Functions and Euler's Theorem 24 7. Linearly Homogeneous Functions and Euler's Theorem Let f(x 1,..., x N ) f(x) be a function of N variables defined over the positive orthant, W {x: x >> 0 N }. Note that x >> 0 N means that each component

More information

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns L. Vandenberghe EE133A (Spring 2016) 4. Matrix inverses left and right inverse linear independence nonsingular matrices matrices with linearly independent columns matrices with linearly independent rows

More information

Examination in TMA4110/TMA4115 Calculus 3, August 2013 Solution

Examination in TMA4110/TMA4115 Calculus 3, August 2013 Solution Norwegian University of Science and Technology Department of Mathematical Sciences Page of Examination in TMA40/TMA45 Calculus 3, August 03 Solution 0 0 Problem Given the matrix A 8 4. 9 a) Write the solution

More information

Chapters 7-8: Linear Algebra

Chapters 7-8: Linear Algebra Sections 75, 78 & 81 Solutions 1 A linear system of equations of the form a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written

More information

Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

More information

Preliminaries of linear algebra

Preliminaries of linear algebra Preliminaries of linear algebra (for the Automatic Control course) Matteo Rubagotti March 3, 2011 This note sums up the preliminary definitions and concepts of linear algebra needed for the resolution

More information

MATH 340: EIGENVECTORS, SYMMETRIC MATRICES, AND ORTHOGONALIZATION

MATH 340: EIGENVECTORS, SYMMETRIC MATRICES, AND ORTHOGONALIZATION MATH 340: EIGENVECTORS, SYMMETRIC MATRICES, AND ORTHOGONALIZATION Let A be an n n real matrix. Recall some basic definitions. A is symmetric if A t = A; A vector x R n is an eigenvector for A if x 0, and

More information

Inner Product Spaces

Inner Product Spaces Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

More information

Math 220 Sections 1, 9 and 11. Review Sheet v.2

Math 220 Sections 1, 9 and 11. Review Sheet v.2 Math 220 Sections 1, 9 and 11. Review Sheet v.2 Tyrone Crisp Fall 2006 1.1 Systems of Linear Equations Key terms and ideas - you need to know what they mean, and how they fit together: Linear equation

More information

MATH 304 Linear Algebra Lecture 11: Basis and dimension.

MATH 304 Linear Algebra Lecture 11: Basis and dimension. MATH 304 Linear Algebra Lecture 11: Basis and dimension. Linear independence Definition. Let V be a vector space. Vectors v 1,v 2,...,v k V are called linearly dependent if they satisfy a relation r 1

More information

The Characteristic Polynomial

The Characteristic Polynomial Physics 116A Winter 2011 The Characteristic Polynomial 1 Coefficients of the characteristic polynomial Consider the eigenvalue problem for an n n matrix A, A v = λ v, v 0 (1) The solution to this problem

More information

MATH 2300 Sample Proofs

MATH 2300 Sample Proofs MATH 2300 Sample Proofs This document contains a number of theorems, the proofs of which are at a difficulty level where they could be put on a test or exam. This should not be taken as an indication that

More information

University of Ottawa

University of Ottawa University of Ottawa Department of Mathematics and Statistics MAT 1302A: Mathematical Methods II Instructor: Alistair Savage Final Exam April 2013 Surname First Name Student # Seat # Instructions: (a)

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Definition: A N N matrix A has an eigenvector x (non-zero) with

More information

Using the three elementary row operations we may rewrite A in an echelon form as

Using the three elementary row operations we may rewrite A in an echelon form as Rank, Row-Reduced Form, and Solutions to Example 1 Consider the matrix A given by Using the three elementary row operations we may rewrite A in an echelon form as or, continuing with additional row operations,

More information