EIGENVALUES AND EIGENVECTORS

Size: px
Start display at page:

Download "EIGENVALUES AND EIGENVECTORS"

Transcription

1 Chapter 6 EIGENVALUES AND EIGENVECTORS 61 Motivation We motivate the chapter on eigenvalues b discussing the equation ax + hx + b = c, where not all of a, h, b are zero The expression ax + hx + b is called a quadratic form in x and and we have the identit where X = form ax + hx + b = x a h h b x and A = a h h b x = X t AX, A is called the matrix of the quadratic We now rotate the x, axes anticlockwise through θ radians to new x 1, 1 axes The equations describing the rotation of axes are derived as follows: Let P have coordinates (x, ) relative to the x, axes and coordinates (x 1, 1 ) relative to the x 1, 1 axes Then referring to Figure 61: 115

2 116 CHAPTER 6 EIGENVALUES AND EIGENVECTORS 1 P x 1 R α θ O Q x Figure 61: Rotating the axes x = OQ = OP cos (θ + α) = OP(cos θ cos α sinθ sinα) = (OP cos α)cos θ (OP sinα)sin θ = OR cos θ PR sinθ = x 1 cos θ 1 sin θ Similarl = x 1 sinθ + 1 cos θ We can combine these transformation equations into the single matrix equation: x cos θ sinθ x1 =, sinθ cos θ 1 x x1 cos θ sinθ or X = PY, where X =, Y = and P = 1 sinθ cos θ We note that the columns of P give the directions of the positive x 1 and 1 axes Also P is an orthogonal matrix we have PP t = I and so P 1 = P t The matrix P has the special propert that detp = 1 cos θ sinθ A matrix of the tpe P = is called a rotation matrix sinθ cos θ We shall show soon that an real orthogonal matrix with determinant

3 61 MOTIVATION 117 equal to 1 is a rotation matrix We can also solve for the new coordinates in terms of the old ones: x1 = Y = P t cos θ sinθ x X =, sinθ cos θ 1 so x 1 = xcos θ + sinθ and 1 = xsinθ + cos θ Then X t AX = (PY ) t A(PY ) = Y t (P t AP)Y Now suppose, as we later show, that it is possible to choose an angle θ so that P t AP is a diagonal matrix, sa diag(λ 1, λ ) Then X t AX = λ x x1 1 = λ 0 λ 1 x 1 + λ 1 (61) and relative to the new axes, the equation ax + hx + b = c becomes λ 1 x 1 + λ 1 = c, which is quite eas to sketch This curve is smmetrical about the x 1 and 1 axes, with P 1 and P, the respective columns of P, giving the directions of the axes of smmetr Also it can be verified that P 1 and P satisf the equations 1 AP 1 = λ 1 P 1 and AP = λ P These equations force a restriction on λ 1 and λ For if P 1 = first equation becomes a h u1 u1 = λ h b 1 v 1 v 1 or a λ1 h h b λ 1 u1 v 1 = u1 v 1 0 0, the Hence we are dealing with a homogeneous sstem of two linear equations in two unknowns, having a non trivial solution (u 1, v 1 ) Hence a λ 1 h h b λ 1 = 0 Similarl, λ satisfies the same equation In expanded form, λ 1 and λ satisf λ (a + b)λ + ab h = 0 This equation has real roots λ = a + b ± (a + b) 4(ab h ) = a + b ± (a b) + 4h (6) (The roots are distinct if a b or h 0 The case a = b and h = 0 needs no investigation, as it gives an equation of a circle) The equation λ (a+b)λ+ab h = 0 is called the eigenvalue equation of the matrix A

4 118 CHAPTER 6 EIGENVALUES AND EIGENVECTORS 6 Definitions and examples DEFINITION 61 (Eigenvalue, eigenvector) Let A be a complex square matrix Then if λ is a complex number and X a non zero complex column vector satisfing AX = λx, we call X an eigenvector of A, while λ is called an eigenvalue of A We also sa that X is an eigenvector corresponding to the eigenvalue λ So in the above example P 1 and P are eigenvectors corresponding to λ 1 and λ, respectivel We shall give an algorithm which starts from the a h eigenvalues of A = and constructs a rotation matrix P such that h b P t AP is diagonal As noted above, if λ is an eigenvalue of an n n matrix A, with corresponding eigenvector X, then (A λi n )X = 0, with X 0, so det(a λi n ) = 0 and there are at most n distinct eigenvalues of A Conversel if det (A λi n ) = 0, then (A λi n )X = 0 has a non trivial solution X and so λ is an eigenvalue of A with X a corresponding eigenvector DEFINITION 6 (Characteristic polnomial, equation) The polnomial det (A λi n ) is called the characteristic polnomial of A and is often denoted b ch A (λ) The equation det(a λi n ) = 0 is called the characteristic equation of A Hence the eigenvalues of A are the roots of the characteristic polnomial of A a b For a matrix A =, it is easil verified that the characteristic polnomial is λ (tracea)λ+det A, where trace A = a+d is the sum c d of the diagonal elements of A 1 EXAMPLE 61 Find the eigenvalues of A = and find all eigenvectors 1 Solution The characteristic equation of A is λ 4λ + 3 = 0, or (λ 1)(λ 3) = 0 Hence λ = 1 or 3 The eigenvector equation (A λi n )X = 0 reduces to λ 1 1 λ x = 0 0,

5 6 DEFINITIONS AND EXAMPLES 119 or Taking λ = 1 gives ( λ)x + = 0 x + ( λ) = 0 x + = 0 x + = 0, which has solution x =, arbitrar Consequentl the eigenvectors corresponding to λ = 1 are the vectors, with 0 Taking λ = 3 gives x + = 0 x = 0, which has solution x =, arbitrar Consequentl the eigenvectors corresponding to λ = 3 are the vectors, with 0 Our next result has wide applicabilit: THEOREM 61 Let A be a matrix having distinct eigenvalues λ 1 and λ and corresponding eigenvectors X 1 and X Let P be the matrix whose columns are X 1 and X, respectivel Then P is non singular and P 1 λ1 0 AP = 0 λ Proof Suppose AX 1 = λ 1 X 1 and AX = λ X We show that the sstem of homogeneous equations xx 1 + X = 0 has onl the trivial solution Then b theorem 510 the matrix P = X 1 X is non singular So assume xx 1 + X = 0 (63) Then A(xX 1 + X ) = A0 = 0, so x(ax 1 ) + (AX ) = 0 Hence xλ 1 X 1 + λ X = 0 (64)

6 10 CHAPTER 6 EIGENVALUES AND EIGENVECTORS Multipling equation 63 b λ 1 and subtracting from equation 64 gives (λ λ 1 )X = 0 Hence = 0, as (λ λ 1 ) 0 and X 0 Then from equation 63, xx 1 = 0 and hence x = 0 Then the equations AX 1 = λ 1 X 1 and AX = λ X give AP = AX 1 X = AX 1 AX = λ 1 X 1 λ X λ1 0 λ1 0 = X 1 X = P 0 λ 0 λ so P 1 λ1 0 AP = 0 λ 1 EXAMPLE 6 Let A = be the matrix of example 61 Then X 1 = and X 1 = are eigenvectors corresponding to eigenvalues and 3, respectivel Hence if P =, we have 1 1 P AP = 0 3 There are two immediate applications of theorem 61 The first is to the calculation of A n : If P 1 AP = diag (λ 1, λ ), then A = Pdiag (λ 1, λ )P 1 and ( ) n n A n λ1 0 = P P 1 λ1 0 = P P 1 λ n = P λ 0 λ 0 λ n P 1 The second application is to solving a sstem of linear differential equations = ax + b d = cx + d, a b where A = is a matrix of real or complex numbers and x and c d are functions of t The sstem can be written in matrix form as Ẋ = AX, where x ẋ X = and Ẋ = = ẏ d,

7 6 DEFINITIONS AND EXAMPLES 11 We make the substitution X = PY, where Y = x1 are also functions of t and Ẋ = PẎ = AX = A(PY ), so Ẏ = (P 1 λ1 0 AP)Y = 0 λ 1 Then x 1 and 1 Y Hence x 1 = λ 1 x 1 and 1 = λ 1 These differential equations are well known to have the solutions x 1 = x 1 (0)e λ1t and 1 = 1 (0)e λt, where x 1 (0) is the value of x 1 when t = 0 If = kx, where k is a constant, then d ( ) e kt x = ke kt x + e kt = ke kt x + e kt kx = 0 Hence e kt x is constant, so e kt x = e k0 x(0) = x(0) Hence x = x(0)e kt x1 (0) x(0) However = P 1 (0) 1, so this determines x (0) 1 (0) and 1 (0) in terms of x(0) and (0) Hence ultimatel x and are determined as explicit functions of t, using the equation X = PY 3 EXAMPLE 63 Let A = Use the eigenvalue method to 4 5 derive an explicit formula for A n and also solve the sstem of differential equations d given x = 7 and = 13 when t = 0 = x 3 = 4x 5, Solution The characteristic polnomial of A is λ +3λ+ which has distinct 1 roots λ 1 = 1 and λ = We find corresponding eigenvectors X 1 = and X = Hence if P =, we have P AP = diag ( 1, ) Hence A n = ( Pdiag ( 1, )P 1) n = Pdiag (( 1) n, ( ) n )P ( 1) n = ( ) n 1 1

8 1 CHAPTER 6 EIGENVALUES AND EIGENVECTORS = ( 1) n = ( 1) n 1 3 n 1 4 n n = ( 1) n 4 3 n n 4 4 n n To solve the differential equation sstem, make the substitution X = PY Then x = x , = x The sstem then becomes ẋ 1 = x 1 ẏ 1 = 1, so x 1 = x 1 (0)e t, 1 = 1 (0)e t Now x1 (0) 1 (0) = P 1 x(0) (0) = = 11 6 so x 1 = 11e t and 1 = 6e t Hence x = 11e t + 3(6e t ) = 11e t + 18e t, = 11e t + 4(6e t ) = 11e t + 4e t For a more complicated example we solve a sstem of inhomogeneous recurrence relations EXAMPLE 64 Solve the sstem of recurrence relations given that x 0 = 0 and 0 = 1 x n+1 = x n n 1 n+1 = x n + n +, Solution The sstem can be written in matrix form as where A = X n+1 = AX n + B, 1 1 It is then an eas induction to prove that and B = 1 X n = A n X 0 + (A n A + I )B (65),

9 6 DEFINITIONS AND EXAMPLES 13 Also it is eas to verif b the eigenvalue method that A n = n 1 3 n 1 3 n n = 1 U + 3n V, where U = and V = Hence A n A + I = n U + (3n ) V = n U + (3n 1) V 4 Then equation 65 gives ( ) 1 X n = U + 3n V which simplifies to xn n = 0 1 ( n + U + (3n 1) V 4 (n n )/4 (n n )/4 Hence x n = (n n )/4 and n = (n n )/4 ) 1 REMARK 61 If (A I ) 1 existed (that is, if det(a I ) 0, or equivalentl, if 1 is not an eigenvalue of A), then we could have used the formula A n A + I = (A n I )(A I ) 1 (66) However the eigenvalues of A are 1 and 3 in the above problem, so formula 66 cannot be used there Our discussion of eigenvalues and eigenvectors has been limited to matrices The discussion is more complicated for matrices of size greater than two and is best left to a second course in linear algebra Nevertheless the following result is a useful generalization of theorem 61 The reader is referred to 8, page 350 for a proof THEOREM 6 Let A be an n n matrix having distinct eigenvalues λ 1,,λ n and corresponding eigenvectors X 1,,X n Let P be the matrix whose columns are respectivel X 1,,X n Then P is non singular and P 1 AP = λ λ λ n,

10 14 CHAPTER 6 EIGENVALUES AND EIGENVECTORS Another useful result which covers the case where there are multiple eigenvalues is the following (The reader is referred to 8, pages for a proof): THEOREM 63 Suppose the characteristic polnomial of A has the factorization det(λi n A) = (λ c 1 ) n1 (λ c t ) nt, where c 1,,c t are the distinct eigenvalues of A Suppose that for i = 1,,t, we have nullit (c i I n A) = n i For each such i, choose a basis X i1,,x ini for the eigenspace N(c i I n A) Then the matrix P = X 11 X 1n1 X t1 X tnt is non singular and P 1 AP is the following diagonal matrix P 1 AP = c 1 I n c I n c t I nt (The notation means that on the diagonal there are n 1 elements c 1, followed b n elements c,, n t elements c t ) 63 PROBLEMS Let A = Find an invertible matrix P such that P AP = diag (1, 3) and hence prove that If A = as n A n = 3n 1 A + 3 3n I, prove that A n tends to a limiting matrix /3 /3 1/3 1/3

11 63 PROBLEMS 15 3 Solve the sstem of differential equations d given x = 13 and = when t = 0 = 3x = 5x 4, Answer: x = 7e t + 6e t, = 7e t + 15e t 4 Solve the sstem of recurrence relations x n+1 = 3x n n n+1 = x n + 3 n, given that x 0 = 1 and 0 = Answer: x n = n 1 (3 n ), n = n 1 (3 + n ) a b 5 Let A = be a real or complex matrix with distinct eigenvalues c d λ 1, λ and corresponding eigenvectors X 1, X Also let P = X 1 X (a) Prove that the sstem of recurrence relations x n+1 n+1 = ax n + b n = cx n + d n has the solution xn n = αλ n 1X 1 + βλ n X, where α and β are determined b the equation α = P 1 x0 β 0 (b) Prove that the sstem of differential equations has the solution x d = ax + b = cx + d = αe λ1t X 1 + βe λt X,

12 16 CHAPTER 6 EIGENVALUES AND EIGENVECTORS where α and β are determined b the equation α = P 1 x(0) β (0) a11 a 6 Let A = 1 be a real matrix with non real eigenvalues λ = a 1 a a + ib and λ = a ib, with corresponding eigenvectors X = U + iv and X = U iv, where U and V are real vectors Also let P be the real matrix defined b P = U V Finall let a + ib = re iθ, where r > 0 and θ is real (a) Prove that AU AV = au bv = bu + av (b) Deduce that P 1 AP = a b b a (c) Prove that the sstem of recurrence relations x n+1 n+1 = a 11 x n + a 1 n = a 1 x n + a n has the solution xn = r n {(αu + βv )cos nθ + (βu αv )sinnθ}, n where α and β are determined b the equation α = P 1 x0 β 0 (d) Prove that the sstem of differential equations d = ax + b = cx + d

13 63 PROBLEMS 17 has the solution x = e at {(αu + βv )cos bt + (βu αv )sin bt}, where α and β are determined b the equation α = P 1 x(0) β (0) x x1 Hint: Let = P Also let z = x 1 + i 1 Prove that and deduce that 1 ż = (a ib)z x 1 + i 1 = e at (α + iβ)(cos bt + i sinbt) Then equate real and imaginar parts to solve for x 1, 1 and hence x, a b 7 (The case of repeated eigenvalues) Let A = and suppose c d that the characteristic polnomial of A, λ (a + d)λ + (ad bc), has a repeated root α Also assume that A αi Let B = A αi (i) Prove that (a d) + 4bc = 0 (ii) Prove that B = 0 (iii) Prove that BX 0 for some vector X ; indeed, show that X 1 0 can be taken to be or 0 1 (iv) Let X 1 = BX Prove that P = X 1 X is non singular, and deduce that AX 1 = αx 1 and AX = αx + X 1 P 1 AP = α 1 0 α 8 Use the previous result to solve sstem of the differential equations d = 4x = 4x + 8,

14 18 CHAPTER 6 EIGENVALUES AND EIGENVECTORS given that x = 1 = when t = 0 To solve the differential equation kx = f(t), k a constant, multipl throughout b e kt, thereb converting the left hand side to (e kt x) Answer: x = (1 3t)e 6t, = (1 + 6t)e 6t 9 Let A = 1/ 1/ 0 1/4 1/4 1/ 1/4 1/4 1/ (a) Verif that det(λi 3 A), the characteristic polnomial of A, is given b (λ 1)λ(λ 1 4 ) (b) Find a non singular matrix P such that P 1 AP = diag (1, 0, 1 4 ) (c) Prove that if n 1 A n = n Let A = (a) Verif that det(λi 3 A), the characteristic polnomial of A, is given b (λ 3) (λ 9) (b) Find a non singular matrix P such that P 1 AP = diag (3, 3, 9)

Identifying second degree equations

Identifying second degree equations Chapter 7 Identifing second degree equations 7.1 The eigenvalue method In this section we appl eigenvalue methods to determine the geometrical nature of the second degree equation a 2 + 2h + b 2 + 2g +

More information

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices. Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R

More information

MAT188H1S Lec0101 Burbulla

MAT188H1S Lec0101 Burbulla Winter 206 Linear Transformations A linear transformation T : R m R n is a function that takes vectors in R m to vectors in R n such that and T (u + v) T (u) + T (v) T (k v) k T (v), for all vectors u

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

Introduction to Matrices for Engineers

Introduction to Matrices for Engineers Introduction to Matrices for Engineers C.T.J. Dodson, School of Mathematics, Manchester Universit 1 What is a Matrix? A matrix is a rectangular arra of elements, usuall numbers, e.g. 1 0-8 4 0-1 1 0 11

More information

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal

More information

Similar matrices and Jordan form

Similar matrices and Jordan form Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 6 Eigenvalues and Eigenvectors 6. Introduction to Eigenvalues Linear equations Ax D b come from steady state problems. Eigenvalues have their greatest importance in dynamic problems. The solution

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

More information

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar

More information

LINEAR ALGEBRA. September 23, 2010

LINEAR ALGEBRA. September 23, 2010 LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................

More information

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1 (d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which

More information

Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

More information

System of First Order Differential Equations

System of First Order Differential Equations CHAPTER System of First Order Differential Equations In this chapter, we will discuss system of first order differential equations. There are many applications that involving find several unknown functions

More information

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components The eigenvalues and eigenvectors of a square matrix play a key role in some important operations in statistics. In particular, they

More information

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i. Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(

More information

DEFINITION 5.1.1 A complex number is a matrix of the form. x y. , y x

DEFINITION 5.1.1 A complex number is a matrix of the form. x y. , y x Chapter 5 COMPLEX NUMBERS 5.1 Constructing the complex numbers One way of introducing the field C of complex numbers is via the arithmetic of matrices. DEFINITION 5.1.1 A complex number is a matrix of

More information

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n. ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?

More information

9 MATRICES AND TRANSFORMATIONS

9 MATRICES AND TRANSFORMATIONS 9 MATRICES AND TRANSFORMATIONS Chapter 9 Matrices and Transformations Objectives After studying this chapter you should be able to handle matrix (and vector) algebra with confidence, and understand the

More information

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4) ROOTS OF POLYNOMIAL EQUATIONS In this unit we discuss polynomial equations. A polynomial in x of degree n, where n 0 is an integer, is an expression of the form P n (x) =a n x n + a n 1 x n 1 + + a 1 x

More information

The Characteristic Polynomial

The Characteristic Polynomial Physics 116A Winter 2011 The Characteristic Polynomial 1 Coefficients of the characteristic polynomial Consider the eigenvalue problem for an n n matrix A, A v = λ v, v 0 (1) The solution to this problem

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively. Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

More information

Factorization Theorems

Factorization Theorems Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization

More information

Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

4.5 Linear Dependence and Linear Independence

4.5 Linear Dependence and Linear Independence 4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then

More information

Section 1.1. Introduction to R n

Section 1.1. Introduction to R n The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to

More information

LS.6 Solution Matrices

LS.6 Solution Matrices LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions

More information

University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7. Review University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

More information

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued). MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0

More information

by the matrix A results in a vector which is a reflection of the given

by the matrix A results in a vector which is a reflection of the given Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

8 Square matrices continued: Determinants

8 Square matrices continued: Determinants 8 Square matrices continued: Determinants 8. Introduction Determinants give us important information about square matrices, and, as we ll soon see, are essential for the computation of eigenvalues. You

More information

Notes on Determinant

Notes on Determinant ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without

More information

1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES 1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

More information

Orthogonal Diagonalization of Symmetric Matrices

Orthogonal Diagonalization of Symmetric Matrices MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

THREE DIMENSIONAL GEOMETRY

THREE DIMENSIONAL GEOMETRY Chapter 8 THREE DIMENSIONAL GEOMETRY 8.1 Introduction In this chapter we present a vector algebra approach to three dimensional geometry. The aim is to present standard properties of lines and planes,

More information

1 Sets and Set Notation.

1 Sets and Set Notation. LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

More information

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) 1 9.4.4 Write the given system in matrix form x = Ax + f ( ) sin(t) x y 1 0 5 z = dy cos(t)

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) 1 9.4.4 Write the given system in matrix form x = Ax + f ( ) sin(t) x y 1 0 5 z = dy cos(t) Solutions HW 9.4.2 Write the given system in matrix form x = Ax + f r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + We write this as ( ) r (t) θ (t) = ( ) ( ) 2 r(t) θ(t) + ( ) sin(t) 9.4.4 Write the given system

More information

Using row reduction to calculate the inverse and the determinant of a square matrix

Using row reduction to calculate the inverse and the determinant of a square matrix Using row reduction to calculate the inverse and the determinant of a square matrix Notes for MATH 0290 Honors by Prof. Anna Vainchtein 1 Inverse of a square matrix An n n square matrix A is called invertible

More information

2.6. The Circle. Introduction. Prerequisites. Learning Outcomes

2.6. The Circle. Introduction. Prerequisites. Learning Outcomes The Circle 2.6 Introduction A circle is one of the most familiar geometrical figures. In this brief Section we discuss the basic coordinate geometr of a circle - in particular the basic equation representing

More information

3.3. Solving Polynomial Equations. Introduction. Prerequisites. Learning Outcomes

3.3. Solving Polynomial Equations. Introduction. Prerequisites. Learning Outcomes Solving Polynomial Equations 3.3 Introduction Linear and quadratic equations, dealt within Sections 3.1 and 3.2, are members of a class of equations, called polynomial equations. These have the general

More information

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0. Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability

More information

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION 4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:

More information

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

Linear Algebraic Equations, SVD, and the Pseudo-Inverse Linear Algebraic Equations, SVD, and the Pseudo-Inverse Philip N. Sabes October, 21 1 A Little Background 1.1 Singular values and matrix inversion For non-smmetric matrices, the eigenvalues and singular

More information

Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

More information

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are Solving Sstems of Linear Equations in Matri Form with rref Learning Goals Determine the solution of a sstem of equations from the augmented matri Determine the reduced row echelon form of the augmented

More information

1 Introduction to Matrices

1 Introduction to Matrices 1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns

More information

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors 1 Chapter 13. VECTORS IN THREE DIMENSIONAL SPACE Let s begin with some names and notation for things: R is the set (collection) of real numbers. We write x R to mean that x is a real number. A real number

More information

MATH 551 - APPLIED MATRIX THEORY

MATH 551 - APPLIED MATRIX THEORY MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

More information

GROUPS ACTING ON A SET

GROUPS ACTING ON A SET GROUPS ACTING ON A SET MATH 435 SPRING 2012 NOTES FROM FEBRUARY 27TH, 2012 1. Left group actions Definition 1.1. Suppose that G is a group and S is a set. A left (group) action of G on S is a rule for

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J. Olver 5. Inner Products and Norms The norm of a vector is a measure of its size. Besides the familiar Euclidean norm based on the dot product, there are a number

More information

ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

More information

2 Polynomials over a field

2 Polynomials over a field 2 Polynomials over a field A polynomial over a field F is a sequence (a 0, a 1, a 2,, a n, ) where a i F i with a i = 0 from some point on a i is called the i th coefficient of f We define three special

More information

5.2 Inverse Functions

5.2 Inverse Functions 78 Further Topics in Functions. Inverse Functions Thinking of a function as a process like we did in Section., in this section we seek another function which might reverse that process. As in real life,

More information

6. Cholesky factorization

6. Cholesky factorization 6. Cholesky factorization EE103 (Fall 2011-12) triangular matrices forward and backward substitution the Cholesky factorization solving Ax = b with A positive definite inverse of a positive definite matrix

More information

CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation

CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation Chapter 2 CONTROLLABILITY 2 Reachable Set and Controllability Suppose we have a linear system described by the state equation ẋ Ax + Bu (2) x() x Consider the following problem For a given vector x in

More information

2.6. The Circle. Introduction. Prerequisites. Learning Outcomes

2.6. The Circle. Introduction. Prerequisites. Learning Outcomes The Circle 2.6 Introduction A circle is one of the most familiar geometrical figures and has been around a long time! In this brief Section we discuss the basic coordinate geometr of a circle - in particular

More information

α = u v. In other words, Orthogonal Projection

α = u v. In other words, Orthogonal Projection Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

More information

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013 Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,

More information

5.3 The Cross Product in R 3

5.3 The Cross Product in R 3 53 The Cross Product in R 3 Definition 531 Let u = [u 1, u 2, u 3 ] and v = [v 1, v 2, v 3 ] Then the vector given by [u 2 v 3 u 3 v 2, u 3 v 1 u 1 v 3, u 1 v 2 u 2 v 1 ] is called the cross product (or

More information

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes The Inverse of a Matrix 7.4 Introduction In number arithmetic every number a 0 has a reciprocal b written as a or such that a ba = ab =. Similarly a square matrix A may have an inverse B = A where AB =

More information

[1] Diagonal factorization

[1] Diagonal factorization 8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +

More information

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Matrix Algebra A. Doerr Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Some Basic Matrix Laws Assume the orders of the matrices are such that

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

= 2 + 1 2 2 = 3 4, Now assume that P (k) is true for some fixed k 2. This means that

= 2 + 1 2 2 = 3 4, Now assume that P (k) is true for some fixed k 2. This means that Instructions. Answer each of the questions on your own paper, and be sure to show your work so that partial credit can be adequately assessed. Credit will not be given for answers (even correct ones) without

More information

1. a. standard form of a parabola with. 2 b 1 2 horizontal axis of symmetry 2. x 2 y 2 r 2 o. standard form of an ellipse centered

1. a. standard form of a parabola with. 2 b 1 2 horizontal axis of symmetry 2. x 2 y 2 r 2 o. standard form of an ellipse centered Conic Sections. Distance Formula and Circles. More on the Parabola. The Ellipse and Hperbola. Nonlinear Sstems of Equations in Two Variables. Nonlinear Inequalities and Sstems of Inequalities In Chapter,

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,

More information

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression The SVD is the most generally applicable of the orthogonal-diagonal-orthogonal type matrix decompositions Every

More information

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

More information

Torgerson s Classical MDS derivation: 1: Determining Coordinates from Euclidean Distances

Torgerson s Classical MDS derivation: 1: Determining Coordinates from Euclidean Distances Torgerson s Classical MDS derivation: 1: Determining Coordinates from Euclidean Distances It is possible to construct a matrix X of Cartesian coordinates of points in Euclidean space when we know the Euclidean

More information

Click here for answers.

Click here for answers. CHALLENGE PROBLEMS: CHALLENGE PROBLEMS 1 CHAPTER A Click here for answers S Click here for solutions A 1 Find points P and Q on the parabola 1 so that the triangle ABC formed b the -ais and the tangent

More information

Equations, Inequalities & Partial Fractions

Equations, Inequalities & Partial Fractions Contents Equations, Inequalities & Partial Fractions.1 Solving Linear Equations 2.2 Solving Quadratic Equations 1. Solving Polynomial Equations 1.4 Solving Simultaneous Linear Equations 42.5 Solving Inequalities

More information

Lecture 5: Singular Value Decomposition SVD (1)

Lecture 5: Singular Value Decomposition SVD (1) EEM3L1: Numerical and Analytical Techniques Lecture 5: Singular Value Decomposition SVD (1) EE3L1, slide 1, Version 4: 25-Sep-02 Motivation for SVD (1) SVD = Singular Value Decomposition Consider the system

More information

Examination paper for TMA4115 Matematikk 3

Examination paper for TMA4115 Matematikk 3 Department of Mathematical Sciences Examination paper for TMA45 Matematikk 3 Academic contact during examination: Antoine Julien a, Alexander Schmeding b, Gereon Quick c Phone: a 73 59 77 82, b 40 53 99

More information

6 EXTENDING ALGEBRA. 6.0 Introduction. 6.1 The cubic equation. Objectives

6 EXTENDING ALGEBRA. 6.0 Introduction. 6.1 The cubic equation. Objectives 6 EXTENDING ALGEBRA Chapter 6 Extending Algebra Objectives After studying this chapter you should understand techniques whereby equations of cubic degree and higher can be solved; be able to factorise

More information

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products H. Geuvers Institute for Computing and Information Sciences Intelligent Systems Version: spring 2015 H. Geuvers Version:

More information

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5.

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5. PUTNAM TRAINING POLYNOMIALS (Last updated: November 17, 2015) Remark. This is a list of exercises on polynomials. Miguel A. Lerma Exercises 1. Find a polynomial with integral coefficients whose zeros include

More information

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

More information

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions. 3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms

More information

BX in ( u, v) basis in two ways. On the one hand, AN = u+

BX in ( u, v) basis in two ways. On the one hand, AN = u+ 1. Let f(x) = 1 x +1. Find f (6) () (the value of the sixth derivative of the function f(x) at zero). Answer: 7. We expand the given function into a Taylor series at the point x = : f(x) = 1 x + x 4 x

More information

CURVE FITTING LEAST SQUARES APPROXIMATION

CURVE FITTING LEAST SQUARES APPROXIMATION CURVE FITTING LEAST SQUARES APPROXIMATION Data analysis and curve fitting: Imagine that we are studying a physical system involving two quantities: x and y Also suppose that we expect a linear relationship

More information

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2. Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given

More information

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a 88 CHAPTER. VECTOR FUNCTIONS.4 Curvature.4.1 Definitions and Examples The notion of curvature measures how sharply a curve bends. We would expect the curvature to be 0 for a straight line, to be very small

More information

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A = MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the

More information

Lecture 14: Section 3.3

Lecture 14: Section 3.3 Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in

More information

Lecture 5 Rational functions and partial fraction expansion

Lecture 5 Rational functions and partial fraction expansion S. Boyd EE102 Lecture 5 Rational functions and partial fraction expansion (review of) polynomials rational functions pole-zero plots partial fraction expansion repeated poles nonproper rational functions

More information

Inner products on R n, and more

Inner products on R n, and more Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +

More information

Section 8.8. 1. The given line has equations. x = 3 + t(13 3) = 3 + 10t, y = 2 + t(3 + 2) = 2 + 5t, z = 7 + t( 8 7) = 7 15t.

Section 8.8. 1. The given line has equations. x = 3 + t(13 3) = 3 + 10t, y = 2 + t(3 + 2) = 2 + 5t, z = 7 + t( 8 7) = 7 15t. . The given line has equations Section 8.8 x + t( ) + 0t, y + t( + ) + t, z 7 + t( 8 7) 7 t. The line meets the plane y 0 in the point (x, 0, z), where 0 + t, or t /. The corresponding values for x and

More information

Plane Stress Transformations

Plane Stress Transformations 6 Plane Stress Transformations ASEN 311 - Structures ASEN 311 Lecture 6 Slide 1 Plane Stress State ASEN 311 - Structures Recall that in a bod in plane stress, the general 3D stress state with 9 components

More information

The Method of Partial Fractions Math 121 Calculus II Spring 2015

The Method of Partial Fractions Math 121 Calculus II Spring 2015 Rational functions. as The Method of Partial Fractions Math 11 Calculus II Spring 015 Recall that a rational function is a quotient of two polynomials such f(x) g(x) = 3x5 + x 3 + 16x x 60. The method

More information

Lecture 1: Schur s Unitary Triangularization Theorem

Lecture 1: Schur s Unitary Triangularization Theorem Lecture 1: Schur s Unitary Triangularization Theorem This lecture introduces the notion of unitary equivalence and presents Schur s theorem and some of its consequences It roughly corresponds to Sections

More information

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited Physics 116A Winter 2011 The Matrix Elements of a 3 3 Orthogonal Matrix Revisited 1. Introduction In a class handout entitled, Three-Dimensional Proper and Improper Rotation Matrices, I provided a derivation

More information

A note on companion matrices

A note on companion matrices Linear Algebra and its Applications 372 (2003) 325 33 www.elsevier.com/locate/laa A note on companion matrices Miroslav Fiedler Academy of Sciences of the Czech Republic Institute of Computer Science Pod

More information

Inner Product Spaces

Inner Product Spaces Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

More information

Core Maths C2. Revision Notes

Core Maths C2. Revision Notes Core Maths C Revision Notes November 0 Core Maths C Algebra... Polnomials: +,,,.... Factorising... Long division... Remainder theorem... Factor theorem... 4 Choosing a suitable factor... 5 Cubic equations...

More information