# LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS

Save this PDF as:

Size: px
Start display at page:

Download "LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS"

## Transcription

1 LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS CHARLES REZK Real inner product. Let V be a vector space over R. A (real) inner product is a function, : V V R such that x, y = y, x for all x, y V, c 1 x 1 + c 2 x 2, y = c 1 x 1, y + c 2 x 2, y for all x 1, x 2, y V, c 1, c 2 R, x, x 0 with x, x = 0 iff x = 0. That is, the pairing is symmetric, linear in the first variable (and therefore bilinear, by symmetry), and positive definite. Example (Standard dot product). Given a column vector v R n 1, let v t R 1 n be the transpose. Then define x, y = y t x. Check that this is just the usual dot product, so that x, y = x i y i. Example. Let P R n n be an invertible matrix. Define x, y P = y t P t P x. This is an inner product, usually different from the dot product. Fact: every inner product on R n 1 is of this form for some P. (See discussion of isometries below.) Example. For f, g P, define f, g = 1 1 f(t)g(t) dt. (Exercise.) This is an inner product. Symmetry and bilinearity are clear. It is also clear that f, f = 1 1 f(t)2 dt 0; if the integral is equal to 0, then it must be the case that f(t) 2 = 0 for all 1 t 1, and therefore f(t) = 0 since it is a polynomial. Date: May 2,

2 2 CHARLES REZK Complex inner product. Let V be a vector space over C. A (complex, or Hermitian) inner product is a function, : V V C, satisfying the same axioms, except that the first axiom is replaced by skew-symmetry : x, y = y, x. Observe that this implies that although a complex inner product is C-linear in the first variable, it is conjugate linear in the second variable: x, c 1 y 1 + c 2 y 2 = c 1 x, y 1 + c 2 x, y 2. Note also that if we take x = y, then skew-symmetry gives x, x = x, x, and so x, x R for all x R, so the positive definite axiom still makes sense. Example (standard Hermitian inner product). Given A C m n, let A = A t, the conjugate transpose of A, also called the adjoint of A; if A = (a ij ), then the ij-entry of A is a ji. Define x, y = y x = x k y k. We can think of the real version of inner product as a kind of special case of the complex inner product; the formulas for C also apply in the real case. Below, when I speak of an inner product space, it could either be a real or a complex inner product space. Over either R or C, we can define x = x, x, called the length. In the case of V = R n 1, the standard inner product has a geometric interpretation familiar from vector calculus: the inner product knows about lengths and angles. (Note that x = x 2 i in this case.) This is added structure, coming from the inner product; vector spaces without an inner product don t know anything about lengths or angles. Exercise. If V is a complex inner product space, then the function x, y R def = Re x, y is a real inner product on V viewed as a real vector space. (We will not use this real inner product.) Subspaces of inner product spaces. Proposition. If V has inner product,, and W V is a subspace, then the restriction of, to W is an inner product on W. Proof. (Exercise.) Straightforward using the definitions. Orthogonality. Say that x, y V are orthogonal if x, y = 0. Given S V, the orthogonal complement of S is Proposition. S is a subspace of V. S = { v V v, s = 0 for all s S }. Proof. (Exercise.) If x 1, x 2 S, then c 1 x 1 + c 2 x 2, s = c 1 x 1, s + c 2 x 2, s, which is 0 for all s S by definition. A set S is orthonormal if for all x, y S, x, x = 1 and x, y = 0 if x y. Proposition. Any orthonormal subset S V is linearly independent.

3 LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS 3 Proof. (Exercise.) If {x 1,..., x n } forms an orthonormal set, consider v = c 1 x c n x n for scalars c i. Compute v, x k = c k, since x k, x k = 1, and x i, x k = 0 if i k. Thus v = 0 implies c k = 0 for all k. If u 1,..., u n is an orthonormal basis of V, then Gram-Schmidt. x = x, u j u j. Proposition. Every finite dimensional inner product space has an orthonormal basis. The proof is the Gram-Schmidt algorithm, which takes a basis v 1,..., v n and inductively produces an orthonormal basis u 1,..., u n. The process is: u 1 = v 1 v 1, u 2 = v 2 v 2, u 1 u 1 v 2 v 2, u 1 u 1, u 3 = v 3 v 3, u 1 u 1 v 3, u 2 u 2 v 3 v 3, u 1 u 1 v 3, u 2 u 2... To prove that this works, we first have to show that this construction is well-defined, which amounts to showing that we never divide by zero in the above formulas. We prove this by induction on k, by showing (inductively) that Span(u 1,..., u k 1 ) = Span(v 1,..., v k 1 ). This implies that v k Span(u 1,..., u k 1 ), and therefore v k k 1 j=1 v k, u j u j. Now that we know that vectors u 1,..., u n are defined, it is clear that u k = 1, and it is straightforward to show (again by induction on k) that u i, u k = 0 for i < k, so they form an orthonormal basis. Later we will need the following: Proposition. If V is a finite dimensional inner product space, then v is complementary to Span(v). Proof. Assume v 0. Choose a basis v 1,..., v n of V with v 1 = v. Use Gram-Schmidt to replace with an orthonormal basis u 1,..., u n, with u 1 = v/ v. Verify that v Span(u 2,..., u n ), and therefore for dimension reasons v = Span(u 2,..., u n ). Thus, V is a direct sum of Span(v) = Span(u 1 ) and v = Span(u 2,..., u n ). Exercise. If W is a subspace of a finite dimensional inner product space V, then (W ) = W. Isometry. If V and W are inner product spaces, an isometry is an isomorphism T : V W of vector spaces such that T x, T y W = x, y V. Proposition. For any (real or complex) n-dimensional inner product space V, there exists an isometry between V and R n 1 (if real) or C n 1 (if complex) with the standard inner product. Proof. (Exercise.) Choose an orthonormal basis u 1,..., u n, and define T (x 1,..., x n ) = x k u k.

4 4 CHARLES REZK Spectral theorems. Recall that for A C m n, we define the adjoint A = A t C n m. Note that (AB) = B A, and that (A ) = A. Theorem. Let A C n n. (1) If AA = A A, there exists an orthonormal basis of eigenvectors of A, which is also an orthonormal basis of eigenvectors for A. (2) If A = A ( self-adjoint, or Hermitian ), then there exists an orthonormal basis of eigenvectors of A, and all the eigenvalues are real (i.e., λ k R). (3) If A = A, ( skew-adjoint, or skew-hermitian ), then there exists an orthonormal basis of eigenvectors of A, and all the eigenvalues are imaginary (i.e., λ k ir). (4) If A = A 1, ( unitary ), then there exists an orthonormal basis of eigenvectors of A, and all eigenvalues have norm one (i.e., λ k = λ k λ k = 1). Statements (2) (4) are special cases of (1), since in each case A commutes with A. To derive the statements about eigenvalues from (1), note in general that since (A w) = w A = w A, we have Av, w = w Av = (A w) v = v, A w. Suppose v is a common eigenvector of A and A, e.g., Av = λv and A v = µv, with v 0. Then Av, v = λv, v = λ v, v is equal to v, A v = v, µv = µ v, v, and so λ = µ. In case (2), this gives λ = λ. In case (3), this gives λ = λ. In case (4), this gives λ 1 = λ. Theorem. Let A R n n. (2) If A t = A ( self-adjoint or symmetric ), then there exists an orthonormal basis of eigenvectors of A in R n 1, and all the eigenvalues are real. (3) If A t = A ( skew-adjoint or skew-symmetric ), then all eigenvalues of A are imaginary (i.e., λ k ir), and come in conjugate pairs ±ai when they are not 0. There are no real eigenvectors (except those with eigenvalue 0); however, in C n 1, there is an orthonormal basis of eigenvectors. (4) If A t = A 1 ( orthogonal ), then all eigenvalues of A are complex of norm one (i.e., λ k = 1), and come in conjugate pairs λ, λ when they are not 1 or 1. The real forms are easily derived from the complex ones. Real (2) is a special case of complex (2), etc. When the eigenvalues are not real, we do not get real eigenvectors; however, the non-real eigenvalues must always come in conjugate pairs. [ ] 2 2 Examples. Symmetric. The real matrix A = has eigenvalues 2, 3, with corresponding 2 1 eigenvectors (1, 2) and (2, 1), which are orthogonal, and so can be normalized to an orthonormal basis of eigenvectors 5 1 (1, 2) and 5 1 (2, 1). Exercise. If A is a real symmetric n n-matrix, consider f : R n 1 R defined by f(x) = x t Ax. Show that the maximum and minimum values attained by f on the unit sphere { x R n 1 x = 1 } are exactly the maximal and minimal eigenvalues of A. (Hint: change to a coordinate system with axes parallel to the eigenvectors. A more fun proof is to use Lagrange multipliers; because

5 LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS 5 A is symmetric, the Lagrange multiplier equations will exactly become the eigenvector equation (A λi)x = 0.) [ ] a b Skew-symmetric. If A = is a real skew-symmetric matrix, then a = d = 0 and b = c. c d Thus, the eigenvalues are λ = ±bi. [ ] cos θ sin θ Orthogonal. The real matrix R θ = is orthogonal, since R sin θ cos θ θ t = R θ. The spectral theorem for n n orthogonal matrices says implies that there is an orthonormal basis of R n 1, with respect to which A has block form consisting of 1 1 and 2 2-blocks on the diagonal. The 1 1-blocks have the form ±1, corresponding to real eigenvalues, and the 2 2-blocks have the form R θ, corresponding to eigenvalue pairs e ±iθ. [ For instance, ] any 3 3 orthogonal matrix has an orthonormal basis u 1, u 2, u 3 with block form ±1 0. If the real eigenvalue is 1, then it is a rotation around the axis u 0 R 1 ; if the real eigenvalue θ is 1, then it is a combination of a rotation with a reflection through the plane of rotation. Proof. We have sketched how to derive every form of the spectral theorem from complex case (1), so we only need this basic case. Recall the following facts Any operator T : V V on a (non-trivial) finite dimensional complex vector space has an eigenvalue. If operators S, T : V V commute (ST = T S), then each T -eigenspace E T (λ) is invariant under S. If V is a finite dimensional inner product space and v 0, then v and Span(v) are complementary subspaces. The hypothesis is that A, A C n n commute. Let λ be an eigenvalue of A. Since E A (λ) is invariant under A, the operator A E A (λ) must therefore have an eigenvector v, which will be a common eigenvector of A and A. Assuming Av = λv and A v = µv, compute λ v, v = Av, v = v, A v = v, µv = µ v, v, so λ = µ, i.e., µ = λ. Let v = { x C n 1 x, v = 0 }, the orthogonal complement of v in C n 1. Claim. v is invariant under both A and A. This is a verification. Assuming w v, show that Aw, A w v. For instance, so Aw v ; likewise, Aw, v = w, A v = w, λv = λ w, v = 0, A w, v = v, A w = Av, w = λv, w = λ w, v = 0, so A w v. I would like to inductively apply this to the restrictions of A and A to linear operators acting on v, which has dimension one less than C n 1. We need the following. Lemma. Let A, A C n n such that AA = A A, and let W C n 1 be a non-trivial subspace invariant under both A and A. Then there exists a common eigenvector v W of A and A in

6 6 CHARLES REZK W, the subspace W v = { w W w, v = 0 } is itself invariant under A and A, and W is a direct sum of Span(v) and W v. Proof. The argument is the same. If W is non-zero, then A W (the restriction of A to a map W W) has an eigenvalue λ. Since A commutes with A, the space E A (λ) is A invariant, and hence so is E A W (λ) = W E A (λ). Since A W has λ as an eigenvalue, the subspace E A W (λ) is non-zero; since it is A -invariant, it contains an A -eigenvector v. We have already seen that v must be invariant under both A and A. Therefore, W v is also invariant under both A and A. It remains to show that W is a direct sum of Span(v) and W v. Since W is itself a finite dimensional inner product space, W v is the orthogonal complement of v relative to W, so the claim follows. To prove (1), use the lemma inductively. Thus, use the lemma to find a common eigenvector v 1 W 0 = C n 1 of A and A, and set W 1 = v, which is guaranteed to be invariant under A and A. Then use the lemma to find a common eigenvector v 2 W 1 of A and A, and set W 2 = W 1 v1, etc. By construction, each new eigenvector is orthogonal to the previous ones, and we can normalize them to length one by setting u k = v k / v k. Orthogonal and unitary matrices. Let U C n n have columns u 1,..., u n. Then U is unitary (U U = U 1 U = I) if and only if u j u i = δ ij. That is, U is unitary if and only if its columns are an orthonormal basis of C n 1. The analogous statetment holds for U R n n, which is orthogonal (U t U = U 1 U = I) if and only if u t j u i = δ ij, i.e., U is orthogonal if and only if its columns are an orthonormal basis of R n 1. The spectral theorem implies, for A C n n, that A is Hermitian if and only if it can be written A = UDU for some unitary matrix U and diagonal matrix D with real entries; A is skew-hermitian if and only if it can be written A = UDU for some unitary matrix U and diagonal matrix D with imaginary entries; A is unitary if and only if it can be written A = UDU for some unitary matrix U and diagonal matrix D with diagonal entries in C of unit norm. The spectral theorem implies, for A R n n, that A is symmetric if and only if it can be written A = UDU t for some orthogonal matrix U and diagonal matrix D with real entries. Department of Mathematics, University of Illinois at Urbana-Champaign, Urbana, IL address:

### Summary of week 8 (Lectures 22, 23 and 24)

WEEK 8 Summary of week 8 (Lectures 22, 23 and 24) This week we completed our discussion of Chapter 5 of [VST] Recall that if V and W are inner product spaces then a linear map T : V W is called an isometry

More information

### Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

### Section 6.1 - Inner Products and Norms

Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,

More information

### Similarity and Diagonalization. Similar Matrices

MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

### MATH 240 Fall, Chapter 1: Linear Equations and Matrices

MATH 240 Fall, 2007 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 9th Ed. written by Prof. J. Beachy Sections 1.1 1.5, 2.1 2.3, 4.2 4.9, 3.1 3.5, 5.3 5.5, 6.1 6.3, 6.5, 7.1 7.3 DEFINITIONS

More information

### Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?

More information

### WHICH LINEAR-FRACTIONAL TRANSFORMATIONS INDUCE ROTATIONS OF THE SPHERE?

WHICH LINEAR-FRACTIONAL TRANSFORMATIONS INDUCE ROTATIONS OF THE SPHERE? JOEL H. SHAPIRO Abstract. These notes supplement the discussion of linear fractional mappings presented in a beginning graduate course

More information

### Chapter 6. Orthogonality

6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

More information

### 1 Eigenvalues and Eigenvectors

Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x

More information

### Orthogonal Diagonalization of Symmetric Matrices

MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

### Linear Algebra Review. Vectors

Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

### Quick Reference Guide to Linear Algebra in Quantum Mechanics

Quick Reference Guide to Linear Algebra in Quantum Mechanics Scott N. Walck September 2, 2014 Contents 1 Complex Numbers 2 1.1 Introduction............................ 2 1.2 Real Numbers...........................

More information

### Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal

More information

### Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Math 550 Notes Chapter 7 Jesse Crawford Department of Mathematics Tarleton State University Fall 2010 (Tarleton State University) Math 550 Chapter 7 Fall 2010 1 / 34 Outline 1 Self-Adjoint and Normal Operators

More information

### PROVING STATEMENTS IN LINEAR ALGEBRA

Mathematics V2010y Linear Algebra Spring 2007 PROVING STATEMENTS IN LINEAR ALGEBRA Linear algebra is different from calculus: you cannot understand it properly without some simple proofs. Knowing statements

More information

### (January 14, 2009) End k (V ) End k (V/W )

(January 14, 29) [16.1] Let p be the smallest prime dividing the order of a finite group G. Show that a subgroup H of G of index p is necessarily normal. Let G act on cosets gh of H by left multiplication.

More information

### A matrix over a field F is a rectangular array of elements from F. The symbol

Chapter MATRICES Matrix arithmetic A matrix over a field F is a rectangular array of elements from F The symbol M m n (F) denotes the collection of all m n matrices over F Matrices will usually be denoted

More information

### Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

More information

### Facts About Eigenvalues

Facts About Eigenvalues By Dr David Butler Definitions Suppose A is an n n matrix An eigenvalue of A is a number λ such that Av = λv for some nonzero vector v An eigenvector of A is a nonzero vector v

More information

### by the matrix A results in a vector which is a reflection of the given

Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

### Solution based on matrix technique Rewrite. ) = 8x 2 1 4x 1x 2 + 5x x1 2x 2 2x 1 + 5x 2

8.2 Quadratic Forms Example 1 Consider the function q(x 1, x 2 ) = 8x 2 1 4x 1x 2 + 5x 2 2 Determine whether q(0, 0) is the global minimum. Solution based on matrix technique Rewrite q( x1 x 2 = x1 ) =

More information

### 3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R

More information

### Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

More information

### UNIT 2 MATRICES - I 2.0 INTRODUCTION. Structure

UNIT 2 MATRICES - I Matrices - I Structure 2.0 Introduction 2.1 Objectives 2.2 Matrices 2.3 Operation on Matrices 2.4 Invertible Matrices 2.5 Systems of Linear Equations 2.6 Answers to Check Your Progress

More information

### 9.3 Advanced Topics in Linear Algebra

548 93 Advanced Topics in Linear Algebra Diagonalization and Jordan s Theorem A system of differential equations x = Ax can be transformed to an uncoupled system y = diag(λ,, λ n y by a change of variables

More information

### Inner Product Spaces

Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

More information

### 6. ISOMETRIES isometry central isometry translation Theorem 1: Proof:

6. ISOMETRIES 6.1. Isometries Fundamental to the theory of symmetry are the concepts of distance and angle. So we work within R n, considered as an inner-product space. This is the usual n- dimensional

More information

### Applied Linear Algebra I Review page 1

Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties

More information

### 3. INNER PRODUCT SPACES

. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.

More information

### 1 Spherical Kinematics

ME 115(a): Notes on Rotations 1 Spherical Kinematics Motions of a 3-dimensional rigid body where one point of the body remains fixed are termed spherical motions. A spherical displacement is a rigid body

More information

### [1] Diagonal factorization

8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

More information

### 5. Orthogonal matrices

L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

### Practice Math 110 Final. Instructions: Work all of problems 1 through 5, and work any 5 of problems 10 through 16.

Practice Math 110 Final Instructions: Work all of problems 1 through 5, and work any 5 of problems 10 through 16. 1. Let A = 3 1 1 3 3 2. 6 6 5 a. Use Gauss elimination to reduce A to an upper triangular

More information

### LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

### Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,

More information

### 1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

More information

### Lecture 1: Schur s Unitary Triangularization Theorem

Lecture 1: Schur s Unitary Triangularization Theorem This lecture introduces the notion of unitary equivalence and presents Schur s theorem and some of its consequences It roughly corresponds to Sections

More information

### Special Orthogonal Groups and Rotations

Special Orthogonal Groups and Rotations Christopher Triola Submitted in partial fulfillment of the requirements for Honors in Mathematics at the University of Mary Washington Fredericksburg, Virginia April

More information

### Linear Algebra Test 2 Review by JC McNamara

Linear Algebra Test 2 Review by JC McNamara 2.3 Properties of determinants: det(a T ) = det(a) det(ka) = k n det(a) det(a + B) det(a) + det(b) (In some cases this is true but not always) A is invertible

More information

### Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

More information

### Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

More information

### Diagonal, Symmetric and Triangular Matrices

Contents 1 Diagonal, Symmetric Triangular Matrices 2 Diagonal Matrices 2.1 Products, Powers Inverses of Diagonal Matrices 2.1.1 Theorem (Powers of Matrices) 2.2 Multiplying Matrices on the Left Right by

More information

### Recall that two vectors in are perpendicular or orthogonal provided that their dot

Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

More information

### Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

### Solutions to Linear Algebra Practice Problems

Solutions to Linear Algebra Practice Problems. Find all solutions to the following systems of linear equations. (a) x x + x 5 x x x + x + x 5 (b) x + x + x x + x + x x + x + 8x Answer: (a) We create the

More information

### Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

### MATH 551 - APPLIED MATRIX THEORY

MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

More information

### 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)

Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible

More information

### NOTES on LINEAR ALGEBRA 1

School of Economics, Management and Statistics University of Bologna Academic Year 205/6 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura

More information

### Inner Product Spaces. 7.1 Inner Products

7 Inner Product Spaces 71 Inner Products Recall that if z is a complex number, then z denotes the conjugate of z, Re(z) denotes the real part of z, and Im(z) denotes the imaginary part of z By definition,

More information

### x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability

More information

### Math 312 Homework 1 Solutions

Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please

More information

### Inner product. Definition of inner product

Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product

More information

### 13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms

More information

### Lecture 14: Section 3.3

Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in

More information

### The determinant of a skew-symmetric matrix is a square. This can be seen in small cases by direct calculation: 0 a. 12 a. a 13 a 24 a 14 a 23 a 14

4 Symplectic groups In this and the next two sections, we begin the study of the groups preserving reflexive sesquilinear forms or quadratic forms. We begin with the symplectic groups, associated with

More information

### Name: Section Registered In:

Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are

More information

### Diagonalisation. Chapter 3. Introduction. Eigenvalues and eigenvectors. Reading. Definitions

Chapter 3 Diagonalisation Eigenvalues and eigenvectors, diagonalisation of a matrix, orthogonal diagonalisation fo symmetric matrices Reading As in the previous chapter, there is no specific essential

More information

### Quaternions and isometries

6 Quaternions and isometries 6.1 Isometries of Euclidean space Our first objective is to understand isometries of R 3. Definition 6.1.1 A map f : R 3 R 3 is an isometry if it preserves distances; that

More information

### MATH36001 Background Material 2015

MATH3600 Background Material 205 Matrix Algebra Matrices and Vectors An ordered array of mn elements a ij (i =,, m; j =,, n) written in the form a a 2 a n A = a 2 a 22 a 2n a m a m2 a mn is said to be

More information

### Definition 12 An alternating bilinear form on a vector space V is a map B : V V F such that

4 Exterior algebra 4.1 Lines and 2-vectors The time has come now to develop some new linear algebra in order to handle the space of lines in a projective space P (V ). In the projective plane we have seen

More information

### Orthogonal Projections

Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

More information

### 17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function

17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function, : V V R, which is symmetric, that is u, v = v, u. bilinear, that is linear (in both factors):

More information

### Inner products on R n, and more

Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +

More information

### ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

More information

### 1 2 3 1 1 2 x = + x 2 + x 4 1 0 1

(d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which

More information

### Finite dimensional C -algebras

Finite dimensional C -algebras S. Sundar September 14, 2012 Throughout H, K stand for finite dimensional Hilbert spaces. 1 Spectral theorem for self-adjoint opertors Let A B(H) and let {ξ 1, ξ 2,, ξ n

More information

### Presentation 3: Eigenvalues and Eigenvectors of a Matrix

Colleen Kirksey, Beth Van Schoyck, Dennis Bowers MATH 280: Problem Solving November 18, 2011 Presentation 3: Eigenvalues and Eigenvectors of a Matrix Order of Presentation: 1. Definitions of Eigenvalues

More information

### Vectors, Gradient, Divergence and Curl.

Vectors, Gradient, Divergence and Curl. 1 Introduction A vector is determined by its length and direction. They are usually denoted with letters with arrows on the top a or in bold letter a. We will use

More information

### NOTES ON LINEAR TRANSFORMATIONS

NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all

More information

### An Advanced Course in Linear Algebra. Jim L. Brown

An Advanced Course in Linear Algebra Jim L. Brown July 20, 2015 Contents 1 Introduction 3 2 Vector spaces 4 2.1 Getting started............................ 4 2.2 Bases and dimension.........................

More information

### 4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

L. Vandenberghe EE133A (Spring 2016) 4. Matrix inverses left and right inverse linear independence nonsingular matrices matrices with linearly independent columns matrices with linearly independent rows

More information

### The cover SU(2) SO(3) and related topics

The cover SU(2) SO(3) and related topics Iordan Ganev December 2011 Abstract The subgroup U of unit quaternions is isomorphic to SU(2) and is a double cover of SO(3). This allows a simple computation of

More information

### Notes on Jordan Canonical Form

Notes on Jordan Canonical Form Eric Klavins University of Washington 8 Jordan blocks and Jordan form A Jordan Block of size m and value λ is a matrix J m (λ) having the value λ repeated along the main

More information

### 1.5 Elementary Matrices and a Method for Finding the Inverse

.5 Elementary Matrices and a Method for Finding the Inverse Definition A n n matrix is called an elementary matrix if it can be obtained from I n by performing a single elementary row operation Reminder:

More information

### Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve

More information

### 1 Sets and Set Notation.

LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

More information

### T ( a i x i ) = a i T (x i ).

Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)

More information

### Mathematics Notes for Class 12 chapter 3. Matrices

1 P a g e Mathematics Notes for Class 12 chapter 3. Matrices A matrix is a rectangular arrangement of numbers (real or complex) which may be represented as matrix is enclosed by [ ] or ( ) or Compact form

More information

### Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

More information

### On the general equation of the second degree

On the general equation of the second degree S Kesavan The Institute of Mathematical Sciences, CIT Campus, Taramani, Chennai - 600 113 e-mail:kesh@imscresin Abstract We give a unified treatment of the

More information

### Matrices, transposes, and inverses

Matrices, transposes, and inverses Math 40, Introduction to Linear Algebra Wednesday, February, 202 Matrix-vector multiplication: two views st perspective: A x is linear combination of columns of A 2 4

More information

### Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Peter J. Olver 5. Inner Products and Norms The norm of a vector is a measure of its size. Besides the familiar Euclidean norm based on the dot product, there are a number

More information

### Section 4.4 Inner Product Spaces

Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer

More information

### Eigenvectors. Chapter Motivation Statistics. = ( x i ˆv) ˆv since v v = v 2

Chapter 5 Eigenvectors We turn our attention now to a nonlinear problem about matrices: Finding their eigenvalues and eigenvectors. Eigenvectors x and their corresponding eigenvalues λ of a square matrix

More information

### CHARACTERISTIC ROOTS AND VECTORS

CHARACTERISTIC ROOTS AND VECTORS 1 DEFINITION OF CHARACTERISTIC ROOTS AND VECTORS 11 Statement of the characteristic root problem Find values of a scalar λ for which there exist vectors x 0 satisfying

More information

### Numerical Linear Algebra Chap. 4: Perturbation and Regularisation

Numerical Linear Algebra Chap. 4: Perturbation and Regularisation Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear

More information

### 7 - Linear Transformations

7 - Linear Transformations Mathematics has as its objects of study sets with various structures. These sets include sets of numbers (such as the integers, rationals, reals, and complexes) whose structure

More information

### Unified Lecture # 4 Vectors

Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,

More information

### (a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular.

Theorem.7.: (Properties of Triangular Matrices) (a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular. (b) The product

More information

### Vector and Matrix Norms

Chapter 1 Vector and Matrix Norms 11 Vector Spaces Let F be a field (such as the real numbers, R, or complex numbers, C) with elements called scalars A Vector Space, V, over the field F is a non-empty

More information

### Chapter 8. Matrices II: inverses. 8.1 What is an inverse?

Chapter 8 Matrices II: inverses We have learnt how to add subtract and multiply matrices but we have not defined division. The reason is that in general it cannot always be defined. In this chapter, we

More information

### We call this set an n-dimensional parallelogram (with one vertex 0). We also refer to the vectors x 1,..., x n as the edges of P.

Volumes of parallelograms 1 Chapter 8 Volumes of parallelograms In the present short chapter we are going to discuss the elementary geometrical objects which we call parallelograms. These are going to

More information

### MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 26 Sherod Eubanks HOMEWORK 1 1.1 : 3, 5 1.2 : 4 1.3 : 4, 6, 12, 13, 16 1.4 : 1, 5, 8 Section 1.1: The Eigenvalue-Eigenvector Equation Problem 3 Let A M n (R). If

More information

### 5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES

5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES Definition 5.3. Orthogonal transformations and orthogonal matrices A linear transformation T from R n to R n is called orthogonal if it preserves

More information

### 4.5 Linear Dependence and Linear Independence

4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then

More information

### Additional Topics in Linear Algebra Supplementary Material for Math 540. Joseph H. Silverman

Additional Topics in Linear Algebra Supplementary Material for Math 540 Joseph H Silverman E-mail address: jhs@mathbrownedu Mathematics Department, Box 1917 Brown University, Providence, RI 02912 USA Contents

More information

### Lecture 6: The Group Inverse

Lecture 6: The Group Inverse The matrix index Let A C n n, k positive integer. Then R(A k+1 ) R(A k ). The index of A, denoted IndA, is the smallest integer k such that R(A k ) = R(A k+1 ), or equivalently,

More information

### Problems for Advanced Linear Algebra Fall 2012

Problems for Advanced Linear Algebra Fall 2012 Class will be structured around students presenting complete solutions to the problems in this handout. Please only agree to come to the board when you are

More information

### Lecture L3 - Vectors, Matrices and Coordinate Transformations

S. Widnall 16.07 Dynamics Fall 2009 Lecture notes based on J. Peraire Version 2.0 Lecture L3 - Vectors, Matrices and Coordinate Transformations By using vectors and defining appropriate operations between

More information