Harvard College. Math 21b: Linear Algebra and Differential Equations Formula and Theorem Review


 Ann Poole
 1 years ago
 Views:
Transcription
1 Harvard College Math 21b: Linear Algebra and Differential Equations Formula and Theorem Review Tommy MacWilliam, 13 May 5,
2 Contents Table of Contents 4 1 Linear Equations Standard Representation of a Vector Reduced Row Echelon Form Elementary Row Operations Rank of a Matrix Dot Product of Vectors The Product A x Algebraic Rules for A x Linear Transformations Linear Transformations Scaling Matrix Orthogonal Projection onto a Line Reflection Matrix Rotation Matrix Shear Matrix Matrix Multiplication Invertibility Finding the Inverse Properties of Invertible Matrices Subspaces of R n and Their Dimensions Image of a Function Span Kernel Subspaces of R n Linear Independence Dimension RankNullity Theorem Coordinates Linearity of Coordinates Matrix of a Linear Transformation Similar Matrices Linear Spaces Linear Spaces Finding a Basis of a Linear Space V Isomorphisms
3 5 Orthogonality and Least Squares Orthongality and Length Orthonormal Vectors Orthogonal Projection onto a Subspace Orthogonal Complement Pythongorean Theorem CauchySchwarz Inequality Angle Between Two Vectors GramSchmidt Process QR Decomposition Orthongonal Transformation Transpose Symmetric and Skew Symmetric Matrices Matrix of an Orthogonal Projection LeastSquares Solution Inner Product Spaces Norm and Orthongonality Trace of a Matrix Orthogonal Projection of an Inner Product Space Fourier Analysis Determinants Sarrus s Rule Patterns Determinants and GaussJordan Elimination Laplace Expansion Properties of the Determinant Eigenvalues and Eigenvectors Eigenvalues Characteristic Polynomial Algebraic Multiplicity Eigenvalues, Determinant, and Trace Eigenspace Eigenbasis Geometric Multiplicty Diagonalization Powers of a Diagonalizable Matrix Stable Equilibrium
4 9 Linear Differential Equations Exponential Growth and Decay Linear Dynamical Systems Continuous Dynamical Systems with Real Eigenvalues Continuous Dynamical Systems with Complex Eigenvalues Strategy for Solving Linear Differential Equations Eigenfunctions Characteristic Polynomial of a Linear Differential Operator Kernel of a Linear Differential Operator Characteristic Polynomial with Complex Solution FirstOrder Linear Differential Equations Strategy for Solving Linear Differential Equations Linearization of a Nonlinear System The Heat Equation The Wave Equation
5 1 Linear Equations 1.1 Standard Representation of a Vector v = 1.2 Reduced Row Echelon Form [ x y If a column contains a leading 1, then all ther other entires in that column are 0. If a row contains a leading 1, then each row above it contains a leading 1 further to the left. 1.3 Elementary Row Operations Divide a row by a nonzero scalar. Subtract a multiple of a row from another row. Swap two rows. 1.4 Rank of a Matrix The rank of a matrix A is the number of leading 1 s in rref(a). 1.5 Dot Product of Vectors 1.6 The Product A x A x = v w = v 1 w v n w n 1.7 Algebraic Rules for A x A( x y) = A x + A y A(k x) = k(a x) w 1. w n ] x = w 1 x. w n x 5
6 2 Linear Transformations 2.1 Linear Transformations A function T from R m to R n is called a linear transformation if there exists a matrix A such that T ( x) = A x A transformation is called linear if and only if T ( v + w) = T ( v) + T ( w) T (k v) = kt ( v) 2.2 Scaling Matrix A scaling matrix by k has the form: [ k 0 0 k 2.3 Orthogonal Projection onto a Line proj L ( x) = where w is a vector parallel to the line L. 2.4 Reflection Matrix A reflection matrix has the form: [ a b b a 2.5 Rotation Matrix ] ( ) x w w w ] w A rotation matrix has the form: [ cos θ sin θ sin θ cos θ ] 2.6 Shear Matrix A horizontal shear matrix has the form: [ 1 k 0 1 ] and a vertical shear matrix has the form: [ 1 0 k 1 ] 6
7 2.7 Matrix Multiplication The product of matrices BA is defined as the matrix of the linear transformation T ( x) = B(A x) = (BA) x. (AB)C = A(BC) A(B + C) = AB + AC (ka)b = A(kB) = k(ab) 2.8 Invertibility An n n matrix A is invertible if and only if: 1. rref(a) = I n 2. rank(a) = n or, more simply, if and only if det(a) Finding the Inverse 1. Form the n (2n) matrix [A I n ] 2. Compute rref[a I n ] 3. If rref[a I n ] is of the form [I n B] then A 1 = B 4. If rref[a I n ] is of another form, then A is not invertible 2.10 Properties of Invertible Matrices If an n n matrix A is invertible: The linear system A x = b has a unique solution x for all b in R n rref(a) = I n rank(a) = n im(a) = R n ker(a) = 0 The column vectors of A are linearly independent and form a basis of R n 7
8 3 Subspaces of R n and Their Dimensions 3.1 Image of a Function image(f) = {f(x) : x in X} = {b in Y : b = f(x), for some x in X} For a linear transformation T : The zero vector is in the image of T If v 1 and v 2 are in the image of T, then so is v 1 + v 2 If v is in the image of T, then so is k v 3.2 Span 3.3 Kernel span( v 1,, v n ) = {c 1 v c m v m : c 1 c n in R} The kernel of a linear transformation T is the solution set of the linear system: The zero vector is in the kernel of T A x = 0 If v 1 and v 2 are in the kerne; of T, then so is v 1 + v 2 If v is in the kernel of T, then so is k v 3.4 Subspaces of R n A subset W of the vector space R n is a linear subspace if and only if: 1. W contains the zero vector 2. W is closed under addition 3. W is closed under scalar multiplication The image and kernel of a linear transformation are linear subspaces. 3.5 Linear Independence 1. If a vector v i in v 1,, v n can be expressed as a linear combination of the vectors v 1,, v i 1, then v i is redundant. 2. The vectors v 1,, v n are linearly independent if no vector is redundant. 3. The vectors v 1,, v n form a basis of a subspace V if they span V and are linearly independent 8
9 3.6 Dimension The number of vectors in a subspace V is the dimension of V. 3.7 RankNullity Theorem For an n m matrix A: 3.8 Coordinates dim(kera) + dim(ima) = m For a basis B = ( v 1,, v n ) of a subspace V, any vector x can be written as: x = c 1 v c n v n c 1,, c n are called the Bcoordinates of x, with [ x] B = c 1. c n where S = [v 1 v n ] x = S[ x] B and [ x] B = S 1 x 3.9 Linearity of Coordinates [ x + y] B = [ x] B + [ y] B [k x] B = k[ x] B 3.10 Matrix of a Linear Transformation The matrix B that transforms [ x] B into [T ( x)] B is called the Bmatrix of T : [T ( x)] B = B[ x] B, where B = [ [T ( v 1 )] B [T ( v n )] B ] 3.11 Similar Matrices Two matrices A and B are similar if and only if: AS = SB or B = S 1 AS 9
10 4 Linear Spaces 4.1 Linear Spaces A linear space V is a set with rules for addition and scalar multiplication that satisfies the following properties: 1. (f + g) + h = f + (g + h) 2. f + g = g + f 3. There exists a neutral element n in V such that f + n = f 4. For each f in V there exists a g in V such that f + g = 0 5. k(f + g) = kf + kg 6. (c + k)f = cf + kf 7. c(kf) = (ck)f 8. 1f = f 4.2 Finding a Basis of a Linear Space V 1. Find a typical element w of V in terms of arbitrary constants. 2. Express w as a linear combination of elements in V. 3. If these elements are linearly independent, they will for a basis of V. 4.3 Isomorphisms A linear transformation T is an isomorphism of T is invertible. A linear transformation T from V to W is an isomorphism if any only if ker(t ) = 0 and im(t ) = W. Coordinate changes are isomorphisms. If V is isomorphic to W, then dim(v ) = dim(w ). 10
11 5 Orthogonality and Least Squares 5.1 Orthongality and Length Two vectors c and w are orthogonal if v w = 0. The length of a vector v is v = v v. A vector u is a unit vector if its length is Orthonormal Vectors The vectors u 1,, u n are orthonormal if they are unit vectors orthogonal to each other. { 1 if i = j u i u j = 0 if i j 5.3 Orthogonal Projection onto a Subspace If V is a subspace with an orthonormal basis u 1,, u n : 5.4 Orthogonal Complement proj V ( x) = ( u 1 x) u ( u n x) u n The orthogonal complement V of a subspace V is the set of vectors x orthogonal to all vectors in V : V = { x in R n : v x = 0, for all v in V } V is a subspace of V V V = 0 dim(v ) + dim(v ) = n (V ) = V 5.5 Pythongorean Theorem 5.6 CauchySchwarz Inequality x + y 2 = x 2 + y 2 x y x y 11
12 5.7 Angle Between Two Vectors 5.8 GramSchmidt Process For a basis v 1,, v n of a subspace V : where v i 5.9 QR Decomposition θ = arccos x y x y u 1 = 1 v 1 v 1,, u n = 1 v n = v i ( u 1 v i ) u 1 ( u i 1 v i ) v i 1 For an n m matrix M, M = QR, where Q is an n m matrix whose columns u 1,, u n are orthonormal and R has entries satisfying: r 1 1 = v 1, r jj = v j, r ij = u i v j for i < j 5.10 Orthongonal Transformation A linear transformation T is considered orthogonal if it preserves the length of vectors, such that T ( x) = x T is orthogonal if the vectors T ( e 1 ),, T ( e n ) form an orthonormal basis of R n. The matrix A is orthogonal if A T A = I n. The matrix A is orthogonal if A 1 = A T. A matrix A is orthogonal if its columns form an orthonormal basis of R n. The product AB of two orthogonal matrices A and B is orthogonal. The inverse A 1 of an orthogonal matrix A is orthogonal Transpose The transpose A T of an m n matrix A is the n m matrix whose ijth entry is the jith entry of A. (AB) T = B T A T (A T ) 1 = (A 1 ) T rank(a) = rank(a T ). 12
13 5.12 Symmetric and Skew Symmetric Matrices An n n matrix A is symmetric if A T = A. An n n matrix A is skewsymmetric if A T = A Matrix of an Orthogonal Projection The orthongonal projection onto a subspace V with an orthonormal basis u 1,, u n is QQ T, where Q = [ u 1 u n ] or equivalently, A(A T A) 1 A T where A = [ v 1 v n ] 5.14 LeastSquares Solution The unique leastsquares solution of a linear system A x = b where ker(a) = 0 is x = (A T A) 1 A T b 5.15 Inner Product Spaces The inner product of a linear space V, denoted f, g, has the following properties: f, g = g, f f + h, g = f, g + h, g cf, g = c f, g f, f > Norm and Orthongonality The norm of an element f of an inner product space is f = f, f Two elements f and g of an inner product space are orthongonal if f, g = Trace of a Matrix The trace tr(a) of a matrix A is the sum of its diagonal entries. 13
14 5.18 Orthogonal Projection of an Inner Product Space If g 1,, g n is an orthonormal basis of a subspace W of an inner product space V : 5.19 Fourier Analysis proj W f = g 1, f g g n, f g n f n (t) = a b 1 sin(t) + c 1 cos(t) + + b n sin(nt) + c n cos(nt) where a 0 = f, 1 = 1 2 π 2 b k = f, sin(kt) = 1 π c k = f, cos(kt) = 1 π π π π π π π f(t) dt f(t) sin(kt) dt f(t) cos(kt) dt 6 Determinants 6.1 Sarrus s Rule For an n n matrix A, write the first n 1 columns to the right of A, then multiply along the diagonal to get 2n products. Subtract the first n products, then add the second n products to get the determinant. 6.2 Patterns A pattern of an n n matrix A is a way to choose n entries of A such that each entry is in a unique row and column. The product of a pattern is designated P. Two entries in a pattern are an inversion if one is located above and to the right of the other. The signature of a pattern is defined as sgn P = ( 1)(inversions in P) det(a) = (sgn P )(prod P ) 14
15 6.3 Determinants and GaussJordan Elimination If B is an n n matrix obtained from applying an elementary row operation on an n n matrix A: If B is obtained by row division: det(b) = 1 k det(a) If B is obtained by row swap: det(b) = det(a) If B is obtained by row addition: det(b) = det(a) 6.4 Laplace Expansion Expansion down the jth column: det(a) = n ( 1) i+j a ij det(a ij ) i=1 Expansion along the ith row: det(a) = n ( 1) i+j a ij det(a ij ) j=1 6.5 Properties of the Determinant det(a T ) = det(a) det(ab) = det(a)det(b) If A and B are similar, then det(a) = det(b) det(a 1 ) = 1 det(a) = det(a) 1 7 Eigenvalues and Eigenvectors 7.1 Eigenvalues λ is an eigenvalues of an n n matrix A if and only if 7.2 Characteristic Polynomial det(a λi n ) = 0 det(a λi n ) = ( 1) n λ n + ( 1) n 1 tr(a)λ n det(a) 15
16 7.3 Algebraic Multiplicity An eigenvalue λ has algebraic multiplicty k if it is a root of multiplicity k of the characteristic polynomial. 7.4 Eigenvalues, Determinant, and Trace For an n n matrix A det(a) = λ 1 λ n = tr(a) = λ λ n = n k=1 λ k n k=1 λ k 7.5 Eigenspace E λ = ker(a λi n ) = { v in R n : A v = λ v} 7.6 Eigenbasis An eigenbasis for an n n matrix A conists of the eigenvectors of A and forms a basis for R n. 7.7 Geometric Multiplicty The dimension of the eignspace E λ is the geometric multiplicty of the eigenvalue λ. 7.8 Diagonalization 1. Find the eigenvalues and corresponding eigenvectors of the matrix A. 2. Let S be the eigenbasis for A and D be a matrix with the eigenvalues of A along the diagonal. 3. D = S 1 AS and A = SDS Powers of a Diagonalizable Matrix If A = SDS 1, then A t = SD t S 1 16
17 7.10 Stable Equilibrium 0 is an asympototically stable equilibrium for the system x(t + 1) = A x(t) if lim x(t) = lim t t At = 0 9 Linear Differential Equations 9.1 Exponential Growth and Decay 9.2 Linear Dynamical Systems Discrete model: x(t + 1) = B x(t) Continuous model: d x dt = A x dx dt = kx, x(t) = ekt x Continuous Dynamical Systems with Real Eigenvalues d x dt = A x, x(t) = c 1e λ 1t v c n e λnt v n where v 1, v n forms a real eigenbasis of A with eignvalues λ 1,, λ n 9.4 Continuous Dynamical Systems with Complex Eigenvalues d x dt = A x, [ cos(qt) sin(qt) x(t) = ept S sin(qt) cos(qt) where p ± q are eigenvalues with eigenvectors v ± w and S = [ w v]. ] S 1 x Strategy for Solving Linear Differential Equations To solve an nth order linear differential equation with the form T (f) = g: 1. Find a basis f 1,, f n of ker(t ). 2. Find a particular solution f p. 3. The solutions f are in the form f = c 1 f c n f n + f p. 17
18 9.6 Eigenfunctions A smooth function F is an eigenfunction of T if T (f) = λf 9.7 Characteristic Polynomial of a Linear Differential Operator For T (f) = f (n) + a n 1 f (n 1) + + a 1 f + a 0 f, the characteristic polynomial is defined as: p T (λ) = λ n + a n 1 λ n a 1 λ + a 0 If e λt is an eigenfunction of T with eigenvalue p T (λ): T (e λt ) = p T (λ)e λt 9.8 Kernel of a Linear Differential Operator If T is a linear differential operator with characteristic polynomial p T (λ) with roots λ 1,, λ n, then the kernel of T is formed by e λ 1t, e λ 2t,, e λnt 9.9 Characteristic Polynomial with Complex Solution If the zeros of p T (λ) are p ± q, then the solutions to its differential equation are of the form f(t) = e pt (c 1 cos(qt) + c 2 sin(qt)) 9.10 FirstOrder Linear Differential Equations A differential equation of the form f (t) af(t) = g(t) has a solution of the form f(t) = e at e at g(t) dt 9.11 Strategy for Solving Linear Differential Equations To solve the nth order linear differential equation T (f) = g: 1. Find n linearly independent solutions of T (f) = 0. Write the characteristic polynomial p T (λ) of T by replacing f (k) with λ k Find the solutions λ 1,, λ n of the equation p T (λ) = 0. If λ is a solution of p T (λ) = 0, then e λt is a solution of T (f) = 0. 18
19 If λ is a solution of p t (λ) = 0 with multiplicity m, then e λt, te λt, t 2 e λt,, t m 1 e λt are the solutions of T (f) = 0. If p ± q are complex solutions of p T (λ) = 0, then e pt cos(qt) and e pt sin(qt) are real solutions of T (f) = If T (f) is inhomogenous, find one particular solution f p of the equation T (f) = g. If g is of the form g(t) = A cos(ωt) + B sin(ωt), g(t) = A cos(ωt), or g(t) = A sin(ωt), look for a particular solution of the form f p (t) = P cos(ωt) + Q sin(ωt). If g is of the form g(t) = a 0 + a 1 t + a n t n, look for a particular solution of the form f p (t) = c 0 + c 1 t + c n t n If g is constant, look for a particular solution of the form f p (t) = c. If g is of the form g(t) = f (t) af(t), use the formula f(t) = e at e at g(t) dt. 3. The solutions of T (f) = g are of the form f(t) = c 1 f 1 (t) + c 2 f 2 (t) + + c n f n (t) + f p (t) where f 1,, f n are the solutions from Step 1 and f p is the solution from Step Linearization of a Nonlinear System [ f(x, y) If (a, b) is an equilibrium point of the system g(x, y) 0, then the system is approximated near (a, b) by the Jacobian matrix: ] [ ] f f [ ] (a, b) (a, b) x y u = g (a, b) (a, b) v [ du dt dv dt 9.13 The Heat Equation g x has solutions of the form f(x, t) = b n sin(nx)e n2 µt n= The Wave Equation y f t (x, t) = µf xx (x, t) where b n = 2 π f tt (x, t) = c 2 f xx (x, t) ], such that f(a, b) = 0 and g(a, b) = π 0 f(x, 0) sin(nx) dx has solutions of the form f(x, t) = a n sin(nx) cos(nct) + b n sin(nx) sin(nct) nc n=1 where a n = 2 π f(x, 0) sin(nx) dx and b π 0 n = 2 π f π 0 t(x, 0) sin(nx) dx. 19
1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)
Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible
More informationSection 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj
Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that
More informationSolutions to Linear Algebra Practice Problems
Solutions to Linear Algebra Practice Problems. Find all solutions to the following systems of linear equations. (a) x x + x 5 x x x + x + x 5 (b) x + x + x x + x + x x + x + 8x Answer: (a) We create the
More informationChapter 6. Orthogonality
6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be
More informationSimilarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
More informationInner Product Spaces and Orthogonality
Inner Product Spaces and Orthogonality week 34 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,
More informationOrthogonal Diagonalization of Symmetric Matrices
MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding
More informationApplied Linear Algebra I Review page 1
Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties
More informationLINEAR ALGEBRA. September 23, 2010
LINEAR ALGEBRA September 3, 00 Contents 0. LUdecomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................
More informationRecall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.
ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the ndimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?
More informationLinear Algebra Review. Vectors
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length
More information13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.
3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in threespace, we write a vector in terms
More information1 Eigenvalues and Eigenvectors
Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x
More informationC 1 x(t) = e ta C = e C n. 2! A2 + t3
Matrix Exponential Fundamental Matrix Solution Objective: Solve dt A x with an n n constant coefficient matrix A x (t) Here the unknown is the vector function x(t) x n (t) General Solution Formula in Matrix
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus ndimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationLecture Notes 2: Matrices as Systems of Linear Equations
2: Matrices as Systems of Linear Equations 33A Linear Algebra, Puck Rombach Last updated: April 13, 2016 Systems of Linear Equations Systems of linear equations can represent many things You have probably
More informationINTRODUCTORY LINEAR ALGEBRA WITH APPLICATIONS B. KOLMAN, D. R. HILL
SOLUTIONS OF THEORETICAL EXERCISES selected from INTRODUCTORY LINEAR ALGEBRA WITH APPLICATIONS B. KOLMAN, D. R. HILL Eighth Edition, Prentice Hall, 2005. Dr. Grigore CĂLUGĂREANU Department of Mathematics
More informationMATH1231 Algebra, 2015 Chapter 7: Linear maps
MATH1231 Algebra, 2015 Chapter 7: Linear maps A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra 1 / 43 Chapter
More informationSection 6.1  Inner Products and Norms
Section 6.1  Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,
More informationCITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION
No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August
More informationOrthogonal Projections and Orthonormal Bases
CS 3, HANDOUT A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).
More informationInner product. Definition of inner product
Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product
More information1 Introduction to Matrices
1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns
More informationMathematics Course 111: Algebra I Part IV: Vector Spaces
Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 19967 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are
More informationApplied Linear Algebra
Applied Linear Algebra OTTO BRETSCHER http://www.prenhall.com/bretscher Chapter 7 Eigenvalues and Eigenvectors ChiaHui Chang Email: chia@csie.ncu.edu.tw National Central University, Taiwan 7.1 DYNAMICAL
More informationMATH 551  APPLIED MATRIX THEORY
MATH 55  APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points
More informationImages and Kernels in Linear Algebra By Kristi Hoshibata Mathematics 232
Images and Kernels in Linear Algebra By Kristi Hoshibata Mathematics 232 In mathematics, there are many different fields of study, including calculus, geometry, algebra and others. Mathematics has been
More informationα = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationRecall that two vectors in are perpendicular or orthogonal provided that their dot
Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal
More information1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0
Solutions: Assignment 4.. Find the redundant column vectors of the given matrix A by inspection. Then find a basis of the image of A and a basis of the kernel of A. 5 A The second and third columns are
More informationNotes on Symmetric Matrices
CPSC 536N: Randomized Algorithms 201112 Term 2 Notes on Symmetric Matrices Prof. Nick Harvey University of British Columbia 1 Symmetric Matrices We review some basic results concerning symmetric matrices.
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 51 Orthonormal
More information1 Sets and Set Notation.
LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most
More informationMATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.
MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α
More informationProblem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.
Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve
More informationSection 2.1. Section 2.2. Exercise 6: We have to compute the product AB in two ways, where , B =. 2 1 3 5 A =
Section 2.1 Exercise 6: We have to compute the product AB in two ways, where 4 2 A = 3 0 1 3, B =. 2 1 3 5 Solution 1. Let b 1 = (1, 2) and b 2 = (3, 1) be the columns of B. Then Ab 1 = (0, 3, 13) and
More information(January 14, 2009) End k (V ) End k (V/W )
(January 14, 29) [16.1] Let p be the smallest prime dividing the order of a finite group G. Show that a subgroup H of G of index p is necessarily normal. Let G act on cosets gh of H by left multiplication.
More informationUniversity of Lille I PC first year list of exercises n 7. Review
University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients
More information2.1: MATRIX OPERATIONS
.: MATRIX OPERATIONS What are diagonal entries and the main diagonal of a matrix? What is a diagonal matrix? When are matrices equal? Scalar Multiplication 45 Matrix Addition Theorem (pg 0) Let A, B, and
More information1 VECTOR SPACES AND SUBSPACES
1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such
More information1 2 3 1 1 2 x = + x 2 + x 4 1 0 1
(d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which
More informationMATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0
More informationPresentation 3: Eigenvalues and Eigenvectors of a Matrix
Colleen Kirksey, Beth Van Schoyck, Dennis Bowers MATH 280: Problem Solving November 18, 2011 Presentation 3: Eigenvalues and Eigenvectors of a Matrix Order of Presentation: 1. Definitions of Eigenvalues
More informationx + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3
Math 24 FINAL EXAM (2/9/9  SOLUTIONS ( Find the general solution to the system of equations 2 4 5 6 7 ( r 2 2r r 2 r 5r r x + y + z 2x + y + 4z 5x + 6y + 7z 2 2 2 2 So x z + y 2z 2 and z is free. ( r
More informationNotes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
More informationMA 242 LINEAR ALGEBRA C1, Solutions to Second Midterm Exam
MA 4 LINEAR ALGEBRA C, Solutions to Second Midterm Exam Prof. Nikola Popovic, November 9, 6, 9:3am  :5am Problem (5 points). Let the matrix A be given by 5 6 5 4 5 (a) Find the inverse A of A, if it exists.
More informationBrief Introduction to Vectors and Matrices
CHAPTER 1 Brief Introduction to Vectors and Matrices In this chapter, we will discuss some needed concepts found in introductory course in linear algebra. We will introduce matrix, vector, vectorvalued
More informationNOTES on LINEAR ALGEBRA 1
School of Economics, Management and Statistics University of Bologna Academic Year 205/6 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura
More informationIntroduction to Matrix Algebra
Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra  1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary
More informationScientific Computing: An Introductory Survey
Scientific Computing: An Introductory Survey Chapter 3 Linear Least Squares Prof. Michael T. Heath Department of Computer Science University of Illinois at UrbanaChampaign Copyright c 2002. Reproduction
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +
More informationLinear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices
MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two
More informationMatrices, Determinants and Linear Systems
September 21, 2014 Matrices A matrix A m n is an array of numbers in rows and columns a 11 a 12 a 1n r 1 a 21 a 22 a 2n r 2....... a m1 a m2 a mn r m c 1 c 2 c n We say that the dimension of A is m n (we
More informationExamination paper for TMA4115 Matematikk 3
Department of Mathematical Sciences Examination paper for TMA45 Matematikk 3 Academic contact during examination: Antoine Julien a, Alexander Schmeding b, Gereon Quick c Phone: a 73 59 77 82, b 40 53 99
More informationChapter 17. Orthogonal Matrices and Symmetries of Space
Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length
More informationby the matrix A results in a vector which is a reflection of the given
Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the yaxis We observe that
More informationT ( a i x i ) = a i T (x i ).
Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)
More information2.5 Elementary Row Operations and the Determinant
2.5 Elementary Row Operations and the Determinant Recall: Let A be a 2 2 matrtix : A = a b. The determinant of A, denoted by det(a) c d or A, is the number ad bc. So for example if A = 2 4, det(a) = 2(5)
More informationMath 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.
Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(
More informationChapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation
Chapter 6 Linear Transformation 6 Intro to Linear Transformation Homework: Textbook, 6 Ex, 5, 9,, 5,, 7, 9,5, 55, 57, 6(a,b), 6; page 7 In this section, we discuss linear transformations 89 9 CHAPTER
More informationPROPERTIES OF MATRICES
PROPERIES OF MARICES ajoint... 4, 5 algebraic multiplicity... augmente matri... basis..., cofactor... 4 coorinate vector... 9 Cramer's rule... eterminant..., 5 iagonal matri... 6 iagonalizable... 8 imension...
More informationMatrix Algebra, Class Notes (part 1) by Hrishikesh D. Vinod Copyright 1998 by Prof. H. D. Vinod, Fordham University, New York. All rights reserved.
Matrix Algebra, Class Notes (part 1) by Hrishikesh D. Vinod Copyright 1998 by Prof. H. D. Vinod, Fordham University, New York. All rights reserved. 1 Sum, Product and Transpose of Matrices. If a ij with
More informationMA106 Linear Algebra lecture notes
MA106 Linear Algebra lecture notes Lecturers: Martin Bright and Daan Krammer Warwick, January 2011 Contents 1 Number systems and fields 3 1.1 Axioms for number systems......................... 3 2 Vector
More informationSolution. Area(OABC) = Area(OAB) + Area(OBC) = 1 2 det( [ 5 2 1 2. Question 2. Let A = (a) Calculate the nullspace of the matrix A.
Solutions to Math 30 Takehome prelim Question. Find the area of the quadrilateral OABC on the figure below, coordinates given in brackets. [See pp. 60 63 of the book.] y C(, 4) B(, ) A(5, ) O x Area(OABC)
More informationLinear Dependence Tests
Linear Dependence Tests The book omits a few key tests for checking the linear dependence of vectors. These short notes discuss these tests, as well as the reasoning behind them. Our first test checks
More informationMATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.
MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. Nullspace Let A = (a ij ) be an m n matrix. Definition. The nullspace of the matrix A, denoted N(A), is the set of all ndimensional column
More informationLecture notes on linear algebra
Lecture notes on linear algebra David Lerner Department of Mathematics University of Kansas These are notes of a course given in Fall, 2007 and 2008 to the Honors sections of our elementary linear algebra
More informationSimilar matrices and Jordan form
Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive
More informationx1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.
Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability
More informationEigenvalues and Eigenvectors
Chapter 6 Eigenvalues and Eigenvectors 6. Introduction to Eigenvalues Linear equations Ax D b come from steady state problems. Eigenvalues have their greatest importance in dynamic problems. The solution
More informationMean value theorem, Taylors Theorem, Maxima and Minima.
MA 001 Preparatory Mathematics I. Complex numbers as ordered pairs. Argand s diagram. Triangle inequality. De Moivre s Theorem. Algebra: Quadratic equations and expressions. Permutations and Combinations.
More information3 Orthogonal Vectors and Matrices
3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first
More information3. INNER PRODUCT SPACES
. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.
More informationSolutions to Review Problems
Chapter 1 Solutions to Review Problems Chapter 1 Exercise 42 Which of the following equations are not linear and why: (a x 2 1 + 3x 2 2x 3 = 5. (b x 1 + x 1 x 2 + 2x 3 = 1. (c x 1 + 2 x 2 + x 3 = 5. (a
More informationThe Characteristic Polynomial
Physics 116A Winter 2011 The Characteristic Polynomial 1 Coefficients of the characteristic polynomial Consider the eigenvalue problem for an n n matrix A, A v = λ v, v 0 (1) The solution to this problem
More informationMAT188H1S Lec0101 Burbulla
Winter 206 Linear Transformations A linear transformation T : R m R n is a function that takes vectors in R m to vectors in R n such that and T (u + v) T (u) + T (v) T (k v) k T (v), for all vectors u
More information[1] Diagonal factorization
8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:
More informationDecember 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in twodimensional space (1) 2x y = 3 describes a line in twodimensional space The coefficients of x and y in the equation
More informationAu = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.
Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry
More informationMATH 110 Spring 2015 Homework 6 Solutions
MATH 110 Spring 2015 Homework 6 Solutions Section 2.6 2.6.4 Let α denote the standard basis for V = R 3. Let α = {e 1, e 2, e 3 } denote the dual basis of α for V. We would first like to show that β =
More informationLecture 2 Matrix Operations
Lecture 2 Matrix Operations transpose, sum & difference, scalar multiplication matrix multiplication, matrixvector product matrix inverse 2 1 Matrix transpose transpose of m n matrix A, denoted A T or
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More information4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION
4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:
More information3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.
Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R
More informationMatrix Representations of Linear Transformations and Changes of Coordinates
Matrix Representations of Linear Transformations and Changes of Coordinates 01 Subspaces and Bases 011 Definitions A subspace V of R n is a subset of R n that contains the zero element and is closed under
More information[: : :] [: :1.  B2)CT; (c) AC  CA, where A= B= andc= o]l [o 1 o]l
Math 225 Problems for Review 1 0. Study your notes and the textbook (Sects. 1.11.5, 1.7, 2.12.3, 2.6). 1. Bring the augmented matrix [A 1 b] to a reduced row echelon form and solve two systems of linear
More information4.1 VECTOR SPACES AND SUBSPACES
4.1 VECTOR SPACES AND SUBSPACES What is a vector space? (pg 229) A vector space is a nonempty set, V, of vectors together with two operations; addition and scalar multiplication which satisfies the following
More informationLinear Algebra Problems
Math 504 505 Linear Algebra Problems Jerry L. Kazdan Note: New problems are often added to this collection so the problem numbers change. If you want to refer others to these problems by number, it is
More informationMATH 304 Linear Algebra Lecture 24: Scalar product.
MATH 304 Linear Algebra Lecture 24: Scalar product. Vectors: geometric approach B A B A A vector is represented by a directed segment. Directed segment is drawn as an arrow. Different arrows represent
More informationVector and Matrix Norms
Chapter 1 Vector and Matrix Norms 11 Vector Spaces Let F be a field (such as the real numbers, R, or complex numbers, C) with elements called scalars A Vector Space, V, over the field F is a nonempty
More informationInner products on R n, and more
Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +
More informationDiagonal, Symmetric and Triangular Matrices
Contents 1 Diagonal, Symmetric Triangular Matrices 2 Diagonal Matrices 2.1 Products, Powers Inverses of Diagonal Matrices 2.1.1 Theorem (Powers of Matrices) 2.2 Multiplying Matrices on the Left Right by
More information4 MT210 Notebook 4 3. 4.1 Eigenvalues and Eigenvectors... 3. 4.1.1 Definitions; Graphical Illustrations... 3
MT Notebook Fall / prepared by Professor Jenny Baglivo c Copyright 9 by Jenny A. Baglivo. All Rights Reserved. Contents MT Notebook. Eigenvalues and Eigenvectors................................... Definitions;
More information1 Introduction. 2 Matrices: Definition. Matrix Algebra. Hervé Abdi Lynne J. Williams
In Neil Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage. 00 Matrix Algebra Hervé Abdi Lynne J. Williams Introduction Sylvester developed the modern concept of matrices in the 9th
More informationNotes on Determinant
ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 918/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without
More informationMAT 242 Test 2 SOLUTIONS, FORM T
MAT 242 Test 2 SOLUTIONS, FORM T 5 3 5 3 3 3 3. Let v =, v 5 2 =, v 3 =, and v 5 4 =. 3 3 7 3 a. [ points] The set { v, v 2, v 3, v 4 } is linearly dependent. Find a nontrivial linear combination of these
More informationUsing row reduction to calculate the inverse and the determinant of a square matrix
Using row reduction to calculate the inverse and the determinant of a square matrix Notes for MATH 0290 Honors by Prof. Anna Vainchtein 1 Inverse of a square matrix An n n square matrix A is called invertible
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a
More information