Lesson 14 The Gram-Schmidt Process and QR Factorizations
|
|
- Ethelbert Stone
- 7 years ago
- Views:
Transcription
1 Lesson 14 The Gram-Schmidt Process and QR Factorizations Math 21b March 12, 2007 Announcements Get your midterm from me if you haven t yet. Homework for March 14: 5.2: 6,14,22,34,40,44* Problem Session Wednesdays, 7-8 PM in SC 101b Office hours: Monday 2-4, Tuesday 3-5 in SC 323 (resuming today)
2 The problem Given a basis B for a subspace V of R n, replace it with an orthonormal basis A having the same span.
3 The problem Given a basis B for a subspace V of R n, replace it with an orthonormal basis A having the same span. Why? Because orthonormal bases are just plain better.
4 Why orthonormal bases are just plain better No need to check for linear independence (automatic from orthogonality) Coordinates are easy just dot: v u 1 v u 2 [ v] A =. v u m Orthogonal projections are easy just dot: proj V ( x) = x = ( x u 1 ) u 1 + ( x u 2 ) u ( x u m ) u m
5 The geometric idea In fact, we will do a little better: Given a basis v 1, v 2,..., v m for a subspace V of R n, replace it with an orthonormal basis u 1, u 2,..., u m such that span( v 1, v 2,..., v k ) = span( u 1, u 2,..., u k ) for each k between 1 and m.
6 The geometric idea In fact, we will do a little better: Given a basis v 1, v 2,..., v m for a subspace V of R n, replace it with an orthonormal basis u 1, u 2,..., u m such that span( v 1, v 2,..., v k ) = span( u 1, u 2,..., u k ) for each k between 1 and m. The process works like this: Scale v 1 to make a unit vector u 1. Choose the rest of the u k to be the (normalized) perpendicular part of the orthogonal decomposition of v k relative to span( v 1, v 2,..., v k 1 ).
7 The geometric idea, in pictures v 2 v 1
8 The geometric idea, in pictures v 2 u 1 v 1
9 The geometric idea, in pictures v 2 u 1 v 1
10 The geometric idea, in pictures v 2 v 2 v 1 v 2 u 1
11 The geometric idea, in pictures v 2 v 2 u 2 v 1 v 2 u 1
12 The geometric idea, in pictures v 2 u 2 v 1 u 1
13 The geometric idea, in 3D v 3 v 2 v 1
14 The geometric idea, in 3D v 3 v 1 u 1 v 2
15 The geometric idea, in 3D v 3 v 1 u 1 v 2
16 The geometric idea, in 3D v 2 v 3 v 1 u 1 v 2 v 2
17 The geometric idea, in 3D v 2 v 3 u 2 v 1 u 1 v 2 v 2
18 The geometric idea, in 3D v 3 u 2 v 2 v 1 u 1
19 The geometric idea, in 3D v 3 v 3 u 2 v 2 v 1 u 1 v 3
20 The geometric idea, in 3D v 3 v 3 u 3 u 2 v 2 v 1 u 1 v 3
21 The geometric idea, in 3D v 3 u3 u 2 v 2 v 1 u 1
22 The algebraic idea Scale v 1 to make a unit vector u 1.
23 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1
24 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2.
25 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2
26 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u
27 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u
28 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u This is known as the Gram-Schmidt process for orthonormalization.
29 Worksheet Do Problems 1 3
30 QR Factorizations
31 When are the spans of two sets the same? When is span( v 1, v 2,..., v m ) = span( u 1, u 2,..., u m )?
32 When are the spans of two sets the same? When is span( v 1, v 2,..., v m ) = span( u 1, u 2,..., u m )? If each of the v i s can be written as a linear combination of the u i s and vice versa. v 1 = c 11 u 1 + c 21 u c m1 u m v 2 = c 12 u 1 + c 22 u c m2 u m. v m = c 1m u 1 + c 2m u c mm u m
33 Writing it all in terms of matrices, we have [ v1 c 11 c 12 c 1m ] [ ] c 21 c 22 c 2m v 2... v m = u1 u 2... u m c m1 } c m2 {{ c mm } S
34 Writing it all in terms of matrices, we have [ v1 c 11 c 12 c 1m ] [ ] c 21 c 22 c 2m v 2... v m = u1 u 2... u m c m1 } c m2 {{ c mm } S The matrix S is the change of basis matrix from ( v 1,..., v m ) to ( u 1,..., u m )
35 From Gram-Schmidt to QR Suppose an n m matrix M has rank m, so its columns v 1,..., v m are linearly independent. Let u 1,..., u m be the orthonormal basis of span( v 1,..., v m ) = image M gotten by the Gram-Schmidt process, and let Q = [ u 1... u m ]. Then M = QR for some invertible matrix R.
36 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1
37 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1 v 2 = v 2 ( v 2 u 1 ) u 1 and u 2 = v 2 v 2, so v 2 = ( u 1 v 2 ) u 1 + v 2 u 2
38 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1 v 2 = v 2 ( v 2 u 1 ) u 1 and u 2 = v 2 v 2, so v 2 = ( u 1 v 2 ) u 1 + v 2 u 2 And so on: v k = ( u 1 v k ) u 1 + ( u 2 v k ) u ( u k 1 v k ) u k 1 + v k u k
39 So v 1 u 1 v 2 u 1 v 3 u 1 v m 0 v 2 u 2 v 3 u 2 v m R = 0 0 v 3... u 3 v m v m That is, R is upper triangular. u i v j if i < j r ij = v i = ui v i if i = j 0 if i > j
40 Example Find the QR factorization of M = [ ]
41 Example Find the QR factorization of M = Solution We have v 1 = [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = [ 1/ ] 2. 1/ 2
42 Example Find the QR factorization of M = Solution We have v 1 = [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 ]
43 Example Find the QR factorization of M = Solution We have v 1 = Therefore [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = r 22 = v 2 [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 = 1 u 2 = 1 [ 1/ ] v 2 2 = 2 r 22 1 /. 2 ]
44 Example Find the QR factorization of M = Solution We have v 1 = Therefore [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = r 22 = v 2 [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 = 1 u 2 = 1 [ 1/ ] v 2 2 = 2 r 22 1 /. 2 Putting this all together, [ ] [ 1 1 1/ 2 1/ ] [ 2 2 1/ ] M = = 1 0 1/ 2 1 / / = QR 2 ]
45 Example (Worksheet, #4) Find the QR Factorization of
46 Example (Worksheet, #4) Find the QR Factorization of Solution 1/2 1/2 1 /2 1 / /2 1/2 Q = 1 /2 1 /2 1/2 1/2 1/2 1/2 1/2 R = / /2 1/2 1 /2 1/2 1 / /2
47 Who cares? A lot of important things are easier to calculate once you have the QR factorization, especially if M is square. It turns out Q 1 = Q T, so M 1 = R 1 Q T and R 1 is easier to calculate than M 1. It turns out det Q = ±1, so det M = ± det R, and det R is very easy to calculate (more later) The eigenvalues of M are easier to calculate when M is factored as QR.
Similarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
More informationOrthogonal Diagonalization of Symmetric Matrices
MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding
More informationChapter 6. Orthogonality
6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be
More informationApplied Linear Algebra I Review page 1
Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties
More informationα = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
More informationNotes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
More informationLinear Algebra Review. Vectors
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length
More information1 2 3 1 1 2 x = + x 2 + x 4 1 0 1
(d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which
More informationLINEAR ALGEBRA. September 23, 2010
LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................
More informationInner product. Definition of inner product
Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product
More informationCS3220 Lecture Notes: QR factorization and orthogonal transformations
CS3220 Lecture Notes: QR factorization and orthogonal transformations Steve Marschner Cornell University 11 March 2009 In this lecture I ll talk about orthogonal matrices and their properties, discuss
More informationSection 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj
Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that
More informationReview Jeopardy. Blue vs. Orange. Review Jeopardy
Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 0-3 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?
More informationRecall that two vectors in are perpendicular or orthogonal provided that their dot
Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal
More informationLectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain
Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal
More information[1] Diagonal factorization
8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:
More informationis in plane V. However, it may be more convenient to introduce a plane coordinate system in V.
.4 COORDINATES EXAMPLE Let V be the plane in R with equation x +2x 2 +x 0, a two-dimensional subspace of R. We can describe a vector in this plane by its spatial (D)coordinates; for example, vector x 5
More informationInner Product Spaces and Orthogonality
Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationRecall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.
ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?
More informationx + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3
Math 24 FINAL EXAM (2/9/9 - SOLUTIONS ( Find the general solution to the system of equations 2 4 5 6 7 ( r 2 2r r 2 r 5r r x + y + z 2x + y + 4z 5x + 6y + 7z 2 2 2 2 So x z + y 2z 2 and z is free. ( r
More informationAu = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.
Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry
More informationMAT 242 Test 3 SOLUTIONS, FORM A
MAT Test SOLUTIONS, FORM A. Let v =, v =, and v =. Note that B = { v, v, v } is an orthogonal set. Also, let W be the subspace spanned by { v, v, v }. A = 8 a. [5 points] Find the orthogonal projection
More informationLecture 5: Singular Value Decomposition SVD (1)
EEM3L1: Numerical and Analytical Techniques Lecture 5: Singular Value Decomposition SVD (1) EE3L1, slide 1, Version 4: 25-Sep-02 Motivation for SVD (1) SVD = Singular Value Decomposition Consider the system
More informationby the matrix A results in a vector which is a reflection of the given
Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that
More informationSection 6.1 - Inner Products and Norms
Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,
More informationMath 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.
Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(
More informationSALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated
SALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET Course Title Course Number Department Linear Algebra Mathematics MAT-240 Action Taken (Please Check One) New Course Initiated
More informationOrthogonal Projections and Orthonormal Bases
CS 3, HANDOUT -A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).
More informationMATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.
MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. Nullspace Let A = (a ij ) be an m n matrix. Definition. The nullspace of the matrix A, denoted N(A), is the set of all n-dimensional column
More information3. INNER PRODUCT SPACES
. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.
More informationName: Section Registered In:
Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are
More informationSimilar matrices and Jordan form
Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationLecture 1: Schur s Unitary Triangularization Theorem
Lecture 1: Schur s Unitary Triangularization Theorem This lecture introduces the notion of unitary equivalence and presents Schur s theorem and some of its consequences It roughly corresponds to Sections
More informationMAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =
MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the
More informationOrthogonal Projections
Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors
More informationUsing row reduction to calculate the inverse and the determinant of a square matrix
Using row reduction to calculate the inverse and the determinant of a square matrix Notes for MATH 0290 Honors by Prof. Anna Vainchtein 1 Inverse of a square matrix An n n square matrix A is called invertible
More informationMATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.
MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More information18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in 2-106. Total: 175 points.
806 Problem Set 4 Solution Due Wednesday, March 2009 at 4 pm in 2-06 Total: 75 points Problem : A is an m n matrix of rank r Suppose there are right-hand-sides b for which A x = b has no solution (a) What
More informationNumerical Methods I Eigenvalue Problems
Numerical Methods I Eigenvalue Problems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course G63.2010.001 / G22.2420-001, Fall 2010 September 30th, 2010 A. Donev (Courant Institute)
More informationMATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.
MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar
More informationMath 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010
Math 550 Notes Chapter 7 Jesse Crawford Department of Mathematics Tarleton State University Fall 2010 (Tarleton State University) Math 550 Chapter 7 Fall 2010 1 / 34 Outline 1 Self-Adjoint and Normal Operators
More informationLinear Algebra Notes
Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note
More informationApplied Linear Algebra
Applied Linear Algebra OTTO BRETSCHER http://www.prenhall.com/bretscher Chapter 7 Eigenvalues and Eigenvectors Chia-Hui Chang Email: chia@csie.ncu.edu.tw National Central University, Taiwan 7.1 DYNAMICAL
More informationProblem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.
Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve
More informationSolutions to Math 51 First Exam January 29, 2015
Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not
More informationMATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0
More information1 Introduction to Matrices
1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns
More information17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function
17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function, : V V R, which is symmetric, that is u, v = v, u. bilinear, that is linear (in both factors):
More information1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0
Solutions: Assignment 4.. Find the redundant column vectors of the given matrix A by inspection. Then find a basis of the image of A and a basis of the kernel of A. 5 A The second and third columns are
More informationChapter 17. Orthogonal Matrices and Symmetries of Space
Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length
More informationFactorization Theorems
Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization
More information4 MT210 Notebook 4 3. 4.1 Eigenvalues and Eigenvectors... 3. 4.1.1 Definitions; Graphical Illustrations... 3
MT Notebook Fall / prepared by Professor Jenny Baglivo c Copyright 9 by Jenny A. Baglivo. All Rights Reserved. Contents MT Notebook. Eigenvalues and Eigenvectors................................... Definitions;
More informationOrthogonal Bases and the QR Algorithm
Orthogonal Bases and the QR Algorithm Orthogonal Bases by Peter J Olver University of Minnesota Throughout, we work in the Euclidean vector space V = R n, the space of column vectors with n real entries
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 Lecture 3: QR, least squares, linear regression Linear Algebra Methods for Data Mining, Spring 2007, University
More informationMAT 242 Test 2 SOLUTIONS, FORM T
MAT 242 Test 2 SOLUTIONS, FORM T 5 3 5 3 3 3 3. Let v =, v 5 2 =, v 3 =, and v 5 4 =. 3 3 7 3 a. [ points] The set { v, v 2, v 3, v 4 } is linearly dependent. Find a nontrivial linear combination of these
More informationx1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.
Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability
More informationDecember 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation
More informationMath 2270 - Lecture 33 : Positive Definite Matrices
Math 2270 - Lecture 33 : Positive Definite Matrices Dylan Zwick Fall 2012 This lecture covers section 6.5 of the textbook. Today we re going to talk about a special type of symmetric matrix, called a positive
More informationLinear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices
MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two
More informationChapter 7. Lyapunov Exponents. 7.1 Maps
Chapter 7 Lyapunov Exponents Lyapunov exponents tell us the rate of divergence of nearby trajectories a key component of chaotic dynamics. For one dimensional maps the exponent is simply the average
More informationLecture Notes 2: Matrices as Systems of Linear Equations
2: Matrices as Systems of Linear Equations 33A Linear Algebra, Puck Rombach Last updated: April 13, 2016 Systems of Linear Equations Systems of linear equations can represent many things You have probably
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More information3 Orthogonal Vectors and Matrices
3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first
More informationMATH 551 - APPLIED MATRIX THEORY
MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points
More informationLinear Algebra: Determinants, Inverses, Rank
D Linear Algebra: Determinants, Inverses, Rank D 1 Appendix D: LINEAR ALGEBRA: DETERMINANTS, INVERSES, RANK TABLE OF CONTENTS Page D.1. Introduction D 3 D.2. Determinants D 3 D.2.1. Some Properties of
More informationLecture 2 Matrix Operations
Lecture 2 Matrix Operations transpose, sum & difference, scalar multiplication matrix multiplication, matrix-vector product matrix inverse 2 1 Matrix transpose transpose of m n matrix A, denoted A T or
More informationNumerical Analysis Lecture Notes
Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,
More informationT ( a i x i ) = a i T (x i ).
Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)
More informationOctober 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix
Linear Algebra & Properties of the Covariance Matrix October 3rd, 2012 Estimation of r and C Let rn 1, rn, t..., rn T be the historical return rates on the n th asset. rn 1 rṇ 2 r n =. r T n n = 1, 2,...,
More informationThe Projection Matrix
The Projection Matrix David Arnold Fall 996 Abstract In this activity you will use Matlab to project a set of vectors onto a single vector. Prerequisites. Inner product (dot product) and orthogonal vectors.
More informationMore than you wanted to know about quadratic forms
CALIFORNIA INSTITUTE OF TECHNOLOGY Division of the Humanities and Social Sciences More than you wanted to know about quadratic forms KC Border Contents 1 Quadratic forms 1 1.1 Quadratic forms on the unit
More informationEigenvalues and Eigenvectors
Chapter 6 Eigenvalues and Eigenvectors 6. Introduction to Eigenvalues Linear equations Ax D b come from steady state problems. Eigenvalues have their greatest importance in dynamic problems. The solution
More informationMath 215 HW #6 Solutions
Math 5 HW #6 Solutions Problem 34 Show that x y is orthogonal to x + y if and only if x = y Proof First, suppose x y is orthogonal to x + y Then since x, y = y, x In other words, = x y, x + y = (x y) T
More informationMath 333 - Practice Exam 2 with Some Solutions
Math 333 - Practice Exam 2 with Some Solutions (Note that the exam will NOT be this long) Definitions (0 points) Let T : V W be a transformation Let A be a square matrix (a) Define T is linear (b) Define
More informationThe Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression
The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression The SVD is the most generally applicable of the orthogonal-diagonal-orthogonal type matrix decompositions Every
More informationSection 4.4 Inner Product Spaces
Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +
More informationMath Common Core Sampler Test
High School Algebra Core Curriculum Math Test Math Common Core Sampler Test Our High School Algebra sampler covers the twenty most common questions that we see targeted for this level. For complete tests
More informationComputing Orthonormal Sets in 2D, 3D, and 4D
Computing Orthonormal Sets in 2D, 3D, and 4D David Eberly Geometric Tools, LLC http://www.geometrictools.com/ Copyright c 1998-2016. All Rights Reserved. Created: March 22, 2010 Last Modified: August 11,
More information6. Cholesky factorization
6. Cholesky factorization EE103 (Fall 2011-12) triangular matrices forward and backward substitution the Cholesky factorization solving Ax = b with A positive definite inverse of a positive definite matrix
More informationAPPLICATIONS. are symmetric, but. are not.
CHAPTER III APPLICATIONS Real Symmetric Matrices The most common matrices we meet in applications are symmetric, that is, they are square matrices which are equal to their transposes In symbols, A t =
More informationMath 312 Homework 1 Solutions
Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please
More information1 VECTOR SPACES AND SUBSPACES
1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such
More information160 CHAPTER 4. VECTOR SPACES
160 CHAPTER 4. VECTOR SPACES 4. Rank and Nullity In this section, we look at relationships between the row space, column space, null space of a matrix and its transpose. We will derive fundamental results
More informationNotes on Symmetric Matrices
CPSC 536N: Randomized Algorithms 2011-12 Term 2 Notes on Symmetric Matrices Prof. Nick Harvey University of British Columbia 1 Symmetric Matrices We review some basic results concerning symmetric matrices.
More information4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION
4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:
More informationBindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8
Spaces and bases Week 3: Wednesday, Feb 8 I have two favorite vector spaces 1 : R n and the space P d of polynomials of degree at most d. For R n, we have a canonical basis: R n = span{e 1, e 2,..., e
More informationSolving Linear Systems, Continued and The Inverse of a Matrix
, Continued and The of a Matrix Calculus III Summer 2013, Session II Monday, July 15, 2013 Agenda 1. The rank of a matrix 2. The inverse of a square matrix Gaussian Gaussian solves a linear system by reducing
More informationLecture 14: Section 3.3
Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in
More informationLecture notes on linear algebra
Lecture notes on linear algebra David Lerner Department of Mathematics University of Kansas These are notes of a course given in Fall, 2007 and 2008 to the Honors sections of our elementary linear algebra
More information( ) which must be a vector
MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are
More information13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.
3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms
More informationNonlinear Iterative Partial Least Squares Method
Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for
More informationThe Determinant: a Means to Calculate Volume
The Determinant: a Means to Calculate Volume Bo Peng August 20, 2007 Abstract This paper gives a definition of the determinant and lists many of its well-known properties Volumes of parallelepipeds are
More informationAdding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors
1 Chapter 13. VECTORS IN THREE DIMENSIONAL SPACE Let s begin with some names and notation for things: R is the set (collection) of real numbers. We write x R to mean that x is a real number. A real number
More informationNotes on Determinant
ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without
More informationLecture 5 Least-squares
EE263 Autumn 2007-08 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property
More information