Lesson 14 The GramSchmidt Process and QR Factorizations


 Ethelbert Stone
 1 years ago
 Views:
Transcription
1 Lesson 14 The GramSchmidt Process and QR Factorizations Math 21b March 12, 2007 Announcements Get your midterm from me if you haven t yet. Homework for March 14: 5.2: 6,14,22,34,40,44* Problem Session Wednesdays, 78 PM in SC 101b Office hours: Monday 24, Tuesday 35 in SC 323 (resuming today)
2 The problem Given a basis B for a subspace V of R n, replace it with an orthonormal basis A having the same span.
3 The problem Given a basis B for a subspace V of R n, replace it with an orthonormal basis A having the same span. Why? Because orthonormal bases are just plain better.
4 Why orthonormal bases are just plain better No need to check for linear independence (automatic from orthogonality) Coordinates are easy just dot: v u 1 v u 2 [ v] A =. v u m Orthogonal projections are easy just dot: proj V ( x) = x = ( x u 1 ) u 1 + ( x u 2 ) u ( x u m ) u m
5 The geometric idea In fact, we will do a little better: Given a basis v 1, v 2,..., v m for a subspace V of R n, replace it with an orthonormal basis u 1, u 2,..., u m such that span( v 1, v 2,..., v k ) = span( u 1, u 2,..., u k ) for each k between 1 and m.
6 The geometric idea In fact, we will do a little better: Given a basis v 1, v 2,..., v m for a subspace V of R n, replace it with an orthonormal basis u 1, u 2,..., u m such that span( v 1, v 2,..., v k ) = span( u 1, u 2,..., u k ) for each k between 1 and m. The process works like this: Scale v 1 to make a unit vector u 1. Choose the rest of the u k to be the (normalized) perpendicular part of the orthogonal decomposition of v k relative to span( v 1, v 2,..., v k 1 ).
7 The geometric idea, in pictures v 2 v 1
8 The geometric idea, in pictures v 2 u 1 v 1
9 The geometric idea, in pictures v 2 u 1 v 1
10 The geometric idea, in pictures v 2 v 2 v 1 v 2 u 1
11 The geometric idea, in pictures v 2 v 2 u 2 v 1 v 2 u 1
12 The geometric idea, in pictures v 2 u 2 v 1 u 1
13 The geometric idea, in 3D v 3 v 2 v 1
14 The geometric idea, in 3D v 3 v 1 u 1 v 2
15 The geometric idea, in 3D v 3 v 1 u 1 v 2
16 The geometric idea, in 3D v 2 v 3 v 1 u 1 v 2 v 2
17 The geometric idea, in 3D v 2 v 3 u 2 v 1 u 1 v 2 v 2
18 The geometric idea, in 3D v 3 u 2 v 2 v 1 u 1
19 The geometric idea, in 3D v 3 v 3 u 2 v 2 v 1 u 1 v 3
20 The geometric idea, in 3D v 3 v 3 u 3 u 2 v 2 v 1 u 1 v 3
21 The geometric idea, in 3D v 3 u3 u 2 v 2 v 1 u 1
22 The algebraic idea Scale v 1 to make a unit vector u 1.
23 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1
24 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2.
25 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2
26 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u
27 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u
28 The algebraic idea Scale v 1 to make a unit vector u 1. Take v 2 to be the perpendicular part in the orthogonal decomposition of v 2 relative to the subspace spanned by v 1 : v 2 = v 2 proj v1 v 2 = v 2 ( u 1 v 2 ) u 1 Scale v 2 to make a unit vector u 2. Take v 3 to be the perpendicular part in the orthogonal decomposition of v 3 relative to the subspace spanned by v 1 and v 2 : v 3 = v 3 proj v1, v 2 v 3 = v 3 ( u 1 v 3 ) u 1 ( u 2 v 3 ) u 2 Scale v 3 to make a unit vector u This is known as the GramSchmidt process for orthonormalization.
29 Worksheet Do Problems 1 3
30 QR Factorizations
31 When are the spans of two sets the same? When is span( v 1, v 2,..., v m ) = span( u 1, u 2,..., u m )?
32 When are the spans of two sets the same? When is span( v 1, v 2,..., v m ) = span( u 1, u 2,..., u m )? If each of the v i s can be written as a linear combination of the u i s and vice versa. v 1 = c 11 u 1 + c 21 u c m1 u m v 2 = c 12 u 1 + c 22 u c m2 u m. v m = c 1m u 1 + c 2m u c mm u m
33 Writing it all in terms of matrices, we have [ v1 c 11 c 12 c 1m ] [ ] c 21 c 22 c 2m v 2... v m = u1 u 2... u m c m1 } c m2 {{ c mm } S
34 Writing it all in terms of matrices, we have [ v1 c 11 c 12 c 1m ] [ ] c 21 c 22 c 2m v 2... v m = u1 u 2... u m c m1 } c m2 {{ c mm } S The matrix S is the change of basis matrix from ( v 1,..., v m ) to ( u 1,..., u m )
35 From GramSchmidt to QR Suppose an n m matrix M has rank m, so its columns v 1,..., v m are linearly independent. Let u 1,..., u m be the orthonormal basis of span( v 1,..., v m ) = image M gotten by the GramSchmidt process, and let Q = [ u 1... u m ]. Then M = QR for some invertible matrix R.
36 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1
37 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1 v 2 = v 2 ( v 2 u 1 ) u 1 and u 2 = v 2 v 2, so v 2 = ( u 1 v 2 ) u 1 + v 2 u 2
38 What is R? u 1 = v 1 v 1 = v 1 = v 1 u 1 v 2 = v 2 ( v 2 u 1 ) u 1 and u 2 = v 2 v 2, so v 2 = ( u 1 v 2 ) u 1 + v 2 u 2 And so on: v k = ( u 1 v k ) u 1 + ( u 2 v k ) u ( u k 1 v k ) u k 1 + v k u k
39 So v 1 u 1 v 2 u 1 v 3 u 1 v m 0 v 2 u 2 v 3 u 2 v m R = 0 0 v 3... u 3 v m v m That is, R is upper triangular. u i v j if i < j r ij = v i = ui v i if i = j 0 if i > j
40 Example Find the QR factorization of M = [ ]
41 Example Find the QR factorization of M = Solution We have v 1 = [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = [ 1/ ] 2. 1/ 2
42 Example Find the QR factorization of M = Solution We have v 1 = [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 ]
43 Example Find the QR factorization of M = Solution We have v 1 = Therefore [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = r 22 = v 2 [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 = 1 u 2 = 1 [ 1/ ] v 2 2 = 2 r 22 1 /. 2 ]
44 Example Find the QR factorization of M = Solution We have v 1 = Therefore [ ] [ ] 1 and r 1 11 = v 1 = 2, so u 2 = v 2 = v 2 ( u 1 v 2 }{{} r 12 ) u 1 = r 22 = v 2 [ ] [ 1/ ] 2 = 2 1/ 2 [ 1/ ] 2. Also, 1/ 2 [ 1/2 1 /2 = 1 u 2 = 1 [ 1/ ] v 2 2 = 2 r 22 1 /. 2 Putting this all together, [ ] [ 1 1 1/ 2 1/ ] [ 2 2 1/ ] M = = 1 0 1/ 2 1 / / = QR 2 ]
45 Example (Worksheet, #4) Find the QR Factorization of
46 Example (Worksheet, #4) Find the QR Factorization of Solution 1/2 1/2 1 /2 1 / /2 1/2 Q = 1 /2 1 /2 1/2 1/2 1/2 1/2 1/2 R = / /2 1/2 1 /2 1/2 1 / /2
47 Who cares? A lot of important things are easier to calculate once you have the QR factorization, especially if M is square. It turns out Q 1 = Q T, so M 1 = R 1 Q T and R 1 is easier to calculate than M 1. It turns out det Q = ±1, so det M = ± det R, and det R is very easy to calculate (more later) The eigenvalues of M are easier to calculate when M is factored as QR.
Similarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
More informationOrthogonal Diagonalization of Symmetric Matrices
MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding
More informationChapter 6. Orthogonality
6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be
More informationα = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
More informationApplied Linear Algebra I Review page 1
Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties
More information1 2 3 1 1 2 x = + x 2 + x 4 1 0 1
(d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which
More informationLinear Algebra Review. Vectors
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length
More informationNotes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
More informationLINEAR ALGEBRA. September 23, 2010
LINEAR ALGEBRA September 3, 00 Contents 0. LUdecomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................
More informationInner product. Definition of inner product
Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product
More informationMATH 240 Fall, Chapter 1: Linear Equations and Matrices
MATH 240 Fall, 2007 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 9th Ed. written by Prof. J. Beachy Sections 1.1 1.5, 2.1 2.3, 4.2 4.9, 3.1 3.5, 5.3 5.5, 6.1 6.3, 6.5, 7.1 7.3 DEFINITIONS
More informationCS3220 Lecture Notes: QR factorization and orthogonal transformations
CS3220 Lecture Notes: QR factorization and orthogonal transformations Steve Marschner Cornell University 11 March 2009 In this lecture I ll talk about orthogonal matrices and their properties, discuss
More information1 Eigenvalues and Eigenvectors
Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x
More information5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES
5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES Definition 5.3. Orthogonal transformations and orthogonal matrices A linear transformation T from R n to R n is called orthogonal if it preserves
More informationSection 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj
Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that
More information[1] Diagonal factorization
8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:
More informationLectures notes on orthogonal matrices (with exercises) 92.222  Linear Algebra II  Spring 2004 by D. Klain
Lectures notes on orthogonal matrices (with exercises) 92.222  Linear Algebra II  Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n realvalued matrix A is said to be an orthogonal
More informationRecall that two vectors in are perpendicular or orthogonal provided that their dot
Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal
More informationSTUDY GUIDE LINEAR ALGEBRA. David C. Lay University of Maryland College Park AND ITS APPLICATIONS THIRD EDITION UPDATE
STUDY GUIDE LINEAR ALGEBRA AND ITS APPLICATIONS THIRD EDITION UPDATE David C. Lay University of Maryland College Park Copyright 2006 Pearson AddisonWesley. All rights reserved. Reproduced by Pearson AddisonWesley
More informationis in plane V. However, it may be more convenient to introduce a plane coordinate system in V.
.4 COORDINATES EXAMPLE Let V be the plane in R with equation x +2x 2 +x 0, a twodimensional subspace of R. We can describe a vector in this plane by its spatial (D)coordinates; for example, vector x 5
More informationInner Product Spaces and Orthogonality
Inner Product Spaces and Orthogonality week 34 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,
More informationLinear Least Squares
Linear Least Squares Suppose we are given a set of data points {(x i,f i )}, i = 1,...,n. These could be measurements from an experiment or obtained simply by evaluating a function at some points. One
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 51 Orthonormal
More informationReview Jeopardy. Blue vs. Orange. Review Jeopardy
Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 03 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?
More informationx + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3
Math 24 FINAL EXAM (2/9/9  SOLUTIONS ( Find the general solution to the system of equations 2 4 5 6 7 ( r 2 2r r 2 r 5r r x + y + z 2x + y + 4z 5x + 6y + 7z 2 2 2 2 So x z + y 2z 2 and z is free. ( r
More informationAu = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.
Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry
More information1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)
Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible
More informationRecall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.
ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the ndimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?
More informationMAT 242 Test 3 SOLUTIONS, FORM A
MAT Test SOLUTIONS, FORM A. Let v =, v =, and v =. Note that B = { v, v, v } is an orthogonal set. Also, let W be the subspace spanned by { v, v, v }. A = 8 a. [5 points] Find the orthogonal projection
More informationSection 6.1  Inner Products and Norms
Section 6.1  Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,
More informationMA 242 LINEAR ALGEBRA C1, Solutions to Second Midterm Exam
MA 4 LINEAR ALGEBRA C, Solutions to Second Midterm Exam Prof. Nikola Popovic, November 9, 6, 9:3am  :5am Problem (5 points). Let the matrix A be given by 5 6 5 4 5 (a) Find the inverse A of A, if it exists.
More informationSolutions to Linear Algebra Practice Problems
Solutions to Linear Algebra Practice Problems. Find all solutions to the following systems of linear equations. (a) x x + x 5 x x x + x + x 5 (b) x + x + x x + x + x x + x + 8x Answer: (a) We create the
More informationMATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.
MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. Nullspace Let A = (a ij ) be an m n matrix. Definition. The nullspace of the matrix A, denoted N(A), is the set of all ndimensional column
More informationMath 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.
Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(
More informationWe seek a factorization of a square matrix A into the product of two matrices which yields an
LU Decompositions We seek a factorization of a square matrix A into the product of two matrices which yields an efficient method for solving the system where A is the coefficient matrix, x is our variable
More informationOrthogonal Projections and Orthonormal Bases
CS 3, HANDOUT A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).
More informationThe Second Undergraduate Level Course in Linear Algebra
The Second Undergraduate Level Course in Linear Algebra University of Massachusetts Dartmouth Joint Mathematics Meetings New Orleans, January 7, 11 Outline of Talk Linear Algebra Curriculum Study Group
More informationSALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated
SALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET Course Title Course Number Department Linear Algebra Mathematics MAT240 Action Taken (Please Check One) New Course Initiated
More information2.1: MATRIX OPERATIONS
.: MATRIX OPERATIONS What are diagonal entries and the main diagonal of a matrix? What is a diagonal matrix? When are matrices equal? Scalar Multiplication 45 Matrix Addition Theorem (pg 0) Let A, B, and
More informationAdvanced Techniques for Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz
Advanced Techniques for Mobile Robotics Compact Course on Linear Algebra Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Vectors Arrays of numbers Vectors represent a point in a n dimensional
More informationUniversity of Ottawa
University of Ottawa Department of Mathematics and Statistics MAT 1302A: Mathematical Methods II Instructor: Alistair Savage Final Exam April 2013 Surname First Name Student # Seat # Instructions: (a)
More informationPractice Math 110 Final. Instructions: Work all of problems 1 through 5, and work any 5 of problems 10 through 16.
Practice Math 110 Final Instructions: Work all of problems 1 through 5, and work any 5 of problems 10 through 16. 1. Let A = 3 1 1 3 3 2. 6 6 5 a. Use Gauss elimination to reduce A to an upper triangular
More informationLecture 5: Singular Value Decomposition SVD (1)
EEM3L1: Numerical and Analytical Techniques Lecture 5: Singular Value Decomposition SVD (1) EE3L1, slide 1, Version 4: 25Sep02 Motivation for SVD (1) SVD = Singular Value Decomposition Consider the system
More informationby the matrix A results in a vector which is a reflection of the given
Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the yaxis We observe that
More information3. INNER PRODUCT SPACES
. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.
More informationName: Section Registered In:
Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are
More informationA Introduction to Matrix Algebra and Principal Components Analysis
A Introduction to Matrix Algebra and Principal Components Analysis Multivariate Methods in Education ERSH 8350 Lecture #2 August 24, 2011 ERSH 8350: Lecture 2 Today s Class An introduction to matrix algebra
More informationMATH 304 Linear Algebra Lecture 8: Inverse matrix (continued). Elementary matrices. Transpose of a matrix.
MATH 304 Linear Algebra Lecture 8: Inverse matrix (continued). Elementary matrices. Transpose of a matrix. Inverse matrix Definition. Let A be an n n matrix. The inverse of A is an n n matrix, denoted
More informationSimilar matrices and Jordan form
Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationInverses. Stephen Boyd. EE103 Stanford University. October 27, 2015
Inverses Stephen Boyd EE103 Stanford University October 27, 2015 Outline Left and right inverses Inverse Solving linear equations Examples Pseudoinverse Left and right inverses 2 Left inverses a number
More informationMATH10212 Linear Algebra B Homework 7
MATH22 Linear Algebra B Homework 7 Students are strongly advised to acquire a copy of the Textbook: D C Lay, Linear Algebra and its Applications Pearson, 26 (or other editions) Normally, homework assignments
More informationSummary of week 8 (Lectures 22, 23 and 24)
WEEK 8 Summary of week 8 (Lectures 22, 23 and 24) This week we completed our discussion of Chapter 5 of [VST] Recall that if V and W are inner product spaces then a linear map T : V W is called an isometry
More informationMAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =
MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the
More informationUsing row reduction to calculate the inverse and the determinant of a square matrix
Using row reduction to calculate the inverse and the determinant of a square matrix Notes for MATH 0290 Honors by Prof. Anna Vainchtein 1 Inverse of a square matrix An n n square matrix A is called invertible
More informationOrthogonal Projections
Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors
More informationIterative Methods for Computing Eigenvalues and Eigenvectors
The Waterloo Mathematics Review 9 Iterative Methods for Computing Eigenvalues and Eigenvectors Maysum Panju University of Waterloo mhpanju@math.uwaterloo.ca Abstract: We examine some numerical iterative
More informationSolution. Area(OABC) = Area(OAB) + Area(OBC) = 1 2 det( [ 5 2 1 2. Question 2. Let A = (a) Calculate the nullspace of the matrix A.
Solutions to Math 30 Takehome prelim Question. Find the area of the quadrilateral OABC on the figure below, coordinates given in brackets. [See pp. 60 63 of the book.] y C(, 4) B(, ) A(5, ) O x Area(OABC)
More informationAn Application of Linear Algebra to Image Compression
An Application of Linear Algebra to Image Compression Paul Dostert July 2, 2009 1 / 16 Image Compression There are hundreds of ways to compress images. Some basic ways use singular value decomposition
More informationMath Practice Problems for Test 1
Math 290  Practice Problems for Test 1 UNSUBSTANTIATED ANSWERS MAY NOT RECEIVE CREDIT. 3 4 5 1. Let c 1 and c 2 be the columns of A 5 2 and b 1. Show that b Span{c 1, c 2 } by 6 6 6 writing b as a linear
More informationRow and column operations
Row and column operations It is often very useful to apply row and column operations to a matrix. Let us list what operations we re going to be using. 3 We ll illustrate these using the example matrix
More informationCofactor Expansion: Cramer s Rule
Cofactor Expansion: Cramer s Rule MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Today we will focus on developing: an efficient method for calculating
More information4.1 VECTOR SPACES AND SUBSPACES
4.1 VECTOR SPACES AND SUBSPACES What is a vector space? (pg 229) A vector space is a nonempty set, V, of vectors together with two operations; addition and scalar multiplication which satisfies the following
More informationMATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.
MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α
More informationLinear Algebra Notes
Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note
More information(January 14, 2009) End k (V ) End k (V/W )
(January 14, 29) [16.1] Let p be the smallest prime dividing the order of a finite group G. Show that a subgroup H of G of index p is necessarily normal. Let G act on cosets gh of H by left multiplication.
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More informationMATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.
MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar
More informationApplied Linear Algebra
Applied Linear Algebra OTTO BRETSCHER http://www.prenhall.com/bretscher Chapter 7 Eigenvalues and Eigenvectors ChiaHui Chang Email: chia@csie.ncu.edu.tw National Central University, Taiwan 7.1 DYNAMICAL
More informationLecture 1: Schur s Unitary Triangularization Theorem
Lecture 1: Schur s Unitary Triangularization Theorem This lecture introduces the notion of unitary equivalence and presents Schur s theorem and some of its consequences It roughly corresponds to Sections
More informationMath 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010
Math 550 Notes Chapter 7 Jesse Crawford Department of Mathematics Tarleton State University Fall 2010 (Tarleton State University) Math 550 Chapter 7 Fall 2010 1 / 34 Outline 1 SelfAdjoint and Normal Operators
More informationLECTURE 1 I. Inverse matrices We return now to the problem of solving linear equations. Recall that we are trying to find x such that IA = A
LECTURE I. Inverse matrices We return now to the problem of solving linear equations. Recall that we are trying to find such that A = y. Recall: there is a matri I such that for all R n. It follows that
More informationSolutions to Math 51 First Exam January 29, 2015
Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not
More information18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in 2106. Total: 175 points.
806 Problem Set 4 Solution Due Wednesday, March 2009 at 4 pm in 206 Total: 75 points Problem : A is an m n matrix of rank r Suppose there are righthandsides b for which A x = b has no solution (a) What
More informationMatrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Pearson Education, Inc.
2 Matrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Theorem 8: Let A be a square matrix. Then the following statements are equivalent. That is, for a given A, the statements are either all true
More informationADVANCED LINEAR ALGEBRA FOR ENGINEERS WITH MATLAB. Sohail A. Dianat. Rochester Institute of Technology, New York, U.S.A. Eli S.
ADVANCED LINEAR ALGEBRA FOR ENGINEERS WITH MATLAB Sohail A. Dianat Rochester Institute of Technology, New York, U.S.A. Eli S. Saber Rochester Institute of Technology, New York, U.S.A. (g) CRC Press Taylor
More informationMATH36001 Background Material 2015
MATH3600 Background Material 205 Matrix Algebra Matrices and Vectors An ordered array of mn elements a ij (i =,, m; j =,, n) written in the form a a 2 a n A = a 2 a 22 a 2n a m a m2 a mn is said to be
More information17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function
17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function, : V V R, which is symmetric, that is u, v = v, u. bilinear, that is linear (in both factors):
More informationLinear Algebra
. Linear Algebra Midterm Solutions. (pts) Consider a matrix A, andletb rref(a). (a) Is ker (A) necessarily equal to ker (B)? Explain. (b) Is im (A) necessarily equal to im (B)? Explain. (a) Yes. By construction
More informationLecture 6. Inverse of Matrix
Lecture 6 Inverse of Matrix Recall that any linear system can be written as a matrix equation In one dimension case, ie, A is 1 1, then can be easily solved as A x b Ax b x b A 1 A b A 1 b provided that
More informationMATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0
More informationProblem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.
Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve
More informationPresentation 3: Eigenvalues and Eigenvectors of a Matrix
Colleen Kirksey, Beth Van Schoyck, Dennis Bowers MATH 280: Problem Solving November 18, 2011 Presentation 3: Eigenvalues and Eigenvectors of a Matrix Order of Presentation: 1. Definitions of Eigenvalues
More information1 Introduction to Matrices
1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns
More informationLinear Algebra Review Part 2: Ax=b
Linear Algebra Review Part 2: Ax=b Edwin Olson University of Michigan The ThreeDay Plan Geometry of Linear Algebra Vectors, matrices, basic operations, lines, planes, homogeneous coordinates, transformations
More informationNOTES on LINEAR ALGEBRA 1
School of Economics, Management and Statistics University of Bologna Academic Year 205/6 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura
More information9.3 Advanced Topics in Linear Algebra
548 93 Advanced Topics in Linear Algebra Diagonalization and Jordan s Theorem A system of differential equations x = Ax can be transformed to an uncoupled system y = diag(λ,, λ n y by a change of variables
More informationWe know a formula for and some properties of the determinant. Now we see how the determinant can be used.
Cramer s rule, inverse matrix, and volume We know a formula for and some properties of the determinant. Now we see how the determinant can be used. Formula for A We know: a b d b =. c d ad bc c a Can we
More information1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0
Solutions: Assignment 4.. Find the redundant column vectors of the given matrix A by inspection. Then find a basis of the image of A and a basis of the kernel of A. 5 A The second and third columns are
More informationChapter 17. Orthogonal Matrices and Symmetries of Space
Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length
More informationMATH 304 Linear Algebra Lecture 4: Matrix multiplication. Diagonal matrices. Inverse matrix.
MATH 304 Linear Algebra Lecture 4: Matrix multiplication. Diagonal matrices. Inverse matrix. Matrices Definition. An mbyn matrix is a rectangular array of numbers that has m rows and n columns: a 11
More information(a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular.
Theorem.7.: (Properties of Triangular Matrices) (a) The transpose of a lower triangular matrix is upper triangular, and the transpose of an upper triangular matrix is lower triangular. (b) The product
More informationThe MoorePenrose Inverse and Least Squares
University of Puget Sound MATH 420: Advanced Topics in Linear Algebra The MoorePenrose Inverse and Least Squares April 16, 2014 Creative Commons License c 2014 Permission is granted to others to copy,
More informationDefinition: A square matrix A is block diagonal if A has the form A 1 O O O A 2 O A =
The question we want to answer now is the following: If A is not similar to a diagonal matrix, then what is the simplest matrix that A is similar to? Before we can provide the answer, we will have to introduce
More informationLinear Dependence Tests
Linear Dependence Tests The book omits a few key tests for checking the linear dependence of vectors. These short notes discuss these tests, as well as the reasoning behind them. Our first test checks
More information4 MT210 Notebook 4 3. 4.1 Eigenvalues and Eigenvectors... 3. 4.1.1 Definitions; Graphical Illustrations... 3
MT Notebook Fall / prepared by Professor Jenny Baglivo c Copyright 9 by Jenny A. Baglivo. All Rights Reserved. Contents MT Notebook. Eigenvalues and Eigenvectors................................... Definitions;
More informationFactorization Theorems
Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 Lecture 3: QR, least squares, linear regression Linear Algebra Methods for Data Mining, Spring 2007, University
More informationNumerical Methods I Eigenvalue Problems
Numerical Methods I Eigenvalue Problems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course G63.2010.001 / G22.2420001, Fall 2010 September 30th, 2010 A. Donev (Courant Institute)
More informationMatrices, transposes, and inverses
Matrices, transposes, and inverses Math 40, Introduction to Linear Algebra Wednesday, February, 202 Matrixvector multiplication: two views st perspective: A x is linear combination of columns of A 2 4
More information