Inverses. Stephen Boyd. EE103 Stanford University. October 27, 2015

Similar documents
6. Cholesky factorization

Solving Linear Systems, Continued and The Inverse of a Matrix

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Lecture 2 Matrix Operations

by the matrix A results in a vector which is a reflection of the given

Question 2: How do you solve a matrix equation using the matrix inverse?

Using row reduction to calculate the inverse and the determinant of a square matrix

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

5. Orthogonal matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Direct Methods for Solving Linear Systems. Matrix Factorization

Notes on Determinant

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

1 Introduction to Matrices

Solving Systems of Linear Equations

Operation Count; Numerical Linear Algebra

Introduction to Matrix Algebra

LINEAR ALGEBRA. September 23, 2010

SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

Lecture 5 Least-squares

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Similarity and Diagonalization. Similar Matrices

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Chapter 6. Orthogonality

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Factorization Theorems

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MAT188H1S Lec0101 Burbulla

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Linear Algebra Review. Vectors

University of Lille I PC first year list of exercises n 7. Review

Applied Linear Algebra I Review page 1

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

The Characteristic Polynomial

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Math 312 Homework 1 Solutions

Name: Section Registered In:

Elementary Matrices and The LU Factorization

Linear Algebra: Determinants, Inverses, Rank

Here are some examples of combining elements and the operations used:

Section Inner Products and Norms

Orthogonal Projections

Chapter 17. Orthogonal Matrices and Symmetries of Space

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

LINEAR ALGEBRA W W L CHEN

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes

Linearly Independent Sets and Linearly Dependent Sets

MATH APPLIED MATRIX THEORY

Solution to Homework 2

LS.6 Solution Matrices

Algebra Cheat Sheets

Eigenvalues and Eigenvectors

2.3. Finding polynomial functions. An Introduction:

Negative Integer Exponents

Systems of Linear Equations

Similar matrices and Jordan form

MATHEMATICS FOR ENGINEERS BASIC MATRIX THEORY TUTORIAL 2

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Equations, Inequalities & Partial Fractions

Subspaces of R n LECTURE Subspaces

Factoring Polynomials

5.3 The Cross Product in R 3

Recall that two vectors in are perpendicular or orthogonal provided that their dot

T ( a i x i ) = a i T (x i ).

A =

3.3. Solving Polynomial Equations. Introduction. Prerequisites. Learning Outcomes

Solving Systems of Linear Equations

Some Lecture Notes and In-Class Examples for Pre-Calculus:

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

RESULTANT AND DISCRIMINANT OF POLYNOMIALS

Arithmetic and Algebra of Matrices

( ) FACTORING. x In this polynomial the only variable in common to all is x.

Vector and Matrix Norms

160 CHAPTER 4. VECTOR SPACES

MAT 242 Test 2 SOLUTIONS, FORM T

The Method of Partial Fractions Math 121 Calculus II Spring 2015

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Factoring Polynomials and Solving Quadratic Equations

Typical Linear Equation Set and Corresponding Matrices

Solutions to Math 51 First Exam January 29, 2015

Math 215 HW #6 Solutions

8 Square matrices continued: Determinants

Inner Product Spaces and Orthogonality

[1] Diagonal factorization

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Lecture 4: Partitioned Matrices and Determinants

Lecture 2 Linear functions and examples

expression is written horizontally. The Last terms ((2)( 4)) because they are the last terms of the two polynomials. This is called the FOIL method.

SOLVING POLYNOMIAL EQUATIONS

Matrix Differentiation

Solution of Linear Systems

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Transcription:

Inverses Stephen Boyd EE103 Stanford University October 27, 2015

Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2

Left inverses a number x that satisfies xa = 1 is called the inverse of a inverse (i.e., 1/a) exists if and only if a 0, and is unique a matrix X that satisfies XA = I is called a left inverse of A (and we say that A is left-invertible) example: the matrix has two different left inverses: B = 1 [ 11 10 16 9 7 8 11 3 4 A = 4 6 1 1 ], C = 1 [ 0 1 6 2 0 1 4 ] Left and right inverses 3

Left inverse and column independence if A has a left inverse C then the columns of A are independent to see this: if Ax = 0 and CA = I then 0 = C0 = C(Ax) = (CA)x = Ix = x we ll see later the converse is also true, so a matrix is left-invertible if and only if its columns are independent so left-invertible matrices are tall or square Left and right inverses 4

Solving linear equations with a left inverse suppose Ax = b, and A has a left inverse C then Cb = C(Ax) = (CA)x = Ix = x so multiplying the right-hand side by a left inverse yields the solution Left and right inverses 5

Right inverses a matrix X that satisfies AX = I is a right inverse of A (and we say that A is right-invertible) A is right-invertible if and only if A T is left-invertible: so we conclude AX = I (AX) T = I X T A T = I A is right-invertible if and only if its rows are linearly independent right-invertible matrices are wide or square Left and right inverses 6

Solving linear equations with a right inverse suppose A has a right inverse B consider the (square or underdetermined) equations Ax = b x = Bb is a solution: Ax = A(Bb) = (AB)b = Ib = b so Ax = b has a solution for any b Left and right inverses 7

Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Inverse 8

Inverse if A has a left and a right inverse, they are unique and equal (and we say that A is invertible) so A must be square to see this: if AX = I, Y A = I X = (Y A)X = Y (AX) = Y we denote them by A 1 : A 1 A = AA 1 = I inverse of inverse: (A 1 ) 1 = A Inverse 9

Solving square systems of linear equations suppose A is invertible for any b, Ax = b has the unique solution x = A 1 b matrix generalization of simple scalar equation ax = b having solution x = (1/a)b (for a 0) Inverse 10

Invertible matrices the following are equivalent for a square matrix A: A is invertible columns of A are linearly independent rows of A are linearly independent A has a left inverse A has a right inverse if any of these hold, all others do Inverse 11

Examples I 1 = I if Q is orthogonal, i.e., Q T Q = I, then Q 1 = Q T 2 2 matrix A is invertible if and only A 11 A 22 A 12 A 21 [ ] A 1 1 A22 A = 12 A 11 A 22 A 12 A 21 A 21 A 11 you need to know this formula there are similar but much more complicated formulas for larger matrices (and no, you do not need to know them) Inverse 12

Properties (AB) 1 = B 1 A 1 (provided inverses exist) (A T ) 1 = (A 1 ) T (sometimes denoted A T ) negative matrix powers: (A 1 ) k is denoted A k with A 0 = I, identity A k A l = A k+l holds for any integers k, l Inverse 13

Triangular matrices lower triangular L with nonzero diagonal entries is invertible so see this, write Lx = 0 as L 11 x 1 = 0 L 21 x 1 + L 22 x 2 = 0. L n1 x 1 + L n2 x 2 + + L n,n 1 x n 1 + L nn x n = 0 from first equation, x 1 = 0 (since L 11 0) second equation reduces to L 22x 2 = 0, so x 2 = 0 (since L 22 0) and so on upper triangular R with nonzero diagonal entries is invertible Inverse 14

Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Solving linear equations 15

Back substitution suppose R is upper triangular with nonzero diagonal entries write out Rx = b as R 11 x 1 + R 12 x 2 + + R 1,n 1 x n 1 + R 1n x n = b 1. R n 1,n 1 x n 1 + R n 1,n x n = b n 1 R nn x n = b n from last equation we get x n = b n /R nn from 2nd to last equation we get x n 1 = (b n 1 R n 1,n x n )/R n 1,n 1 continue to get x n 2, x n 3,..., x 1 Solving linear equations 16

Back substitution called back substitution since we find the variables in reverse order, substituting the already known values of x i computes x = R 1 b complexity: first step requires 1 flop (division) 2nd step needs 3 flops ith step needs 2i 1 flops total is 1 + 3 + + (2n 1) = n 2 flops Solving linear equations 17

Solving linear equations via QR factorization assuming A is invertible, let s solve Ax = b, i.e., compute x = A 1 b columns of A are linearly independent, so its QR factorization exists: A = QR Q is orthogonal, so Q 1 = Q T R is upper triangular with positive diagonal entries, hence invertible so A 1 = (QR) 1 = R 1 Q T compute x = R 1 (Q T b) by back substitution Solving linear equations 18

Solving linear equations via QR factorization given an n n invertible matrix A and an n-vector b 1. QR factorization. Compute the QR factorization A = QR. 2. Compute Q T b. 3. Back substitution. Solve the triangular equation Rx = Q T b using back substitution. complexity 2n 3 (step 1), 2n 2 (step 2), n 2 (step 3) total is 2n 3 + 3n 2 2n 3 Solving linear equations 19

Multiple right-hand sides let s solve Ax i = b i, i = 1,..., k, with A invertible carry out QR factorization once (2n 3 flops) for i = 1,..., k, solve Rx i = Q T b i via back substitution (3kn 2 flops) total is 2n 3 + 2kn 2 flops if k is small compared to n, same cost as solving one set of equations Solving linear equations 20

Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Examples 21

Polynomial interpolation let s find coefficients of a cubic polynomial that satisfies p(x) = c 1 + c 2 x + c 3 x 2 + c 4 x 3 p( 1.1) = b 1, p( 0.4) = b 2, p(0.1) = b 3, p(0.8) = b 4 write as Ac = b, with A = 1 1.1 ( 1.1) 2 ( 1.1) 3 1 0.4 ( 0.4) 2 ( 0.4) 3 1 0.1 (0.1) 2 (0.1) 3 1 0.8 (0.8) 2 (0.8) 3 Examples 22

Polynomial interpolation (unique) coefficients given by c = A 1 b, with A 1 = 0.0201 0.2095 0.8381 0.0276 0.1754 2.1667 1.8095 0.1817 0.3133 0.4762 1.6667 0.8772 0.6266 2.381 2.381 0.6266 so, e.g., c 1 is not very sensitive to b 1 or b 4 Examples 23

Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Pseudo-inverse 24

Invertibility of Gram matrix A has independent columns A T A is invertible to see this, we ll show that Ax = 0 A T Ax = 0 : if Ax = 0 then (A T A)x = A T (Ax) = A T 0 = 0 : if (A T A)x = 0 then 0 = x T (A T A)x = (Ax) T (Ax) = Ax 2 = 0 so Ax = 0 Pseudo-inverse 25

Pseudo-inverse of tall matrix the pseudo-inverse of A with independent columns is A = (A T A) 1 A T it is a left inverse of A: A A = (A T A) 1 A T A = (A T A) 1 (A T A) = I (we ll soon see that it s a very important left inverse of A) reduces to A 1 when A is square: A = (A T A) 1 A T = A 1 A T A T = A 1 I = A 1 Pseudo-inverse 26

Pseudo-inverse of wide matrix if A is wide, with independent rows, AA T is invertible pseudo-inverse is defined as A = A T (AA T ) 1 A is a right inverse of A: AA = AA T (AA T ) 1 = I (we ll see later it is an important right inverse) reduces to A 1 when A is square: A T (AA T ) 1 = A T A T A 1 = A 1 Pseudo-inverse 27

Pseudo-inverse via QR factorization suppose A has independent columns, A = QR then A T A = (QR) T (QR) = R T Q T QR = R T R so A = (A T A) 1 A T = (R T R) 1 (QR) T = R 1 R T R T Q T = R 1 Q T can compute A using back substitution on columns of Q T for A with independent rows, A = QR T Pseudo-inverse 28