Lecture 10: Invertible matrices. Finding the inverse of a matrix

Similar documents
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Using row reduction to calculate the inverse and the determinant of a square matrix

Solving Linear Systems, Continued and The Inverse of a Matrix

Notes on Determinant

8 Square matrices continued: Determinants

Solutions to Math 51 First Exam January 29, 2015

Systems of Linear Equations

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Row Echelon Form and Reduced Row Echelon Form

Lecture 2 Matrix Operations

DETERMINANTS TERRY A. LORING

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MAT188H1S Lec0101 Burbulla

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

1 Introduction to Matrices

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

MATH APPLIED MATRIX THEORY

Direct Methods for Solving Linear Systems. Matrix Factorization

1.2 Solving a System of Linear Equations

Solving Systems of Linear Equations

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Orthogonal Projections

Similar matrices and Jordan form

160 CHAPTER 4. VECTOR SPACES

( ) which must be a vector

LS.6 Solution Matrices

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Math 312 Homework 1 Solutions

by the matrix A results in a vector which is a reflection of the given

Continued Fractions and the Euclidean Algorithm

SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison

Methods for Finding Bases

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

[1] Diagonal factorization

1 Sets and Set Notation.

Subspaces of R n LECTURE Subspaces

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Linearly Independent Sets and Linearly Dependent Sets

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

University of Lille I PC first year list of exercises n 7. Review

A =

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes

7 Gaussian Elimination and LU Factorization

Lecture 3: Finding integer solutions to systems of linear equations

Typical Linear Equation Set and Corresponding Matrices

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Examination paper for TMA4115 Matematikk 3

Matrix Representations of Linear Transformations and Changes of Coordinates

Lecture Notes 2: Matrices as Systems of Linear Equations

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

1 VECTOR SPACES AND SUBSPACES

Solution to Homework 2

T ( a i x i ) = a i T (x i ).

Question 2: How do you solve a matrix equation using the matrix inverse?

Properties of Real Numbers

Matrices 2. Solving Square Systems of Linear Equations; Inverse Matrices

Systems of Linear Equations

MAT 242 Test 2 SOLUTIONS, FORM T

Name: Section Registered In:

NOTES ON LINEAR TRANSFORMATIONS

5 Homogeneous systems

Lecture notes on linear algebra

Solving Systems of Linear Equations

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan

Lecture 4: Partitioned Matrices and Determinants

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

ISOMETRIES OF R n KEITH CONRAD

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Linear Equations ! $ & " % & " 11,750 12,750 13,750% MATHEMATICS LEARNING SERVICE Centre for Learning and Professional Development

4.5 Linear Dependence and Linear Independence

The Determinant: a Means to Calculate Volume

6. Cholesky factorization

The last three chapters introduced three major proof techniques: direct,

Section Continued

LINEAR ALGEBRA. September 23, 2010

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Chapter 17. Orthogonal Matrices and Symmetries of Space

9 MATRICES AND TRANSFORMATIONS

1.5 SOLUTION SETS OF LINEAR SYSTEMS

Introduction to Matrices for Engineers

Cryptography and Network Security. Prof. D. Mukhopadhyay. Department of Computer Science and Engineering. Indian Institute of Technology, Kharagpur

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Elementary Matrices and The LU Factorization

1 Solving LPs: The Simplex Algorithm of George Dantzig

Vector Spaces 4.4 Spanning and Independence

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

MATH10040 Chapter 2: Prime and relatively prime numbers

Similarity and Diagonalization. Similar Matrices

The Characteristic Polynomial

Applications of Fermat s Little Theorem and Congruences

Here are some examples of combining elements and the operations used:

Chapter 6. Orthogonality

1 Review of Least Squares Solutions to Overdetermined Systems

Vector and Matrix Norms

Transcription:

Lecture 10: Invertible matrices. Finding the inverse of a matrix Danny W. Crytser April 11, 2014

Today s lecture Today we will

Today s lecture Today we will 1 Single out a class of especially nice matrices called the invertible matrices.

Today s lecture Today we will 1 Single out a class of especially nice matrices called the invertible matrices. 2 Discuss how to compute the inverse of a invertible matrix.

Today s lecture Today we will 1 Single out a class of especially nice matrices called the invertible matrices. 2 Discuss how to compute the inverse of a invertible matrix. 3 State a theorem which says when a square matrix possesses an inverse.

Today s lecture Today we will 1 Single out a class of especially nice matrices called the invertible matrices. 2 Discuss how to compute the inverse of a invertible matrix. 3 State a theorem which says when a square matrix possesses an inverse.

Multiplication of matrices We saw on Wednesday s lecture that if A is an m n matrix and B is an n p matrix, then the product AB is defined and AB = [ Ab 1 Ab 2... Ab p ].

Multiplication of matrices We saw on Wednesday s lecture that if A is an m n matrix and B is an n p matrix, then the product AB is defined and AB = [ Ab 1 Ab 2... Ab p ]. Unlike the situation with real numbers, you can find (with not too much difficulty) matrices A, B, C such that AB = AC and yet B C.

Multiplication of matrices We saw on Wednesday s lecture that if A is an m n matrix and B is an n p matrix, then the product AB is defined and AB = [ Ab 1 Ab 2... Ab p ]. Unlike the situation with real numbers, you can find (with not too much difficulty) matrices A, B, C such that AB = AC and yet B C. That is, you cannot divide out by a general matrix A.

Multiplication of matrices We saw on Wednesday s lecture that if A is an m n matrix and B is an n p matrix, then the product AB is defined and AB = [ Ab 1 Ab 2... Ab p ]. Unlike the situation with real numbers, you can find (with not too much difficulty) matrices A, B, C such that AB = AC and yet B C. That is, you cannot divide out by a general matrix A. Today we will describe all the matrices that you can divide out by.

Multiplication of matrices We saw on Wednesday s lecture that if A is an m n matrix and B is an n p matrix, then the product AB is defined and AB = [ Ab 1 Ab 2... Ab p ]. Unlike the situation with real numbers, you can find (with not too much difficulty) matrices A, B, C such that AB = AC and yet B C. That is, you cannot divide out by a general matrix A. Today we will describe all the matrices that you can divide out by. This entails finding the reciprocal of a matrix A, which is only possible for some matrices.

The Identity Matrix We aim to define the inverse of a matrix in (very) rough analogy with the reciprocal of a real number.

The Identity Matrix We aim to define the inverse of a matrix in (very) rough analogy with the reciprocal of a real number. As x 1 := 1 x, we will need to know what 1 means in a matrix context.

The Identity Matrix We aim to define the inverse of a matrix in (very) rough analogy with the reciprocal of a real number. As x 1 := 1 x, we will need to know what 1 means in a matrix context. Definition Let m 1 be an integer.

The Identity Matrix We aim to define the inverse of a matrix in (very) rough analogy with the reciprocal of a real number. As x 1 := 1 x, we will need to know what 1 means in a matrix context. Definition Let m 1 be an integer. Then I m, the m m identity matrix, is the m m matrix given by 1 0... 0 0 1... 0 I m :=..... 0 0... 1

The Identity Matrix We aim to define the inverse of a matrix in (very) rough analogy with the reciprocal of a real number. As x 1 := 1 x, we will need to know what 1 means in a matrix context. Definition Let m 1 be an integer. Then I m, the m m identity matrix, is the m m matrix given by 1 0... 0 0 1... 0 I m :=..... 0 0... 1 That is, I m is a diagonal matrix with 1s on the diagonal.

Examples of identity matrices Example I 2 = [ 1 0 0 1 ]

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1 I 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1 I 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 The interesting thing about the identity matrices is that for any m n matrix A, I m A = A and AI n = A.

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1 I 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 The interesting thing about the identity matrices is that for any m n matrix A, I m A = A and AI n = A. (Use row-column rule).

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1 I 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 The interesting thing about the identity matrices is that for any m n matrix A, I m A = A and AI n = A. (Use row-column rule). Sometimes when the context is abundantly clear, we ll write I instead of I n, but we ll try to retain the subscript to avoid confusion.

Examples of identity matrices Example I 2 = [ 1 0 0 1 ] I 3 = 1 0 0 0 1 0 0 0 1 I 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 The interesting thing about the identity matrices is that for any m n matrix A, I m A = A and AI n = A. (Use row-column rule). Sometimes when the context is abundantly clear, we ll write I instead of I n, but we ll try to retain the subscript to avoid confusion.

Invertible matrices Definition An n n matrix A is invertible if there exists an n n matrix C such that AC = I n = CA.

Invertible matrices Definition An n n matrix A is invertible if there exists an n n matrix C such that AC = I n = CA. If such a C exists, we denote it by A 1 and refer to it as the inverse of A.

Invertible matrices Definition An n n matrix A is invertible if there exists an n n matrix C such that AC = I n = CA. If such a C exists, we denote it by A 1 and refer to it as the inverse of A. (Don t sweat about it too much, but a matrix which is not invertible is sometimes called a singular matrix, and an invertible matrix is sometimes referred to as a non-singular matrix.)

Invertible matrices Definition An n n matrix A is invertible if there exists an n n matrix C such that AC = I n = CA. If such a C exists, we denote it by A 1 and refer to it as the inverse of A. (Don t sweat about it too much, but a matrix which is not invertible is sometimes called a singular matrix, and an invertible matrix is sometimes referred to as a non-singular matrix.) Example Let A = [ 7 2 3 1 ] and C = [ 1 2 3 7 ].

Invertible matrices Definition An n n matrix A is invertible if there exists an n n matrix C such that AC = I n = CA. If such a C exists, we denote it by A 1 and refer to it as the inverse of A. (Don t sweat about it too much, but a matrix which is not invertible is sometimes called a singular matrix, and an invertible matrix is sometimes referred to as a non-singular matrix.) Example Let A = that [ 7 2 3 1 ] and C = [ 1 2 3 7 AC = CA = I 2 Thus A is invertible and A 1 = C. ]. Then you can check

Examples Example If A = I 2 = [ 1 0 0 1 ], then A is invertible and A = A 1.

Examples Example If A = I 2 = Example [ 1 0 0 1 The matrix A = ], then A is invertible and A = A 1. [ 1 0 0 0 ] is not invertible.

Examples Example If A = I 2 = Example [ 1 0 0 1 The matrix A = ], then A is invertible and A = A 1. [ 1 0 0 0 [ 1 0 0 0 ] is not invertible. Note that ] [ ] a b = c d [ a b 0 0 ].

Examples Example If A = I 2 = Example [ 1 0 0 1 The matrix A = ], then A is invertible and A = A 1. [ 1 0 0 0 [ 1 0 0 0 ] is not invertible. Note that ] [ ] a b = c d [ a b 0 0 So no matter what you multiply A by, you can t get the identity matrix. ].

Inverses for 2 2 matrices Probably you noticed a relationship between the entries of the matrices in the previous example. [ ] [ ] 7 2 A = and A 1 1 2 =. 3 1 3 7

Inverses for 2 2 matrices Probably you noticed a relationship between the entries of the matrices in the previous example. [ ] [ ] 7 2 A = and A 1 1 2 =. 3 1 3 7 There is a rule for 2 2 matrices describing this precisely. Theorem [ a b Let A = c d ].

Inverses for 2 2 matrices Probably you noticed a relationship between the entries of the matrices in the previous example. [ ] [ ] 7 2 A = and A 1 1 2 =. 3 1 3 7 There is a rule for 2 2 matrices describing this precisely. Theorem [ a b Let A = c d ]. If ad bc 0, then A is invertible and A 1 = 1 ad bc [ d b c a ].

Inverses for 2 2 matrices Probably you noticed a relationship between the entries of the matrices in the previous example. [ ] [ ] 7 2 A = and A 1 1 2 =. 3 1 3 7 There is a rule for 2 2 matrices describing this precisely. Theorem [ a b Let A = c d ]. If ad bc 0, then A is invertible and A 1 = 1 ad bc If ad bc = 0 then A is not invertible. [ d b c a ].

Inverses for 2 2 matrices Probably you noticed a relationship between the entries of the matrices in the previous example. [ ] [ ] 7 2 A = and A 1 1 2 =. 3 1 3 7 There is a rule for 2 2 matrices describing this precisely. Theorem [ a b Let A = c d ]. If ad bc 0, then A is invertible and A 1 = 1 ad bc If ad bc = 0 then A is not invertible. [ d b c a The quantity ad bc (diagonal product minus off-diagonal product) is called the determinant of A. ].

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ]

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example Let B = [ 1 1 2 2 ].

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example Let B = [ 1 1 2 2 ]. Is B invertible?

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example [ ] 1 1 Let B =. Is B invertible? We compute the determinant: 2 2 (1)(2) (2)(1) = 0.

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example [ ] 1 1 Let B =. Is B invertible? We compute the determinant: 2 2 (1)(2) (2)(1) = 0. The determinant is 0, so B is not invertible.

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example [ ] 1 1 Let B =. Is B invertible? We compute the determinant: 2 2 (1)(2) (2)(1) = 0. The determinant is 0, so B is not invertible. Example Let A = [ 5 2 2 1 ].

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example [ ] 1 1 Let B =. Is B invertible? We compute the determinant: 2 2 (1)(2) (2)(1) = 0. The determinant is 0, so B is not invertible. Example Let A = [ 5 2 2 1 So A is invertible. ]. Compute the determinant: (5)(1) (2)(2) = 1.

Inverses for 2 2 matrices, ctd. A 1 = 1 ad bc [ d b c a ] Example [ ] 1 1 Let B =. Is B invertible? We compute the determinant: 2 2 (1)(2) (2)(1) = 0. The determinant is 0, so B is not invertible. Example [ ] 5 2 Let A =. Compute the determinant: (5)(1) (2)(2) = 1. 2 1 [ ] [ ] So A is invertible. We have A 1 = 1 1 2 1 2 1 =. 2 5 2 5

Using inverse matrices to solve equations You can solve equations with the matrix inverse.

Using inverse matrices to solve equations You can solve equations with the matrix inverse. Theorem Let A be an invertible n n matrix. Then for each b R n, the equation Ax = b is consistent and has unique solution x = A 1 b.

Using inverse matrices to solve equations You can solve equations with the matrix inverse. Theorem Let A be an invertible n n matrix. Then for each b R n, the equation Ax = b is consistent and has unique solution x = A 1 b. Proof. First we show that A 1 b is a solution.

Using inverse matrices to solve equations You can solve equations with the matrix inverse. Theorem Let A be an invertible n n matrix. Then for each b R n, the equation Ax = b is consistent and has unique solution x = A 1 b. Proof. First we show that A 1 b is a solution.check: A(A 1 b) = (AA 1 )b, and AA 1 = I n. So A(A 1 b) = I n b = b. Thus x = A 1 b is a solution to Ax = b.

Using inverse matrices to solve equations You can solve equations with the matrix inverse. Theorem Let A be an invertible n n matrix. Then for each b R n, the equation Ax = b is consistent and has unique solution x = A 1 b. Proof. First we show that A 1 b is a solution.check: A(A 1 b) = (AA 1 )b, and AA 1 = I n. So A(A 1 b) = I n b = b. Thus x = A 1 b is a solution to Ax = b. If x is a solution to Ax = b, then multiply both sides of the equation to obtain A 1 (Ax) = A 1 b.

Using inverse matrices to solve equations You can solve equations with the matrix inverse. Theorem Let A be an invertible n n matrix. Then for each b R n, the equation Ax = b is consistent and has unique solution x = A 1 b. Proof. First we show that A 1 b is a solution.check: A(A 1 b) = (AA 1 )b, and AA 1 = I n. So A(A 1 b) = I n b = b. Thus x = A 1 b is a solution to Ax = b. If x is a solution to Ax = b, then multiply both sides of the equation to obtain A 1 (Ax) = A 1 b. Then rearrange and cancel on the left to obtain x = A 1 b

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses of matrices.

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. 3 2 5

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 (3)(2) (3) = 3.

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ ] (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 =. 3 3 1 1

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ ] (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 =. 3 3 1 1 Thus the solution x to the above equation is x = A 1 b

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ ] (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 =. 3 3 1 1 Thus the solution x to the above equation is [ ] [ ] x = A 1 2/3 1/3 11 b = 1 1 5

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ ] (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 =. 3 3 1 1 Thus the solution x to the above equation is [ ] [ ] [ ] x = A 1 2/3 1/3 11 22/3 5/3 b = = 1 1 5 11 + 5

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 = 3 3 1 1 Thus the solution x to the above equation is [ ] [ ] [ ] x = A 1 2/3 1/3 11 22/3 5/3 b = = = 1 1 5 11 + 5 ]. [ 17/3 6 ].

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 = 3 3 1 1 Thus the solution x to the above equation is [ ] [ ] [ ] x = A 1 2/3 1/3 11 22/3 5/3 b = = = 1 1 5 11 + 5 Thus the unique solution is x = [ 17/3 6 ]. ]. [ 17/3 6 ].

Example: solving systems with inverses Lets solve 3x + y = 11 3x + 2y = 5 using inverses [ of matrices. ] This [ is equivalent ] to solving Ax = b, 3 1 11 where A = and b =. The determinant of A is 3 2 5 [ ] [ (3)(2) (3) = 3. Thus A 1 = 1 2 1 2/3 1/3 3 = 3 3 1 1 Thus the solution x to the above equation is [ ] [ ] [ ] x = A 1 2/3 1/3 11 22/3 5/3 b = = = 1 1 5 11 + 5 Thus the unique solution is x = y = 6. [ 17/3 6 ].So x = 17/3 and ]. [ 17/3 6 ].

Properties of invertible matrices

Properties of invertible matrices Theorem 1 If A is an invertible matrix, then A 1 is also invertible, and the inverse of A 1 is A.

Properties of invertible matrices Theorem 1 If A is an invertible matrix, then A 1 is also invertible, and the inverse of A 1 is A. 2 If A and B are n n invertible matrices, then so is AB, and the inverse of AB is B 1 A 1 [Similar to the formula for the transpose of a product]

Properties of invertible matrices Theorem 1 If A is an invertible matrix, then A 1 is also invertible, and the inverse of A 1 is A. 2 If A and B are n n invertible matrices, then so is AB, and the inverse of AB is B 1 A 1 [Similar to the formula for the transpose of a product] 3 The transpose of an invertible matrix is invertible, and (A T ) 1 = (A 1 ) T.

Properties of invertible matrices Theorem 1 If A is an invertible matrix, then A 1 is also invertible, and the inverse of A 1 is A. 2 If A and B are n n invertible matrices, then so is AB, and the inverse of AB is B 1 A 1 [Similar to the formula for the transpose of a product] 3 The transpose of an invertible matrix is invertible, and (A T ) 1 = (A 1 ) T. Proof. Proof in the book is uncomplicated but you should read it. You really only need associativity of matrix multiplication, the fact that I n A = A = AI n for n n matrices, and properties of the transpose.

Properties of invertible matrices Theorem 1 If A is an invertible matrix, then A 1 is also invertible, and the inverse of A 1 is A. 2 If A and B are n n invertible matrices, then so is AB, and the inverse of AB is B 1 A 1 [Similar to the formula for the transpose of a product] 3 The transpose of an invertible matrix is invertible, and (A T ) 1 = (A 1 ) T. Proof. Proof in the book is uncomplicated but you should read it. You really only need associativity of matrix multiplication, the fact that I n A = A = AI n for n n matrices, and properties of the transpose. In general, if A 1, A 2,..., A k are a bunch of invertible n n matrices then so is A 1 A 2... A k, and (A 1 A 2... A k ) 1 = A 1... A 1 k 2 A 1 1.

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling.

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses.

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses. Definition An elementary matrix is a matrix obtained by performing an elementary row operation on an identity matrix.

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses. Definition An elementary matrix is a matrix obtained by performing an elementary row operation on an identity matrix. Example [ 0 1 1 0 ]

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses. Definition An elementary matrix is a matrix obtained by performing an elementary row operation on an identity matrix. Example [ 0 1 1 0 ] 1 0 0 2 1 0 0 0 1

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses. Definition An elementary matrix is a matrix obtained by performing an elementary row operation on an identity matrix. Example [ 0 1 1 0 ] 1 0 0 2 1 0 0 0 1 1 0 0 0 1 0 0 0 5

Elementary matrices and row operations Recall the elementary row operations: interchange, replacement, scaling. We want to reformulate them so that performing an elementary row operation on the square matrix A is the same as taking the product EA by some invertible matrix E. This will allow us effective computation of matrix inverses. Definition An elementary matrix is a matrix obtained by performing an elementary row operation on an identity matrix. Example [ 0 1 1 0 ] are all elementary matrices. 1 0 0 2 1 0 0 0 1 1 0 0 0 1 0 0 0 5

Elementary matrices are invertible Remember that the elementary row operations can be reversed:

Elementary matrices are invertible Remember that the elementary row operations can be reversed: 1 interchanging rows i 1 and i 2 is reversed by interchanging the rows again.

Elementary matrices are invertible Remember that the elementary row operations can be reversed: 1 interchanging rows i 1 and i 2 is reversed by interchanging the rows again. 2 scaling a row by c is reversed by scaling the same row by 1 c

Elementary matrices are invertible Remember that the elementary row operations can be reversed: 1 interchanging rows i 1 and i 2 is reversed by interchanging the rows again. 2 scaling a row by c is reversed by scaling the same row by 1 c 3 replacing row i 1 with c times row i 2 is reversed by subtracting c times row i 2 from row i 1

Elementary matrices are invertible Remember that the elementary row operations can be reversed: 1 interchanging rows i 1 and i 2 is reversed by interchanging the rows again. 2 scaling a row by c is reversed by scaling the same row by 1 c 3 replacing row i 1 with c times row i 2 is reversed by subtracting c times row i 2 from row i 1 This gives us that every elementary matrix E corresponding to a row operation is invertible, and E 1 is the elementary matrix corresponding to the reverse elementary row operation.

Elementry matrices are invertible Example [ ] 1 0 Let E =. Then E is the elementary matrix corresponding 5 1 to the row operation add 5 times the first row to the second row.

Elementry matrices are invertible Example [ ] 1 0 Let E =. Then E is the elementary matrix corresponding 5 1 to the row operation add 5 times the first row to the second row. Then its inverse is the elementary matrix corresponding to the row operation subtract 5 times the first row from the second row.

Elementry matrices are invertible Example [ ] 1 0 Let E =. Then E is the elementary matrix corresponding 5 1 to the row operation add 5 times the first row to the second row. Then its inverse is the elementary matrix corresponding to the row operation subtract 5 times the first row from the second row. That is, E 1 = [ 1 0 5 1 ].

Elementary matrices in action Example Let E 1 be the elementary matrix obtained by adding the first row of I 2 to its second row. [ ] 1 0 E 1 =. 1 1

Elementary matrices in action Example Let E 1 be the elementary matrix obtained by adding the first row of I 2 to its second row. [ ] 1 0 E 1 =. 1 1 [ ] a b Then for any 2 2 matrix A = we have c d

Elementary matrices in action Example Let E 1 be the elementary matrix obtained by adding the first row of I 2 to its second row. [ ] 1 0 E 1 =. 1 1 [ ] a b Then for any 2 2 matrix A = we have c d E 1 A

Elementary matrices in action Example Let E 1 be the elementary matrix obtained by adding the first row of I 2 to its second row. [ ] 1 0 E 1 =. 1 1 [ ] a b Then for any 2 2 matrix A = we have c d E 1 A = [ 1 0 1 1 ] [ a b ] c d

Elementary matrices in action Example Let E 1 be the elementary matrix obtained by adding the first row of I 2 to its second row. [ ] 1 0 E 1 =. 1 1 [ ] a b Then for any 2 2 matrix A = we have c d [ ] [ ] [ 1 0 a b E 1 A = = 1 1 c d a a + c b b + d ].

Invertibility and elementary row ops. Theorem Let A be an n n matrix. Then A is invertible if and only if A is row equivalent to I n, and any sequence of row operations that transforms A into I applied in the same order to I transforms I into A 1.

Invertibility and elementary row ops. Theorem Let A be an n n matrix. Then A is invertible if and only if A is row equivalent to I n, and any sequence of row operations that transforms A into I applied in the same order to I transforms I into A 1. Put another way, if E 1, E 2,..., E k are the elementary matrices corresponding to the row operations transforming A into I, so that E 1 E 2... E k A = I,

Invertibility and elementary row ops. Theorem Let A be an n n matrix. Then A is invertible if and only if A is row equivalent to I n, and any sequence of row operations that transforms A into I applied in the same order to I transforms I into A 1. Put another way, if E 1, E 2,..., E k are the elementary matrices corresponding to the row operations transforming A into I, so that E 1 E 2... E k A = I, then we necessarily have that E 1 E 2... E k I = E 1... E k = A 1.

Matrix inversion algorithm

Matrix inversion algorithm The theorem in the preceding slide tells us how to compute the inverse of any invertible matrix, and it encodes a way to check if an n n matrix is invertible, all through row reduction.

Matrix inversion algorithm The theorem in the preceding slide tells us how to compute the inverse of any invertible matrix, and it encodes a way to check if an n n matrix is invertible, all through row reduction. Fact Let A be an n n matrix and form the augmented matrix [A I n ]. Row reduce this matrix as usual. If the reduced echelon form looks like [I n C] then A is invertible and C = A 1.

Matrix inversion algorithm The theorem in the preceding slide tells us how to compute the inverse of any invertible matrix, and it encodes a way to check if an n n matrix is invertible, all through row reduction. Fact Let A be an n n matrix and form the augmented matrix [A I n ]. Row reduce this matrix as usual. If the reduced echelon form looks like [I n C] then A is invertible and C = A 1. If the reduced echelon form looks any other way, then A is not invertible (in fact, you can decide if A is invertible just by transforming to echelon form and seeing if there are any rows that have n 0s in the left half.

Example of MIA Let s find determine if the matrix A = 1 2 3 3 1 2 2 3 1 is invertible and find its inverse, if possible. Form the augmented system 1 2 3 1 0 0 3 1 2 0 1 0. 2 3 1 0 0 1

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 1 5 2 0 1 0 5 7 3 1 0

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 1 5 2 0 1 0 5 7 3 1 0 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 1 5 2 0 1 0 5 7 3 1 0

Ex. of MIA, pt. II 1 2 3 1 0 0 3 1 2 0 1 0 2 3 1 0 0 1 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 1 5 2 0 1 0 5 7 3 1 0 1 2 3 1 0 0 0 5 7 3 1 0 0 1 5 2 0 1 1 2 3 1 0 0 0 1 5 2 0 1 0 5 7 3 1 0 1 2 3 1 0 0 0 1 5 2 0 1 0 0 18 7 1 5 Now we can pause at echelon form to say that there are no bad rows..

Row reduction details We keep row reducing the entire augmented matrix 1 2 3 1 0 0 1 2 3 1 0 0 0 1 5 2 0 1 0 1 5 2 0 1 0 0 18 7 1 5 0 0 1 7/18 1/18 5/18 1 2 3 1 0 0 0 1 5 2 0 1 7 1 5 0 0 1 18 18 18

Row reduction details We keep row reducing the entire augmented matrix 1 2 3 1 0 0 1 2 3 1 0 0 0 1 5 2 0 1 0 1 5 2 0 1 0 0 18 7 1 5 0 0 1 7/18 1/18 5/18 1 2 3 1 0 0 0 1 5 2 0 1 7 1 5 0 0 1 18 18 18 1 2 0 3 1 0 1 0 18 7 0 0 1 18 18 3 18 5 18 1 18 15 18 13 9 5 18 1 2 0 3/18 3/18 15/18 0 1 0 1/18 5/18 7/18 0 0 1 7/18 1/18 5/18

Row reduction details We keep row reducing the entire augmented matrix 1 2 3 1 0 0 1 2 3 1 0 0 0 1 5 2 0 1 0 1 5 2 0 1 0 0 18 7 1 5 0 0 1 7/18 1/18 5/18 1 2 3 1 0 0 0 1 5 2 0 1 7 1 5 0 0 1 18 18 18 1 2 0 3 1 0 1 0 18 7 0 0 1 18 18 3 18 5 18 1 18 15 18 13 9 5 18 This is in reduced echelon form. 1 2 0 3/18 3/18 15/18 0 1 0 1/18 5/18 7/18 0 0 1 7/18 1/18 5/18 1 0 0 5/18 7/8 1/18 0 1 0 1/18 5/18 7/18 0 0 1 7/18 1/18 5/18

Row reduction details We keep row reducing the entire augmented matrix 1 2 3 1 0 0 1 2 3 1 0 0 0 1 5 2 0 1 0 1 5 2 0 1 0 0 18 7 1 5 0 0 1 7/18 1/18 5/18 1 2 3 1 0 0 0 1 5 2 0 1 7 1 5 0 0 1 18 18 18 1 2 0 3 1 0 1 0 18 7 0 0 1 18 18 3 18 5 18 1 18 15 18 13 9 5 18 1 2 0 3/18 3/18 15/18 0 1 0 1/18 5/18 7/18 0 0 1 7/18 1/18 5/18 1 0 0 5/18 7/8 1/18 0 1 0 1/18 5/18 7/18 0 0 1 7/18 1/18 5/18 This is in reduced echelon form. The right half of the augmented matrix is the inverse of A.

Inverse We computed that 1 2 3 3 1 2 2 3 1 1 = 5/18 7/18 1/18 1/18 5/18 7/18 7/18 1/18 5/18.

Inverse We computed that 1 2 3 3 1 2 2 3 1 1 = 5/18 7/18 1/18 1/18 5/18 7/18 7/18 1/18 5/18. If you multiply the two matrices you get the identity 1 2 3 5/18 7/18 1/18 1 0 0 3 1 2 1/18 5/18 7/18 = 0 1 0 2 3 1 7/18 1/18 5/18 0 0 1.

When is a matrix invertible? We have seen that a matrix is invertible if you can reduce it to the identity matrix via row operations. This description is not complete enough for our purposes we don t always want to have to run the row reduction algorithm every time.

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n 8 the columns of A span R n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n 8 the columns of A span R n 9 the linear transformation x Ax maps R n onto R n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n 8 the columns of A span R n 9 the linear transformation x Ax maps R n onto R n 10 there is an n n matrix C such that CA = I n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n 8 the columns of A span R n 9 the linear transformation x Ax maps R n onto R n 10 there is an n n matrix C such that CA = I n 11 there is an n n matrix D such that AD = I n

Theorem (The Invertible Matrix Theorem) If A is an n n matrix then the following are equivalent: 1 A is invertible 2 A is row equivalent to I n 3 A has n pivot columns 4 the equation Ax = 0 has only the trivial solution 5 the columns of A form a linearly independent set 6 the linear transformation x Ax is one-to-one 7 the equation Ax = b has at least one solution for each b R n 8 the columns of A span R n 9 the linear transformation x Ax maps R n onto R n 10 there is an n n matrix C such that CA = I n 11 there is an n n matrix D such that AD = I n 12 A T is invertible

Proving the IMT: 1 implies 2 implies 3 implies 4 implies 5 If A is invertible then A is row equivalent to I n by the previous section. So (1) implies (2).

Proving the IMT: 1 implies 2 implies 3 implies 4 implies 5 If A is invertible then A is row equivalent to I n by the previous section. So (1) implies (2). If A is row-equivalent to I n, then the row-operations that transform A to I n actually transform A into reduced echelon form. Since I n has no non-pivot columns, A has no non-pivot columns. So (2) implies (3)

Proving the IMT: 1 implies 2 implies 3 implies 4 implies 5 If A is invertible then A is row equivalent to I n by the previous section. So (1) implies (2). If A is row-equivalent to I n, then the row-operations that transform A to I n actually transform A into reduced echelon form. Since I n has no non-pivot columns, A has no non-pivot columns. So (2) implies (3) Suppose A has n pivot columns. If we augment A with 0 and row-reduce, there are no free variables. Thus Ax = 0 has only the trivial solution. So (3) implies (4)

Proving the IMT: 1 implies 2 implies 3 implies 4 implies 5 If A is invertible then A is row equivalent to I n by the previous section. So (1) implies (2). If A is row-equivalent to I n, then the row-operations that transform A to I n actually transform A into reduced echelon form. Since I n has no non-pivot columns, A has no non-pivot columns. So (2) implies (3) Suppose A has n pivot columns. If we augment A with 0 and row-reduce, there are no free variables. Thus Ax = 0 has only the trivial solution. So (3) implies (4) Any linear dependence relation among the columns of A gives a non-trivial solution to Ax = 0. Thus (4) implies (5).

Application of the IMT We can use the IMT to check if things are non-invertible really easily.

Application of the IMT We can use the IMT to check if things are non-invertible really easily. Example Let A = 1 0 1 2 2 0 2 4 3 0 2 6 4 1 3 8.

Application of the IMT We can use the IMT to check if things are non-invertible really easily. Example Let A = reduction. 1 0 1 2 2 0 2 4 3 0 2 6 4 1 3 8. Let s see if A is invertible without row

Application of the IMT We can use the IMT to check if things are non-invertible really easily. Example Let A = 1 0 1 2 2 0 2 4 3 0 2 6 4 1 3 8. Let s see if A is invertible without row reduction. There s a whole bunch of ways to check, but one that is nice is: if A has linearly dependent columns, then A is not invertible.

Application of the IMT We can use the IMT to check if things are non-invertible really easily. Example Let A = 1 0 1 2 2 0 2 4 3 0 2 6 4 1 3 8. Let s see if A is invertible without row reduction. There s a whole bunch of ways to check, but one that is nice is: if A has linearly dependent columns, then A is not invertible. We note that the fourth column of A is 2 times the first column of A.

Application of the IMT We can use the IMT to check if things are non-invertible really easily. Example Let A = 1 0 1 2 2 0 2 4 3 0 2 6 4 1 3 8. Let s see if A is invertible without row reduction. There s a whole bunch of ways to check, but one that is nice is: if A has linearly dependent columns, then A is not invertible. We note that the fourth column of A is 2 times the first column of A. Thus the columns of A are linearly dependent, so A is not invertible by the IMT.

Invertible maps Definition A function f : X Y is called invertible (or bijective) if there is a map g : Y X such that f (g(y)) = y for all y Y and g(f (x)) = x for all x X. If such a g exists, it is called the inverse of f.

Invertible maps Definition A function f : X Y is called invertible (or bijective) if there is a map g : Y X such that f (g(y)) = y for all y Y and g(f (x)) = x for all x X. If such a g exists, it is called the inverse of f. When is a linear transformation T given by x Ax invertible?

Invertible maps Definition A function f : X Y is called invertible (or bijective) if there is a map g : Y X such that f (g(y)) = y for all y Y and g(f (x)) = x for all x X. If such a g exists, it is called the inverse of f. When is a linear transformation T given by x Ax invertible? Precisely when A is an invertible matrix.

Theorem Let A be n n and let T be the linear transformation x Ax.

Theorem Let A be n n and let T be the linear transformation x Ax.Then T is invertible if and only if A is invertible.

Theorem Let A be n n and let T be the linear transformation x Ax.Then T is invertible if and only if A is invertible. Proof. Suppose that A is invertible. Define S : R n R n by S(x) = A 1 x.

Theorem Let A be n n and let T be the linear transformation x Ax.Then T is invertible if and only if A is invertible. Proof. Suppose that A is invertible. Define S : R n R n by S(x) = A 1 x. Then T (S(x)) = AA 1 (x) = x and S(T (x)) = x for all x R n.

Theorem Let A be n n and let T be the linear transformation x Ax.Then T is invertible if and only if A is invertible. Proof. Suppose that A is invertible. Define S : R n R n by S(x) = A 1 x. Then T (S(x)) = AA 1 (x) = x and S(T (x)) = x for all x R n. Suppose that T is invertible with inverse S. We show that Ax = 0 has only the trivial solution.

Theorem Let A be n n and let T be the linear transformation x Ax.Then T is invertible if and only if A is invertible. Proof. Suppose that A is invertible. Define S : R n R n by S(x) = A 1 x. Then T (S(x)) = AA 1 (x) = x and S(T (x)) = x for all x R n. Suppose that T is invertible with inverse S. We show that Ax = 0 has only the trivial solution. If T (x) = T (0) then S(T (x)) = S(T (0)). But S is the inverse of T, so x = 0. Thus the only solution to Ax = 0 is the trivial solution. The IMT tells us that A is invertible.