L1-2. Special Matrix Operations: Permutations, Transpose, Inverse, Augmentation 12 Aug 2014



Similar documents
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Lecture 2 Matrix Operations

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

1 Introduction to Matrices

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

8 Square matrices continued: Determinants

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Solution to Homework 2

Data Mining: Algorithms and Applications Matrix Math Review

Solving Linear Systems, Continued and The Inverse of a Matrix

Introduction to Matrix Algebra

Matrix Differentiation

Typical Linear Equation Set and Corresponding Matrices

SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison

Using row reduction to calculate the inverse and the determinant of a square matrix

Similar matrices and Jordan form

MAT188H1S Lec0101 Burbulla

5.5. Solving linear systems by the elimination method

LS.6 Solution Matrices

LINEAR ALGEBRA. September 23, 2010

MATHEMATICS FOR ENGINEERS BASIC MATRIX THEORY TUTORIAL 2

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Properties of Real Numbers

DETERMINANTS TERRY A. LORING

Name: Section Registered In:

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes

[1] Diagonal factorization

Matrices 2. Solving Square Systems of Linear Equations; Inverse Matrices

6. Cholesky factorization

Linear Algebra Review. Vectors

9 MATRICES AND TRANSFORMATIONS

Solutions to Math 51 First Exam January 29, 2015

Chapter 17. Orthogonal Matrices and Symmetries of Space

Math 312 Homework 1 Solutions

CS3220 Lecture Notes: QR factorization and orthogonal transformations

x = + x 2 + x

Solving simultaneous equations using the inverse matrix

26. Determinants I. 1. Prehistory

MATH APPLIED MATRIX THEORY

1 Symmetries of regular polyhedra

7 Gaussian Elimination and LU Factorization

NOTES ON LINEAR TRANSFORMATIONS

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Introduction to Matrices for Engineers

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

5. Orthogonal matrices

K80TTQ1EP-??,VO.L,XU0H5BY,_71ZVPKOE678_X,N2Y-8HI4VS,,6Z28DDW5N7ADY013

A linear combination is a sum of scalars times quantities. Such expressions arise quite frequently and have the form

Continued Fractions and the Euclidean Algorithm

Solving Systems of Linear Equations

S on n elements. A good way to think about permutations is the following. Consider the A = 1,2,3, 4 whose elements we permute with the P =

Test1. Due Friday, March 13, 2015.

Chapter Two: Matrix Algebra and its Applications 1 CHAPTER TWO. Matrix Algebra and its applications

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

You know from calculus that functions play a fundamental role in mathematics.

SECTION 10-2 Mathematical Induction

Chapter 6. Orthogonality

A =

Matrix Representations of Linear Transformations and Changes of Coordinates

Recall that two vectors in are perpendicular or orthogonal provided that their dot

A note on companion matrices

University of Lille I PC first year list of exercises n 7. Review

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

Eigenvalues and Eigenvectors

The Determinant: a Means to Calculate Volume

Permutation Groups. Tom Davis April 2, 2003

160 CHAPTER 4. VECTOR SPACES

Notes on Symmetric Matrices

Answer Key for California State Standards: Algebra I

SOLVING QUADRATIC EQUATIONS - COMPARE THE FACTORING ac METHOD AND THE NEW DIAGONAL SUM METHOD By Nghi H. Nguyen

Solution of Linear Systems

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

Systems of Linear Equations

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

8.2. Solution by Inverse Matrix Method. Introduction. Prerequisites. Learning Outcomes

26 Integers: Multiplication, Division, and Order

Playing with Numbers

Vector and Matrix Norms

T ( a i x i ) = a i T (x i ).

Solving Systems of Linear Equations

5.3 The Cross Product in R 3

1 Solving LPs: The Simplex Algorithm of George Dantzig

Clock Arithmetic and Modular Systems Clock Arithmetic The introduction to Chapter 4 described a mathematical system

CURVE FITTING LEAST SQUARES APPROXIMATION

Lecture notes on linear algebra

Review Jeopardy. Blue vs. Orange. Review Jeopardy

SOLVING LINEAR SYSTEMS

Orthogonal Projections

Matrix Algebra in R A Minimal Introduction

6.3 Conditional Probability and Independence

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Excel Basics By Tom Peters & Laura Spielman

1.2 Solving a System of Linear Equations

Brief Introduction to Vectors and Matrices

Transcription:

L1-2. Special Matrix Operations: Permutations, Transpose, Inverse, Augmentation 12 Aug 2014 Unfortunately, no one can be told what the Matrix is. You have to see it for yourself. -- Morpheus Primary concepts: Row and column operations, Matrix and its transpose, Symmetric matrices, Trace of a matrix, Inverse of a matrix, singular vs. non-singular matrices Dawkins: http://tutorial.math.lamar.edu/classes/linalg/propsofmatrixarith.aspx By now you are comfortable with basic matrix operations (if not, stop! Go back! Do more practice!) In this lab, there are a few special ops that we would like to investigate. Permutations : Swapping columns 1 1 1 Given a matrix A 2 4 8, suppose we needed to change the order of the columns 3 9 27 1 1 1 to B 4 2 8, ie, swap columns 1 and 2. How can this idea be represented as a 9 3 27 simple operation in matrix terms? 0 1 0 The answer is a permutation matrix. Consider the matrix P12 1 0 0 0 0 1 When we take the product of the matrices A and P 12 (note the order), what do we get? For these permutation matrices, we will use the convention that the subscripts of P represent the columns that are being swapped, so that P 12 represents a swap of columns

one and two. The permutation comes from the identity matrix I: just swap the columns of I that correspond to the desired permutation of A. It should also be apparent that all these permutations are reversible: What is P 12 P 12? Write P 13 and P 23 and verify that they perform the desired permutations of A. What would a permutation matrix that was designed to swap all three columns in just one step look like? Write a P matrix to first swap columns 1 and 3 and then columns 2 and 3 of the resulting 1 1 1 matrix such that C AP132 8 2 4. Note this is easiest to visualize by doing in 27 3 9 two steps with two separate P jk matrices; then just multiply the two P matrices together! Recall that matrix multiplication is associative, but NOT commutative. Suppose we agree that I is also a permutation matrix (it just doesn t do anything: I = P 11 or I = P 22 or I = P 33 ). How many total permutation matrices (including the two-column swappers) are there for the 3x3 matrix A? Do the same count for a 2x2 and a 4x4 matrix of your choice. What pattern do you see emerging? Swapping rows: permutations again! 2 4 8 Suppose we want to obtain D 1 1 1 from the original matrix A. We have 3 9 27 swapped the first two rows. Do we need a brand new permutation matrix? Let s use the same P 12 from the column swap, but this time put the permutation matrix to the left of A and multiply. We certainly do not expect the product P 12 A to equal the product A P 12. Does D = P 12 A? What is the result of P 21 P 12 A? What does this say about any product P jk P kj and the relationship between P jk and P kj? Try a double row swap; use the appropriate P matrix and verify that it does what you expect. 2

Transpose of a matrix Consider the rectangular 3x2 matrix 1 3 R. 3 1 We define the transpose of R to be the matrix obtained when the columns of R are written as rows:. T R T 1 3 1 3 3 1 3 1 Note that the transpose operation is written with a superscript capital T. This operation produces the following relationship between the elements of A and A T : (A T ) jk = A kj The transpose operation is immediately reversible: (A T ) T =A Symmetric matrices (which have to be square) are their own transpose: S = S T. This is obviously true for the Identity matrix. For any rectangular matrix R, the product R T R is always symmetric (and therefore square). If R is m x n, R T is n x m and the product R T R is n x n. What can you say about the product R R T? We can prove the symmetry of R T R simply by taking its transpose: (R T R) T = R T R TT = R T R. Do the same for RR T. Go back and look at one of the permutation matrices and the permutation that undoes it. What is the relationship between invert and transpose for permutation matrices? A matrix that undoes the result of a prior matrix operation is an inverse matrix (much more about this later). Note the Properties of the Transpose listed in Dawkins chapter on Properties of Matrix arithmetic, including some proofs. Trace The trace of a square matrix is the sum of its diagonal elements, indicated tr(a). Interesting property of trace: tr(a) = tr(a T ). And that is the only reason it is mentioned here. More on trace later. 3

Problems Prove or disprove (by counterexample) the following generalizations about matrix transposition: For two conformable matrices A and B of any rectangular size and shape, 1. (A + B) T = A T + B T 2. (AB) T = B T A T Why do the matrices being multiplied change place under the transpose operation? Start by noting that if A is mxp, a conformable B must be pxn. Then A T is pxm and B T must be nxp. For there to be a product A T B T, m = n. But A and B are rectangular! 3. Suppose A and B are both nxn symmetric matrices. Is matrix multiplication of A and B commutative? Why or why not? Note: A proof is a complete chain of logical statements relating a claim or assertion to a verifiable conclusion. It is therefore never enough to just supply a single example when asked for a proof. However, a single counterexample is valid to show that a claim is not generally true. Many valid proofs begin by assuming that what is to be proved is false and arrive at a logical contradiction. So what was assumed to be false must in fact be true. And when we are done with a proof, we say QED (quite easily done). Inverse of a matrix (for starters) Definition (for you squares only): If A is a square matrix and its inverse matrix A -1 exists, then A -1 A = I = A A -1. Multiplication of a square matrix by its inverse must produce the identity, whether on the left and on the right (commutative). What is the inverse of a row swap matrix? One of the problems that will haunt us throughout this course is the question of what is necessary for a matrix to have an inverse and then how to find that inverse, of course! In the 2x2 case, there is a simple formula for finding the inverse. In case you don t remember the 2x2 formula, a b 1 1 A, A d b c d ad bc c a The quantity ad bc is known as the determinant of the 2x2 matrix, an old friend (ha!) from Algebra 2. 4

For anything bigger than 3x3, you do not want to have to invert the matrix using a formula. In this course, calculating the determinant of a large matrix is a method of last resort. Example A 1 3 2 7 has inverse 1 7 3 A 2 1. Conveniently, the determinant of A is 1 and our formula just flips the signs of b and c and swaps the places of a and d. We can also obtain this inverse by solving the four equations: x 1 + 3x 3 = 1 x 2 + 3 x 4 = 0 2x 1 + 7x 3 = 0 2x 2 + 7x 4 = 1 1 3 1 1 1 3 2 0 or the two matrix equations x x and 2 7 x 3 0 2 7 x 4 1, which come from the expression AA -1 = I. Be sure you see how we arrived at those equations! x1 x 1 2 Note that in this example, the inverse is written as A x3 x and we just 4 split the matrix into column vectors. So you think that inverses apply only to square matrices? Not so fast! Rectangular matrices can have left and right inverses: read on, but slowly If A is mxn, then L nxm is a left inverse for A if and only if LA = I n Similarly, R nxm is a right inverse for A if and only if AR = I m Claim: If A has both an L and an R, then L = R. Why? Start with LA = I and AR = I. Then L = LI = L(AR) = (LA)R = IR = R 5

Example 1 1 Find L and R (if they exist) for A 4 1 To begin, LA = I = AR. Thus L must be 2x3, as must R. Let a b c L d e f and generate some equations: 1 1 a b c a2b4c1 a3bc0 LA I d e f d 2e4f 0 d 3e f 1 4 1 Oh no, only 4 equations, with 6 unknowns! But we just showed that if both L and R exist, L = R. So 1 1 ad 1 be0 c f 0 a b c AR I 2a 3d 0 2b 3e 1 2c 3f 0 d e f 4 1 4a d 0 4be0 4c f 1 9 more equations, for a total of 13, with 6 unknowns!!! We can only hope that some of the equations are redundant, so that a unique solution can be found. If not, there are no inverses for A. Find out! Questions: To have both an L and an R, must A be a square matrix? If L is a left inverse for rectangular matrix A, prove that L T is a right inverse for A T. If R is a right inverse for rectangular matrix A, prove that R T is a left inverse for A T. An invertible matrix is also called a non-singular matrix. A matrix that has no inverse is said to be singular or have a singularity. What s important about the singular/non-singular question? For two matrices A and B and the appropriately sized 0 matrix, if AB = 0, then either A = 0, B = 0 or both A and B are singular matrices. If Ax = b and A has inverse A -1, then A -1 Ax = A -1 b or x = A -1 b. 6

Thus if A has an inverse, there is at least one solution x for every b. Further, if A has an inverse and Ax = 0, the only solution is the trivial solution x = 0. Is x = A -1 b a good way to solve Ax = b? Maybe yes, maybe no. Some properties of matrix inverse Assume that A and B are both square, nonsingular matrices 1. (AB) -1 = B -1 A -1 A proof of this important property has two distinct steps. 1. Start with B -1 A -1 and multiply on the right by AB. 2. Start with B -1 A -1 and multiply on the left by AB. Finish the proof. If these two products can be shown to be I, what have we shown? How does that satisfy the definition of the matrix inverse? Is that a sufficient proof? 2. (A -1 ) -1 = A In the same manner as the first property, if the statement is true, (A -1 ) -1 A -1 = I. Finish the proof. 3. (A T ) -1 = (A -1 ) T Using the same approach, if this property is true, (A -1 ) T (A T ) = I and (A T ) (A -1 ) T = I Finish the proof. 4. If A, B and C are pxp nonsingular matrices then both of the following must be true: a. the product ABC is nonsingular and b. (ABC) -1 = C -1 B -1 A -1. Prove it! Much more on inversion later! 7

Loading the Matrix: Augmentation It is often convenient to create an augmented matrix from a given matrix A and a vector b by tacking the values of b onto the right of A, creating a temporary rectangular matrix. The symbol for the augmented matrix is sometimes A b: 1 2 2 4 A b 1 3 3 5 2 6 5 6. Any operation we do to the augmented matrix A b is identical to doing the same operation to the right and left hand side of the equation Ax = b. In some cases, it is also possible to augment a matrix with another matrix: A B is only valid if A and B have the same number of rows. See Dawkins section on Special Matrices for some special matrices. Practice Problems a. If matrix A is 5x3 and the product AB is 5x7, what is the size of B? 2 5 b. Let A 4 5 3 1 and B 3 k. How many values of k, if any, will make the matrix products AB = BA? c. True/false (justify your answer): 1. If A, B and C are arbitrary 2x2 matrices. The columns of A and B are represented as a 1, a 2 and b 1, b 2, respectively, then AB ab 1 1 a2b2 2. Each column of AB is a linear combination of the elements in a column of B using weights from the corresponding rows of A. 3. AB + AC = A(B + C) Practice Answers: a. 3x7 b. one (what is it?) c. 1 = false: matrix multiplication is row by column, not column by column 2 = true: follows from the correct statement of 1 3 = true: suppose D = B+C, can show that if AB + AC is not AD a contradiction develops. 8

Problem Set 1a. Construct a random 4x4 matrix A and test whether (A + I)(A I) = A 2 I. What is the meaning of A 2? Try at least three different matrices, including both symmetric and non-symmetric cases. b. Test (A + B)(A B) = A 2 B 2 in the same manner. Prepare a report of your findings, sufficient to document the claim that you have discovered binomial factorization of matrices. Be sure you address the commutivity issue! 1 2 2. Let A 5 12 ; find A-1 by hand. Use A -1 to solve Ax = b for the following values of b: 1 1 2 b1, b2, b3 3 5 6 3. Suppose A, B and C are all invertible nxn matrices. Show that ABC is also invertible by producing matrix D such that (ABC)D = I = D(ABC). 4. Solve the equation AB = BC for A, assuming A, B and C are square and B is invertible. Then find an equality for C in terms of A, B and B -1. 5. Suppose P is invertible and A = PBP -1. Solve for B in terms of A. 6. Suppose A, B and X are nxn matrices with A, X and A AX invertible and suppose (A AX) -1 = X -1 B. Explain why B must be invertible and use this information to solve for X. 9