Contents 1 Diagonal, Symmetric Triangular Matrices 2 Diagonal Matrices 2.1 Products, Powers Inverses of Diagonal Matrices 2.1.1 Theorem (Powers of Matrices) 2.2 Multiplying Matrices on the Left Right by Diagonal Matrices 2.2.1 Theorem (Multiplying on left right by a diagonal matrix) 3 Symmetric Matrices 3.1 Example: AA T A T A 3.2 Example: Evaluating A n 3.3 Theorem (Properties of Symmetric Matrices) 4 Triangular Matrices 4.1 Triangularity Transpose 4.1.1 Theorem (Transpose of Triangular Matrices) 4.2 Triangularity Inverses 4.2.1 Theorem (Products Inverses of Upper Triangular Matrices) Diagonal, Symmetric Triangular Matrices In this section we look at some matrices that play a special role. Each one has visual properties that are reflected in the values of some of the entries with special subscripts. All matrices considered are square. Diagonal Matrices Diagonal matrices are zero off the diagonal. This means the matrix looks like where the entries denoted by * may be zero or nonzero. In terms of the subscripts, D = [d ] is a diagonal matrix if d = 0 whenever If D = [d ] is an diagonal matrix, it is common to write where are the diagonal entries. Contents 1
Products, Powers Inverses of Diagonal Matrices It is very easy to verify the special rule for taking products of diagonal matrices. If Proof: The i-j entry of RS is Math1300:MainPage/DiagTriangSym for all are diagonal matrices, then In particular, if D = [d ] is a diagonal matrix, then, more generally Clearly r i,k s k,j = 0 unless both i = k k = j. Hence the only way to get a nonzero summ is to have i = j, in which case its value is r i,i s i,i. If are all nonzero, Then clearly the product of the matrix D = [d ] is, so the inverse of D is known. On the other h, if some d an all zero row, so the matrix is not invertible. 1 = 0, then D has Theorem (Powers of Matrices) If is a diagonal matrix then for. D is invertible if only if for When D is invertible, for all integers k. Multiplying Matrices on the Left Right by Diagonal Matrices Suppose that A = [a ] is an matrix is a diagonal matrix. Let us consider a typical element in the i-th row of DA. Then the i-j entry of DA is d i a. Considering the row as a whole, each element is multiplied by d i. In other words, the row R i becomes d i R i. The remains true, of course, for every row i, This can also be viewed as a sequence of elementary row operations: its easy to take the product of these diagonal matrices: The corresponding elementary matrices are, so the sequence of elementary row operations is the same as multiplying on the left by D. Products, Powers Inverses of Diagonal Matrices 2
A similar analysis of multiplication of a matrix on the right by a diagonal matrix reveals that the columns are multiplied by the diagonal entries of the matrix. Theorem (Multiplying on left right by a diagonal matrix) If A is an matrix, are diagonal matrices, then The effect of RA is to multiply each entry of the i-th row of A by ri. Symbolically, for The effect of AS is to multiply each entry of the j-th column of A by s. Symbolically, for j Symmetric Matrices A matrix A = [a ] is symmetric if A = A T. In other words, a An example of a symmetric matrix is = a for all j,i Notice the geometric relationship for a typical a a j,i within the matrix A: This means that the part of the matrix above the diagonal is the mirror image of the part below the diagonal vice-versa. Another way of saying this: if you flip the matrix over using the main diagonal as an axis, the matrix is unchanged. Multiplying Matrices on the Left Right by Diagonal Matrices 3
Example: AA T A T A The following example gives a method of constructing symmetric matrices. We use For any matrix A, both AA T A T A are symmetric. Proof: Remember that (A T ) T = A (AB) T = B T A T. Hence (AA T ) T = (A T ) T A T = AA T which is precisely the condition necessary for AA T to be symmetric. The proof for A T A is essentially identical. Another way of seeing the symmetry is to observe that (AA T ) (AA T ) j,i are both equal to the inner product of R i R j. Suppose Then both of which are symmetric. Example: Evaluating A n Let We wish to evaluate A n. Notice that U 2 = 2I, so Notice also that Now also Hence Theorem (Properties of Symmetric Matrices) If A B are symmetric matrices of order n, then A T is symmetric Example: AAT ATA 4
A + B A B are symmetric ra is symmetric for any real number r If A is invertible, then A 1 is symmetric Math1300:MainPage/DiagTriangSym Proofs: Remember that (A 1 ) T = (A T ) 1. Since A = A T B = B T, it follows that (A T ) T = A = A T (A + B) T = A T + B T = A + B (A B) T = A T B T = A B (ra) T = r(a T ) = ra (A 1 ) T = (A T ) 1 = A 1 so A 1 is symmetric. Triangular Matrices Triangular matrices come in two varieties: upper triangular lower triangular. An upper triangular matrix zero below the diagonal. This means the matrix looks like where * may be zero or nonzero. Viewed another way, if A = [a ] then A is upper triangular if a = 0 whenever i > j. Similarly, a lower triangular matrix zero abpve the diagonal. This means the matrix looks like where * may be zero or nonzero. In this case A is lower triangular if a = 0 whenever i < j. A matrix is triangular if it is either upper triangular or lower triangular. Triangularity Transpose Theorem (Transpose of Triangular Matrices) The transpose of an upper triangular matrix is lower triangular. The transpose of a lower triangular matrix is upper triangular. Theorem (Properties of Symmetric Matrices) 5
The transpose of a triangular matrix is triangular. Proofs: The i-j entry of A is the j-i entry of A T. If A is lower triangular, then the i-j entry of A is zero if i < j, hence so is the j-i entry of A T. This means that A T is upper triangular. A similar argument holds if A is lower triangular or if A is triangular. Informally: Math1300:MainPage/DiagTriangSym Triangularity Inverses Theorem (Products Inverses of Upper Triangular Matrices) Informally: if then Formally: Suppose A B are two square matrices of order n. Then If A B are upper triangular, then so is AB A is upper triangular invertible if only if each diagonal entry is nonzero Theorem (Transpose of Triangular Matrices) 6
If A is upper triangular invertible, then so is A 1 Proofs: If A B are both upper triangular, then a = b = 0 whenever i > j. Now consider the i-j entry in the product AB. Now the upper triangularity of A B imply so For this sum to be nonzero we need triangular. This means that (AB) = 0 whenever i > j, so AB is upper Suppose A is upper triangular, we want to solve AX = I. The matrix itself is already in the form for Gaussian Elimination! If all the diagonal entries are nonzero, we can back substitute solve AX = I. Alternatively, we can change each diagonal entry to 1 with an elementary row operation change all entries above the diagonal to zero to get I n as the reduced row echelon form. In either case, A is invertible. On the other h, if there is a zero on the diagonal, then there is a free variable, A is not invertible. Finally, suppose that A is upper triangular invertible. This means we can start with the matrix [A I], by elementary row operations, reduce it to [I A 1 ]. For each diagonal element d we have hence the i elementary row operations for changes the diagonal entries to 1. To change an entry above the diagonal, say a, to zero we use Note that i < j since the entry is above the diagonal. The entries below the diagonal are already zero. What is the effect on the right side of [A I] when we follow this process to reduce A to I? The first part just multiplies changes the diagonal entries to. The second part just changes entries that are above the diagonal. Hence the right side of [I A 1 ] must be upper triangular, the proof of the theorem is complete. Theorem (Products Inverses of Upper Triangular Matrices) 7