6. Cholesky factorization



Similar documents
7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Direct Methods for Solving Linear Systems. Matrix Factorization

Elementary Matrices and The LU Factorization

5. Orthogonal matrices

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

SOLVING LINEAR SYSTEMS

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Using row reduction to calculate the inverse and the determinant of a square matrix

Solving Linear Systems, Continued and The Inverse of a Matrix

LINEAR ALGEBRA. September 23, 2010

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Solution of Linear Systems

Yousef Saad University of Minnesota Computer Science and Engineering. CRM Montreal - April 30, 2008

Factorization Theorems

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Question 2: How do you solve a matrix equation using the matrix inverse?

Similar matrices and Jordan form

Operation Count; Numerical Linear Algebra

Chapter 6. Orthogonality

7 Gaussian Elimination and LU Factorization

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

University of Lille I PC first year list of exercises n 7. Review

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

8 Square matrices continued: Determinants

Lecture 5: Singular Value Decomposition SVD (1)

Solving Systems of Linear Equations

Notes on Cholesky Factorization

MAT188H1S Lec0101 Burbulla

Notes on Determinant

Similarity and Diagonalization. Similar Matrices

Solving Linear Systems of Equations. Gerald Recktenwald Portland State University Mechanical Engineering Department

Inner Product Spaces and Orthogonality

Math 215 HW #6 Solutions

by the matrix A results in a vector which is a reflection of the given

Numerical Methods I Solving Linear Systems: Sparse Matrices, Iterative Methods and Non-Square Systems

1 Introduction to Matrices

8. Linear least-squares

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Subspaces of R n LECTURE Subspaces

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

1 Determinants and the Solvability of Linear Systems

7.4. The Inverse of a Matrix. Introduction. Prerequisites. Learning Style. Learning Outcomes

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

x = + x 2 + x

Eigenvalues and Eigenvectors

Data Mining: Algorithms and Applications Matrix Math Review

Lecture 4: Partitioned Matrices and Determinants

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

Orthogonal Projections

General Framework for an Iterative Solution of Ax b. Jacobi s Method

Math 312 Homework 1 Solutions

A Direct Numerical Method for Observability Analysis

MATRICES WITH DISPLACEMENT STRUCTURE A SURVEY

Lecture 5 Least-squares

LINEAR ALGEBRA W W L CHEN

The Characteristic Polynomial

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Matrix Multiplication

LU Factoring of Non-Invertible Matrices

Vector and Matrix Norms

Matrix Algebra in R A Minimal Introduction

Inner products on R n, and more

[1] Diagonal factorization

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

A Negative Result Concerning Explicit Matrices With The Restricted Isometry Property

Linear Algebra: Determinants, Inverses, Rank

Lecture 2 Matrix Operations

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

DETERMINANTS TERRY A. LORING

Notes on Symmetric Matrices

More than you wanted to know about quadratic forms

26. Determinants I. 1. Prehistory

10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method

MATHEMATICS FOR ENGINEERS BASIC MATRIX THEORY TUTORIAL 2

Systems of Linear Equations

1 VECTOR SPACES AND SUBSPACES

Linearly Independent Sets and Linearly Dependent Sets

Linear Algebra Review. Vectors

The world s largest matrix computation. (This chapter is out of date and needs a major overhaul.)

Matrix Differentiation

Unit 18 Determinants

Linear Algebra Notes

A =

Lecture notes on linear algebra

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

SECTIONS NOTES ON GRAPH THEORY NOTATION AND ITS USE IN THE STUDY OF SPARSE SYMMETRIC MATRICES

( ) which must be a vector

Solution to Homework 2

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

T ( a i x i ) = a i T (x i ).

Lecture 3: Finding integer solutions to systems of linear equations

Factoring Algorithms

Transcription:

6. Cholesky factorization EE103 (Fall 2011-12) triangular matrices forward and backward substitution the Cholesky factorization solving Ax = b with A positive definite inverse of a positive definite matrix permutation matrices sparse Cholesky factorization 6-1

Triangular matrix a square matrix A is lower triangular if a ij = 0 for j > i A = a 11 0 0 0 a 21 a 22 0 0..... 0 0 a n 1,1 a n 1,2 a n 1,n 1 0 a n1 a n2 a n,n 1 a nn A is upper triangular if a ij = 0 for j < i (A T is lower triangular) a triangular matrix is unit upper/lower triangular if a ii = 1 for all i Cholesky factorization 6-2

Forward substitution solve Ax = b when A is lower triangular with nonzero diagonal elements a 11 0 0 x 1 b 1 a 21 a 22 0 x 2....... = b 2. a n1 a n2 a nn x n b n algorithm: x 1 := b 1 /a 11 x 2 := (b 2 a 21 x 1 )/a 22 x 3 := (b 3 a 31 x 1 a 32 x 2 )/a 33. x n := (b n a n1 x 1 a n2 x 2 a n,n 1 x n 1 )/a nn cost: 1+3+5+ +(2n 1) = n 2 flops Cholesky factorization 6-3

Back substitution solve Ax = b when A is upper triangular with nonzero diagonal elements a 11 a 1,n 1 a 1n x 1 b 1....... 0 a n 1,n 1 a n 1,n x n 1 =. b n 1 0 0 a nn x n b n algorithm: x n := b n /a nn x n 1 := (b n 1 a n 1,n x n )/a n 1,n 1 x n 2 := (b n 2 a n 2,n 1 x n 1 a n 2,n x n )/a n 2,n 2. x 1 := (b 1 a 12 x 2 a 13 x 3 a 1n x n )/a 11 cost: n 2 flops Cholesky factorization 6-4

Inverse of a triangular matrix triangular matrix A with nonzero diagonal elements is nonsingular Ax = b is solvable via forward/back substitution; hence A has full range therefore A has a zero nullspace, is invertible, etc. (see p.4-8) inverse can be computed by solving AX = I column by column A [ X 1 X 2 X n ] = [ e1 e 2 e n ] inverse of lower triangular matrix is lower triangular inverse of upper triangular matrix is upper triangular Cholesky factorization 6-5

Cholesky factorization every positive definite matrix A can be factored as A = LL T where L is lower triangular with positive diagonal elements cost: (1/3)n 3 flops if A is of order n L is called the Cholesky factor of A can be interpreted as square root of a positive define matrix Cholesky factorization 6-6

Cholesky factorization algorithm partition matrices in A = LL T as [ ] [ ][ ] a11 A T 21 l11 0 l11 L = T 21 A 21 A 22 L 21 L 22 0 L T 22 [ l11 = 2 l 11 L T 21 l 11 L 21 L 21 L T 21+L 22 L T 22 ] algorithm 1. determine l 11 and L 21 : 2. compute L 22 from l 11 = a 11, L 21 = 1 l 11 A 21 A 22 L 21 L T 21 = L 22 L T 22 this is a Cholesky factorization of order n 1 Cholesky factorization 6-7

proof that the algorithm works for positive definite A of order n step 1: if A is positive definite then a 11 > 0 step 2: if A is positive definite, then is positive definite (see page 4-23) A 22 L 21 L T 21 = A 22 1 a 11 A 21 A T 21 hence the algorithm works for n = m if it works for n = m 1 it obviously works for n = 1; therefore it works for all n Cholesky factorization 6-8

Example 25 15 5 15 18 0 5 0 11 = l 11 0 0 l 21 l 22 0 l 31 l 32 l 33 l 11 l 21 l 31 0 l 22 l 32 0 0 l 33 first column of L 25 15 5 15 18 0 5 0 11 = 5 0 0 3 l 22 0 1 l 32 l 33 5 3 1 0 l 22 l 32 0 0 l 33 second column of L [ ] [ 18 0 0 11 3 1 [ 9 3 3 10 ] [ ][ ] [ ] l22 0 l22 l 3 1 = 32 l 32 l 33 0 l 33 ] = [ ][ ] 3 0 3 1 1 l 33 0 l 33 Cholesky factorization 6-9

third column of L: 10 1 = l 2 33, i.e., l 33 = 3 conclusion: 25 15 5 15 18 0 5 0 11 = 5 0 0 3 3 0 1 1 3 5 3 1 0 3 1 0 0 3 Cholesky factorization 6-10

Solving equations with positive definite A Ax = b (A positive definite of order n) algorithm factor A as A = LL T solve LL T x = b forward substitution Lz = b back substitution L T x = z cost: (1/3)n 3 flops factorization: (1/3)n 3 forward and backward substitution: 2n 2 Cholesky factorization 6-11

Inverse of a positive definite matrix suppose A is positive definite with Cholesky factorization A = LL T L is invertible (its diagonal elements are nonzero) X = L T L 1 is a right inverse of A: AX = LL T L T L 1 = LL 1 = I X = L T L 1 is a left inverse of A: XA = L T L 1 LL T = L T L T = I hence, A is invertible and A 1 = L T L 1 Cholesky factorization 6-12

Summary if A is positive definite of order n A can be factored as LL T the cost of the factorization is (1/3)n 3 flops Ax = b can be solved in (1/3)n 3 flops A is invertible with inverse: A 1 = L T L 1 Cholesky factorization 6-13

Sparse positive definite matrices a matrix is sparse if most of its elements are zero a matrix is dense if it is not sparse Cholesky factorization of dense matrices (1/3)n 3 flops on a current PC: a few seconds or less, for n up to a few 1000 Cholesky factorization of sparse matrices if A is very sparse, then L is often (but not always) sparse if L is sparse, the cost of the factorization is much less than (1/3)n 3 exact cost depends on n, #nonzero elements, sparsity pattern very large sets of equations (n 10 6 ) are solved by exploiting sparsity Cholesky factorization 6-14

Effect of ordering sparse equation (a is an (n 1)-vector with a < 1) [ ][ ] [ ] 1 a T u b = a I v c factorization [ 1 a T a I ] = [ 1 0 a L 22 ][ 1 a T 0 L T 22 ] where I aa T = L 22 L T 22 = factorization with 100% fill-in Cholesky factorization 6-15

reordered equation [ I a a T 1 ][ v u ] = [ c b ] factorization [ I a a T 1 ] = [ I 0 1 at a a T ][ I a 0 1 a T a ] = factorization with zero fill-in Cholesky factorization 6-16

Permutation matrices a permutation matrix is the identity matrix with its rows reordered, e.g., 0 1 0 1 0 0, 0 1 0 0 0 1 0 0 1 1 0 0 the vector Ax is a permutation of x 0 1 0 0 0 1 x 1 x 2 1 0 0 x 3 = x 2 x 3 x 1 A T x is the inverse permutation applied to x 0 0 1 1 0 0 = 0 1 0 x 1 x 2 x 3 x 3 x 1 x 2 A T A = AA T = I, so A is invertible and A 1 = A T Cholesky factorization 6-17

Solving Ax = b when A is a permutation matrix the solution of Ax = b is x = A T b example 0 1 0 0 0 1 1 0 0 x 1 x 2 x 3 = 1.5 10.0 2.1 solution is x = ( 2.1,1.5,10.0) cost: zero flops Cholesky factorization 6-18

Sparse Cholesky factorization if A is sparse and positive definite, it is usually factored as A = PLL T P T P a permutation matrix; L lower triangular with positive diagonal elements interpretation: we permute the rows and columns of A and factor P T AP = LL T choice of P greatly affects the sparsity L many heuristic methods (that we don t cover) exist for selecting good permutation matrices P Cholesky factorization 6-19

Example sparsity pattern of A Cholesky factor of A pattern of P T AP Cholesky factor of P T AP Cholesky factorization 6-20

Solving sparse positive definite equations solve Ax = b via factorization A = PLL T P T algorithm 1. b := P T b 2. solve Lz = b by forward substitution 3. solve L T y = z by back substitution 4. x := Py interpretation: we solve (P T AP)y = b using the Cholesky factorization of P T AP Cholesky factorization 6-21