REVIEW Lecture 6: Spring 215 Lecture 7 Direct Methods for solving linear algebraic equations LU decomposition/factorization Separates time-consuming elimination for A from that for b / B Derivation, assuming no pivoting needed: Number of Ops: Same as for Gauss Elimination Pivoting: Use pivot element pointer vector Variations: Doolittle and Crout decompositions, Matrix Inverse Error Analysis for Linear Systems Matrix norms Condition Number for Perturbed RHS and LHS: Special Matrices: Intro PFJL Lecture 7, 1
TODAY (Lecture 7): Systems of Linear Equations III Direct Methods Gauss Elimination LU decomposition/factorization Error Analysis for Linear Systems Special Matrices: LU Decompositions Tri-diagonal systems: Thomas Algorithm General Banded Matrices Algorithm, Pivoting and Modes of storage Sparse and Banded Matrices Symmetric, positive-definite Matrices Iterative Methods Definitions and Properties, Choleski Decomposition Jacobi s method Gauss-Seidel iteration Convergence PFJL Lecture 7, 2
Reading Assignment Chapters 11 of Chapra and Canale, Numerical Methods for Engineers, 26/21/214. Any chapter on Solving linear systems of equations in references on CFD references provided. For example: chapter 5 of J. H. Ferziger and M. Peric, Computational Methods for Fluid Dynamics. Springer, NY, 3 rd edition, 22 PFJL Lecture 7, 3
Special Matrices Certain Matrices have particular structures that can be exploited, i.e. Reduce number of ops and memory needs Banded Matrices: Square banded matrix that has all elements equal to zero, excepted for a band around the main diagonal. Frequent in engineering and differential equations: Tri-diagonal Matrices Wider bands for higher-order schemes Gauss Elimination or LU decomposition inefficient because, if pivoting is not necessary, all elements outside of the band remain zero (but direct GE/LU would manipulate these zero elements anyway) Symmetric Matrices Iterative Methods: Employ initial guesses, than iterate to refine solution Can be subject to round-off errors PFJL Lecture 7, 4
Special Matrices: Tri-diagonal Systems Example Forced Vibration of a String f(x,t) Example of a travelling pluse: x i Y(x,t) Consider the case of a Harmonic excitation f(x,t) =- f(x) cos(wt) Applying Newton s law leads to the wave equation: With separation of variables, one obtains the equation for modal amplitudes, see eq. (1) below: 2 Ytt c Yxx f ( x, t) Y ( x, t) ( t) y( x) Differential Equation for the amplitude: (1) Boundary Conditions: PFJL Lecture 7, 5
Special Matrices: Tri-diagonal Systems Forced Vibration of a String f(x,t) x i Y(x,t) Harmonic excitation f(x,t) = f(x) cos(wt) Finite Difference + O(h 2 ) Discrete Difference Equations + Matrix Form: Differential Equation: (1) y Boundary Conditions: Tridiagonal Matrix If kh 1 or kh 3symmetric, negative or positive definite: No pivoting needed Note: for < kh <1 Negative definite => Write: A =-A and y' y ' to render matrix positive definite PFJL Lecture 7, 6
Special Matrices: Tri-diagonal Systems General Tri-diagonal Systems: Bandwidth of 3 LU Decomposition Three steps for LU scheme: 1. Decomposition (GE): 2. Forward substitution 3. Backward substitution PFJL Lecture 7, 7
Special Matrices: Tri-diagonal Systems Thomas Algorithm By identification with the general LU decomposition, one obtains, 1. Factorization/Decomposition 2. Forward Substitution 3. Back Substitution i Number of Operations: Thomas Algorithm LU Factorization: 3*(n-1) operations Forward substitution: 2*(n-1) operations Back substitution: 3*(n-1)+1 operations Total: 8*(n-1) ~ O(n) operations PFJL Lecture 7, 8
Special Matrices: General, Banded Matrix p General Banded Matrix (p q) q Banded Symmetric Matrix (p = q = b) p super-diagonals q sub-diagonals w = p + q + 1 bandwidth w = 2 b + 1 is called the bandwidth b is the half-bandwidth PFJL Lecture 7, 9
i j Special Matrices: General, Banded Matrix LU Decomposition via Gaussian Elimination If No Pivoting: the zeros are preserved j p i q = = m ij a ( j) ij ( ) a jj if j i or i j q j u a (i i 1) ij ij ij m i,i a ) a ( (i 1) 1 i 1, j u ij if i j or j i p (as gen. case) (banded) (as gen. case) (banded) PFJL Lecture 7, 1
Consider pivoting the 2 rows as below: p Special Matrices: General, Banded Matrix LU Decomposition via Gaussian Elimination With Partial Pivoting (by rows): q q Then, the bandwidth of L remains unchanged, m if j i or i j q ij = q w p but the bandwidth of U becomes as that of A u if i j or j i p q ij w = p + 2 q +1 bandwidth PFJL Lecture 7, 11
Special Matrices: General, Banded Matrix Compact Storage p Diagonal (length n) q p q Needed for Pivoting only q i i n q j Matrix size: n 2 j =j i+ q Matrix size: n (p+2q+1) PFJL Lecture 7, 12
Special Matrices: Sparse and Banded Matrix Skyline Systems (typically for symmetric matrices) Skyline.. Storage.. Pointers 1 4 9 11 16 2.. Skyline storage applicable when no pivoting is needed, e.g. for banded, symmetric, and positive definite matrices: FEM and FD methods. Skyline solvers are usually based on Cholesky factorization (which preserves the skyline) PFJL Lecture 7, 13
Special Matrices: Symmetric (Positive-Definite) Matrix Symmetric Coefficient Matrices: If no pivoting, the matrix remains symmetric after Gauss Elimination/LU decompositions a a ( k) ( k) ij ji ( k 1) ( k 1) ij ji Proof: Show that if then using: a a Gauss Elimination symmetric (use only the upper triangular portion of A): a a m a ( k 1) ( k ) ( k ) ij ij ik kj a mik, i k 1, k 2,..., n j i, i 1,..., n a ( k ) ki ( k ) kk About half the total number of ops than full GE PFJL Lecture 7, 14
Special Matrices: Symmetric, Positive Definite Matrix 1. Sylvester Criterion: A symmetric matrix is Positive Definite if and only if: det(a k ) > for k=1,2,,n, where A k is matrix of k first lines/columns Symmetric Positive Definite matrices frequent in engineering 2. For a symmetric positive definite A, one thus has the following properties a) The maximum elements of A are on the main diagonal b) For a Symmetric, Positive Definite A: No pivoting needed ( k 1) ( k) 2 c) The elimination is stable: a 2 a. To show this, use a a a in a a m a ( k 1) ( k ) ( k ) ij ij ik kj a mik, i k 1, k 2,..., n j i, i 1,..., n a ( k ) ki ( k ) kk ii ii kj kk jj PFJL Lecture 7, 15
Special Matrices: Symmetric, Positive Definite Matrix The general GE becomes: a a m a ( k 1) ( k ) ( k ) ij ij ik kj a mik, i k 1, k 2,..., n j i, i 1,..., n a ( k ) ki ( k ) kk Choleski Factorization Complex Conjugate where No pivoting needed Complex Conjugate and Transpose PFJL Lecture 7, 16
Linear Systems of Equations: Iterative Methods Sparse (large) Full-bandwidth Systems (frequent in practice) x x x x x ps: B and c could be function of k (non-stationary) x x x x x Iterative Methods are then efficient Analogous to iterative methods obtained for roots of equations, i.e. Open Methods: Fixed-point, Newton-Raphson, Secant Example of Iteration equation A x b A x b x x A x b k 1 k k k x x A x b A I x b General Stationary Iteration Formula x B x c k 1 k k ( ),1,2,... Compatibility condition for Ax=b to be the solution: Write c C b 1 ( ) or 1 1 A b BA b Cb I B A C B I CA PFJL Lecture 7, 17
Linear Systems of Equations: Iterative Methods Convergence Convergence Convergence Analysis Iteration Matrix form Sufficient Condition for Convergence: PFJL Lecture 7, 18
B <1 for a chosen matrix norm Infinite norm often used in practice Maximum Column Sum Maximum Row Sum The Frobenius norm (also called Euclidean norm), which for matrices differs from: The l-2 norm (also called spectral norm) PFJL Lecture 7, 19
Linear Systems of Equations: Iterative Methods Convergence: Necessary and Sufficient Condition Convergence Convergence Analysis Iteration Matrix form Necessary and Sufficient Condition for Convergence: Spectral radius of B is smaller than one: (proof: use eigendecomposition of B) ( B) max i 1, where i eigenvalue( Bn n) i 1... n (This ensures B <1) PFJL Lecture 7, 2
MIT OpenCourseWare http://ocw.mit.edu Spring 215 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.