Math 2331 Linear Algebra

Similar documents
These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

Systems of Linear Equations

Recall that two vectors in are perpendicular or orthogonal provided that their dot

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Linearly Independent Sets and Linearly Dependent Sets

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Section Continued

Solutions to Math 51 First Exam January 29, 2015

Solving Systems of Linear Equations

1.5 SOLUTION SETS OF LINEAR SYSTEMS

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Linear Algebra Notes

( ) which must be a vector

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Math 312 Homework 1 Solutions

α = u v. In other words, Orthogonal Projection

Vector Spaces 4.4 Spanning and Independence

Row Echelon Form and Reduced Row Echelon Form

Methods for Finding Bases

Solving Linear Systems, Continued and The Inverse of a Matrix

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

by the matrix A results in a vector which is a reflection of the given

Name: Section Registered In:

Homogeneous systems of algebraic equations. A homogeneous (ho-mo-geen -ius) system of linear algebraic equations is one in which

Section 6.4. Lecture 23. Section 6.4 The Centroid of a Region; Pappus Theorem on Volumes. Jiwen He. Department of Mathematics, University of Houston

4.5 Linear Dependence and Linear Independence

NOTES ON LINEAR TRANSFORMATIONS

Linear Equations in Linear Algebra

Arithmetic and Algebra of Matrices

Inner product. Definition of inner product

Solving Systems of Linear Equations Using Matrices

5 Homogeneous systems

Lecture Notes 2: Matrices as Systems of Linear Equations

Lecture 14: Section 3.3

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Lecture 1: Systems of Linear Equations

Similarity and Diagonalization. Similar Matrices

CURVE FITTING LEAST SQUARES APPROXIMATION

Solutions to Homework Section 3.7 February 18th, 2005

A =

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Direct Methods for Solving Linear Systems. Matrix Factorization

8 Square matrices continued: Determinants

Math 215 HW #6 Solutions

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

THE DIMENSION OF A VECTOR SPACE

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Inner Product Spaces and Orthogonality

1 VECTOR SPACES AND SUBSPACES

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Solving Systems of Linear Equations

Vector Math Computer Graphics Scott D. Anderson

9 Multiplication of Vectors: The Scalar or Dot Product

Chapter 6. Orthogonality

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Notes on Determinant

Section 8.2 Solving a System of Equations Using Matrices (Guassian Elimination)

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Orthogonal Diagonalization of Symmetric Matrices

MATH APPLIED MATRIX THEORY

160 CHAPTER 4. VECTOR SPACES

LINEAR ALGEBRA W W L CHEN

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

1 Introduction to Matrices

Chapter 6. Linear Programming: The Simplex Method. Introduction to the Big M Method. Section 4 Maximization and Minimization with Problem Constraints

Orthogonal Projections

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

5.3 The Cross Product in R 3

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Mathematics Course 111: Algebra I Part IV: Vector Spaces

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Chapter 20. Vector Spaces and Bases

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

MATH1231 Algebra, 2015 Chapter 7: Linear maps

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

2.2/2.3 - Solving Systems of Linear Equations

Using row reduction to calculate the inverse and the determinant of a square matrix

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Systems of Equations

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

Applied Linear Algebra

Similar matrices and Jordan form

MAT188H1S Lec0101 Burbulla

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Lecture 4: Partitioned Matrices and Determinants

Typical Linear Equation Set and Corresponding Matrices

1.2 Solving a System of Linear Equations

ISOMETRIES OF R n KEITH CONRAD

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Inner products on R n, and more

Subspaces of R n LECTURE Subspaces

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Transcription:

1.4 The Matrix Equation Ax = b Math 2331 Linear Algebra 1.4 The Matrix Equation Ax = b Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2331 Jiwen He, University of Houston Math 2331, Linear Algebra 1 / 15

1.4 The Matrix Equation Ax = b Matrix-Vector Multiplication Linear Combination of the Columns Matrix Equation Three Equivalent Ways of Viewing a Linear System Existence of Solution Matrix Equation Equivalent Theorem Another method for computing Ax Row-Vector Rule Jiwen He, University of Houston Math 2331, Linear Algebra 2 / 15

Matrix-Vector Multiplication Key Concepts to Master Matrix-Vector Multiplication Linear combinations can be viewed as a matrix-vector multiplication. If A is an m n matrix, with columns a 1, a 2,..., a n, and if x is in R n, then the product of A and x, denoted by Ax, is the linear combination of the columns of A using the corresponding entries in x as weights. i.e., x 1 Ax = [ ] x 2 a 1 a 2 a n. = x 1 a 1 + x 2 a 2 + + x n a n x n Jiwen He, University of Houston Math 2331, Linear Algebra 3 / 15

Matrix-Vector Multiplication: Examples Example 1 4 3 2 0 5 [ 7 21 0 7 6 ] = 7 + 24 12 30 1 3 0 + 6 = 31 9 30 4 2 5 = Jiwen He, University of Houston Math 2331, Linear Algebra 4 / 15

Matrix-Vector Multiplication: Examples Example Write down the system of equations corresponding to the augmented matrix below and then express the system of equations in vector form and finally in the form Ax = b where b is a 3 1 vector. [ 2 3 4 9 3 1 0 2 ] Solution: Corresponding system of equations (fill-in) Vector Equation: [ ] 2 + 3 Matrix equation (fill-in): [ 3 1 ] + [ 4 0 ] [ = 9 2 Jiwen He, University of Houston Math 2331, Linear Algebra 5 / 15 ].

Matrix Equation Three Equivalent Ways of Viewing a Linear System 1 as a system of linear equations; 2 as a vector equation x 1 a 1 + x 2 a 2 + + x n a n = b; or 3 as a matrix equation Ax = b. Useful Fact The equation Ax = b has a solution if and only if b is a of the columns of A. Jiwen He, University of Houston Math 2331, Linear Algebra 6 / 15

Matrix Equation: Theorem Theorem If A is a m n matrix, with columns a 1,..., a n, and if b is in R m, then the matrix equation Ax = b has the same solution set as the vector equation x 1 a 1 + x 2 a 2 + + x n a n = b which, in turn, has the same solution set as the system of linear equations whose augmented matrix is [ a1 a 2 a n b ]. Jiwen He, University of Houston Math 2331, Linear Algebra 7 / 15

Matrix Equation: Example Example Let A = 1 4 5 3 11 14 2 8 10 and b = Is the equation Ax = b consistent for all b? b 1 b 2 b 3. Solution: Augmented matrix corresponding to Ax = b: 1 4 5 b 1 3 11 14 b 2 2 8 10 b 3 1 4 5 b 1 0 1 1 3b 1 + b 2 0 0 0 2b 1 + b 3 Ax = b is consistent for all b since some choices of b make 2b 1 + b 3 nonzero. Jiwen He, University of Houston Math 2331, Linear Algebra 8 / 15

Matrix Equation: Example (cont.) A = 1 4 5 3 11 14 2 8 10 a 1 a 2 a 3 The equation Ax = b is consistent if 2b 1 + b 3 = 0. (equation of a plane in R 3 ) x 1 a 1 + x 2 a 3 + x 3 a 3 = b if and only if b 3 2b 1 = 0. Columns of A span a plane in R 3 through 0 Instead, if any b in R 3 (not just those lying on a particular line or in a plane) can be expressed as a linear combination of the columns of A, then we say that the columns of A span R 3. Jiwen He, University of Houston Math 2331, Linear Algebra 9 / 15

Matrix Equation: Span R m Definition We say that the columns of A = [ a 1 a 2 a p ] span R m if every vector b in R m is a linear combination of a 1,..., a p (i.e. Span{a 1,..., a p } = R m ). Theorem (4) Let A be an m n matrix. Then the following statements are logically equivalent: 1 For each b in R m, the equation Ax = b has a solution. 2 Each b in R m is a linear combination of the columns of A. 3 The columns of A span R m. 4 A has a pivot position in every row. Jiwen He, University of Houston Math 2331, Linear Algebra 10 / 15

Matrix Equation: Proof of Theorem Proof (outline): Statements (a), (b) and (c) are logically equivalent. To complete the proof, we need to show that (a) is true when (d) is true and (a) is false when (d) is false. Suppose (d) is. Then row-reduce the augmented matrix [ A b ] : [A b] [U d] and each row of U has a pivot position and so there is no pivot in the last column of [U d]. So (a) is. Jiwen He, University of Houston Math 2331, Linear Algebra 11 / 15

Matrix Equation: Proof of Theorem (cont.) Now suppose (d) is. Then the last row of [U d] contains all zeros. Suppose d is a vector with a 1 as the last entry. Then [U d] represents an inconsistent system. Row operations are reversible: [U d] [A b] = [A b] is inconsistent also. So (a) is. Jiwen He, University of Houston Math 2331, Linear Algebra 12 / 15

Matrix Equation: Example Example Let A = 1 2 3 4 5 6 and b = consistent for all possible b? b 1 b 2 b 3. Is the equation Ax = b Solution: A has only columns and therefore has at most pivots. Since A does not have a pivot in every is for all possible b, according to Theorem 4., Ax = b Jiwen He, University of Houston Math 2331, Linear Algebra 13 / 15

Matrix Equation: Example Example Do the columns of A = 1 2 3 2 4 6 0 3 9 span R 3? Solution: 1 2 3 2 4 6 0 3 9 (no pivot in row 2) By Theorem 4, the columns of A Jiwen He, University of Houston Math 2331, Linear Algebra 14 / 15.

Another method for computing Ax: Row-Vector Rule Another method for computing Ax: Row-Vector Rule Read Example 4 on page 37 through Example 5 on page 38 to learn this rule for computing the product Ax. Theorem If A is an m n matrix, u and v are vectors in R n, and c is a scalar, then: 1 A (u + v) = Au + Av; 2 A (cu) = cau. Jiwen He, University of Houston Math 2331, Linear Algebra 15 / 15