Linearly Independent Sets and Linearly Dependent Sets



Similar documents
Row Echelon Form and Reduced Row Echelon Form

Section Continued

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Linear Algebra Notes

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Lecture Notes 2: Matrices as Systems of Linear Equations

( ) which must be a vector

Orthogonal Diagonalization of Symmetric Matrices

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Vector Spaces 4.4 Spanning and Independence

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

by the matrix A results in a vector which is a reflection of the given

Solutions to Math 51 First Exam January 29, 2015

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

Similarity and Diagonalization. Similar Matrices

Systems of Linear Equations

Linear Equations in Linear Algebra

1.5 SOLUTION SETS OF LINEAR SYSTEMS

MAT 242 Test 2 SOLUTIONS, FORM T

Solving Systems of Linear Equations

Name: Section Registered In:

Solving Systems of Linear Equations Using Matrices

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

LS.6 Solution Matrices

Solving Linear Systems, Continued and The Inverse of a Matrix

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

5.5. Solving linear systems by the elimination method

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

4.5 Linear Dependence and Linear Independence

THE DIMENSION OF A VECTOR SPACE

160 CHAPTER 4. VECTOR SPACES

Notes on Determinant

Partial Fractions. Combining fractions over a common denominator is a familiar operation from algebra:

Solving Systems of Linear Equations

Zeros of Polynomial Functions

The Method of Partial Fractions Math 121 Calculus II Spring 2015

Algebraic Concepts Algebraic Concepts Writing

Zeros of a Polynomial Function

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Methods for Finding Bases

Subspaces of R n LECTURE Subspaces

Numerical Analysis Lecture Notes

Inner Product Spaces

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are

1 VECTOR SPACES AND SUBSPACES

Math Common Core Sampler Test

Chapter 20. Vector Spaces and Bases

Section 8.2 Solving a System of Equations Using Matrices (Guassian Elimination)

1 Determinants and the Solvability of Linear Systems

Method To Solve Linear, Polynomial, or Absolute Value Inequalities:

Chapter 6. Linear Programming: The Simplex Method. Introduction to the Big M Method. Section 4 Maximization and Minimization with Problem Constraints

NOTES ON LINEAR TRANSFORMATIONS

A =

CURVE FITTING LEAST SQUARES APPROXIMATION

Similar matrices and Jordan form

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Vector Math Computer Graphics Scott D. Anderson

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Equations, Inequalities & Partial Fractions

Systems of Equations

Inner Product Spaces and Orthogonality

Zeros of Polynomial Functions

5. Linear algebra I: dimension

1 Review of Least Squares Solutions to Overdetermined Systems

8 Square matrices continued: Determinants

ALGEBRA 2 CRA 2 REVIEW - Chapters 1-6 Answer Section

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Examination paper for TMA4115 Matematikk 3

Linear Programming Problems

Figure 1.1 Vector A and Vector F

1.2 Solving a System of Linear Equations

Chapter 6. Orthogonality

Sample Problems. Practice Problems

Biggar High School Mathematics Department. National 5 Learning Intentions & Success Criteria: Assessing My Progress

Zeros of Polynomial Functions

CHAPTER 2. Eigenvalue Problems (EVP s) for ODE s

Introduction to Matrix Algebra

T ( a i x i ) = a i T (x i ).

MATH APPLIED MATRIX THEORY

2.3. Finding polynomial functions. An Introduction:

LINEAR ALGEBRA W W L CHEN

Chapter 17. Orthogonal Matrices and Symmetries of Space

QUADRATIC, EXPONENTIAL AND LOGARITHMIC FUNCTIONS

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

MA106 Linear Algebra lecture notes

Linear Algebra Review. Vectors

Zero: If P is a polynomial and if c is a number such that P (c) = 0 then c is a zero of P.

Math 312 Homework 1 Solutions

Question 2: How do you solve a matrix equation using the matrix inverse?

8. Linear least-squares

IEOR 4404 Homework #2 Intro OR: Deterministic Models February 14, 2011 Prof. Jay Sethuraman Page 1 of 5. Homework #2

Indiana State Core Curriculum Standards updated 2009 Algebra I

Numerical Analysis Lecture Notes

5 Homogeneous systems

Mathematics Pre-Test Sample Questions A. { 11, 7} B. { 7,0,7} C. { 7, 7} D. { 11, 11}

6. Cholesky factorization

Transcription:

These notes closely follow the presentation of the material given in David C. Lay s textbook Linear Algebra and its Applications (3rd edition). These notes are intended primarily for in-class presentation and should not be regarded as a substitute for thoroughly reading the textbook itself and working through the exercises therein. Linearly Independent Sets and Linearly Dependent Sets Definition An indexed set of vectors v,v 2,,v k in a vector space V is said to be linearly independent if the vector equation c v c 2 v 2 c k v k has only the trivial solution (c c 2 c k ). If the set of vectors v,v 2,,v k is not linearly independent, then it is said to be linearly dependent. If the of vectors v,v 2,,v k is linearly dependent, then there exist scalars, c,c 2,,c k, at least one of which is not equal to, such that c v c 2 v 2 c k v k. An equation such as the above (with not all of the scalar coefficient equal to ) is called a linear dependence relation. Example Let f : be the function f x sin x and let g : be the function g x cos x. Then, f and g are vectors in the vector space C. Is the set of functions f,g linearly independent or linearly dependent? If this set is linearly dependent, then give a linear dependence relation for the set.

Example Let f : be the function f x 9sin 2x and let g : be the function g x 8sin x cos x. Then, f and g are vectors in the vector space C. Is the set of functions f,g linearly independent or linearly dependent? If this set is linearly dependent, then give a linear dependence relation for the set. 2

Example Let p,p 2, and p 3 be the polynomial functions (with domain ) defined by p t 3t 2 5t 3 p 2 t 2t 2 4t 8 p 3 t 6t 2 2t 8. These functions are vectors in the vector space P 2. Is the set of vectors p,p 2,p 3 linearly independent or linearly dependent? If this set is linearly dependent, then give a linear dependence relation for the set. Solution We need to consider the vector equation c p c 2 p 2 c 3 p 3 z where z is the zero vector of P 2. (In other words, z : is the function defined by z t for all t.) The above equation is equivalent to c p t c 2 p 2 t c 3 p 3 t z t for all t or c 3t 2 5t 3 c 2 2t 2 4t 8 c 3 6t 2 2t 8 for all t. Rearrangement of the above equation gives 3c 2c 2 6c 3 t 2 5c 4c 2 2c 3 t 3c 8c 2 8c 3 for all t. The above equation states that a certain quadratic function is equal to the zero function. This can be true only if all of the coefficients of this quadratic function are equal to zero. Therefore, we must have 3c 2c 2 6c 3 5c 4c 2 2c 3 3c 8c 2 8c 3. Solving this linear system in the usual way, 3 2 6 5 4 2 3 8 8 2 3 3 ~, we see that there are non trivial solutions. One such solution is c 2 c 2 c 3 3. Our conclusion is that the set of functions p,p 2,p 3 is linearly dependent and that a linear dependence relation for this set is 2p p 2 3p 3 z. 3

Theorem An indexed set, v,v 2,,v k, of two or more vectors, with v, is linearly dependent if and only if some v j (with j ) is a linear combination of the preceding vectors, v,v 2,,v j. 4

Bases Definition A basis for a vector space, V, is a set of vectors in V that is linearly independent and spans V. Example The set of vectors e,e 2 where e and e 2. is a basis for 2. 5

Example Is the set of vectors v,v 2,v 3 where a basis for 3? Solution Observe that v 2 3, v 2 2 6 3 3 ~ 6 3, v 3 By the Square Matrix Theorem, we conclude that the set of vectors v,v 2,v 3 is linearly independent and that this set of vectors also spans 3. Thus, this set of vectors is a basis for 3.. 6

The above example suggests a theorem that follows immediately from the Square Matrix Theorem: Theorem If v,v 2,,v n is a linearly independent set (consisting of exactly n vectors) in n, then this set of vectors is a basis for n. Also, if v,v 2,,v n is a set (consisting of exactly n vectors) in n and this set of vectors spans n, then this set of vectors is a basis for n. What if we have a set of vectors in n that consists of fewer or more than n vectors? Can such a set possibly be a basis for n? The answer is No. If we have a set v,v 2,,v k of k vectors in n where k n, then the matrix A v v 2 v k has more rows than columns. Thus, not every row of A can be a pivot row, and it is not possible for the set v,v 2,,v k to span n. If we have a set v,v 2,,v k of k vectors in n where k n, then the matrix A v v 2 v k has more columns than rows. Thus, not every column of A can be a pivot column, and it is not possible for the set v,v 2,,v k to be linearly independent. (Another way of looking at this is that the set v,v 2,,v k contains more vectors than there are entries in each vector, so the set must be linearly dependent.) From the above considerations, we conclude: Theorem Every basis for n contains exactly n vectors. Note that the above theorem does not say that any set of n vectors in n is a basis for n. It is certainly possible that a set of n vectors in n might be linearly dependent. However, any linearly independent set of vectors in n must also span n and, conversely, and set of n vectors that spans n must also be linearly independent. 7

Example Let p,p 2, and p 3 be the polynomial functions (with domain ) defined by p t 3t 2 5t 3 p 2 t 2t 2 4t 8 p 3 t 6t 2 2t. Is the set of vectors p,p 2,p 3 a basis for P 2? If so, write the function p P 2 defined by p t t 2 6t 7 as a linear combination of the functions in the set p,p 2,p 3. Solution In order to solve the vector equation c p c 2 p 2 c 3 p 3 p, we need to consider the augmented matrix which is equivalent to 3 5 3 2 4 8 6 6 2 7 9 96 7 32 29 24 We observe that the set of vectors p,p 2,p 3 is a basis for P 2 and that 9 96 p 32 7 p 2 29 24 p 3 p.. 8

Bases for the Nullspace and the Column Space of a Matrix Example Find bases for the nullspace and the column space of the matrix 4 2 3 2 5 5 A. 2 8 3 2 5 2 2 8 8 Use the fact that 4 2 A~. 9

Theorem The pivot columns of a matrix, A, form a basis for the column space of A. Note: It is the pivot columns of A, not the pivot columns of rref A, that form a basis for the column space of A!