# 4.5 Linear Dependence and Linear Independence

Size: px
Start display at page:

Transcription

1 4.5 Linear Dependence and Linear Independence {v 1, v 2 }, where v 1, v 2 are collinear vectors in R Prove that if S and S are subsets of a vector space V such that S is a subset of S, then span(s) is a subset of span(s ). 34. Prove that span{v 1, v 2, v 3 }=span{v 1, v 2 } if and only if v 3 can be written as a linear combination of v 1 and v Linear Dependence and Linear Independence As indicated in the previous section, in analyzing a vector space we will be interested in determining a spanning set. The reader has perhaps already noticed that a vector space V can have many such spanning sets. Example Observe that {(1, 0), (0, 1)}, {(1, 0), (1, 1)}, and {(1, 0), (0, 1), (1, 2)} are all spanning sets for R 2. As another illustration, two different spanning sets for V = M 2 (R) were given in Example and the remark that followed. Given the abundance of spanning sets available for a given vector space V, we are faced with a natural question: Is there a best class of spanning sets to use? The answer, to a large degree, is yes. For instance, in Example 4.5.1, the spanning set {(1, 0), (0, 1), (1, 2)} contains an extra vector, (1, 2), which seems to be unnecessary for spanning R 2, since {(1, 0), (0, 1)} is already a spanning set. In some sense, {(1, 0), (0, 1)} is a more efficient spanning set. It is what we call a minimal spanning set, since it contains the minimum number of vectors needed to span the vector space. 3 But how will we know if we have found a minimal spanning set (assuming one exists)? Returning to the example above, we have seen that span{(1, 0), (0, 1)} =span{(1, 0), (0, 1), (1, 2)} =R 2. Observe that the vector (1, 2) is already a linear combination of (1, 0) and (0, 1), and therefore it does not add any new vectors to the linear span of {(1, 0), (0, 1)}. As a second example, consider the vectors v 1 = (1, 1, 1), v 2 = (3, 2, 1), and v 3 = 4v 1 + v 2 = (7, 2, 5). It is easily verified that det([v 1, v 2, v 3 ]) = 0. Consequently, the three vectors lie in a plane (see Figure 4.5.1) and therefore, since they are not collinear, the linear span of these three vectors is the whole of this plane. Furthermore, the same plane is generated if we consider the linear span of v 1 and v 2 alone. As in the previous example, the reason that v 3 does not add any new vectors to the linear span of {v 1, v 2 } is that it is already a linear combination of v 1 and v 2. It is not possible, however, to generate all vectors in the plane by taking linear combinations of just one vector, as we could generate only a line lying in the plane in that case. Consequently, {v 1, v 2 } is a minimal spanning set for the subspace of R 3 consisting of all points lying on the plane. As a final example, recall from Example that the solution space to the differential equation y + y = 0 3 Since a single (nonzero) vector in R 2 spans only the line through the origin along which it points, it cannot span all of R 2 ; hence, the minimum number of vectors required to span R 2 is 2.

2 268 CHAPTER 4 Vector Spaces z (7, 2, 5) v 3 4v 1 v 2 (3, 2, 0) x (3, 2, 1) v 2 v 1 (1, 1, 1) (1, 1, 0) (7, 2, 0) y Figure 4.5.1: v 3 = 4v 1 + v 2 lies in the plane through the origin containing v 1 and v 2, and so, span{v 1, v 2, v 3 }=span{v 1, v 2 }. can be written as span{y 1,y 2 }, where y 1 (x) = cos x and y 2 (x) = sin x. However,ifwe let y 3 (x) = 3 cos x 2sinx, for instance, then {y 1,y 2,y 3 } is also a spanning set for the solution space of the differential equation, since span{y 1,y 2,y 3 }={c 1 cos x + c 2 sin x + c 3 (3 cos x 2sinx) : c 1,c 2,c 3 R} ={(c 1 + 3c 3 ) cos x + (c 2 2c 3 ) sin x : c 1,c 2,c 3 R} ={d 1 cos x + d 2 sin x : d 1,d 2 R} = span{y 1,y 2 }. The reason that {y 1,y 2,y 3 } is not a minimal spanning set for the solution space is that y 3 is a linear combination of y 1 and y 2, and therefore, as we have just shown, it does not add any new vectors to the linear span of {cos x,sin x}. More generally, it is not too difficult to extend the argument used in the preceding examples to establish the following general result. Theorem Let {v 1, v 2,...,v k } be a set of at least two vectors in a vector space V. If one of the vectors in the set is a linear combination of the other vectors in the set, then that vector can be deleted from the given set of vectors and the linear span of the resulting set of vectors will be the same as the linear span of {v 1, v 2,...,v k }. Proof The proof of this result is left for the exercises (Problem 48). For instance, if v 1 is a linear combination of v 2, v 3,...,v k, then Theorem says that span{v 1, v 2,...,v k }=span{v 2, v 3,...,v k }. In this case, the set {v 1, v 2,...,v k } is not a minimal spanning set. To determine a minimal spanning set, the problem we face in view of Theorem is that of determining when a vector in {v 1, v 2,...,v k } can be expressed as a linear combination of the remaining vectors in the set. The correct formulation for solving this problem requires the concepts of linear dependence and linear independence, which we are now ready to introduce. First we consider linear dependence.

3 4.5 Linear Dependence and Linear Independence 269 DEFINITION A finite nonempty set of vectors {v 1, v 2,...,v k } in a vector space V issaidtobe linearly dependent if there exist scalars c 1, c 2,..., c k, not all zero, such that c 1 v 1 + c 2 v 2 + +c k v k = 0. Such a nontrivial linear combination of vectors is sometimes referred to as a linear dependency among the vectors v 1, v 2,...,v k. A set of vectors that is not linearly dependent is called linearly independent. This can be stated mathematically as follows: DEFINITION A finite, nonempty set of vectors {v 1, v 2,...,v k } in a vector space V is said to be linearly independent if the only values of the scalars c 1,c 2,...,c k for which are c 1 = c 2 = =c k = 0. c 1 v 1 + c 2 v 2 + +c k v k = 0 Remarks 1. It follows immediately from the preceding two definitions that a nonempty set of vectors in a vector space V is linearly independent if and only if it is not linearly dependent. 2. If {v 1, v 2,...,v k } is a linearly independent set of vectors, we sometimes informally say that the vectors v 1, v 2,...,v k are themselves linearly independent. The same remark applies to the linearly dependent condition as well. Consider the simple case of a set containing a single vector v. Ifv = 0, then {v} is linearly dependent, since for any nonzero scalar c 1, c 1 0 = 0. On the other hand, if v 0, then the only value of the scalar c 1 for which c 1 v = 0 is c 1 = 0. Consequently, {v} is linearly independent. We can therefore state the next theorem. Theorem A set consisting of a single vector v in a vector space V is linearly dependent if and only if v = 0. Therefore, any set consisting of a single nonzero vector is linearly independent. We next establish that linear dependence of a set containing at least two vectors is equivalent to the property that we are interested in namely, that at least one vector in the set can be expressed as a linear combination of the remaining vectors in the set.

4 270 CHAPTER 4 Vector Spaces Theorem Let {v 1, v 2,...,v k } be a set of at least two vectors in a vector space V. Then {v 1, v 2,...,v k } is linearly dependent if and only if at least one of the vectors in the set can be expressed as a linear combination of the others. Proof If {v 1, v 2,...,v k } is linearly dependent, then according to Definition 4.5.3, there exist scalars c 1,c 2,...,c k, not all zero, such that c 1 v 1 + c 2 v 2 + +c k v k = 0. Suppose that c i 0. Then we can express v i as a linear combination of the other vectors as follows: v i = 1 c i (c 1 v 1 + c 2 v 2 + +c i 1 v i 1 + c i+1 v i+1 + +c k v k ). Conversely, suppose that one of the vectors, say, v j, can be expressed as a linear combination of the remaining vectors. That is, v j = c 1 v 1 + c 2 v 2 + +c j 1 v j 1 + c j+1 v j+1 + +c k v k. Adding ( 1)v j to both sides of this equation yields c 1 v 1 + c 2 v 2 + +c j 1 v j 1 v j + c j+1 v j+1 + +c k v k = 0. y v 2 v 3 v 1 Figure 4.5.2: The set of vectors {v 1, v 2, v 3 } is linearly dependent in R 2, since v 3 is a linear combination of v 1 and v 2. x Since the coefficient of v j is 1 0, the set of vectors {v 1, v 2,...,v k } is linearly dependent. As far as the minimal-spanning-set idea is concerned, Theorems and tell us that a linearly dependent spanning set for a (nontrivial) vector space V cannot be a minimal spanning set. On the other hand, we will see in the next section that a linearly independent spanning set for V must be a minimal spanning set for V. For the remainder of this section, however, we focus more on the mechanics of determining whether a given set of vectors is linearly independent or linearly dependent. Sometimes this can be done by inspection. For example, Figure illustrates that any set of three vectors in R 2 is linearly dependent. As another example, let V be the vector space of all functions defined on an interval I.If f 1 (x) = 1, f 2 (x) = 2sin 2 x, f 3 (x) = 5 cos 2 x, then {f 1,f 2,f 3 } is linearly dependent in V, since the identity sin 2 x + cos 2 x = 1 implies that for all x I, f 1 (x) = 1 2 f 2(x) 1 5 f 3(x). We can therefore conclude from Theorem that span{1, 2sin 2 x, 5 cos 2 x}=span{2sin 2 x, 5 cos 2 x}. In relatively simple examples, the following general results can be applied. They are a direct consequence of the definition of linearly dependent vectors and are left for the exercises (Problem 49). Proposition Let V be a vector space. 1. Any set of two vectors in V is linearly dependent if and only if the vectors are proportional.

5 4.5 Linear Dependence and Linear Independence Any set of vectors in V containing the zero vector is linearly dependent. Remark We emphasize that the first result in Proposition holds only for the case of two vectors. It cannot be applied to sets containing more than two vectors. Example If v 1 = (1, 2, 9) and v 2 = ( 2, 4, 18), then {v 1, v 2 } is linearly dependent in R 3, since v 2 = 2v 1. Geometrically, v 1 and v 2 lie on the same line. Example Example If A 1 = [ ] 21, A 34 2 = [ ] 00, A 00 3 = [ ] 25, 3 2 then {A 1,A 2,A 3 } is linearly dependent in M 2 (R), since it contains the zero vector from M 2 (R). For more complicated situations, we must resort to Definitions and 4.5.4, although conceptually it is always helpful to keep in mind that the essence of the problem we are solving is to determine whether a vector in a given set can be expressed as a linear combination of the remaining vectors in the set. We now give some examples to illustrate the use of Definitions and If v 1 = (1, 2, 1) v 2 = (2, 1, 1), and v 3 = (8, 1, 1), show that {v 1, v 2, v 3 } is linearly dependent in R 3, and determine the linear dependency relationship. Solution: We must first establish that there are values of the scalars c 1,c 2,c 3, not all zero, such that Substituting for the given vectors yields That is, c 1 v 1 + c 2 v 2 + c 3 v 3 = 0. (4.5.1) c 1 (1, 2, 1) + c 2 (2, 1, 1) + c 3 (8, 1, 1) = (0, 0, 0). (c 1 + 2c 2 + 8c 3, 2c 1 c 2 + c 3, c 1 + c 2 + c 3 ) = (0, 0, 0). Equating corresponding components on either side of this equation yields c 1 + 2c 2 + 8c 3 = 0, 2c 1 c 2 + c 3 = 0, c 1 + c 2 + c 3 = 0. The reduced row-echelon form of the augmented matrix of this system is Consequently, the system has an infinite number of solutions for c 1,c 2,c 3, so the vectors are linearly dependent. In order to determine a specific linear dependency relationship, we proceed to find c 1,c 2, and c 3. Setting c 3 = t, wehavec 2 = 3t and c 1 = 2t. Taking t = 1 and

6 272 CHAPTER 4 Vector Spaces substituting these values for c 1,c 2,c 3 into (4.5.1), we obtain the linear dependency relationship 2v 1 3v 2 + v 3 = 0, or equivalently, v 1 = 3 2 v v 3, which can be easily verified using the given expressions for v 1, v 2, and v 3. It follows from Theorem that span{v 1, v 2, v 3 }=span{v 2, v 3 }. Geometrically, we can conclude that v 1 lies in the plane determined by the vectors v 2 and v 3. Example Determine whether the following matrices are linearly dependent or linearly independent in M 2 (R): [ ] [ ] [ ] A 1 =, A =, A 03 3 =. 2 1 Solution: The condition for determining whether these vectors are linearly dependent or linearly independent, c 1 A 1 + c 2 A 2 + c 3 A 3 = 0 2, is equivalent in this case to which is satisfied if and only if [ ] [ ] [ ] c 1 + c c 03 3 = 2 1 c 1 + 2c 2 + c 3 = 0, c 1 + c 2 c 3 = 0, 2c 1 + 2c 3 = 0, 3c 2 + c 3 = 0. [ ] 00, 00 The reduced row-echelon form of the augmented matrix of this homogeneous system is , 0000 which implies that the system has only the trivial solution c 1 = c 2 = c 3 = 0. It follows from Definition that {A 1,A 2,A 3 } is linearly independent. As a corollary to Theorem 4.5.2, we establish the following result. Corollary Any nontrivial, finite set of linearly dependent vectors in a vector space V contains a linearly independent subset that has the same linear span as the given set of vectors.

7 4.5 Linear Dependence and Linear Independence 273 Proof Since the given set is linearly dependent, at least one of the vectors in the set is a linear combination of the remaining vectors, by Theorem Thus, by Theorem 4.5.2, we can delete that vector from the set, and the resulting set of vectors will span the same subspace of V as the original set. If the resulting set is linearly independent, then we are done. If not, then we can repeat the procedure to eliminate another vector in the set. Continuing in this manner (with a finite number of iterations), we will obtain a linearly independent set that spans the same subspace of V as the subspace spanned by the original set of vectors. Remark Corollary is actually true even if the set of vectors in question is infinite, but we shall not need to consider that case in this text. In the case of an infinite set of vectors, other techniques are required for the proof. Note that the linearly independent set obtained using the procedure given in the previous theorem is not unique, and therefore the question arises whether the number of vectors in any resulting linearly independent set is independent of the manner in which the procedure is applied. We will give an affirmative answer to this question in Section 4.6. Example Let v 1 = (1, 2, 3), v 2 = ( 1, 1, 4), v 3 = (3, 3, 2), and v 4 = ( 2, 4, 6). Determine a linearly independent set of vectors that spans the same subspace of R 3 as span{v 1, v 2, v 3, v 4 }. Solution: requires that Setting c 1 v 1 + c 2 v 2 + c 3 v 3 + c 4 v 4 = 0 c 1 (1, 2, 3) + c 2 ( 1, 1, 4) + c 3 (3, 3, 2) + c 4 ( 2, 4, 6) = (0, 0, 0), leading to the linear system c 1 c 2 + 3c 3 2c 4 = 0, 2c 1 + c 2 + 3c 3 4c 4 = 0, 3c 1 + 4c 2 + 2c 3 6c 4 = 0. The augmented matrix of this system is and the reduced row-echelon form of the augmented matrix of this system is The system has two free variables, c 3 = s and c 4 = t, and so {v 1, v 2, v 3, v 4 } is linearly dependent. Then c 2 = s and c 1 = 2t 2s. So the general form of the solution is (2t 2s, s, s, t) = s( 2, 1, 1, 0) + t(2, 0, 0, 1). Setting s = 1 and t = 0 yields the linear combination 2v 1 + v 2 + v 3 = 0, (4.5.2)

8 274 CHAPTER 4 Vector Spaces and setting s = 0 and t = 1 yields the linear combination 2v 1 + v 4 = 0. (4.5.3) We can solve (4.5.2) for v 3 in terms of v 1 and v 2, and we can solve (4.5.3) for v 4 in terms of v 1. Hence, according to Theorem 4.5.2, we have span{v 1, v 2, v 3, v 4 }=span{v 1, v 2 }. By Proposition 4.5.7, v 1 and v 2 are linearly independent, so {v 1, v 2 } is the linearly independent set we are seeking. Geometrically, the subspace of R 3 spanned by v 1 and v 2 is a plane, and the vectors v 3 and v 4 lie in this plane. Linear Dependence and Linear Independence in R n Let {v 1, v 2,...,v k } be a set of vectors in R n, and let A denote the matrix that has v 1, v 2,...,v k as column vectors. Thus, A =[v 1, v 2,...,v k ]. (4.5.4) Since each of the given vectors is in R n, it follows that A has n rows and is therefore an n k matrix. The linear combination c 1 v 1 + c 2 v 2 + +c k v k = 0 can be written in matrix form as (see Theorem 2.2.9) Ac = 0, (4.5.5) where A is given in Equation (4.5.4) and c =[c 1 c 2... c k ] T. Consequently, we can state the following theorem and corollary: Theorem Let v 1, v 2,...,v k be vectors in R n and A =[v 1, v 2,...,v k ]. Then {v 1, v 2,...,v k } is linearly dependent if and only if the linear system Ac = 0 has a nontrivial solution. Corollary Let v 1, v 2,...,v k be vectors in R n and A =[v 1, v 2,...,v k ]. 1. If k>n, then {v 1, v 2,...,v k } is linearly dependent. 2. If k = n, then {v 1, v 2,...,v k } is linearly dependent if and only if det(a) = 0. Proof If k > n, the system (4.5.5) has an infinite number of solutions (see Corollary ), hence the vectors are linearly dependent by Theorem On the other hand, if k = n, the system (4.5.5) is n n, and hence, from Corollary 3.2.5, it has an infinite number of solutions if and only if det(a) = 0. Example Determine whether the given vectors are linearly dependent or linearly independent in R v 1 = (1, 3, 1, 0), v 2 = (2, 9, 1, 3), v 3 = (4, 5, 6, 11), v 4 = (1, 1, 2, 5), v 5 = (3, 2, 6, 7). 2. v 1 = (1, 4, 1, 7), v 2 = (3, 5, 2, 3), v 3 = (2, 1, 6, 9), v 4 = ( 2, 3, 1, 6).

9 4.5 Linear Dependence and Linear Independence 275 Solution: 1. Since we have five vectors in R 4, Corollary implies that {v 1, v 2, v 3, v 4, v 5 } is necessarily linearly dependent. 2. In this case, we have four vectors in R 4, and therefore, we can use the determinant: det(a) = det[v 1, v 2, v 3, v 4 ]= = 462. Since the determinant is nonzero, it follows from Corollary that the given set of vectors is linearly independent. Linear Independence of Functions We now consider the general problem of determining whether or not a given set of functions is linearly independent or linearly dependent. We begin by specializing the general Definition to the case of a set of functions defined on an interval I. DEFINITION The set of functions {f 1,f 2,...,f k } is linearly independent on an interval I if and only if the only values of the scalars c 1,c 2,...,c k such that c 1 f 1 (x) + c 2 f 2 (x) + +c k f k (x) = 0, for all x I, (4.5.6) are c 1 = c 2 = =c k = 0. The main point to notice is that the condition (4.5.6) must hold for all x in I. A key tool in deciding whether or not a collection of functions is linearly independent on an interval I is the Wronskian. As we will see in Chapter 6, we can draw particularly sharp conclusions from the Wronskian about the linear dependence or independence of a family of solutions to a linear homogeneous differential equation. DEFINITION Let f 1,f 2,...,f k be functions in C k 1 (I). The Wronskian of these functions is the order k determinant defined by W[f 1,f 2,...,f k ](x) = f 1 (x) f 2 (x)... f k (x) f 1 (x) f 2 (x)... f k (x)... f (k 1) 1 (x) f (k 1) 2 (x)... f (k 1) k (x). Remark Notice that the Wronskian is a function defined on I. Also note that this function depends on the order of the functions in the Wronskian. For example, using properties of determinants, W[f 2,f 1,...,f k ](x) = W[f 1,f 2,...,f k ](x).

10 276 CHAPTER 4 Vector Spaces Example If f 1 (x) = sin x and f 2 (x) = cos x on (, ), then W[f 1,f 2 ](x) = sin x cos x = (sin x)( sin x) (cos x)(cos x) cos x sin x = (sin 2 x + cos 2 x) = 1. Example If f 1 (x) = x, f 2 (x) = x 2, and f 3 (x) = x 3 on (, ), then W[f 1,f 2,f 3 ](x) = xx 2 x 3 12x 3x x = x(12x 2 6x 2 ) (6x 3 2x 3 ) = 2x 3. We can now state and prove the main result about the Wronskian. Theorem Let f 1,f 2,...,f k be functions in C k 1 (I). IfW[f 1,f 2,...,f k ] is nonzero at some point x 0 in I, then {f 1,f 2,...,f k } is linearly independent on I. Proof To apply Definition , assume that c 1 f 1 (x) + c 2 f 2 (x) + +c k f k (x) = 0, for all x in I. Then, differentiating k 1 times yields the linear system c 1 f 1 (x) + c 2 f 2 (x) + + c k f k (x) = 0, c 1 f 1 (x) + c 2f 2 (x) + + c kf k (x) = 0,. c 1 f (k 1) 1 (x) + c 2 f (k 1) 2 (x) + + c k f (k 1) k (x) = 0, where the unknowns in the system are c 1,c 2,...,c k.wewishtoshowthatc 1 = c 2 = = c k = 0. The determinant of the matrix of coefficients of this system is just W[f 1,f 2,...,f k ](x). Consequently, if W[f 1,f 2,...,f k ](x 0 ) 0 for some x 0 in I, then the determinant of the matrix of coefficients of the system is nonzero at that point, and therefore the only solution to the system is the trivial solution c 1 = c 2 = =c k = 0. That is, the given set of functions is linearly independent on I. Remarks 1. Notice that it is only necessary for W[f 1,f 2,...,f k ](x) to be nonzero at one point in I for {f 1,f 2,...,f k } to be linearly independent on I. 2. Theorem does not say that if W[f 1,f 2,...,f k ](x) = 0 for every x in I, then {f 1,f 2,...,f k } is linearly dependent on I. As we will see in the next example below, the Wronskian of a linearly independent set of functions on an interval I can be identically zero on I. Instead, the logical equivalent of the preceding theorem is: If {f 1,f 2,...,f k } is linearly dependent on I, then W[f 1,f 2,...,f k ](x) = 0 at every point of I.

11 4.5 Linear Dependence and Linear Independence 277 If W[f 1,f 2,...,f k ](x) = 0 for all x in I, Theorem gives no information as to the linear dependence or independence of {f 1,f 2,...,f k } on I. Example Determine whether the following functions are linearly dependent or linearly independent on I = (, ). (a) f 1 (x) = e x, f 2 (x) = x 2 e x. (b) f 1 (x) = x, f 2 (x) = x + x 2, f 3 (x) = 2x x 2. { (c) f 1 (x) = x 2 2x, f 2 (x) = 2, if x 0, x 2, if x<0. Solution: (a) W[f 1,f 2 ](x) = ex x 2 e x e x e x (x 2 + 2x) = e2x (x 2 + 2x) x 2 e 2x = 2xe 2x. Since W[f 1,f 2 ](x) 0 (except at x = 0), the functions are linearly independent on (, ). (b) W[f 1,f 2,f 3 ](x) = xx+ x 2 2x x x 2 2x = x [( 2)(1 + 2x) 2(2 2x)] [ ] ( 2)(x + x 2 ) 2(2x x 2 ) = 0. Thus, no conclusion can be drawn from Theorem However, a closer inspection of the functions reveals, for example, that f 2 = 3f 1 f 3. Consequently, the functions are linearly dependent on (, ). (c) If x 0, then whereas if x<0, then W[f 1,f 2 ](x) = x2 2x 2 2x 4x = 0, W[f 1,f 2 ](x) = x2 x 2 2x 2x = 0. Thus, W[f 1,f 2 ](x) = 0 for all x in (, ), so no conclusion can be drawn from Theorem Again we take a closer look at the given functions. They are sketched in Figure In this case, we see that on the interval (, 0),the functions are linearly dependent, since f 1 + f 2 = 0.

12 278 CHAPTER 4 Vector Spaces y y f 1 (x) x 2 y f 2 (x) 2x 2 f 2 (x) 2f 1 (x) on [0, ) f 2 (x) f 1 (x) on (, 0) y f 1 (x) x 2 x y f 2 (x) x 2 Figure 4.5.3: Two functions that are linearly independent on (, ), but whose Wronskian is identically zero on that interval. They are also linearly dependent on [0, ), since on this interval we have 2f 1 f 2 = 0. The key point is to realize that there is no set of nonzero constants c 1,c 2 for which c 1 f 1 + c 2 f 2 = 0 holds for all x in (, ). Hence, the given functions are linearly independent on (, ). This illustrates our second remark following Theorem , and it emphasizes the importance of the role played by the interval I when discussing linear dependence and linear independence of functions. A collection of functions may be linearly independent on an interval I 1, but linearly dependent on another interval I 2. It might appear at this stage that the usefulness of the Wronskian is questionable, since if W[f 1,f 2,...,f k ] vanishes on an interval I, then no conclusion can be drawn as to the linear dependence or linear independence of the functions f 1,f 2,...,f k on I. However, the real power of the Wronskian is in its application to solutions of linear differential equations of the form y (n) + a 1 (x)y (n 1) + +a n 1 (x)y + a n (x)y = 0. (4.5.7) In Chapter 6, we will establish that if we have n functions that are solutions of an equation of the form (4.5.7) on an interval I, then if the Wronskian of these functions is identically zero on I, the functions are indeed linearly dependent on I. Thus, the Wronskian does completely characterize the linear dependence or linear independence of solutions of such equations. This is a fundamental result in the theory of linear differential equations. Exercises for 4.5 Key Terms Linearly dependent set, Linear dependency, Linearly independent set, Minimal spanning set, Wronskian of a set of functions. Skills Be able to determine whether a given finite set of vectors is linearly dependent or linearly independent. For sets of one or two vectors, you should be able to do this at a glance. If the set is linearly dependent, be able to determine a linear dependency relationship among the vectors. Be able to take a linearly dependent set of vectors and remove vectors until it becomes a linearly independent set of vectors with the same span as the original set.

13 4.5 Linear Dependence and Linear Independence 279 Be able to produce a linearly independent set of vectors that spans a given subspace of a vector space V. Be able to conclude immediately that a set of k vectors in R n is linearly dependent if k>n, and know what can be said in the case where k = n as well. Know what information the Wronskian does (and does not) give about the linear dependence or linear independence of a set of functions on an interval I. True-False Review For Questions 1 9, decide if the given statement is true or false, and give a brief justification for your answer. If true, you can quote a relevant definition or theorem from the text. If false, provide an example, illustration, or brief explanation of why the statement is false. 1. Every vector space V possesses a unique minimal spanning set. 2. The set of column vectors of a 5 7 matrix A must be linearly dependent. 3. The set of column vectors of a 7 5 matrix A must be linearly independent. 4. Any nonempty subset of a linearly independent set of vectors is linearly independent. 5. If the Wronskian of a set of functions is nonzero at some point x 0 in an interval I, then the set of functions is linearly independent. 6. If it is possible to express one of the vectors in a set S as a linear combination of the others, then S is a linearly dependent set. 7. If a set of vectors S in a vector space V contains a linearly dependent subset, then S is itself a linearly dependent set. 8. A set of three vectors in a vector space V is linearly dependent if and only if all three vectors are proportional to one another. 9. If the Wronskian of a set of functions is identically zero at every point of an interval I, then the set of functions is linearly dependent. Problems For Problems 1 9, determine whether the given set of vectors is linearly independent or linearly dependent in R n.inthe case of linear dependence, find a dependency relationship. 1. {(1, 1), (1, 1)}. 2. {(2, 1), (3, 2), (0, 1)}. 3. {(1, 1, 0), (0, 1, 1), (1, 1, 1)}. 4. {(1, 2, 3), (1, 1, 2), (1, 4, 1)}. 5. {( 2, 4, 6), (3, 6, 9)}. 6. {(1, 1, 2), (2, 1, 0)}. 7. {( 1, 1, 2), (0, 2, 1), (3, 1, 2), ( 1, 1, 1)}. 8. {(1, 1, 2, 3), (2, 1, 1, 1), ( 1, 1, 1, 1)}. 9. {(2, 1, 0, 1), (1, 0, 1, 2), (0, 3, 1, 2), ( 1, 1, 2, 1)}. 10. Let v 1 = (1, 2, 3), v 2 = (4, 5, 6), v 3 = (7, 8, 9). Determine whether {v 1, v 2, v 3 } is linearly independent in R 3. Describe span{v 1, v 2, v 3 } geometrically. 11. Consider the vectors v 1 = (2, 1, 5), v 2 = (1, 3, 4), v 3 = ( 3, 9, 12) in R 3. (a) Show that {v 1, v 2, v 3 } is linearly dependent. (b) Is v 1 span{v 2, v 3 }? Draw a picture illustrating your answer. 12. Determine all values of the constant k for which the vectors (1, 1,k),(0, 2,k),and (1,k,6) are linearly dependent in R 3. For Problems 13 14, determine all values of the constant k for which the given set of vectors is linearly independent in R {(1, 0, 1,k),( 1, 0,k,1), (2, 0, 1, 3)}. 14. {(1, 1, 0, 1), (1,k,1, 1), (2, 1,k,1), ( 1, 1, 1,k)}. For Problems 15 17, determine whether the given set of vectors is linearly independent in M 2 (R). [ ] [ ] [ ] A 1 =,A 01 2 =,A =. 04 [ ] [ ] A 1 =,A =. 13

14 280 CHAPTER 4 Vector Spaces [ ] [ A 1 =,A 12 2 = 21 ],A 3 = [ ] For Problems 18 19, determine whether the given set of vectors is linearly independent in P { x 2, if x 0, f 1 (x) = 3x 3, if x<0, 18. p 1 (x) = 1 x, p 2 (x) = 1 + x. 19. p 1 (x) = 2 + 3x, p 2 (x) = 4 + 6x. 20. Show that the vectors p 1 (x) = a + bx and p 2 (x) = c + dx are linearly independent in P 1 if and only if the constants a, b, c, d satisfy ad bc If f 1 (x) = cos 2x,f 2 (x) = sin 2 x,f 3 (x) = cos 2 x, determine whether {f 1,f 2,f 3 } is linearly dependent or linearly independent in C (, ). For Problems 22 28, determine a linearly independent set of vectors that spans the same subspace of V as that spanned by the original set of vectors. 22. V = R 3, {(1, 2, 3), ( 3, 4, 5), (1, 4 3, 5 3 )}. 23. V = R 3, {(3, 1, 5), (0, 0, 0), (1, 2, 1), ( 1, 2, 3)}. 24. V = R 3, {(1, 1, 1), (1, 1, 1), (1, 3, 1), (3, 1, 2)}. 25. V = R 4, {(1, 1, 1, 1), (2, 1, 3, 1), (1, 1, 2, 1), (2, 1, 2, 1)}. 26. V = M 2 (R), {[ ], [ ] 12, V = P 1, {2 5x,3 + 7x,4 x}. [ ]} V = P 2, {2 + x 2, 4 2x + 3x 2, 1 + x}. For Problems 29 33, use the Wronskian to show that the given functions are linearly independent on the given interval I. 29. f 1 (x) = 1,f 2 (x) = x,f 3 (x) = x 2,I = (, ). 30. f 1 (x) = sin x,f 2 (x) = cos x,f 3 (x) = tan x, I = ( π/2,π/2). 31. f 1 (x) = 1,f 2 (x) = 3x,f 3 (x) = x 2 1,I = (, ). 32. f 1 (x) = e 2x,f 2 (x) = e 3x,f 3 (x) = e x,i = (, ). f 2 (x) = 7x 2,I = (, ). For Problems 34 36, show that the Wronskian of the given functions is identically zero on (, ). Determine whether the functions are linearly independent or linearly dependent on that interval. 34. f 1 (x) = 1,f 2 (x) = x,f 3 (x) = 2x f 1 (x) = e x,f 2 (x) = e x,f 3 (x) = cosh x. 36. f 1 (x) = 2x 3, { 5x 3, ifx 0, f 2 (x) = 3x 3, if x< Consider the functions f 1 (x) = x, { x, if x 0, f 2 (x) = x, if x<0. (a) Show that f 2 is not in C 1 (, ). (b) Show that {f 1,f 2 } is linearly dependent on the intervals (, 0) and [0, ), while it is linearly independent on the interval (, ). Justify your results by making a sketch showing both of the functions. 38. Determine whether the functions f 1 (x) = x, { x, if x 0, f 2 (x) = 1, if x = 0. are linearly dependent or linearly independent on I = (, ). 39. Show that the functions { x 1, if x 1, f 1 (x) = 2(x 1), if x<1, f 2 (x) = 2x,f 3 (x) = 3 form a linearly independent set on (, ). Determine all intervals on which {f 1,f 2,f 3 } is linearly dependent.

15 4.6 Bases and Dimension (a) Show that {1,x,x 2,x 3 } is linearly independent on every interval. (b) If f k (x) = x k for k = 0, 1,...,n, show that {f 0,f 1,...,f n } is linearly independent on every interval for all fixed n. 41. (a) Show that the functions f 1 (x) = e r1x, f 2 (x) = e r2x, f 3 (x) = e r 3x have Wronskian W [f 1,f 2,f 3 ](x) = e (r 1+r 2 +r 3 )x r 1 r 2 r 3 r1 2 r2 2 r2 3 = e (r 1+r 2 +r 3 )x (r 3 r 1 )(r 3 r 2 )(r 2 r 1 ), and hence determine the conditions on r 1,r 2,r 3 such that {f 1,f 2,f 3 } is linearly independent on every interval. (b) More generally, show that the set of functions {e r1x,e r2x,...,e rnx } is linearly independent on every interval if and only if all of the r i are distinct. [Hint: Show that the Wronskian of the given functions is a multiple of the n n Vandermonde determinant, and then use Problem 21 in Section 3.3.] 42. Let {v 1, v 2 } be a linearly independent set in a vector space V, and let v = αv 1 + v 2, w = v 1 + αv 2, where α is a constant. Use Definition to determine all values of α for which {v, w} is linearly independent. 43. If v 1 and v 2 are vectors in a vector space V, and u 1, u 2, u 3 are each linear combinations of them, prove that {u 1, u 2, u 3 } is linearly dependent. 44. Let v 1, v 2,...,v m be a set of linearly independent vectors in a vector space V and suppose that the vectors u 1, u 2,...,u n are each linear combinations of them. It follows that we can write m u k = a ik v i, k = 1, 2,...,n, i=1 for appropriate constants a ik. (a) If n>m, prove that {u 1, u 2,...,u n } is linearly dependent on V. (b) If n = m, prove that {u 1, u 2,...,u n } is linearly independent in V if and only if det[a ij ] 0. (c) If n<m, prove that {u 1, u 2,...,u n } is linearly independent in V if and only if rank(a) = n, where A =[a ij ]. (d) Which result from this section do these results generalize? 45. Prove from the definition of linearly independent that if {v 1, v 2,...,v n } is linearly independent and if A is an invertible n n matrix, then the set {Av 1,Av 2,...,Av n } is linearly independent. 46. Prove that if {v 1, v 2 } is linearly independent and v 3 is not in span{v 1, v 2 }, then {v 1, v 2, v 3 } is linearly independent. 47. Generalizing the previous exercise, prove that if {v 1, v 2,...,v k } is linearly independent and v k+1 is not in span{v 1, v 2,...,v k }, then {v 1, v 2,...,v k+1 } is linearly independent. 48. Prove Theorem Prove Proposition Prove that if {v 1, v 2,...,v k } spans a vector space V, then for every vector v in V, {v, v 1, v 2,...,v k } is linearly dependent. 51. Prove that if V = P n and S ={p 1,p 2,...,p k } is a set of vectors in V each of a different degree, then S is linearly independent. [Hint: Assume without loss of generality that the polynomials are ordered in descending degree: deg(p 1 )>deg(p 2 )> > deg(p k ). Assuming that c 1 p 1 + c 2 p 2 + +c k p k = 0, first show that c 1 is zero by examining the highest degree. Then repeat for lower degrees to show successively that c 2 = 0, c 3 = 0, and so on.] 4.6 Bases and Dimension The results of the previous section show that if a minimal spanning set exists in a (nontrivial) vector space V, it cannot be linearly dependent. Therefore if we are looking for minimal spanning sets for V, we should focus our attention on spanning sets that are linearly independent. One of the results of this section establishes that every spanning set for V that is linearly independent is indeed a minimal spanning set. Such a set will be

### MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar

### 160 CHAPTER 4. VECTOR SPACES

160 CHAPTER 4. VECTOR SPACES 4. Rank and Nullity In this section, we look at relationships between the row space, column space, null space of a matrix and its transpose. We will derive fundamental results

### NOTES ON LINEAR TRANSFORMATIONS

NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all

### Methods for Finding Bases

Methods for Finding Bases Bases for the subspaces of a matrix Row-reduction methods can be used to find bases. Let us now look at an example illustrating how to obtain bases for the row space, null space,

### Systems of Linear Equations

Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and

### Similarity and Diagonalization. Similar Matrices

MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

### Vector Spaces 4.4 Spanning and Independence

Vector Spaces 4.4 and Independence October 18 Goals Discuss two important basic concepts: Define linear combination of vectors. Define Span(S) of a set S of vectors. Define linear Independence of a set

### Name: Section Registered In:

Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are

### by the matrix A results in a vector which is a reflection of the given

Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

### Solutions to Math 51 First Exam January 29, 2015

Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not

### MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. Nullspace Let A = (a ij ) be an m n matrix. Definition. The nullspace of the matrix A, denoted N(A), is the set of all n-dimensional column

### MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column

### a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.

Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given

### Solving Linear Systems, Continued and The Inverse of a Matrix

, Continued and The of a Matrix Calculus III Summer 2013, Session II Monday, July 15, 2013 Agenda 1. The rank of a matrix 2. The inverse of a square matrix Gaussian Gaussian solves a linear system by reducing

### Matrix Representations of Linear Transformations and Changes of Coordinates

Matrix Representations of Linear Transformations and Changes of Coordinates 01 Subspaces and Bases 011 Definitions A subspace V of R n is a subset of R n that contains the zero element and is closed under

### THE DIMENSION OF A VECTOR SPACE

THE DIMENSION OF A VECTOR SPACE KEITH CONRAD This handout is a supplementary discussion leading up to the definition of dimension and some of its basic properties. Let V be a vector space over a field

### MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

### 1.5 SOLUTION SETS OF LINEAR SYSTEMS

1-2 CHAPTER 1 Linear Equations in Linear Algebra 1.5 SOLUTION SETS OF LINEAR SYSTEMS Many of the concepts and computations in linear algebra involve sets of vectors which are visualized geometrically as

### Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

### LS.6 Solution Matrices

LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions

### 1 Determinants and the Solvability of Linear Systems

1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped

### These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

DEFINITION: A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the following axioms

### 1.2 Solving a System of Linear Equations

1.. SOLVING A SYSTEM OF LINEAR EQUATIONS 1. Solving a System of Linear Equations 1..1 Simple Systems - Basic De nitions As noticed above, the general form of a linear system of m equations in n variables

### MA106 Linear Algebra lecture notes

MA106 Linear Algebra lecture notes Lecturers: Martin Bright and Daan Krammer Warwick, January 2011 Contents 1 Number systems and fields 3 1.1 Axioms for number systems......................... 3 2 Vector

### α = u v. In other words, Orthogonal Projection

Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

### MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +

### Notes on Determinant

ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without

### Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

### 1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0

Solutions: Assignment 4.. Find the redundant column vectors of the given matrix A by inspection. Then find a basis of the image of A and a basis of the kernel of A. 5 A The second and third columns are

### Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal

### Orthogonal Diagonalization of Symmetric Matrices

MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

### Inner Product Spaces

Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

### MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the

### 1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

### Section 4.4 Inner Product Spaces

Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer

### MATH1231 Algebra, 2015 Chapter 7: Linear maps

MATH1231 Algebra, 2015 Chapter 7: Linear maps A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra 1 / 43 Chapter

### Linearly Independent Sets and Linearly Dependent Sets

These notes closely follow the presentation of the material given in David C. Lay s textbook Linear Algebra and its Applications (3rd edition). These notes are intended primarily for in-class presentation

### LEARNING OBJECTIVES FOR THIS CHAPTER

CHAPTER 2 American mathematician Paul Halmos (1916 2006), who in 1942 published the first modern linear algebra book. The title of Halmos s book was the same as the title of this chapter. Finite-Dimensional

### Quotient Rings and Field Extensions

Chapter 5 Quotient Rings and Field Extensions In this chapter we describe a method for producing field extension of a given field. If F is a field, then a field extension is a field K that contains F.

### Math 312 Homework 1 Solutions

Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please

### Recall that two vectors in are perpendicular or orthogonal provided that their dot

Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

### Vector and Matrix Norms

Chapter 1 Vector and Matrix Norms 11 Vector Spaces Let F be a field (such as the real numbers, R, or complex numbers, C) with elements called scalars A Vector Space, V, over the field F is a non-empty

### MATH2210 Notebook 1 Fall Semester 2016/2017. 1 MATH2210 Notebook 1 3. 1.1 Solving Systems of Linear Equations... 3

MATH0 Notebook Fall Semester 06/07 prepared by Professor Jenny Baglivo c Copyright 009 07 by Jenny A. Baglivo. All Rights Reserved. Contents MATH0 Notebook 3. Solving Systems of Linear Equations........................

### Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

### Solving Systems of Linear Equations

LECTURE 5 Solving Systems of Linear Equations Recall that we introduced the notion of matrices as a way of standardizing the expression of systems of linear equations In today s lecture I shall show how

### Subspaces of R n LECTURE 7. 1. Subspaces

LECTURE 7 Subspaces of R n Subspaces Definition 7 A subset W of R n is said to be closed under vector addition if for all u, v W, u + v is also in W If rv is in W for all vectors v W and all scalars r

### Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

### Linear Algebra Notes

Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note

### Section 1.7 22 Continued

Section 1.5 23 A homogeneous equation is always consistent. TRUE - The trivial solution is always a solution. The equation Ax = 0 gives an explicit descriptions of its solution set. FALSE - The equation

### Math 4310 Handout - Quotient Vector Spaces

Math 4310 Handout - Quotient Vector Spaces Dan Collins The textbook defines a subspace of a vector space in Chapter 4, but it avoids ever discussing the notion of a quotient space. This is understandable

### Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

### T ( a i x i ) = a i T (x i ).

Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)

### Linear Algebra I. Ronald van Luijk, 2012

Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.

### 1 if 1 x 0 1 if 0 x 1

Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

### 1 Sets and Set Notation.

LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

### Lecture 14: Section 3.3

Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in

### ( ) which must be a vector

MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are

### December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation

### 8 Square matrices continued: Determinants

8 Square matrices continued: Determinants 8. Introduction Determinants give us important information about square matrices, and, as we ll soon see, are essential for the computation of eigenvalues. You

### Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of

### 8 Divisibility and prime numbers

8 Divisibility and prime numbers 8.1 Divisibility In this short section we extend the concept of a multiple from the natural numbers to the integers. We also summarize several other terms that express

### 3. INNER PRODUCT SPACES

. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.

### MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α

### Scalar Valued Functions of Several Variables; the Gradient Vector

Scalar Valued Functions of Several Variables; the Gradient Vector Scalar Valued Functions vector valued function of n variables: Let us consider a scalar (i.e., numerical, rather than y = φ(x = φ(x 1,

### University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

### MAT 242 Test 2 SOLUTIONS, FORM T

MAT 242 Test 2 SOLUTIONS, FORM T 5 3 5 3 3 3 3. Let v =, v 5 2 =, v 3 =, and v 5 4 =. 3 3 7 3 a. [ points] The set { v, v 2, v 3, v 4 } is linearly dependent. Find a nontrivial linear combination of these

PYTHAGOREAN TRIPLES KEITH CONRAD 1. Introduction A Pythagorean triple is a triple of positive integers (a, b, c) where a + b = c. Examples include (3, 4, 5), (5, 1, 13), and (8, 15, 17). Below is an ancient

### 28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE 1.4 Cross Product 1.4.1 Definitions The cross product is the second multiplication operation between vectors we will study. The goal behind the definition

### Unit 18 Determinants

Unit 18 Determinants Every square matrix has a number associated with it, called its determinant. In this section, we determine how to calculate this number, and also look at some of the properties of

### Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

Matrix Algebra A. Doerr Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Some Basic Matrix Laws Assume the orders of the matrices are such that

### SECTION 10-2 Mathematical Induction

73 0 Sequences and Series 6. Approximate e 0. using the first five terms of the series. Compare this approximation with your calculator evaluation of e 0.. 6. Approximate e 0.5 using the first five terms

### Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

Linear Algebra A vector space (over R) is an ordered quadruple (V, 0, α, µ) such that V is a set; 0 V ; and the following eight axioms hold: α : V V V and µ : R V V ; (i) α(α(u, v), w) = α(u, α(v, w)),

### Lecture L3 - Vectors, Matrices and Coordinate Transformations

S. Widnall 16.07 Dynamics Fall 2009 Lecture notes based on J. Peraire Version 2.0 Lecture L3 - Vectors, Matrices and Coordinate Transformations By using vectors and defining appropriate operations between

### THE FUNDAMENTAL THEOREM OF ARBITRAGE PRICING

THE FUNDAMENTAL THEOREM OF ARBITRAGE PRICING 1. Introduction The Black-Scholes theory, which is the main subject of this course and its sequel, is based on the Efficient Market Hypothesis, that arbitrages

### Section 8.2 Solving a System of Equations Using Matrices (Guassian Elimination)

Section 8. Solving a System of Equations Using Matrices (Guassian Elimination) x + y + z = x y + 4z = x 4y + z = System of Equations x 4 y = 4 z A System in matrix form x A x = b b 4 4 Augmented Matrix

### 4: SINGLE-PERIOD MARKET MODELS

4: SINGLE-PERIOD MARKET MODELS Ben Goldys and Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2015 B. Goldys and M. Rutkowski (USydney) Slides 4: Single-Period Market

### I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

I GROUPS: BASIC DEFINITIONS AND EXAMPLES Definition 1: An operation on a set G is a function : G G G Definition 2: A group is a set G which is equipped with an operation and a special element e G, called

### 3. Mathematical Induction

3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)

### Row Echelon Form and Reduced Row Echelon Form

These notes closely follow the presentation of the material given in David C Lay s textbook Linear Algebra and its Applications (3rd edition) These notes are intended primarily for in-class presentation

### SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison

SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89 by Joseph Collison Copyright 2000 by Joseph Collison All rights reserved Reproduction or translation of any part of this work beyond that permitted by Sections

### MATH 551 - APPLIED MATRIX THEORY

MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

### r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) 1 9.4.4 Write the given system in matrix form x = Ax + f ( ) sin(t) x y 1 0 5 z = dy cos(t)

Solutions HW 9.4.2 Write the given system in matrix form x = Ax + f r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + We write this as ( ) r (t) θ (t) = ( ) ( ) 2 r(t) θ(t) + ( ) sin(t) 9.4.4 Write the given system

### Orthogonal Projections

Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

### Unified Lecture # 4 Vectors

Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,

### CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

### Chapter 3. Cartesian Products and Relations. 3.1 Cartesian Products

Chapter 3 Cartesian Products and Relations The material in this chapter is the first real encounter with abstraction. Relations are very general thing they are a special type of subset. After introducing

### LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

### 5 Homogeneous systems

5 Homogeneous systems Definition: A homogeneous (ho-mo-jeen -i-us) system of linear algebraic equations is one in which all the numbers on the right hand side are equal to : a x +... + a n x n =.. a m

### THE FUNDAMENTAL THEOREM OF ALGEBRA VIA PROPER MAPS

THE FUNDAMENTAL THEOREM OF ALGEBRA VIA PROPER MAPS KEITH CONRAD 1. Introduction The Fundamental Theorem of Algebra says every nonconstant polynomial with complex coefficients can be factored into linear

### DIFFERENTIABILITY OF COMPLEX FUNCTIONS. Contents

DIFFERENTIABILITY OF COMPLEX FUNCTIONS Contents 1. Limit definition of a derivative 1 2. Holomorphic functions, the Cauchy-Riemann equations 3 3. Differentiability of real functions 5 4. A sufficient condition

### 5. Linear algebra I: dimension

5. Linear algebra I: dimension 5.1 Some simple results 5.2 Bases and dimension 5.3 Homomorphisms and dimension 1. Some simple results Several observations should be made. Once stated explicitly, the proofs

### CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Chapter 2 CONTROLLABILITY 2 Reachable Set and Controllability Suppose we have a linear system described by the state equation ẋ Ax + Bu (2) x() x Consider the following problem For a given vector x in

### Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product Geometrical definition Properties Expression in components. Definition in components Properties Geometrical expression.

### Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

### Math 333 - Practice Exam 2 with Some Solutions

Math 333 - Practice Exam 2 with Some Solutions (Note that the exam will NOT be this long) Definitions (0 points) Let T : V W be a transformation Let A be a square matrix (a) Define T is linear (b) Define

### Vector Spaces. Chapter 2. 2.1 R 2 through R n

Chapter 2 Vector Spaces One of my favorite dictionaries (the one from Oxford) defines a vector as A quantity having direction as well as magnitude, denoted by a line drawn from its original to its final

### Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,

### GROUPS ACTING ON A SET

GROUPS ACTING ON A SET MATH 435 SPRING 2012 NOTES FROM FEBRUARY 27TH, 2012 1. Left group actions Definition 1.1. Suppose that G is a group and S is a set. A left (group) action of G on S is a rule for