5. Linear algebra I: dimension



Similar documents
Mathematics Course 111: Algebra I Part IV: Vector Spaces

THE DIMENSION OF A VECTOR SPACE

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

MA106 Linear Algebra lecture notes

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Matrix Representations of Linear Transformations and Changes of Coordinates

LEARNING OBJECTIVES FOR THIS CHAPTER

T ( a i x i ) = a i T (x i ).

Metric Spaces. Chapter Metrics

ZORN S LEMMA AND SOME APPLICATIONS

3. Prime and maximal ideals Definitions and Examples.

Math 4310 Handout - Quotient Vector Spaces

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

Systems of Linear Equations

6. Fields I. 1. Adjoining things

1 Homework 1. [p 0 q i+j p i 1 q j+1 ] + [p i q j ] + [p i+1 q j p i+j q 0 ]

(a) Write each of p and q as a polynomial in x with coefficients in Z[y, z]. deg(p) = 7 deg(q) = 9

Math Practice Exam 2 with Some Solutions

Chapter 20. Vector Spaces and Bases

NOTES ON LINEAR TRANSFORMATIONS

SOLUTIONS TO PROBLEM SET 3

Linearly Independent Sets and Linearly Dependent Sets

Linear Algebra Notes

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 3. Spaces with special properties

6.2 Permutations continued

4.5 Linear Dependence and Linear Independence

PROJECTIVE GEOMETRY. b3 course Nigel Hitchin

Vector Spaces 4.4 Spanning and Independence

Polynomial Invariants

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Finite dimensional C -algebras

α = u v. In other words, Orthogonal Projection

Finite dimensional topological vector spaces

Solutions to Math 51 First Exam January 29, 2015

Methods for Finding Bases

Section 4.4 Inner Product Spaces

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

by the matrix A results in a vector which is a reflection of the given

LINEAR ALGEBRA W W L CHEN

Orthogonal Diagonalization of Symmetric Matrices

( ) which must be a vector

160 CHAPTER 4. VECTOR SPACES

8.1 Examples, definitions, and basic properties

Commutative Algebra Notes Introduction to Commutative Algebra Atiyah & Macdonald

Linear Algebra I. Ronald van Luijk, 2012

University of Lille I PC first year list of exercises n 7. Review

Introduction to Topology

Group Theory. Contents

1 Sets and Set Notation.

BANACH AND HILBERT SPACE REVIEW

CONSEQUENCES OF THE SYLOW THEOREMS

Lecture Notes 2: Matrices as Systems of Linear Equations

Geometric Transformations

Metric Spaces Joseph Muscat 2003 (Last revised May 2009)

Quotient Rings and Field Extensions

Chapter 13: Basic ring theory

Practice with Proofs

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

1 if 1 x 0 1 if 0 x 1

Lemma 5.2. Let S be a set. (1) Let f and g be two permutations of S. Then the composition of f and g is a permutation of S.

GROUP ALGEBRAS. ANDREI YAFAEV

(Basic definitions and properties; Separation theorems; Characterizations) 1.1 Definition, examples, inner description, algebraic properties

Math 223 Abstract Algebra Lecture Notes

Notes on Determinant

1 VECTOR SPACES AND SUBSPACES

3. Mathematical Induction

GROUPS ACTING ON A SET

Algebra of the 2x2x2 Rubik s Cube

Cartesian Products and Relations

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS

Numerical Analysis Lecture Notes

THE BANACH CONTRACTION PRINCIPLE. Contents

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

x a x 2 (1 + x 2 ) n.

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

CLUSTER ALGEBRAS AND CATEGORIFICATION TALKS: QUIVERS AND AUSLANDER-REITEN THEORY

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Name: Section Registered In:

BABY VERMA MODULES FOR RATIONAL CHEREDNIK ALGEBRAS

Lectures on Number Theory. Lars-Åke Lindahl

Math 319 Problem Set #3 Solution 21 February 2002

Set theory as a foundation for mathematics

1. Prove that the empty set is a subset of every set.

A Topology Primer. Preface. Lecture Notes 2001/2002. Klaus Wirthmüller. wirthm/de/top.html

1 = (a 0 + b 0 α) (a m 1 + b m 1 α) 2. for certain elements a 0,..., a m 1, b 0,..., b m 1 of F. Multiplying out, we obtain

ISOMETRIES OF R n KEITH CONRAD

r(x + y) =rx + ry; (r + s)x = rx + sx; r(sx) =(rs)x; 1x = x

Factoring of Prime Ideals in Extensions

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

Notes on Linear Algebra. Peter J. Cameron

Inner Product Spaces

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Linear Algebra Notes for Marsden and Tromba Vector Calculus

5 Homogeneous systems

MODULES OVER A PID KEITH CONRAD

Kevin James. MTHSC 412 Section 2.4 Prime Factors and Greatest Comm

Transcription:

5. Linear algebra I: dimension 5.1 Some simple results 5.2 Bases and dimension 5.3 Homomorphisms and dimension 1. Some simple results Several observations should be made. Once stated explicitly, the proofs are easy. [1] The intersection of a (non-empty) set of subspaces of a vector space V is a subspace. Proof: Let {W i : i I} be a set of subspaces of V. For w in every W i, the additive inverse w is in W i. Thus, w lies in the intersection. The same argument proves the other properties of subspaces. The subspace spanned by a set X of vectors in a vector space V is the intersection of all subspaces containing X. From above, this intersection is a subspace. The subspace spanned by a set X in a vector space V is the collection of all linear combinations of vectors from X. Proof: Certainly every linear combination of vectors taken from X is in any subspace containing X. On the other hand, we must show that any vector in the intersection of subspaces containing X is a linear combination of vectors in X. Now it is not hard to check that the collection of such linear combinations is itself a subspace of V, and contains X. Therefore, the intersection is no larger than this set of linear combinations. [1] At the beginning of the abstract form of this and other topics, there are several results which have little informational content, but, rather, only serve to assure us that the definitions/axioms have not included phenomena too violently in opposition to our expectations. This is not surprising, considering that the definitions have endured several decades of revision exactly to address foundational and other potential problems. 85

86 Linear algebra I: dimension A linearly independent set of vectors spanning a subspace W of V is a basis for W. [1.0.1] Proposition: Given a basis e 1,..., e n for a vector space V, there is exactly one expression for an arbitrary vector v V as a linear combination of e 1,..., e n. Proof: That there is at least one expression follows from the spanning property. On the other hand, if are two expressions for v, then subtract to obtain a i e i = v = i i (a i b i )e i = 0 i Since the e i are linearly independent, a i = b i for all indices i. b i e i 2. Bases and dimension The argument in the proof of the following fundamental theorem is the Lagrange replacement principle. This is the first non-trivial result in linear algebra. [2.0.1] Theorem: Let v 1,..., v m be a linearly independent set of vectors in a vector space V, and let w 1,..., w n be a basis for V. Then m n, and (renumbering the vectors w i if necessary) the vectors are a basis for V. v 1,..., v m, w m+1, w m+2,..., w n Proof: Since the w i s are a basis, we may express v 1 as a linear combination v 1 = c 1 w 1 +... + c n w n Not all coefficients can be 0, since v 1 is not 0. Renumbering the w i s if necessary, we can assume that c 1 0. Since the scalars k are a field, we can express w 1 in terms of v 1 and w 2,..., w n w 1 = c 1 1 v 1 + ( c 1 1 c 2)w 2 +... + ( c 1 1 c 2)w n Replacing w 1 by v 1, the vectors v 1, w 2, w 3,..., w n span V. They are still linearly independent, since if v 1 were a linear combination of w 2,..., w n then the expression for w 1 in terms of v 1, w 2,..., w n would show that w 1 was a linear combination of w 2,..., w n, contradicting the linear independence of w 1,..., w n. Suppose inductively that v 1,..., v i, w i+1,..., w n are a basis for V, with i < n. Express v i+1 as a linear combination v i+1 = a 1 v 1 +... + a i v i + b i+1 w i+1 +... + b n w n Some b j is non-zero, or else v i is a linear combination of v 1,..., v i, contradicting the linear independence of the v j s. By renumbering the w j s if necessary, assume that b i+1 0. Rewrite this to express w i+1 as a linear combination of v 1,..., v i, w i+1,..., w n w i+1 = ( b 1 i+1 a 1)v 1 +... + ( b 1 i+1 a i)v i + (b 1 i+1 )v i+1 + ( b 1 i+1 b i+2)w i+2 +... + ( b 1 i+1 b n)w n

Garrett: Abstract Algebra 87 Thus, v 1,..., v i+1, w i+2,..., w n span V. Claim that these vectors are linearly independent: if for some coefficients a j, b j a 1 v 1 +... + a i+1 v i+1 + b i+2 w i+2 +... + b n w n = 0 then some a i+1 is non-zero, because of the linear independence of v 1,..., v i, w i+1,..., w n. Thus, rearrange to express v i+1 as a linear combination of v 1,..., v i, w i+2,..., w n. The expression for w i+1 in terms of v 1,..., v i, v i+1, w i+2,..., w n becomes an expression for w i+1 as a linear combination of v 1,..., v i, w i+2,..., w n. But this would contradict the (inductively assumed) linear independence of v 1,..., v i, w i+1, w i+2,..., w n. Consider the possibility that m > n. Then, by the previous argument, v 1,..., v n is a basis for V. Thus, v n+1 is a linear combination of v 1,..., v n, contradicting their linear independence. Thus, m n, and v 1,..., v m, w m+1,..., w n is a basis for V, as claimed. Now define the (k-)dimension [2] of a vector space (over field k) as the number of elements in a (k-)basis. The theorem says that this number is well-defined. Write dim V = dimension of V A vector space is finite-dimensional if it has a finite basis. [3] [2.0.2] Corollary: A linearly independent set of vectors in a finite-dimensional vector space can be augmented to be a basis. Proof: Let v 1,..., v m be as linearly independent set of vectors, let w 1,..., w n be a basis, and apply the theorem. [2.0.3] Corollary: The dimension of a proper subspace of a finite-dimensional vector space is strictly less than the dimension of the whole space. Proof: Let w 1,..., w m be a basis for the subspace. By the theorem, it can be extended to a basis w 1,..., w m, v m+1,..., v n of the whole space. It must be that n > m, or else the subspace is the whole space. [2.0.4] Corollary: The dimension of k n is n. The vectors are a basis (the standard basis). Proof: Those vectors span k n, since e 1 = (1, 0, 0,..., 0, 0) e 2 = (0, 1, 0,..., 0, 0) e 3 = (0, 0, 1,..., 0, 0)... e n = (0, 0, 0,..., 0, 1) (c 1,..., c n ) = c 1 e 1 +... + c n e n [2] This is an instance of terminology that is nearly too suggestive. That is, a naive person might all too easily accidentally assume that there is a connection to the colloquial sense of the word dimension, or that there is an appeal to physical or visual intuition. Or one might assume that it is somehow obvious that dimension is a welldefined invariant. [3] We proved only the finite-dimensional case of the well-definedness of dimension. The infinite-dimensional case needs transfinite induction or an equivalent.

88 Linear algebra I: dimension On the other hand, a linear dependence relation gives 0 = c 1 e 1 +... + c n e n (c 1,..., c n ) = (0,..., 0) from which each c i is 0. Thus, these vectors are a basis for k n. 3. Homomorphisms and dimension Now we see how dimension behaves under homomorphisms. Again, a vector space homomorphism [4] f : V W from a vector space V over a field k to a vector space W over the same field k is a function f such that The kernel of f is and the image of f is f(v 1 + v 2 ) = f(v 1 ) + f(v 2 ) (for all v 1, v 2 V ) f(α v) = α f(v) (for all α k, v V ) ker f = {v V : f(v) = 0} Imf = {f(v) : v V } A homomorphism is an isomorphism if it has a two-sided inverse homomorphism. For vector spaces, a homomorphism that is a bijection is an isomorphism. [5] A vector space homomorphism f : V W sends 0 (in V ) to 0 (in W, and, for v V, f( v) = f(v). [6] [3.0.1] Proposition: The kernel and image of a vector space homomorphism f : V W are vector subspaces of V and W, respectively. Proof: Regarding the kernel, the previous proposition shows that it contains 0. The last bulleted point was that additive inverses of elements in the kernel are again in the kernel. For x, y ker f f(x + y) = f(x) + f(y) = 0 + 0 = 0 so ker f is closed under addition. For α k and v V f(α v) = α f(v) = α 0 = 0 so ker f is closed under scalar multiplication. Thus, the kernel is a vector subspace. Similarly, f(0) = 0 shows that 0 is in the image of f. For w = f(v) in the image of f and α k α w = α f(v) = f(αv) Imf [4] Or linear map or linear operator. [5] In most of the situations we will encounter, bijectivity of various sorts of homomorphisms is sufficient (and certainly necessary) to assure that there is an inverse map of the same sort, justifying this description of isomorphism. [6] This follows from the analogous result for groups, since V with its additive structure is an abelian group.

For x = f(u) and y = f(v) both in the image of f, Garrett: Abstract Algebra 89 x + y = f(u) + f(v) = f(u + v) Imf And from above so the image is a vector subspace. f( v) = f(v) [3.0.2] Corollary: A linear map f : V W is injective if and only if its kernel is the trivial subspace {0}. Proof: This follows from the analogous assertion for groups. [3.0.3] Corollary: Let f : V W be a vector space homomorphism, with V finite-dimensional. Then dim ker f + dim Imf = dim V Proof: Let v 1,..., v m be a basis for ker f, and, invoking the theorem, let w m+1,..., w n be vectors in V such that v 1,..., v m, w m+1,..., w n form a basis for V. We claim that the images f(w m+1 ),..., f(w n ) are a basis for Imf. First, show that these vectors span. For f(v) = w, express v as a linear combination and apply f v = a 1 v 1 +... + a m v m + b m+1 w m+1 +... + b n w n w = a 1 f(v 1 ) +... + a m f(v m ) + b m+1 f(w m+1 ) +... + b n f(w n ) = a 1 0 +... + a m 0(v m ) + b m+1 f(w m+1 ) +... + b n f(w n ) = b m+1 f(w m+1 ) +... + b n f(w n ) since the v i s are in the kernel. Thus, the f(w j ) s span the image. For linear independence, suppose Then 0 = b m+1 f(w m+1 ) +... + b n f(w n ) 0 = f(b m+1 w m+1 +... + b n w n ) Then, b m+1 w m+1 +... + b n w n would be in the kernel of f, so would be a linear combination of the v i s, contradicting the fact that v 1,..., v m, w m+1,..., w n is a basis, unless all the b j s were 0. Thus, the f(w j ) are linearly independent, so are a basis for Imf. Exercises 5.[3.0.1] For subspaces V, W of a vector space over a field k, show that dim k V + dim k W = dim k (V + W ) + dim k (V W ) 5.[3.0.2] Given two bases e 1,..., e n and f 1,..., f n for a vector space V over a field k, show that there is a unique k-linear map T : V V such that T (e i ) = f i. 5.[3.0.3] Given a basis e 1,..., e n of a k-vectorspace V, and given arbitrary vectors w 1,..., w n in a k- vectorspace W, show that there is a unique k-linear map T : V W such that T e i = w i for all indices i.

90 Linear algebra I: dimension 5.[3.0.4] The space Hom k (V, W ) of k-linear maps from one k-vectorspace V to another, W, is a k- vectorspace under the operation (αt )(v) = α (T (v)) for α k and T Hom k (V, W ). Show that dim k Hom k (V, W ) = dim k V dim k W 5.[3.0.5] A flag V 1... V l of subspaces of a k-vectorspace V is simply a collection of subspaces satisfying the indicated inclusions. The type of the flag is the list of dimensions of the subspaces V i. Let W be a k-vectorspace, with a flag W 1... W l of the same type as the flag in V. Show that there exists a k-linear map T : V W such that T restricted to V i is an isomorphism V i W i. 5.[3.0.6] Let V 1 V l be a flag of subspace inside a finite-dimensional k-vectorspace V, and W 1... W l a flag inside another finite-dimensional k-vectorspace W. We do not suppose that the two flags are of the same type. Compute the dimension of the space of k-linear homomorphisms T : V W such that T V i W i.