8. Kernel, Image, Dimension 1

Similar documents
NOTES ON LINEAR TRANSFORMATIONS

MA106 Linear Algebra lecture notes

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

MATH1231 Algebra, 2015 Chapter 7: Linear maps

T ( a i x i ) = a i T (x i ).

LINEAR ALGEBRA W W L CHEN

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Matrix Representations of Linear Transformations and Changes of Coordinates

5. Linear algebra I: dimension

160 CHAPTER 4. VECTOR SPACES

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Orthogonal Diagonalization of Symmetric Matrices

Solutions to Math 51 First Exam January 29, 2015

Math Practice Exam 2 with Some Solutions

Similarity and Diagonalization. Similar Matrices

4.5 Linear Dependence and Linear Independence

1 Sets and Set Notation.

THE DIMENSION OF A VECTOR SPACE

University of Lille I PC first year list of exercises n 7. Review

16.3 Fredholm Operators

A =

Section Inner Products and Norms

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

LEARNING OBJECTIVES FOR THIS CHAPTER

Finite dimensional topological vector spaces

Linear Algebra Notes

Name: Section Registered In:

by the matrix A results in a vector which is a reflection of the given

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

BANACH AND HILBERT SPACE REVIEW

Section Continued

Linear Algebra I. Ronald van Luijk, 2012

1 VECTOR SPACES AND SUBSPACES

( ) which must be a vector

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Row Ideals and Fibers of Morphisms

Polynomial Invariants

The cover SU(2) SO(3) and related topics

PROJECTIVE GEOMETRY. b3 course Nigel Hitchin

Systems of Linear Equations

Chapter 20. Vector Spaces and Bases

α = u v. In other words, Orthogonal Projection

3. Prime and maximal ideals Definitions and Examples.

GROUP ALGEBRAS. ANDREI YAFAEV

Scalar Valued Functions of Several Variables; the Gradient Vector

4.1 Modules, Homomorphisms, and Exact Sequences

Vector Spaces 4.4 Spanning and Independence

Inner Product Spaces

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS

Recall that two vectors in are perpendicular or orthogonal provided that their dot

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Section 4.4 Inner Product Spaces

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

LS.6 Solution Matrices

On the representability of the bi-uniform matroid

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MATH APPLIED MATRIX THEORY

Math 67: Modern Linear Algebra (UC Davis, Fall 2011) Summary of lectures Prof. Dan Romik

Linearly Independent Sets and Linearly Dependent Sets

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a

Linear Equations in Linear Algebra

Orthogonal Projections and Orthonormal Bases

1 if 1 x 0 1 if 0 x 1

Linear Algebra Notes for Marsden and Tromba Vector Calculus

How To Prove The Dirichlet Unit Theorem

Chapter 13: Basic ring theory

Chapter 17. Orthogonal Matrices and Symmetries of Space

1 Introduction to Matrices

Notes on Symmetric Matrices

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

Methods for Finding Bases

ISOMETRIES OF R n KEITH CONRAD

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Linear Algebra Review. Vectors

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan

8.1 Examples, definitions, and basic properties

Ideal Class Group and Units

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

5 Homogeneous systems

Lecture 18 - Clifford Algebras and Spin groups

Numerical Analysis Lecture Notes

Quotient Rings and Field Extensions

Inner Product Spaces and Orthogonality

Notes on Determinant

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

NOTES ON CATEGORIES AND FUNCTORS

Math 4310 Handout - Quotient Vector Spaces

Chapter 5. Banach Spaces

A Modern Course on Curves and Surfaces. Richard S. Palais

Section 1.1. Introduction to R n

SMALL SKEW FIELDS CÉDRIC MILLIET

Cartesian Products and Relations

Jim Lambers MAT 169 Fall Semester Lecture 25 Notes

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Transcription:

8. Kernel, Image, Dimension 1 Kernel and Image The subspace of V consisting of the vectors that are mapped to 0 in W, namely ker(t = {X V T(X = 0}, is called the kernel of the transformation T. We shall discover as we continue to flesh out the theory that, with respect to any linear transformation T:V W, the most important associated objects are the kernel of T, a subspace of V, and the image of T, Im(T = {T(X W X V }, which we have seen is a subspace of W. Examples: Consider the map R:R 3 R 3 given by the formula R(x, y, z = (x, y, z. Geometrically, this produces the effect of reflecting points in space across the xyplane. The kernel of R consists of only the zero vector: ker( R = {0}. The image is all of 3-space: Im(R = R 3.

8. Kernel, Image, Dimension 2 The differential operator D: P 5 (R P 5 (R can be composed with itself. The composite, D 2 = D D, is the second-order differential operator: it computes the second derivative of a polynomial in P 5 (R. Notice that ker( D 2 is the subspace P 1 (R of P 5 (R, whereas Im(D 2 is the subspace P 3 (R. Recall that, where a is any element chosen from a set S, the evaluation map e a :Fun(S,R R given by the formula e a ( f = f (a is a linear transformation. Here, ker(e a = {f :S R f (a = 0} is the subset of Fun(S,R consisting of those functions that send a to 0; that is, ker(e a = Fun(S,R;{a}. Further, the characteristic function χ a is an element of Fun(S,R, whence so is any scalar multiple, r χ a. This implies that Im( e a = R, since e a (r χ a = (r χ a (a = r(χ a (a = r. A most fundamental fact regarding a linear transformation defined on a (finite-dimensional vector space is the following theorem; it describes how the kernel and image of a transformation are related.

8. Kernel, Image, Dimension 3 Theorem Let T:V W be a linear transformation. If V is finite-dimensional, then so are Im(T and ker(t, and dim(im(t + dim(ker(t = dim V. (The dimension of the image space is sometimes called the rank of T, and the dimension of the kernel is sometimes called the nullity of T. As such, this theorem goes by the name of the Ranknullity Theorem. Proof Since V is finite-dimensional, so is its subspace ker(t, so we can find a basis for it, say {B 1, B 2,, B k }. Extend this set to a basis for all of V by including the vectors A 1, A 2,, A j ; that is, {A 1, A 2,, A j, B 1, B 2,, B k } is a basis for V. Then Im(T = T(V = T(L(A 1, A 2,, A j, B 1, B 2,, B k = L(T(A 1,, T(A j, T(B 1,, T(B k = L(T(A 1,, T(A j,0,,0 = L(T(A 1,, T(A j shows that Im(T is finite-dimensional as well.

8. Kernel, Image, Dimension 4 Indeed, it also shows that Im(T is spanned by the j vectors T(A 1, T(A 2,, T(A j. We claim that these j vectors are in fact linearly independent as well. For if there are scalars r 1,r 2,,r j for which r 1 T(A 1 + r 2 T(A 2 +L + r j T(A j = 0, then by the linearity of T, T(r 1 A 1 + r 2 A 2 +L + r j A j = 0, implying that r 1 A 1 + r 2 A 2 +L + r j A j lies in ker(t. Therefore, there must equal scalars s 1, s 2,, s k for which r 1 A 1 + r 2 A 2 +L + r j A j = s 1 B 1 + s 2 B 2 +L + s k B k. Writing this in the form r 1 A 1 + r 2 A 2 +L + r j A j s 1 B 1 s 2 B 2 L s k B k = 0 gives us a linear combination of basis vectors of V equal to the zero vector, so all the scalars in particular, all the r s must equal 0. Thus, we have shown that dim(im(t = j, dim(ker(t = k, and that dim V = j + k, which is what the theorem states. //

8. Kernel, Image, Dimension 5 Fun with linear transformations We have seen how, given an arbitrary set S and a vector space W, the function space Fun(S,W is itself a vector space under natural definitions of addition and scalar multiplication of functions from S to W. (Recall Exercise 3.10, p. 31. An important example of such a vector space arises when S is chosen to be a basis for another vector space V. Let S = {A 1, A 2,, A m } be a basis for V. Then choose some function f from Fun(S,W and set f(a 1 = B 1, f (A 2 = B 2,, f(a m = B m. These B s are vectors in W; they need not even be distinct. Now, since every A V is uniquely a linear combination of vectors from S, we can define a new function F from V to W by setting equal to F(A = F(r 1 A 1 + r 2 A 2 +L + r m A m r 1 f(a 1 + r 2 f (A 2 +L + r m f(a m = r 1 B 1 + r 2 B 2 +L + r m B m.

8. Kernel, Image, Dimension 6 In fact, this definition shows that F(A i = f(a i for each i = 1,2,,m; that is, F can be viewed as an extension of the function f : f has S as domain whereas F has all of V = L(S as domain. What is more, F is a linear transformation between the vector spaces V andw since F(r 1 A 1 + r 2 A 2 +L + r m A m = r 1 f(a 1 + r 2 f (A 2 +L + r m f(a m = r 1 F(A 1 + r 2 F(A 2 +L + r m F(A m For this reason, we call F the linear extension of f. It is a useful construction, as we shall now see. Theorem Let A 1, A 2,, A m V be any m vectors chosen from a vector space V. Where E = {E 1, E 2,, E m } is the standard basis in R m, define f Fun(E,V as f(e i = A i for each i and let F be its linear extension to all of R m. Then (1 {A 1, A 2,, A m } is linearly independent in V if and only if ker(f = {0}; and (2 {A 1, A 2,, A m } spans V if and only if Im(F = V.

8. Kernel, Image, Dimension 7 Proof (1 {A 1, A 2,, A m } is linearly independent in V whenever the linear combination r 1 A 1 + r 2 A 2 +L + r m A m equals 0, r 1 = r 2 = L = r m = 0 whenever 0 = r 1 f(e 1 + r 2 f (E 2 +L + r m f(e m = r 1 F(E 1 + r 2 F(E 2 +L + r m F(E m = F(r 1 E 1 + r 2 E 2 +L + r m E m = F(r 1,r 2,,r m, then r 1 = r 2 = L = r m = 0 if X = (r 1,r 2,,r m ker(f, then X = 0 ker(f = {0}. (2 {A 1, A 2,, A m } spans V every A V has the form r 1 A 1 + r 2 A 2 +L + r m A m = r 1 f(e 1 + r 2 f (E 2 +L + r m f(e m Im(F = V. // = r 1 F(E 1 + r 2 F(E 2 +L + r m F(E m = F(r 1 E 1 + r 2 E 2 +L + r m E m = F(r 1,r 2,,r m

8. Kernel, Image, Dimension 8 We know that Fun(S,W, the collection of functions f:s W, has the structure of a vector space. When we choose this set S = {A 1, A 2,, A m } to be a basis for another vector space V, then the collection of linear extensions F:V W of elements of Fun(S,W, which we denote L(V,W, is itself a vector space, in fact, a subspace of the vector space Fun(V,W. (Why? Observe that every linear transformation T:V W between V and W is the linear extension of some function in Fun(S,W, namely of that function whose values at each vector in S is the same as the value T has there. Thus, L(V,W is the space of all linear transformations between V and W. A particularly important example of a space of linear transformations arises when W is taken to be the 1-dimensional vector space consisting of the field of scalars R. In this instance, we call L(V,R the dual space of V and use the special notation: V* = L(V,R. We will study the properties of the dual space of a vector space somewhat later.

8. Kernel, Image, Dimension 9 Another important situation occurs when W = V. Here, L(V,V represents the space of linear transformations from V to itself. Such functions are called endomorphisms on V. The simplest endomorphism on a vector space is the identity map I L(V,V, with the simple definition I(A = A. Yet another important phenomeon arises when a linear transformation is invertible: if T L(V,W and S L(W,V, then these transformations can be composed in both directions: S o T L(V,V and T o S L(W,W are both endomorphisms (but of different spaces!. If it should happen that both S o T and T o S are identity maps on their respective spaces, then T is invertible and S is its inverse transformation. In addition, since every invertible function must be a one-to-one correspondence between its domain and codomain, every invertible transformation T L(V,W is a one-to-one correspondence between the vector spaces V and W. That is, every vector A in V is associated by the transformation T to a unique vector T(A in W, and vice versa. The linearity of T ensures, for instance, that a set of basis vectors in V is associated to a set of basis vectors in W. In particular, this means that if there is an invertible

8. Kernel, Image, Dimension 10 linear transformation T L(V,W, then V and W must have the same dimension, indeed, the same structure. Consequently, we call the two vector spaces isomorphic and also refer to T as a vector space isomorphism. Proposition Suppose the linear transformation T L(V,W has trivial kernel (ker(t = {0}. Then every vector B Im(T is the image of a unique vector A V. Proof Clearly, B is the image of at least one vector in V. But if both A and A have B as image, then T(A = T( A, so T(A A = T(A T( A = 0, so A A ker(t. So A A = 0, whence A = A. // Theorem A linear transformation T L(V,W is an isomorphism (i.e., is invertible if and only if ker(t = {0} and Im(T = W (the kernel is trivial and the image is full. Proof Suppose first that T L(V,W is an isomorphism. Then it has an inverse linear transformation S L(W,V for which the compositions S o T and T o S are identity maps. So if A ker(t, then A = S o T(A = S(0 = 0; that is,

8. Kernel, Image, Dimension 11 ker(t = {0}. Also, if B W, then B = T o S(B is the image under T of the vector S(B V; that is, Im(T = W. Conversely, suppose that T L(V,W is a linear transformation with trivial kernel and full image. Then by the previous proposition, every vector B in Im(T = W is the image of a unique vector A V. So we can define a function S:W V that sends B to the unique vector A that is sent to B by T. This function is a linear transformation, for if S(B = A and S( B = A, then T(A = B and T( A = B, so T(A + A = B + B, implying that S(B + B = A + A = S(B + S( B ; also, for any scalar r, T(rA = rb so S(rB = ra. Thus, S L(W,V and S o T and T o S are the respective identity maps, so T is an isomorphism. //

8. Kernel, Image, Dimension 12 Corollary If V and W are finite-dimensional vector spaces, then they are isomorphic if and only if they have the same dimension. Proof If V and W are isomorphic, then there is some isomorphsim T:V W between them. If V has dimension n, then it has a basis with n vectors, say {A 1, A 2,, A n } is a basis for V. Then W = T(V = T(L(A 1, A 2,, A n = L(T(A 1,, T(A 2,, T(A n so that {T(A 1, T(A 2,, T(A n } is a spanning set for W. But these vectors are also linearly independent, for if r 1 T(A 1 + r 2 T(A 2 +L + r n T(A n = 0, then T(r 1 A 1 + r 2 A 2 +L + r n A n = 0, which, since T has trivial kernel, means that r 1 A 1 + r 2 A 2 +L + r n A n = 0, whence, since {A 1, A 2,, A n } is a basis, forces all the r s to equal 0. Therefore,

8. Kernel, Image, Dimension 13 {T(A 1, T(A 2,, T(A n } is a basis for W, from which we conclude that W also has dimension n. Conversely, if dimv = dimw = n, then both spaces have bases of size n: say {A 1, A 2,, A n } is a basis for V and {B 1, B 2,, B n } is a basis for W. Let T be the linear extension of the function that sends A i to B i for i = 1, 2,, n. Then and also, Im(T = L(B 1, B 2,, B n = W, n = dim(v = dim(im(t + dim(ker(t = dim(w + dim(ker(t = n + dim(ker(t implying that dim(ker(t = 0, so ker(t = {0}. Since T has trivial kernel and full image, it must be an isomorphism. // Corollary Every n-dimensional vector space is isomorphic to R n. //