8. Kernel, Image, Dimension 1

Size: px
Start display at page:

Download "8. Kernel, Image, Dimension 1"

Transcription

1 8. Kernel, Image, Dimension 1 Kernel and Image The subspace of V consisting of the vectors that are mapped to 0 in W, namely ker(t = {X V T(X = 0}, is called the kernel of the transformation T. We shall discover as we continue to flesh out the theory that, with respect to any linear transformation T:V W, the most important associated objects are the kernel of T, a subspace of V, and the image of T, Im(T = {T(X W X V }, which we have seen is a subspace of W. Examples: Consider the map R:R 3 R 3 given by the formula R(x, y, z = (x, y, z. Geometrically, this produces the effect of reflecting points in space across the xyplane. The kernel of R consists of only the zero vector: ker( R = {0}. The image is all of 3-space: Im(R = R 3.

2 8. Kernel, Image, Dimension 2 The differential operator D: P 5 (R P 5 (R can be composed with itself. The composite, D 2 = D D, is the second-order differential operator: it computes the second derivative of a polynomial in P 5 (R. Notice that ker( D 2 is the subspace P 1 (R of P 5 (R, whereas Im(D 2 is the subspace P 3 (R. Recall that, where a is any element chosen from a set S, the evaluation map e a :Fun(S,R R given by the formula e a ( f = f (a is a linear transformation. Here, ker(e a = {f :S R f (a = 0} is the subset of Fun(S,R consisting of those functions that send a to 0; that is, ker(e a = Fun(S,R;{a}. Further, the characteristic function χ a is an element of Fun(S,R, whence so is any scalar multiple, r χ a. This implies that Im( e a = R, since e a (r χ a = (r χ a (a = r(χ a (a = r. A most fundamental fact regarding a linear transformation defined on a (finite-dimensional vector space is the following theorem; it describes how the kernel and image of a transformation are related.

3 8. Kernel, Image, Dimension 3 Theorem Let T:V W be a linear transformation. If V is finite-dimensional, then so are Im(T and ker(t, and dim(im(t + dim(ker(t = dim V. (The dimension of the image space is sometimes called the rank of T, and the dimension of the kernel is sometimes called the nullity of T. As such, this theorem goes by the name of the Ranknullity Theorem. Proof Since V is finite-dimensional, so is its subspace ker(t, so we can find a basis for it, say {B 1, B 2,, B k }. Extend this set to a basis for all of V by including the vectors A 1, A 2,, A j ; that is, {A 1, A 2,, A j, B 1, B 2,, B k } is a basis for V. Then Im(T = T(V = T(L(A 1, A 2,, A j, B 1, B 2,, B k = L(T(A 1,, T(A j, T(B 1,, T(B k = L(T(A 1,, T(A j,0,,0 = L(T(A 1,, T(A j shows that Im(T is finite-dimensional as well.

4 8. Kernel, Image, Dimension 4 Indeed, it also shows that Im(T is spanned by the j vectors T(A 1, T(A 2,, T(A j. We claim that these j vectors are in fact linearly independent as well. For if there are scalars r 1,r 2,,r j for which r 1 T(A 1 + r 2 T(A 2 +L + r j T(A j = 0, then by the linearity of T, T(r 1 A 1 + r 2 A 2 +L + r j A j = 0, implying that r 1 A 1 + r 2 A 2 +L + r j A j lies in ker(t. Therefore, there must equal scalars s 1, s 2,, s k for which r 1 A 1 + r 2 A 2 +L + r j A j = s 1 B 1 + s 2 B 2 +L + s k B k. Writing this in the form r 1 A 1 + r 2 A 2 +L + r j A j s 1 B 1 s 2 B 2 L s k B k = 0 gives us a linear combination of basis vectors of V equal to the zero vector, so all the scalars in particular, all the r s must equal 0. Thus, we have shown that dim(im(t = j, dim(ker(t = k, and that dim V = j + k, which is what the theorem states. //

5 8. Kernel, Image, Dimension 5 Fun with linear transformations We have seen how, given an arbitrary set S and a vector space W, the function space Fun(S,W is itself a vector space under natural definitions of addition and scalar multiplication of functions from S to W. (Recall Exercise 3.10, p. 31. An important example of such a vector space arises when S is chosen to be a basis for another vector space V. Let S = {A 1, A 2,, A m } be a basis for V. Then choose some function f from Fun(S,W and set f(a 1 = B 1, f (A 2 = B 2,, f(a m = B m. These B s are vectors in W; they need not even be distinct. Now, since every A V is uniquely a linear combination of vectors from S, we can define a new function F from V to W by setting equal to F(A = F(r 1 A 1 + r 2 A 2 +L + r m A m r 1 f(a 1 + r 2 f (A 2 +L + r m f(a m = r 1 B 1 + r 2 B 2 +L + r m B m.

6 8. Kernel, Image, Dimension 6 In fact, this definition shows that F(A i = f(a i for each i = 1,2,,m; that is, F can be viewed as an extension of the function f : f has S as domain whereas F has all of V = L(S as domain. What is more, F is a linear transformation between the vector spaces V andw since F(r 1 A 1 + r 2 A 2 +L + r m A m = r 1 f(a 1 + r 2 f (A 2 +L + r m f(a m = r 1 F(A 1 + r 2 F(A 2 +L + r m F(A m For this reason, we call F the linear extension of f. It is a useful construction, as we shall now see. Theorem Let A 1, A 2,, A m V be any m vectors chosen from a vector space V. Where E = {E 1, E 2,, E m } is the standard basis in R m, define f Fun(E,V as f(e i = A i for each i and let F be its linear extension to all of R m. Then (1 {A 1, A 2,, A m } is linearly independent in V if and only if ker(f = {0}; and (2 {A 1, A 2,, A m } spans V if and only if Im(F = V.

7 8. Kernel, Image, Dimension 7 Proof (1 {A 1, A 2,, A m } is linearly independent in V whenever the linear combination r 1 A 1 + r 2 A 2 +L + r m A m equals 0, r 1 = r 2 = L = r m = 0 whenever 0 = r 1 f(e 1 + r 2 f (E 2 +L + r m f(e m = r 1 F(E 1 + r 2 F(E 2 +L + r m F(E m = F(r 1 E 1 + r 2 E 2 +L + r m E m = F(r 1,r 2,,r m, then r 1 = r 2 = L = r m = 0 if X = (r 1,r 2,,r m ker(f, then X = 0 ker(f = {0}. (2 {A 1, A 2,, A m } spans V every A V has the form r 1 A 1 + r 2 A 2 +L + r m A m = r 1 f(e 1 + r 2 f (E 2 +L + r m f(e m Im(F = V. // = r 1 F(E 1 + r 2 F(E 2 +L + r m F(E m = F(r 1 E 1 + r 2 E 2 +L + r m E m = F(r 1,r 2,,r m

8 8. Kernel, Image, Dimension 8 We know that Fun(S,W, the collection of functions f:s W, has the structure of a vector space. When we choose this set S = {A 1, A 2,, A m } to be a basis for another vector space V, then the collection of linear extensions F:V W of elements of Fun(S,W, which we denote L(V,W, is itself a vector space, in fact, a subspace of the vector space Fun(V,W. (Why? Observe that every linear transformation T:V W between V and W is the linear extension of some function in Fun(S,W, namely of that function whose values at each vector in S is the same as the value T has there. Thus, L(V,W is the space of all linear transformations between V and W. A particularly important example of a space of linear transformations arises when W is taken to be the 1-dimensional vector space consisting of the field of scalars R. In this instance, we call L(V,R the dual space of V and use the special notation: V* = L(V,R. We will study the properties of the dual space of a vector space somewhat later.

9 8. Kernel, Image, Dimension 9 Another important situation occurs when W = V. Here, L(V,V represents the space of linear transformations from V to itself. Such functions are called endomorphisms on V. The simplest endomorphism on a vector space is the identity map I L(V,V, with the simple definition I(A = A. Yet another important phenomeon arises when a linear transformation is invertible: if T L(V,W and S L(W,V, then these transformations can be composed in both directions: S o T L(V,V and T o S L(W,W are both endomorphisms (but of different spaces!. If it should happen that both S o T and T o S are identity maps on their respective spaces, then T is invertible and S is its inverse transformation. In addition, since every invertible function must be a one-to-one correspondence between its domain and codomain, every invertible transformation T L(V,W is a one-to-one correspondence between the vector spaces V and W. That is, every vector A in V is associated by the transformation T to a unique vector T(A in W, and vice versa. The linearity of T ensures, for instance, that a set of basis vectors in V is associated to a set of basis vectors in W. In particular, this means that if there is an invertible

10 8. Kernel, Image, Dimension 10 linear transformation T L(V,W, then V and W must have the same dimension, indeed, the same structure. Consequently, we call the two vector spaces isomorphic and also refer to T as a vector space isomorphism. Proposition Suppose the linear transformation T L(V,W has trivial kernel (ker(t = {0}. Then every vector B Im(T is the image of a unique vector A V. Proof Clearly, B is the image of at least one vector in V. But if both A and A have B as image, then T(A = T( A, so T(A A = T(A T( A = 0, so A A ker(t. So A A = 0, whence A = A. // Theorem A linear transformation T L(V,W is an isomorphism (i.e., is invertible if and only if ker(t = {0} and Im(T = W (the kernel is trivial and the image is full. Proof Suppose first that T L(V,W is an isomorphism. Then it has an inverse linear transformation S L(W,V for which the compositions S o T and T o S are identity maps. So if A ker(t, then A = S o T(A = S(0 = 0; that is,

11 8. Kernel, Image, Dimension 11 ker(t = {0}. Also, if B W, then B = T o S(B is the image under T of the vector S(B V; that is, Im(T = W. Conversely, suppose that T L(V,W is a linear transformation with trivial kernel and full image. Then by the previous proposition, every vector B in Im(T = W is the image of a unique vector A V. So we can define a function S:W V that sends B to the unique vector A that is sent to B by T. This function is a linear transformation, for if S(B = A and S( B = A, then T(A = B and T( A = B, so T(A + A = B + B, implying that S(B + B = A + A = S(B + S( B ; also, for any scalar r, T(rA = rb so S(rB = ra. Thus, S L(W,V and S o T and T o S are the respective identity maps, so T is an isomorphism. //

12 8. Kernel, Image, Dimension 12 Corollary If V and W are finite-dimensional vector spaces, then they are isomorphic if and only if they have the same dimension. Proof If V and W are isomorphic, then there is some isomorphsim T:V W between them. If V has dimension n, then it has a basis with n vectors, say {A 1, A 2,, A n } is a basis for V. Then W = T(V = T(L(A 1, A 2,, A n = L(T(A 1,, T(A 2,, T(A n so that {T(A 1, T(A 2,, T(A n } is a spanning set for W. But these vectors are also linearly independent, for if r 1 T(A 1 + r 2 T(A 2 +L + r n T(A n = 0, then T(r 1 A 1 + r 2 A 2 +L + r n A n = 0, which, since T has trivial kernel, means that r 1 A 1 + r 2 A 2 +L + r n A n = 0, whence, since {A 1, A 2,, A n } is a basis, forces all the r s to equal 0. Therefore,

13 8. Kernel, Image, Dimension 13 {T(A 1, T(A 2,, T(A n } is a basis for W, from which we conclude that W also has dimension n. Conversely, if dimv = dimw = n, then both spaces have bases of size n: say {A 1, A 2,, A n } is a basis for V and {B 1, B 2,, B n } is a basis for W. Let T be the linear extension of the function that sends A i to B i for i = 1, 2,, n. Then and also, Im(T = L(B 1, B 2,, B n = W, n = dim(v = dim(im(t + dim(ker(t = dim(w + dim(ker(t = n + dim(ker(t implying that dim(ker(t = 0, so ker(t = {0}. Since T has trivial kernel and full image, it must be an isomorphism. // Corollary Every n-dimensional vector space is isomorphic to R n. //

NOTES ON LINEAR TRANSFORMATIONS

NOTES ON LINEAR TRANSFORMATIONS NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all

More information

MA106 Linear Algebra lecture notes

MA106 Linear Algebra lecture notes MA106 Linear Algebra lecture notes Lecturers: Martin Bright and Daan Krammer Warwick, January 2011 Contents 1 Number systems and fields 3 1.1 Axioms for number systems......................... 3 2 Vector

More information

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold:

Linear Algebra. A vector space (over R) is an ordered quadruple. such that V is a set; 0 V ; and the following eight axioms hold: Linear Algebra A vector space (over R) is an ordered quadruple (V, 0, α, µ) such that V is a set; 0 V ; and the following eight axioms hold: α : V V V and µ : R V V ; (i) α(α(u, v), w) = α(u, α(v, w)),

More information

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of

More information

MATH1231 Algebra, 2015 Chapter 7: Linear maps

MATH1231 Algebra, 2015 Chapter 7: Linear maps MATH1231 Algebra, 2015 Chapter 7: Linear maps A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra 1 / 43 Chapter

More information

T ( a i x i ) = a i T (x i ).

T ( a i x i ) = a i T (x i ). Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix. Nullspace Let A = (a ij ) be an m n matrix. Definition. The nullspace of the matrix A, denoted N(A), is the set of all n-dimensional column

More information

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

More information

Matrix Representations of Linear Transformations and Changes of Coordinates

Matrix Representations of Linear Transformations and Changes of Coordinates Matrix Representations of Linear Transformations and Changes of Coordinates 01 Subspaces and Bases 011 Definitions A subspace V of R n is a subset of R n that contains the zero element and is closed under

More information

5. Linear algebra I: dimension

5. Linear algebra I: dimension 5. Linear algebra I: dimension 5.1 Some simple results 5.2 Bases and dimension 5.3 Homomorphisms and dimension 1. Some simple results Several observations should be made. Once stated explicitly, the proofs

More information

160 CHAPTER 4. VECTOR SPACES

160 CHAPTER 4. VECTOR SPACES 160 CHAPTER 4. VECTOR SPACES 4. Rank and Nullity In this section, we look at relationships between the row space, column space, null space of a matrix and its transpose. We will derive fundamental results

More information

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar

More information

Orthogonal Diagonalization of Symmetric Matrices

Orthogonal Diagonalization of Symmetric Matrices MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

Solutions to Math 51 First Exam January 29, 2015

Solutions to Math 51 First Exam January 29, 2015 Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not

More information

Math 333 - Practice Exam 2 with Some Solutions

Math 333 - Practice Exam 2 with Some Solutions Math 333 - Practice Exam 2 with Some Solutions (Note that the exam will NOT be this long) Definitions (0 points) Let T : V W be a transformation Let A be a square matrix (a) Define T is linear (b) Define

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

4.5 Linear Dependence and Linear Independence

4.5 Linear Dependence and Linear Independence 4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then

More information

1 Sets and Set Notation.

1 Sets and Set Notation. LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

More information

THE DIMENSION OF A VECTOR SPACE

THE DIMENSION OF A VECTOR SPACE THE DIMENSION OF A VECTOR SPACE KEITH CONRAD This handout is a supplementary discussion leading up to the definition of dimension and some of its basic properties. Let V be a vector space over a field

More information

University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7. Review University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

More information

16.3 Fredholm Operators

16.3 Fredholm Operators Lectures 16 and 17 16.3 Fredholm Operators A nice way to think about compact operators is to show that set of compact operators is the closure of the set of finite rank operator in operator norm. In this

More information

1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0

1 0 5 3 3 A = 0 0 0 1 3 0 0 0 0 0 0 0 0 0 0 Solutions: Assignment 4.. Find the redundant column vectors of the given matrix A by inspection. Then find a basis of the image of A and a basis of the kernel of A. 5 A The second and third columns are

More information

Section 6.1 - Inner Products and Norms

Section 6.1 - Inner Products and Norms Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,

More information

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES I GROUPS: BASIC DEFINITIONS AND EXAMPLES Definition 1: An operation on a set G is a function : G G G Definition 2: A group is a set G which is equipped with an operation and a special element e G, called

More information

LEARNING OBJECTIVES FOR THIS CHAPTER

LEARNING OBJECTIVES FOR THIS CHAPTER CHAPTER 2 American mathematician Paul Halmos (1916 2006), who in 1942 published the first modern linear algebra book. The title of Halmos s book was the same as the title of this chapter. Finite-Dimensional

More information

Finite dimensional topological vector spaces

Finite dimensional topological vector spaces Chapter 3 Finite dimensional topological vector spaces 3.1 Finite dimensional Hausdorff t.v.s. Let X be a vector space over the field K of real or complex numbers. We know from linear algebra that the

More information

Linear Algebra Notes

Linear Algebra Notes Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note

More information

Name: Section Registered In:

Name: Section Registered In: Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are

More information

by the matrix A results in a vector which is a reflection of the given

by the matrix A results in a vector which is a reflection of the given Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1. MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column

More information

BANACH AND HILBERT SPACE REVIEW

BANACH AND HILBERT SPACE REVIEW BANACH AND HILBET SPACE EVIEW CHISTOPHE HEIL These notes will briefly review some basic concepts related to the theory of Banach and Hilbert spaces. We are not trying to give a complete development, but

More information

Section 1.7 22 Continued

Section 1.7 22 Continued Section 1.5 23 A homogeneous equation is always consistent. TRUE - The trivial solution is always a solution. The equation Ax = 0 gives an explicit descriptions of its solution set. FALSE - The equation

More information

Linear Algebra I. Ronald van Luijk, 2012

Linear Algebra I. Ronald van Luijk, 2012 Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.

More information

1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES 1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

More information

( ) which must be a vector

( ) which must be a vector MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are

More information

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors Chapter 9. General Matrices An n m matrix is an array a a a m a a a m... = [a ij]. a n a n a nm The matrix A has n row vectors and m column vectors row i (A) = [a i, a i,..., a im ] R m a j a j a nj col

More information

17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function

17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function 17. Inner product spaces Definition 17.1. Let V be a real vector space. An inner product on V is a function, : V V R, which is symmetric, that is u, v = v, u. bilinear, that is linear (in both factors):

More information

Row Ideals and Fibers of Morphisms

Row Ideals and Fibers of Morphisms Michigan Math. J. 57 (2008) Row Ideals and Fibers of Morphisms David Eisenbud & Bernd Ulrich Affectionately dedicated to Mel Hochster, who has been an inspiration to us for many years, on the occasion

More information

Polynomial Invariants

Polynomial Invariants Polynomial Invariants Dylan Wilson October 9, 2014 (1) Today we will be interested in the following Question 1.1. What are all the possible polynomials in two variables f(x, y) such that f(x, y) = f(y,

More information

The cover SU(2) SO(3) and related topics

The cover SU(2) SO(3) and related topics The cover SU(2) SO(3) and related topics Iordan Ganev December 2011 Abstract The subgroup U of unit quaternions is isomorphic to SU(2) and is a double cover of SO(3). This allows a simple computation of

More information

PROJECTIVE GEOMETRY. b3 course 2003. Nigel Hitchin

PROJECTIVE GEOMETRY. b3 course 2003. Nigel Hitchin PROJECTIVE GEOMETRY b3 course 2003 Nigel Hitchin hitchin@maths.ox.ac.uk 1 1 Introduction This is a course on projective geometry. Probably your idea of geometry in the past has been based on triangles

More information

Systems of Linear Equations

Systems of Linear Equations Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and

More information

Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

More information

α = u v. In other words, Orthogonal Projection

α = u v. In other words, Orthogonal Projection Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

More information

3. Prime and maximal ideals. 3.1. Definitions and Examples.

3. Prime and maximal ideals. 3.1. Definitions and Examples. COMMUTATIVE ALGEBRA 5 3.1. Definitions and Examples. 3. Prime and maximal ideals Definition. An ideal P in a ring A is called prime if P A and if for every pair x, y of elements in A\P we have xy P. Equivalently,

More information

GROUP ALGEBRAS. ANDREI YAFAEV

GROUP ALGEBRAS. ANDREI YAFAEV GROUP ALGEBRAS. ANDREI YAFAEV We will associate a certain algebra to a finite group and prove that it is semisimple. Then we will apply Wedderburn s theory to its study. Definition 0.1. Let G be a finite

More information

Scalar Valued Functions of Several Variables; the Gradient Vector

Scalar Valued Functions of Several Variables; the Gradient Vector Scalar Valued Functions of Several Variables; the Gradient Vector Scalar Valued Functions vector valued function of n variables: Let us consider a scalar (i.e., numerical, rather than y = φ(x = φ(x 1,

More information

4.1 Modules, Homomorphisms, and Exact Sequences

4.1 Modules, Homomorphisms, and Exact Sequences Chapter 4 Modules We always assume that R is a ring with unity 1 R. 4.1 Modules, Homomorphisms, and Exact Sequences A fundamental example of groups is the symmetric group S Ω on a set Ω. By Cayley s Theorem,

More information

Vector Spaces 4.4 Spanning and Independence

Vector Spaces 4.4 Spanning and Independence Vector Spaces 4.4 and Independence October 18 Goals Discuss two important basic concepts: Define linear combination of vectors. Define Span(S) of a set S of vectors. Define linear Independence of a set

More information

Inner Product Spaces

Inner Product Spaces Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

More information

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS

FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS FINITE DIMENSIONAL ORDERED VECTOR SPACES WITH RIESZ INTERPOLATION AND EFFROS-SHEN S UNIMODULARITY CONJECTURE AARON TIKUISIS Abstract. It is shown that, for any field F R, any ordered vector space structure

More information

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Recall that two vectors in are perpendicular or orthogonal provided that their dot Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

More information

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION 4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:

More information

Section 4.4 Inner Product Spaces

Section 4.4 Inner Product Spaces Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer

More information

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation Chapter 6 Linear Transformation 6 Intro to Linear Transformation Homework: Textbook, 6 Ex, 5, 9,, 5,, 7, 9,5, 55, 57, 6(a,b), 6; page 7- In this section, we discuss linear transformations 89 9 CHAPTER

More information

LS.6 Solution Matrices

LS.6 Solution Matrices LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions

More information

On the representability of the bi-uniform matroid

On the representability of the bi-uniform matroid On the representability of the bi-uniform matroid Simeon Ball, Carles Padró, Zsuzsa Weiner and Chaoping Xing August 3, 2012 Abstract Every bi-uniform matroid is representable over all sufficiently large

More information

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.

a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2. Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given

More information

MATH 551 - APPLIED MATRIX THEORY

MATH 551 - APPLIED MATRIX THEORY MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

More information

Math 67: Modern Linear Algebra (UC Davis, Fall 2011) Summary of lectures Prof. Dan Romik

Math 67: Modern Linear Algebra (UC Davis, Fall 2011) Summary of lectures Prof. Dan Romik Math 67: Modern Linear Algebra (UC Davis, Fall 2011) Summary of lectures Prof. Dan Romik [Version of November 30, 2011 this document will be updated occasionally with new material] Lecture 1 (9/23/11)

More information

Linearly Independent Sets and Linearly Dependent Sets

Linearly Independent Sets and Linearly Dependent Sets These notes closely follow the presentation of the material given in David C. Lay s textbook Linear Algebra and its Applications (3rd edition). These notes are intended primarily for in-class presentation

More information

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively. Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

More information

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a 88 CHAPTER. VECTOR FUNCTIONS.4 Curvature.4.1 Definitions and Examples The notion of curvature measures how sharply a curve bends. We would expect the curvature to be 0 for a straight line, to be very small

More information

Linear Equations in Linear Algebra

Linear Equations in Linear Algebra 1 Linear Equations in Linear Algebra 1.5 SOLUTION SETS OF LINEAR SYSTEMS HOMOGENEOUS LINEAR SYSTEMS A system of linear equations is said to be homogeneous if it can be written in the form A 0, where A

More information

Orthogonal Projections and Orthonormal Bases

Orthogonal Projections and Orthonormal Bases CS 3, HANDOUT -A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

More information

How To Prove The Dirichlet Unit Theorem

How To Prove The Dirichlet Unit Theorem Chapter 6 The Dirichlet Unit Theorem As usual, we will be working in the ring B of algebraic integers of a number field L. Two factorizations of an element of B are regarded as essentially the same if

More information

Chapter 13: Basic ring theory

Chapter 13: Basic ring theory Chapter 3: Basic ring theory Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 42, Spring 24 M. Macauley (Clemson) Chapter 3: Basic ring

More information

Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

More information

1 Introduction to Matrices

1 Introduction to Matrices 1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns

More information

Notes on Symmetric Matrices

Notes on Symmetric Matrices CPSC 536N: Randomized Algorithms 2011-12 Term 2 Notes on Symmetric Matrices Prof. Nick Harvey University of British Columbia 1 Symmetric Matrices We review some basic results concerning symmetric matrices.

More information

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL R. DRNOVŠEK, T. KOŠIR Dedicated to Prof. Heydar Radjavi on the occasion of his seventieth birthday. Abstract. Let S be an irreducible

More information

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES CHRISTOPHER HEIL 1. Cosets and the Quotient Space Any vector space is an abelian group under the operation of vector addition. So, if you are have studied

More information

Methods for Finding Bases

Methods for Finding Bases Methods for Finding Bases Bases for the subspaces of a matrix Row-reduction methods can be used to find bases. Let us now look at an example illustrating how to obtain bases for the row space, null space,

More information

ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan 3 Binary Operations We are used to addition and multiplication of real numbers. These operations combine two real numbers

More information

8.1 Examples, definitions, and basic properties

8.1 Examples, definitions, and basic properties 8 De Rham cohomology Last updated: May 21, 211. 8.1 Examples, definitions, and basic properties A k-form ω Ω k (M) is closed if dω =. It is exact if there is a (k 1)-form σ Ω k 1 (M) such that dσ = ω.

More information

Ideal Class Group and Units

Ideal Class Group and Units Chapter 4 Ideal Class Group and Units We are now interested in understanding two aspects of ring of integers of number fields: how principal they are (that is, what is the proportion of principal ideals

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

5 Homogeneous systems

5 Homogeneous systems 5 Homogeneous systems Definition: A homogeneous (ho-mo-jeen -i-us) system of linear algebraic equations is one in which all the numbers on the right hand side are equal to : a x +... + a n x n =.. a m

More information

Lecture 18 - Clifford Algebras and Spin groups

Lecture 18 - Clifford Algebras and Spin groups Lecture 18 - Clifford Algebras and Spin groups April 5, 2013 Reference: Lawson and Michelsohn, Spin Geometry. 1 Universal Property If V is a vector space over R or C, let q be any quadratic form, meaning

More information

Numerical Analysis Lecture Notes

Numerical Analysis Lecture Notes Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,

More information

Quotient Rings and Field Extensions

Quotient Rings and Field Extensions Chapter 5 Quotient Rings and Field Extensions In this chapter we describe a method for producing field extension of a given field. If F is a field, then a field extension is a field K that contains F.

More information

Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

Notes on Determinant

Notes on Determinant ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without

More information

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d. DEFINITION: A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the following axioms

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

More information

NOTES ON CATEGORIES AND FUNCTORS

NOTES ON CATEGORIES AND FUNCTORS NOTES ON CATEGORIES AND FUNCTORS These notes collect basic definitions and facts about categories and functors that have been mentioned in the Homological Algebra course. For further reading about category

More information

Math 4310 Handout - Quotient Vector Spaces

Math 4310 Handout - Quotient Vector Spaces Math 4310 Handout - Quotient Vector Spaces Dan Collins The textbook defines a subspace of a vector space in Chapter 4, but it avoids ever discussing the notion of a quotient space. This is understandable

More information

Chapter 5. Banach Spaces

Chapter 5. Banach Spaces 9 Chapter 5 Banach Spaces Many linear equations may be formulated in terms of a suitable linear operator acting on a Banach space. In this chapter, we study Banach spaces and linear operators acting on

More information

A Modern Course on Curves and Surfaces. Richard S. Palais

A Modern Course on Curves and Surfaces. Richard S. Palais A Modern Course on Curves and Surfaces Richard S. Palais Contents Lecture 1. Introduction 1 Lecture 2. What is Geometry 4 Lecture 3. Geometry of Inner-Product Spaces 7 Lecture 4. Linear Maps and the Euclidean

More information

Section 1.1. Introduction to R n

Section 1.1. Introduction to R n The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to

More information

SMALL SKEW FIELDS CÉDRIC MILLIET

SMALL SKEW FIELDS CÉDRIC MILLIET SMALL SKEW FIELDS CÉDRIC MILLIET Abstract A division ring of positive characteristic with countably many pure types is a field Wedderburn showed in 1905 that finite fields are commutative As for infinite

More information

Cartesian Products and Relations

Cartesian Products and Relations Cartesian Products and Relations Definition (Cartesian product) If A and B are sets, the Cartesian product of A and B is the set A B = {(a, b) :(a A) and (b B)}. The following points are worth special

More information

Jim Lambers MAT 169 Fall Semester 2009-10 Lecture 25 Notes

Jim Lambers MAT 169 Fall Semester 2009-10 Lecture 25 Notes Jim Lambers MAT 169 Fall Semester 009-10 Lecture 5 Notes These notes correspond to Section 10.5 in the text. Equations of Lines A line can be viewed, conceptually, as the set of all points in space that

More information

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables The Calculus of Functions of Several Variables Section 1.4 Lines, Planes, Hyperplanes In this section we will add to our basic geometric understing of R n by studying lines planes. If we do this carefully,

More information