MATH 113 HOMEWORK 1: SOLUTIONS

Similar documents
Vector and Matrix Norms

Quotient Rings and Field Extensions

Math 312 Homework 1 Solutions

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

1 if 1 x 0 1 if 0 x 1

LEARNING OBJECTIVES FOR THIS CHAPTER

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Linear Algebra I. Ronald van Luijk, 2012

Math 4310 Handout - Quotient Vector Spaces

Solutions to Math 51 First Exam January 29, 2015

Chapter 4, Arithmetic in F [x] Polynomial arithmetic and the division algorithm.

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

1. Prove that the empty set is a subset of every set.

5.3 The Cross Product in R 3

it is easy to see that α = a

NOTES ON LINEAR TRANSFORMATIONS

SOLUTIONS TO ASSIGNMENT 1 MATH 576

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

3. INNER PRODUCT SPACES

Practice with Proofs

LINEAR ALGEBRA W W L CHEN

Matrix Representations of Linear Transformations and Changes of Coordinates

Linear Algebra Notes for Marsden and Tromba Vector Calculus

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Lecture 14: Section 3.3

1 VECTOR SPACES AND SUBSPACES

GROUPS ACTING ON A SET

26 Ideals and Quotient Rings

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include

TOPIC 4: DERIVATIVES

Name: Section Registered In:

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Systems of Linear Equations

Polynomial Invariants

T ( a i x i ) = a i T (x i ).

1 Homework 1. [p 0 q i+j p i 1 q j+1 ] + [p i q j ] + [p i+1 q j p i+j q 0 ]

k, then n = p2α 1 1 pα k

minimal polyonomial Example

Vector Spaces 4.4 Spanning and Independence

CS 103X: Discrete Structures Homework Assignment 3 Solutions

DERIVATIVES AS MATRICES; CHAIN RULE

Unified Lecture # 4 Vectors

by the matrix A results in a vector which is a reflection of the given

4. CLASSES OF RINGS 4.1. Classes of Rings class operator A-closed Example 1: product Example 2:

Notes from February 11

Student Outcomes. Lesson Notes. Classwork. Discussion (10 minutes)

Homework until Test #2

The Division Algorithm for Polynomials Handout Monday March 5, 2012

Limits and Continuity

THE DIMENSION OF A VECTOR SPACE

Chapter 20. Vector Spaces and Bases

4.5 Linear Dependence and Linear Independence

Vector Spaces. Chapter R 2 through R n

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Notes on Symmetric Matrices

3. Prime and maximal ideals Definitions and Examples.

Inner Product Spaces

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2

5 Systems of Equations

Methods for Finding Bases

(a) Write each of p and q as a polynomial in x with coefficients in Z[y, z]. deg(p) = 7 deg(q) = 9

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

BANACH AND HILBERT SPACE REVIEW

Linear and quadratic Taylor polynomials for functions of several variables.

The Mean Value Theorem

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

1.5. Factorisation. Introduction. Prerequisites. Learning Outcomes. Learning Style

Real Roots of Univariate Polynomials with Real Coefficients

Properties of Real Numbers

So let us begin our quest to find the holy grail of real analysis.

3. Mathematical Induction

Math 115A - Week 1 Textbook sections: Topics covered: What is a vector? What is a vector space? Span, linear dependence, linear independence

12.5 Equations of Lines and Planes

Introduction to Finite Fields (cont.)

Chapter 7 - Roots, Radicals, and Complex Numbers

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Mathematics Review for MS Finance Students

( ) which must be a vector

Undergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Math Practice Exam 2 with Some Solutions

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES

Math 319 Problem Set #3 Solution 21 February 2002

MA651 Topology. Lecture 6. Separation Axioms.

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

9 Multiplication of Vectors: The Scalar or Dot Product

15. Symmetric polynomials

MATH APPLIED MATRIX THEORY

Math 241, Exam 1 Information.

Chapter 17. Orthogonal Matrices and Symmetries of Space

A =

Answer Key for California State Standards: Algebra I

Duality of linear conic problems

The last three chapters introduced three major proof techniques: direct,

Section 1.1. Introduction to R n

MATH10040 Chapter 2: Prime and relatively prime numbers

Unique Factorization

3 1. Note that all cubes solve it; therefore, there are no more

Transcription:

MATH 3 HOMEWORK : SOLUTIONS Problems From the Book: Axler Chapter, problems, 4, 8, 9, and 3. Solution to. Multiplying the numerator by its congruence directly gives a simplified expression of the ratio of two complex numbers: a + bi = a bi (a + bi)(a bi) = a bi a 2 + b 2. So just take c = a a 2 + b 2, d = b a 2 + b 2. Solution to 4. If a = 0 then we are done. If a 0, then multiplying both sides of the equation av = 0 by /a gives a (av) = a 0. The associativity of scalar product in vector space shows that the left hand side of the equation is precisely v = v. So we get v = 0, which ends the proof. Solution to 8. Suppose U, U 2,... is a collection of subspaces of V. We wish to show U U 2 is also a subspace of V. Let u, v be two arbitrary elements and a F. Since U i is a subspace of V for each i, we have 0 U i. So 0 Suppose u, v Then for each i, u, v U i. Since U i is a subspace of V, we have u + v U i. Thus u + v Suppose u i= U i and a F. Then for each i, u U i. Because U i is a subspace, we have au U i. Thus au Because i= U i is a nonempty subset of V which is closed under addition and scalar multiplication, it is a linear subspace of V. Solution to 9. Suppose U, W are subspaces of V such that U W is a subspace as well. Assume for the sake of contradiction that U W and W U. Then there exists u U, u / W and w / U, w W by definition. Then u, w U W so (by linearity in subspaces) u + w U W. Thus it is in either u + w U or u + w W. Without loss of generality we can suppose that u + w U (it s the same argument for W ). Then u U, u + w U so that w U, a contradiction. Hence our original assumption must have been false, so either U W or W U. Conversely, if U W then U W = W which is clearly a subspace, and if W U then U W = U which is also a subspace by assumption. Proof of 3. False. Counterexample: Let V be any nonzero vector space, let U = {0} and U 2 = V and W = V. Then U + W = V = U 2 + W, but U U 2. Of course there are many other examples.

2 MATH 3 HOMEWORK : SOLUTIONS Question. Let F be R or C and let X be any set. Let F X denote the set of all functions f : X F. Define addition and scalar multiplication on F X so that if f, g F X and a F then (f + g)(x) = f(x) + g(x), (af)(x) = a(f(x)). (a) Show that F X is a vector space under these operations. (b) Which of the following sets of functions from R to R are subspaces of R R? Justify your answers. {f : R R : f is continuous at all x R} {f : R R : f(x) 0 for all x R} {f : R R : f is differentiable at all x R} Proof. (a): We verify the axioms individually. First, note that F X is indeed closed under addition and scalar multiplication: addition of functions is obviously still a well-defined function, and same for scalar multiplication of functions. Commutativity: Let f, g F X, then for any x, (f + g)(x) = f(x) + g(x) = g(x) + f(x) = (g + f)(x); hence f + g = g + f for all x X, so they are identical functions in F X. Associativity: Let f, g, h F X. Similar to the above argument, ((f + g) + h)(x) = (f + g)(x) + h(x) = (f(x) + g(x)) + h(x) = f(x) + (g(x) + h(x)) = (f + (g + h))(x). Additive identity: Consider the zero function z : X F defined asz(x) = 0 for all x X. Then for any f F X, Then (f + z)(x) = f(x) + z(x) = f(x) + 0 = f(x). Additive inverse: For any function f F X, consider the function n f F X defined by Multiplicative identity: Let f F X, then n f (x) := (f(x)). (f + n f )(x) = f(x) + ( f(x)) = 0 = z(x). (f)(x) = f(x) = f(x). Multiplicative associativity: Let f F X and a, b F, then (a(bf))(x) = a((bf)(x)) = a(bf(x)) = (ab)f(x) = ((ab)f)(x). Distributive property : Let a F, f, g F X, (a(f + g))(x) = a(f + g)(x) = a((f(x) + g(x)) = af(x) + ag(x) = (af)(x) + (ag)(x) = ((af) + (ag))(x). Distributive property 2: Let a, b F, f F X : ((a + b)f)(x) = (a + b)f(x) = af(x) + bf(x) = (af)(x) + (bf)(x) = ((af) + (bf))(x). This finishes all the axioms and shows that F X is a vector space. (b): The first and third are subspaces; the second is not. The second is not, since it is not closed under scalar multiplication: the constant function I : I(x) = for all x is nonnegative for all x but its negative n I (x) fails to be. To show that the first and the third are subspaces, we verify the three axioms. Additive identity: the zero function z : z(x) = 0 for all x is clearly continuous and differentiable. Closed under addition: If f, g R R are continuous (alt. differentiable), then f + g is as well (facts from high-school calculus). Closed under scalar multiplication: If f R R and a R then af is (continuous/differentiable) if f is (continuous/differentiable), again by high-school calculus.

MATH 3 HOMEWORK : SOLUTIONS 3 Question 2. Define E to be the set E = {[x] : x R, x > 0}. Define addition and scaler multiplication on E as follows: [x] + [y] = [x y] for [x], [y] E, and a[x] = [x a ] for [x] E, a R. Show that E with the above operations is a vector space over R. Proof. We verify each of the axioms as above. First, note that E is indeed closed under addition and scalar multiplication: if x, y > 0 then xy > 0 as well; and for any a R, x a > 0 is well-defined and positive. Commutativity: For any [x], [y] E, we have Associativity: For any [x], [y], [z] E we have [x] + [y] = [x y] = [y x] = [y] + [x]. ([x] + [y]) + [z] = [x y] + [z] = [(xy)z] = [x(yz)] = [x] + [yz] = [x] + ([y] + [z]). Additive identity: For the element [] E, we have [x] + [] = [x] = [x]. Additive inverse: For any [x] E, the element [/x] E satisfies Multiplicative identity: For any [x] E, we have [x] + [/x] = [x /x] = []. [x] = [x ] = [x]. Multiplicative associativity: For any [x] E, a, b R, we have Distributive property : Distributive property 2: a(b[x]) = a([x b ]) = [(x b ) a ] = [x ab ] = (ab)[x]. a([x] + [y]) = a[xy] = [(xy) a ] = [x a y a ] = [x a ] + [y a ] = a[x] + a[y]. (a + b)[x] = [x a+b ] = [x a x b ] = [x a ] + [x b ] = a[x] + b[x]. Question 3. Let U,..., U m be subspaces of a vector space V. Let W = U +... + U m. Show that W is the smallest subspace of V that contains all of the U i. In particular, show that: (a) W U i for all i. (b) If W is a subspace of V so that W U i for all i, then W W. Proof. (a): By definition, W = {u + + u m u i U i for each i}. Fix any i, i m. and any u U i. Then since 0 U j for all j i, it follows that u = u + 0 + + 0 W. This shows that u U i implies that u W, i.e. that U i W. Since this is true for any i we are done. (b): Suppose W is a subspace of V such that W U i for all i. Pick any vector w W ; we want to show that w W as well. By definition w can be written as w = u + + u m, u i U i. Note that u U W and u 2 U 2 W so by closure under addition, u + u 2 W. Similarly u 3 W, u 4 W, etc., so by adding these vectors one-by-one, we eventually end up with u + + u m W, i.e. w W. This shows that for any w W, w W, which implies by definition that W W as desired.

4 MATH 3 HOMEWORK : SOLUTIONS Question 4. Consider the following subspaces of R 3 : U = {(x, 0, 0) : x R}, U 2 = {(0, y, 0) : y R}, U 3 = {(0, 0, z) : z R}, U 4 = {(w, 0, w) : w R}. Describe the following subspaces: U 2 + U 4 U + U 3 U + U 3 + U 4 U + U 2 + U 3 + U 4 Which of the above are all of R 3? For which of the above are the sums also direct sums? Solution. (a): By definition, the sum of given subspaces are: U 2 + U 4 = {(w, y, w) : w, y R} U + U 3 = {(x, 0, z) : x, z R} U + U 3 + U 4 = {(x + w, 0, z + w) : x, z, w R} = {(x, 0, z) : x, z R} U + U 2 + U 3 + U 4 = {(x + w, y, z + w) : x, y, z, w R} = {(x, y, z) : x, y, z R} = R 3 The first and the second summation follows directly from definition of addition of sets. To see the third and fourth one, we need to be careful that the two sets {(x+w, 0, z+w) : x, z, w R} and {(x, 0, z) : x, z R} are actually equal. Indeed, {(x, 0, z) : x, z R} {(x + w, 0, z + w) : x, z, w R} is natural by setting w = 0. And for any x, z, w R, we define x = x + w, z = z + w, so that (x + w, 0, z + w) = (x, 0, z ) {(x, 0, z ) : x, z R} = {(x, 0, z) : x, z R}. This gives {(x + w, 0, z + w) : x, z, w R} = {(x, 0, z) : x, z R}. And the expression for U + U 2 + U 3 + U 4 is obtained similarly. (b): We need to tell whether the sums are direct sums. Let s first recall that a sum of linear subspaces V = U + U 2 +... + U k is a direct sum if each element v V can be written uniquely as a sum u + u 2 +... + u k, where u j U j. As a result, if the intersection of two subspaces U and U 2 of R 3 is {(0, 0, 0)}, then their sum U + U 2 is a direct sum. To see this, suppose v = u + u 2 = u + u 2 be two ways to write v V as a sum of elements in U and U 2. Then we have u u = u 2 u 2. Since u u belongs to U and u 2 u 2 belongs to U 2, these two equal elements must lie in U U 2 = {(0, 0, 0)}, or we have u = u and u 2 = u 2. This criterion immediately gives us two direct sums among all: because U U 3 = U 2 U 4 = {(0, 0, 0)}, the sum U 2 + U 4 and U + U 3 must be direct sum. And the rest two are not direct sum. For example, we have and (, 0, ) = (, 0, 0) + (0, 0, ) + (0, 0, 0) = (0, 0, 0) + (0, 0, 0) + (, 0, ) (, 0, ) = (, 0, 0) + (0, 0, 0) + (0, 0, ) + (0, 0, 0) = (0, 0, 0) + (0, 0, 0) + (0, 0, 0) + (, 0, ) in the third and fourth sum. There are, of course, infinitely many examples to choose. Question 5. Prove or provide a counter-example to the following statement: U, U 2, U 3 of a vector space V that U (U 2 + U 3 ) = (U U 2 ) + (U U 3 ). Solution. The statement if false generally. Here is a possible counter-example. Take V = R 3 and U = {(x, y, 0) : x, y R}, U 2 = {(x, y, y) : x, y R}, U 3 = {(0, 0, z) : z R}. For any three subspaces Clearly U, U 2, U 3 are subspaces of V. Now we have U 2 + U 3 = {(x, y, y + z) : x, y, z R} = {(x, y, z) : x, y, z R} = R 3, so the left hand side is U (U 2 + U 3 ) = U = {(x, y, 0) : x, y R}, a linear subspace of V. However, U U 2 = {(x, 0, 0) : x R},

MATH 3 HOMEWORK : SOLUTIONS 5 and U U 3 = {(0, 0, 0)}. So the right hand side is (U U 2 ) + (U U 3 ) = {(x, 0, 0) : x R} + {(0, 0, 0)} = {(x, 0, 0)}, an one-dimensional subspace of V. Now (0,, 0) is in the former subspace but not in the latter one. We have U (U 2 + U 3 ) (U U 2 ) + (U U 3 ). A little remark: in general we do have U (U 2 + U 3 ) (U U 2 ) + (U U 3 ), but the other direction of inclusion is not true. Question 6. Approximately how much time did you spend working on this homework?