Solutions to Homework Section 3.7 February 18th, 2005



Similar documents
MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MAT188H1S Lec0101 Burbulla

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Solutions to Math 51 First Exam January 29, 2015

( ) which must be a vector

Math 312 Homework 1 Solutions

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are

A =

Name: Section Registered In:

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Solving Linear Systems, Continued and The Inverse of a Matrix

Methods for Finding Bases

Section Continued

MAT 242 Test 2 SOLUTIONS, FORM T

NOTES ON LINEAR TRANSFORMATIONS

Recall that two vectors in are perpendicular or orthogonal provided that their dot

by the matrix A results in a vector which is a reflection of the given

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

T ( a i x i ) = a i T (x i ).

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Linearly Independent Sets and Linearly Dependent Sets

Linear Algebra Notes

Similarity and Diagonalization. Similar Matrices

1.5 SOLUTION SETS OF LINEAR SYSTEMS

x = + x 2 + x

Systems of Linear Equations

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

University of Lille I PC first year list of exercises n 7. Review

Linear Algebra Review. Vectors

Row Echelon Form and Reduced Row Echelon Form

α = u v. In other words, Orthogonal Projection

Using row reduction to calculate the inverse and the determinant of a square matrix

Subspaces of R n LECTURE Subspaces

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

[1] Diagonal factorization

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Abstract: We describe the beautiful LU factorization of a square matrix (or how to write Gaussian elimination in terms of matrix multiplication).

Lecture 14: Section 3.3

Lecture Notes 2: Matrices as Systems of Linear Equations

1 Introduction to Matrices

8 Square matrices continued: Determinants

Introduction to Matrices for Engineers

LINEAR ALGEBRA. September 23, 2010

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Linear Algebra Notes for Marsden and Tromba Vector Calculus

6. Cholesky factorization

160 CHAPTER 4. VECTOR SPACES

Lecture 5: Singular Value Decomposition SVD (1)

Solving Systems of Linear Equations Using Matrices

Similar matrices and Jordan form

Matrix Representations of Linear Transformations and Changes of Coordinates

Orthogonal Projections

THE DIMENSION OF A VECTOR SPACE

Examination paper for TMA4115 Matematikk 3

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

MATH APPLIED MATRIX THEORY

Solving Systems of Linear Equations With Row Reductions to Echelon Form On Augmented Matrices. Paul A. Trogdon Cary High School Cary, North Carolina

DETERMINANTS TERRY A. LORING

Vector Spaces 4.4 Spanning and Independence

Mathematics Placement Packet Colorado College Department of Mathematics and Computer Science

CURVE FITTING LEAST SQUARES APPROXIMATION

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Homogeneous systems of algebraic equations. A homogeneous (ho-mo-geen -ius) system of linear algebraic equations is one in which

Chapter 14. Three-by-Three Matrices and Determinants. A 3 3 matrix looks like a 11 a 12 a 13 A = a 21 a 22 a 23

Eigenvalues and Eigenvectors

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

1 Sets and Set Notation.

Arithmetic and Algebra of Matrices

Math 215 HW #6 Solutions

Question 2: How do you solve a matrix equation using the matrix inverse?

4.5 Linear Dependence and Linear Independence

Inner products on R n, and more

MATH1231 Algebra, 2015 Chapter 7: Linear maps

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Lecture 2 Matrix Operations

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Systems of Linear Equations: Solving by Substitution

1 VECTOR SPACES AND SUBSPACES

5.3 The Cross Product in R 3

5.2 Inverse Functions

Example 1: Model A Model B Total Available. Gizmos. Dodads. System:

Math Practice Exam 2 with Some Solutions

Notes from February 11

Vectors Math 122 Calculus III D Joyce, Fall 2012

LS.6 Solution Matrices

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

The last three chapters introduced three major proof techniques: direct,

PROPERTIES OF ELLIPTIC CURVES AND THEIR USE IN FACTORING LARGE NUMBERS

Use order of operations to simplify. Show all steps in the space provided below each problem. INTEGER OPERATIONS

5 Homogeneous systems

Notes on Symmetric Matrices

Transcription:

Math 54W Spring 5 Solutions to Homeork Section 37 Februar 8th, 5 List the ro vectors and the column vectors of the matrix The ro vectors are The column vectors are ( 5 5 The matrix ( (,,,, 4, (5,,,, ( ( 4 3 7 4 5 ( 4 5 ( 4 is in ro echelon form Find a basis for its ro space, find a basis for its column space, and determine its rank Since A is alread in ro echelon form, its nonzero ros form a basis for RS(A b Theorem 37 Since all of the ros are nonzero, a basis for RS(A is (, 4, 3,, (,,, 7, (,, 4,, (,,,5 For the column space, e use Theorem 373, hich sas that the column vectors containing pivots form a basis for CS(A Since ever column has a pivot, a basis for CS(A is 4 3,, 4, 7 5 3 3 8 We have 6 3 5 6 This is ro equivalent to U = 7 A basis for RSU 7 consists of the vectors (3,, and (,, 7 Since RSU =RSA, these also constitute a basis for RSA The first to columns of U constitute a basis for CSU Thus, the first to columns of A, namel (3, 6,, and (, 3,,, constitute a basis for CSA Since all the bases here contain to elements, e see rk Note that V = Span{(, 4,,4,(4,,3,,(,6, 4, } = RSA, here 4 4 4 3 A is ro equivalent to U = 5 3 Thus a basis for V = 6 4 RSA consists of the vectors (, 4,, and (,, 5,3

4 4 8 3 is ro equivalent to U = 7 5 4 Thus 5 5, ith basis NS {( 7 (5t + s 4t s, (5t + s, t, s t, s R} 7 B = {( 7, 5,,, ( 5 7 7,,, } 7 This shos dimns Since U has to pivots, e see rk Sure enough + = 4 = n in this case In exercises -4, determine if b lies in the column space of A If it does, express b as a linear combination of the columns of A ( 4 6 ( 4, b = 6 The second column of A is 3 times the first, so ( ( x CS( Span{ } = { x R} 4 4x Since b cannot be expressed in the form 4 3, b = ( x 4x, it does not lie in the column space The vector b lies in the column space if and onl if b can be ritten as a linear combination b = = a + b + c 3 for scalars a, b and c So e have to solve the sstem of equations a + b c = a + b + c = a + 3b + c = A bit of ro reduction 3 3 3 3 5 3 6 tells us nope, the sstem is inconsistent, so b is not in the column space Shortcut: For future reference, notice that the matrix associated to the sstem as just A itself, augmented b the vector b So if ou ant to save time, skip the first to steps and jump right into the ro reduction

39 Let 3 4 5 Find bases for RS, NS, CS and LNS Find the rank of A and verif that dimrs+dim NS = n, dimcs + dimlns = m Since A is alread in ro echelon form, a basis for the ro space is given b the nonzero ros of A: (, 3,, 4, (,, 5, Since the ro space has to basis vectors, A has rank For the null space, set up a sstem of equations and rite everthing in terms of the free variables: x NS( {x = z R4 Ax = } This tells us that = { = { x z = { R4 5z + =, x + 3 z + 4 = } /5 /5 + z, R4, R} /5 /5 /5 /5, R} is a basis for NS(A, and e can no verif that dimrs + dim NS = + = 4 A basis for the column space consists of the columns of A hich have pivots:,, 5 Finall, a ro vector x = (x,, z lies in the left null space LNS(A if and onl if 3 4 (x,, z 5 = (,,,, 3

and this happens if and onl if x =, 3x =, x + 5 =, 4x + = In the solution to this sstem, z is a free variable and x = = Thus the left null space consists of all vectors of the form (,, z This is a one-dimensional space ith basis (,, We can no verif dimcs + dimlns = + = 3 43 True or false? [(a] If A is an n n matrix, then the ro space of A is equal to the column space of A False The matrix ( has ro space spanned b (, and column space spanned b ( These are not the same [(b] Even if A is square, the column space of A can never equal the null space of A False The matrix ( ( has CS( NS( { R} [(c]if A is an m n matrix and the columns of A are linearl independent, then Ax = b ma or ma not have a solution But if it has a solution, that solution is unique True Since the columns of A are linearl independent, the ro echelon form of A must have a pivot in ever column, so there are no free variables associated to the sstem Ax = b [(d]a 3 4 matrix never has linearl independent columns True Four vectors in R 3 can never be linearl independent [(e]a 4 3 matrix must have linearl independent columns False The zero matrix doesn t have linearl independent columns As ou can see, the zero matrix is ver useful for producing counterexamples! 44 We consider A as a collection of n columns in R m (a If these vectors are linearl independent, then the form a basis of CSA, in hich case rk dimcs n We must have n m since ou cannot have more than m linearl independent vectors in R m (b If these vectors span R m, e have b definition CS R m, and hence rk dimcs dimr m = m In this case, n m since one cannot have feer than m vectors spanning R m (c If these vectors form a basis of R m, then both (a and (b hold, in hich case n = m = rka, and e see that A is a square, invertible matrix In Ex 48 e suppose A is an nxn matrix and has a right inverse B such that AB = I [WARNING: We do not assume that A is invertible, as e are not told hether B I In fact, this is exactl hat e set out to prove!] 4

48 (a To sho that CS R n, it is enough to sho that given an v R n, e can find an x such that Ax = v (cf Ex 33 But notice that v = Iv = ABv = A(Bv Thus, setting x = Bv, e see that Ax = v, and e are done (b Since rk dimcs dimr n = n, e see A is invertible b 383d (c Since A is invertible, there exists a matrix A such that A AA = I Take the equation AB = I Multipling both sides on the left b A, e get B = A I = A, proving the claim 5