Section 1.7 22 Continued



Similar documents
Linear Equations in Linear Algebra

Linearly Independent Sets and Linearly Dependent Sets

1.5 SOLUTION SETS OF LINEAR SYSTEMS

Solutions to Math 51 First Exam January 29, 2015

Systems of Linear Equations

Section 8.2 Solving a System of Equations Using Matrices (Guassian Elimination)

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

by the matrix A results in a vector which is a reflection of the given

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

160 CHAPTER 4. VECTOR SPACES

Linear Algebra Notes

Name: Section Registered In:

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

( ) which must be a vector

Vector Spaces 4.4 Spanning and Independence

Review Jeopardy. Blue vs. Orange. Review Jeopardy

NOTES ON LINEAR TRANSFORMATIONS

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Solving Systems of Linear Equations

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Methods for Finding Bases

r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) Write the given system in matrix form x = Ax + f ( ) sin(t) x y z = dy cos(t)

Homogeneous systems of algebraic equations. A homogeneous (ho-mo-geen -ius) system of linear algebraic equations is one in which

THE DIMENSION OF A VECTOR SPACE

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

Jim Lambers MAT 169 Fall Semester Lecture 25 Notes

4.5 Linear Dependence and Linear Independence

LS.6 Solution Matrices

5 Homogeneous systems

Lecture 14: Section 3.3

α = u v. In other words, Orthogonal Projection

A =

Solving Systems of Linear Equations

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Orthogonal Projections

LINEAR ALGEBRA W W L CHEN

8 Square matrices continued: Determinants

x y The matrix form, the vector form, and the augmented matrix form, respectively, for the system of equations are

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

12.5 Equations of Lines and Planes

MAT 242 Test 2 SOLUTIONS, FORM T

Chapter 20. Vector Spaces and Bases

Matrix Representations of Linear Transformations and Changes of Coordinates

MATH1231 Algebra, 2015 Chapter 7: Linear maps

1 Determinants and the Solvability of Linear Systems

October 3rd, Linear Algebra & Properties of the Covariance Matrix

ISOMETRIES OF R n KEITH CONRAD

Inner Product Spaces and Orthogonality

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

University of Lille I PC first year list of exercises n 7. Review

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Linear Algebra Review. Vectors

Lecture 2: Homogeneous Coordinates, Lines and Conics

Similarity and Diagonalization. Similar Matrices

A vector is a directed line segment used to represent a vector quantity.

Subspaces of R n LECTURE Subspaces

1.2 Solving a System of Linear Equations

Solutions to Homework Section 3.7 February 18th, 2005

5.3 The Cross Product in R 3

CS3220 Lecture Notes: QR factorization and orthogonal transformations

CHAPTER 2. Eigenvalue Problems (EVP s) for ODE s

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Equations of Lines and Planes

Numerical Analysis Lecture Notes

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Orthogonal Diagonalization of Symmetric Matrices

T ( a i x i ) = a i T (x i ).

1 VECTOR SPACES AND SUBSPACES

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Chapter 6. Linear Programming: The Simplex Method. Introduction to the Big M Method. Section 4 Maximization and Minimization with Problem Constraints

Least-Squares Intersection of Lines

Notes on Determinant

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

Introduction to Algebraic Geometry. Bézout s Theorem and Inflection Points

Section 2.4: Equations of Lines and Planes

Problem set on Cross Product

Problems. Universidad San Pablo - CEU. Mathematical Fundaments of Biomedical Engineering 1. Author: First Year Biomedical Engineering

Applied Linear Algebra

Solving Equations Involving Parallel and Perpendicular Lines Examples

Practical Guide to the Simplex Method of Linear Programming

Math 215 HW #6 Solutions

Arithmetic and Algebra of Matrices

Linear Equations. Find the domain and the range of the following set. {(4,5), (7,8), (-1,3), (3,3), (2,-3)}

9 Multiplication of Vectors: The Scalar or Dot Product

Numerical Analysis Lecture Notes

= y y 0. = z z 0. (a) Find a parametric vector equation for L. (b) Find parametric (scalar) equations for L.

Math 241, Exam 1 Information.

Metric Spaces. Chapter Metrics

Geometric Camera Parameters

Section 13.5 Equations of Lines and Planes

Transcription:

Section 1.5 23 A homogeneous equation is always consistent. TRUE - The trivial solution is always a solution. The equation Ax = 0 gives an explicit descriptions of its solution set. FALSE - The equation gives an implicit description of the solution set. The homogeneous equation Ax = 0 has the trivial solution if and only if the equation has at least one free variable. FALSE - The trivial solution is always a solution to the equation Ax = 0. The equation x = p + tv describes a line through v parallel to p. False. The line goes through p and is parallel to v. The solution set of Ax = b is the set of all vectors of the form w = p + v h where v h is any solution of the equation Ax = 0 FALSE This is only true when there exists some vector p such that Ap = b.

Section 1.5 24 If x is a nontrivial solution of Ax = 0, then every entry in x is nonzero. FALSE. At least one entry in x is nonzero. The equation x = x 2 u + x 3 v, with x 2 and x 3 free (and neither u or v a multiple of the other), describes a plane through the origin. TRUE The equation Ax = b is homogeneous if the zero vector is a solution. TRUE. If the zero vector is a solution then b = Ax = A0 = 0. So the equation is Ax = 0, thus homogenous. The effect of adding p to a vector is to move the vector in the direction parallel to p. TRUE. We can also think of adding p as sliding the vector along p. The solution set of Ax = b is obtained by translating the solution set of Ax = 0. FALSE. This only applies to a consistent system.

Section 1.7 21 The columns of the matrix A are linearly independent if the equation Ax = 0 has the trivial solution. FALSE. The trivial solution is always a solution. If S is a linearly dependent set, then each vector is a linear combination of the other vectors in S. FALSE- For example, [1, 1], [2, 2] and [5, 4] are linearly dependent but the last is not a linear combination of the first two. The columns of any 4 5 matrix are linearly dependent. TRUE. There are five columns each with four entries, thus by Thm 8 they are linearly dependent. If x and y are linearly independent, and if {x, y, z} is linearly dependent, then z is in Span{x, y}. TRUE Since x and y are linearly independent, and {x, y, z} is linearly dependent, it must be that z can be written as a linear combination of the other two, thus in in their span.

Section 1.7 22 Two vectors are linearly dependent if and only if they lie on a line through the origin. TRUE. If they lie on a line through the origin then the origin, the zero vector, is in their span thus they are linearly dependent. If a set contains fewer vectors then there are entries in the vectors, then the set is linearly independent. FALSE For example, [1, 2, 3] and [2, 4, 6] are linearly dependent. If x and y are linearly independent, and if z is in the Span{x, y} then {x, y, z} is linearly dependent. TRUE If z is in the Span{x, y} then z is a linear combination of the other two, which can be rearranged to show linear dependence.

Section 1.7 22 Continued If a set in R n is linearly dependent, then the set contains more vectors than there are entries in each vector. False. For example, in R 3 [1, 2, 3] and [3, 6, 9] are linearly dependent.

Section 1.8 21 A linear transformation is a special type of function. TRUE The properties are (i) T (u + v) = T (u) + T (v) and (ii) T (cu) = ct (u). If A is a 3 5 matrix and T is a transformation defined by T (x) = Ax, then the domain of T is R 3. FALSE The domain is R 5. If A is an m n matrix, then the range of the transformation x Ax is R m FALSE R m is the codomain, the range is where we actually land. Every linear transformation is a matrix transformation. FALSE. The converse (every matrix transformation is a linear transformation) is true, however. We (probably) will see examples of when the original statement is false later.

Section 1.8 21 Continued A transformation T is linear if and only if T (c 1 v 1 + c 2 v 2 ) = c 1 T (v1) + c 2 T (v 2 ) for all v 1 and v 2 in the domain of T and for all scalars c 1 and c 2. TRUE If we take the definition of linear transformation we can derive these and if these are true then they are true for c 1, c 2 = 1 so the first part of the definition is true, and if v = 0, then the second part if true.

Section 1.8 22 Every matrix transformation is a linear transformation. TRUE To actually show this, we would have to show all matrix transformations satisfy the two criterion of linear transformations. The codomain of the transformation x Ax is the set of all linear combinations of the columns of A. FALSE The If A is m n codomain is R m. The original statement in describing the range. If T : R n R m is a linear transformation and if c is in R m, then a uniqueness question is Is c is the range of T. FALSE This is an existence question. A linear transformation preserves the operations of vector addition and scalar multiplication. TRUE This is part of the definition of a linear transformation. The superposition principle is a physical description of a linear transformation. TRUE The book says so. (page 77)