, u 2 =, u 3 = Row reducing these equations leads to the system

Size: px
Start display at page:

Download ", u 2 =, u 3 = Row reducing these equations leads to the system"

Transcription

1 Solutions HW 5.. Consider the vectors u =, u =, u = in R 4. Can you find a vector u 4 such that u, u, u, u 4 are orthonormal? If so, how many such vectors are there? Note that u, u, u are already orthonormal, so we just need to find u 4, which must satisfy u u 4 = u u 4 = u u 4 = and u 4 u 4 =. If u 4 = (a, b, c, d) T, then the first three conditions give the linear equations a + b + c + d = a + b c d = a b + c d = Row reducing these equations leads to the system a + b + c + d = b + d = c + d = which has general solution u 4 = (a, b, c, d) T = (t, t, t, t) T. Any such vector has dot product zero with u, u, u. For an orthonormal basis, we want u 4 u 4 =, which gives t = ±, so there are choices for u 4, namely or 5.. (See the book for the full statement of the problem, which involves finding a least squares estimate for a line y = mx). Let x be the vector of x coordinates and y be the vector of y coordinates. Geometrically, {mx)} is a dimension vector space. We which to minimize the distance from y into this space; this is accomplished when mx is the projection x of y into this space. Note that x is a unit vector in this space, so the corresponding projection should be (y x x ) x x = y x x x. In particular, we should take m = y x x = =.....

2 r = m x y from our formula. (Note that y / x is the expected slope if all of the points did lie on a line, so r can be thought of as a ratio of two slopes.) If you draw the relevant line in the picture, most points will be near the line (even if none are on the line). 5.. Consider a basis v, v,..., v m of a subspace V of R n. Show that a vector x in R n is orthogonal to v if and only if it is orthogonal to all the vectors v,..., v n. If x is orthogonal to all the vectors v,..., v n, then for any v V, we can write v = c v +... c m v m. Then x v = x (c v +... c m v m ) = c (x v ) + +c m (x v m ) = (since each x v i = ). So x is orthogonal to every vector in V. If x v = for all v V, then this applies in particular to the vectors v,... v m V Complete the proof of Fact 5..4: Orthogonal Projection are linear transformations. We must show that proj V (x + y) = proj V (x) + proj V (y) and proj V (cx) = cproj V (x). From page 89, if u,... u m are an orthonormal basis for V, then proj V (x + y) = (u (x + y))u + + (u m (x + y))u m = ((u x) + (u y))u + + ((u m x)+(u m y))u m = [(u x)u + +(u m x)u m ]+[(u y)u + +(u m y)u m ] = proj V (x) + proj V (y). Similarly proj V (cx) = (u cx)u + + (u m cx)u m = c[(u x)u + + (u m x)u m ] = cproj V (x). 5.. Consider a subspace V or R n and a vector x in R n. Let y = proj V (x). What is the relation ship between y and y x? The two quantities are equal. One can see this by writing x = y + w; then y is the parallel component to V and w is orthogonal to V, and in particular y w =. Then x y = (y + w) y = y y = y as desired. 5.. Perform Gram-Schmidt on the vectors We have

3 u = v = u = / / / / / / / / = / 5..4 Perform Gram-Schmidt on the vectors We have 8 u = v = u = v = u = / / / / / / 8 / / / / = / / / / / 5.. Find the QR decomposition of the matrix / / =

4 From the work in problem, we can take Q to be / / / / / / R is calculated by computing various dot products of the column vectors of Q with the column vectors of the original matrix, so we get R to be ( ) 5..8 Find the QR decomposition of the matrix 8 From the work in problem, we can take Q to be / / / / / / / / R is calculated by computing various dot products of the column vectors of Q with the column vectors of the original matrix, so we get R to be 5.. Find an orthonormal basis of the plane x + x + x =. First we find a basis for the plane by backsolving the equation. For example, one such basis is v = v = Next we apply Gram-Schmidt to this basis to make it orthonormal.

5 u = v = u = / / / / / / / / = / / u, u form an orthonormal basis for the plane x + x + x =. 5.. Is orthogonal? (.8 )...8 Yes. Either one can note that the columns are orthogonal vectors, or one can compute A T A and see that you get the identity matrix. 5.. If A and B are orthogonal matrices, is B AB orthogonal also? Yes. By fact 5..4 a, B is also orthogonal, and then applying Fact 5..4 b several times shows that the product B AB of three orthogonal matrices is an orthogonal matrix. (Alternatively, one can show (B AB) T = (B AB) directly.) 5.. If A and B are symmetric, is AB A symmetric? Yes. (AB A) T = A T (B ) T A T = AB A as required. 5.. If B is an n n matrix, is BB T symmetric? Yes. (BB T ) T = (B T ) T B T = BB T as desired Consider the subspace W of R 4 spanned by the vectors v = v = 9 5 Find the matrix of orthogonal projection onto W.

6 We start by finding an orthonormal basis of W using Gram-Schmidt. We get u = v = u = / / / / 9 5 / / / / 4 / / / = / So setting Q = We have that the matrix of orthogonal projection onto W is QQ T or Find the dimension of the space of all skew symmetric matrices. The dimension is n n, since such a matrix must have all s on the diagonal, and then is determined by the values in the strictly upper triangular part of the matrix.

Review Session There will be a review session on Sunday March 14 (Pi day!!) from 3:30-5:20 in Center 119.

Review Session There will be a review session on Sunday March 14 (Pi day!!) from 3:30-5:20 in Center 119. Announcements Review Session There will be a review session on Sunday March 14 (Pi day!!) from 3:30-5:20 in Center 119. Additional Office Hours I will have office hours Monday from 2-3 (normal time) and

More information

Lesson 14 The Gram-Schmidt Process and QR Factorizations

Lesson 14 The Gram-Schmidt Process and QR Factorizations Lesson 14 The Gram-Schmidt Process and QR Factorizations Math 21b March 12, 2007 Announcements Get your midterm from me if you haven t yet. Homework for March 14: 5.2: 6,14,22,34,40,44* Problem Session

More information

Math Final Review Dec 10, 2010

Math Final Review Dec 10, 2010 Math 301-001 Final Review Dec 10, 2010 General Points: Date and time: Monday, December 13, 10:30pm 12:30pm Exam topics: Chapters 1 4, 5.1, 5.2, 6.1, 6.2, 6.4 There is just one fundamental way to prepare

More information

Orthogonal Transformations Math 217 Professor Karen Smith

Orthogonal Transformations Math 217 Professor Karen Smith Definition: A linear transformation R n Theorem: If R n Orthogonal Transformations Math 217 Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License.

More information

α = u v. In other words, Orthogonal Projection

α = u v. In other words, Orthogonal Projection Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

More information

Solutions to Homework 8

Solutions to Homework 8 Solutions to Homework 8. Find an orthonormal basis for the plane 4x x + x = in R. Answer: First, we choose two linearly independent vectors that are in the plane. These vectors from a basis for the plane,

More information

Orthogonal matrices. Definition: A square matrix E such that E t = E 1 is called an orthogonal matrix. Example:

Orthogonal matrices. Definition: A square matrix E such that E t = E 1 is called an orthogonal matrix. Example: Orthogonal matrices Suppose we take an orthonormal (on) basis {e, e,, e n } of R n and form the n n matrix E = (e e n ) Then E t E = e t e t (e e n ) = I n, e t n because (E t E) ij = e t ie j = e i e

More information

Math 215 HW #7 Solutions

Math 215 HW #7 Solutions Math 5 HW #7 Solutions Problem 8 If P is the projection matrix onto a k-dimensional subspace S of the whole space R n, what is the column space of P and what is its rank? Answer: The column space of P

More information

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that

More information

Math 22 Final Exam 1

Math 22 Final Exam 1 Math 22 Final Exam. (36 points) Determine if the following statements are true or false. In each case give either a short justification or example (as appropriate) to justify your conclusion. T F (a) If

More information

Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

More information

18.06 Final Exam May 18, 2010 Professor Strang

18.06 Final Exam May 18, 2010 Professor Strang 18.06 Final Exam May 18, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number is 2. 3. 4. 5. 6. 7. 8. 9. 1. (12 points) This question is about the matrix 1 2 0 1 A = 2 4 1 4. 3 6 3 9 (a)

More information

Orthogonal Diagonalization of Symmetric Matrices

Orthogonal Diagonalization of Symmetric Matrices MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding

More information

Example Linear Algebra Competency Test Solutions. N/A vectors and matrices must be of the same dimensions for addition to be defined.

Example Linear Algebra Competency Test Solutions. N/A vectors and matrices must be of the same dimensions for addition to be defined. Example Linear Algebra Competency Test Solutions The 40 questions below are a combination of True or False, multiple choice, fill in the blank, and computations involving matrices and vectors. In the latter

More information

5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES

5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES 5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL MATRICES Definition 5.3. Orthogonal transformations and orthogonal matrices A linear transformation T from R n to R n is called orthogonal if it preserves

More information

Notes on the Gramm-Schmidt Procedure for Constructing Orthonormal Bases. by Eric Carlen

Notes on the Gramm-Schmidt Procedure for Constructing Orthonormal Bases. by Eric Carlen Notes on the Gramm-Schmidt Procedure for Constructing Orthonormal Bases by Eric Carlen Suppose I m given two vectors, say v = and v =, and I want to find an orthonormal basis for the subspace V that they

More information

Math Winter Final Exam

Math Winter Final Exam Math 02 - Winter 203 - Final Exam Problem Consider the matrix (i) Find the left inverse of A A = 2 2 (ii) Find the matrix of the projection onto the column space of A (iii) Find the matrix of the projection

More information

To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.

To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices and Vectors Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Some Definitions An m n (read "m by n")

More information

MA 52 May 9, Final Review

MA 52 May 9, Final Review MA 5 May 9, 6 Final Review This packet contains review problems for the whole course, including all the problems from the previous reviews. We also suggest below problems from the textbook for chapters

More information

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013 Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,

More information

Scalar Product / Dot Product / Inner Product

Scalar Product / Dot Product / Inner Product Scalar Product / Dot Product / Inner Product Scalar product, dot product, and inner product all are different name for the same operation. Scalar product should not be confused with multiplying a vector

More information

LINEAR ALGEBRA. September 23, 2010

LINEAR ALGEBRA. September 23, 2010 LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

More information

Linear Algebra A Summary

Linear Algebra A Summary Linear Algebra A Summary Definition: A real vector space is a set V that is provided with an addition and a multiplication such that (a) u V and v V u + v V, (1) u + v = v + u for all u V en v V, (2) u

More information

Solutions to Assignment 12

Solutions to Assignment 12 Solutions to Assignment Math 7, Fall 6.7. Let P have the inner product given by evaluation at,,, and. Let p =, p = t and q = /(t 5). Find the best approximation to p(t) = t by polynomials in Span{p, p,

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 25, 2016

Inverses. Stephen Boyd. EE103 Stanford University. October 25, 2016 Inverses Stephen Boyd EE103 Stanford University October 25, 2016 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Symmetric Matrices and Quadratic Forms

Symmetric Matrices and Quadratic Forms 7 Symmetric Matrices and Quadratic Forms 7.1 DIAGONALIZAION OF SYMMERIC MARICES SYMMERIC MARIX A symmetric matrix is a matrix A such that. A A Such a matrix is necessarily square. Its main diagonal entries

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

Problem Set 3 Due: In class Thursday, Sept. 27 Late papers will be accepted until 1:00 PM Friday.

Problem Set 3 Due: In class Thursday, Sept. 27 Late papers will be accepted until 1:00 PM Friday. Math 312, Fall 2012 Jerry L Kazdan Problem Set 3 Due: In class Thursday, Sept 27 Late papers will be accepted until 1:00 PM Friday These problems are intended to be straightforward with not much computation

More information

Inner Product Spaces

Inner Product Spaces Inner Product Spaces Definition 0.. An inner product on a vector space V is an operation on V that assigns to each pair of vectors x, y V, a real number < x, y > satisfying: I. < x, y >< y, x > for all

More information

Applied Linear Algebra I Review page 1

Applied Linear Algebra I Review page 1 Applied Linear Algebra Review 1 I. Determinants A. Definition of a determinant 1. Using sum a. Permutations i. Sign of a permutation ii. Cycle 2. Uniqueness of the determinant function in terms of properties

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2013 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS

LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS LECTURE NOTES FOR 416, INNER PRODUCTS AND SPECTRAL THEOREMS CHARLES REZK Real inner product. Let V be a vector space over R. A (real) inner product is a function, : V V R such that x, y = y, x for all

More information

Orthogonal Projections

Orthogonal Projections Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

More information

Lecture 4 Orthonormal sets of vectors and QR factorization

Lecture 4 Orthonormal sets of vectors and QR factorization EE263 Autumn 2007-08 Stephen Boyd Lecture 4 Orthonormal sets of vectors and QR factorization orthonormal sets of vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced by a matrix

More information

LEAST SQUARE PROBLEMS, QR DECOMPOSITION, AND SVD DECOMPOSITION

LEAST SQUARE PROBLEMS, QR DECOMPOSITION, AND SVD DECOMPOSITION LEAST SQUARE PROBLEMS, QR DECOMPOSITION, AND SVD DECOMPOSITION LONG CHEN ABSTRACT. We review basics on least square problems. The material is mainly taken from books [2, 1, 3]. We consider an overdetermined

More information

Well. Let me observe that what we want to do is. Orthogonally project f onto the space V spanned by 1, sin x, sin 2x.

Well. Let me observe that what we want to do is. Orthogonally project f onto the space V spanned by 1, sin x, sin 2x. LECTURE I. Fourier series We ended last week talking about our desire to take a function f defined on [, π] and approximate it by a combination of sine waves: f(x) b + a sin x + a sin x. If I haven t said

More information

Math 220 Sections 1, 9 and 11. Review Sheet v.2

Math 220 Sections 1, 9 and 11. Review Sheet v.2 Math 220 Sections 1, 9 and 11. Review Sheet v.2 Tyrone Crisp Fall 2006 1.1 Systems of Linear Equations Key terms and ideas - you need to know what they mean, and how they fit together: Linear equation

More information

Math 480 Diagonalization and the Singular Value Decomposition. These notes cover diagonalization and the Singular Value Decomposition.

Math 480 Diagonalization and the Singular Value Decomposition. These notes cover diagonalization and the Singular Value Decomposition. Math 480 Diagonalization and the Singular Value Decomposition These notes cover diagonalization and the Singular Value Decomposition. 1. Diagonalization. Recall that a diagonal matrix is a square matrix

More information

Math 210 Linear Algebra Fall 2014

Math 210 Linear Algebra Fall 2014 Math 210 Linear Algebra Fall 2014 Instructor's Name: Office Location: Office Hours: Office Phone: E-mail: Course Description This is a first course in vectors, matrices, vector spaces, and linear transformations.

More information

SOLUTIONS TO PROBLEM SET 6

SOLUTIONS TO PROBLEM SET 6 SOLUTIONS TO PROBLEM SET 6 18.6 SPRING 16 Note the difference of conventions: these solutions adopt that the characteristic polynomial of a matrix A is det A xi while the lectures adopt the convention

More information

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V. .4 COORDINATES EXAMPLE Let V be the plane in R with equation x +2x 2 +x 0, a two-dimensional subspace of R. We can describe a vector in this plane by its spatial (D)coordinates; for example, vector x 5

More information

Application of Linear Algebra on Least Squares Approximation

Application of Linear Algebra on Least Squares Approximation Application of Linear Algebra on Least Squares Approximation Kelan Lu Doctoral Student Univ. of North Texas Dept. of Political Science lillylu01@gmail.com May 8, 2010 An Introduction of the Least Squares

More information

MA Assignments. Text: Linear Algebra, Ideas and Applications, 4rd Edition, Richard C. Penney, (J. Wiley, 2016).

MA Assignments. Text: Linear Algebra, Ideas and Applications, 4rd Edition, Richard C. Penney, (J. Wiley, 2016). MA 35100 Assignments Text: Linear Algebra, Ideas and Applications, 4rd Edition, Richard C. Penney, (J. Wiley, 2016). Typically, each assignment contains exercises from the material discussed in the most

More information

2.5 Spaces of a Matrix and Dimension

2.5 Spaces of a Matrix and Dimension 38 CHAPTER. MORE LINEAR ALGEBRA.5 Spaces of a Matrix and Dimension MATH 94 SPRING 98 PRELIM # 3.5. a) Let C[, ] denote the space of continuous function defined on the interval [, ] (i.e. f(x) is a member

More information

be a set of basis functions. In order to represent f as their linear combination i=0

be a set of basis functions. In order to represent f as their linear combination i=0 Maria Cameron 1. The least squares fitting using non-orthogonal basis We have learned how to find the least squares approximation of a function f using an orthogonal basis. For example, f can be approximates

More information

A general element of R 3 3 has the form A = d e f and the condition A T = A becomes. b e f, so that. b = d, c = g, and f = h. Thus, A = a b c.

A general element of R 3 3 has the form A = d e f and the condition A T = A becomes. b e f, so that. b = d, c = g, and f = h. Thus, A = a b c. Let V = {f(t) P 3 f() = f()} Show that V is a subspace of P 3 and find a basis of V The neutral element of P 3 is the function n(t) = In particular, n() = = n(), so n(t) V Let f, g V Then f() = f() and

More information

EXERCISES IN LINEAR ALGEBRA. 1. Matrix operations

EXERCISES IN LINEAR ALGEBRA. 1. Matrix operations EXERCISES IN LINEAR ALGEBRA 1 Matrix operations (1) Put D = diag(d 1, d 2,, d n ) Let A = (a ij ) be an n n matrix Find DA and AD When is D invertible? (2) An n n matrix A = (a ij ) is called upper triangular

More information

Vector Spaces and Subspaces

Vector Spaces and Subspaces Chapter 5 Vector Spaces and Subspaces 5. The Column Space of a Matrix To a newcomer, matrix calculations involve a lot of numbers. To you, they involve vectors. The columns of Av and AB are linear combinations

More information

(u, Av) = (A T u,v), (6.4)

(u, Av) = (A T u,v), (6.4) 216 SECTION 6.1 CHAPTER 6 6.1 Hermitian Operators HERMITIAN, ORTHOGONAL, AND UNITARY OPERATORS In Chapter 4, we saw advantages in using bases consisting of eigenvectors of linear operators in a number

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

18.06 Problem Set 6 Solutions Due Thursday, 18 March 2010 at 4 pm in Total: 100 points

18.06 Problem Set 6 Solutions Due Thursday, 18 March 2010 at 4 pm in Total: 100 points 1.06 Problem Set 6 Solutions Due Thursday, 1 March 2010 at 4 pm in 2-10. Total: 100 points Section 4.3. Problem 4: Write down E = Ax b 2 as a sum of four squares the last one is (C + 4D 20) 2. Find the

More information

MATH 304 Linear Algebra Lecture 11: Basis and dimension.

MATH 304 Linear Algebra Lecture 11: Basis and dimension. MATH 304 Linear Algebra Lecture 11: Basis and dimension. Linear independence Definition. Let V be a vector space. Vectors v 1,v 2,...,v k V are called linearly dependent if they satisfy a relation r 1

More information

v 1 2 v 1 is orthogonal to v 1.

v 1 2 v 1 is orthogonal to v 1. Gram-Schmidt Process This process consists of steps that describes how to obtain an orthonormal basis for any finite dimensional inner products. Let V be any nonzero finite dimensional inner product space

More information

1 Eigenvalues and Eigenvectors

1 Eigenvalues and Eigenvectors Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x

More information

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Recall that two vectors in are perpendicular or orthogonal provided that their dot Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

More information

POL502: Linear Algebra

POL502: Linear Algebra POL502: Linear Algebra Kosuke Imai Department of Politics, Princeton University December 12, 2005 1 Matrix and System of Linear Equations Definition 1 A m n matrix A is a rectangular array of numbers with

More information

Math 2B Spring 13 Chapters 5 and 6 Test

Math 2B Spring 13 Chapters 5 and 6 Test Math B Spring 3 Chapters 5 and 6 Test Name Give complete solutions to all problems. You can use technology to check your results, but be sure to detail how they are derived. Do not copy other people s

More information

Vector and Matrix Introduction

Vector and Matrix Introduction Vector and Matrix Introduction June 13, 2011 1 Vectors A scalar is a magnitude, or just a number An element of R is a scalar Until this point, most variables you have used denoted scalars Quantities measured

More information

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1

1 2 3 1 1 2 x = + x 2 + x 4 1 0 1 (d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which

More information

Math 215 HW #4 Solutions

Math 215 HW #4 Solutions Math 5 HW #4 Solutions. Problem..6. Let P be the plane in 3-space with equation x + y + z = 6. What is the equation of the plane P through the origin parallel to P? Are P and P subspaces of R 3? Answer:

More information

Linear Algebra Prerequisites - continued. Jana Kosecka

Linear Algebra Prerequisites - continued. Jana Kosecka Linear Algebra Prerequisites - continued Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html kosecka@cs.gmu.edu Matrices meaning m points from n-dimensional space n x m matrix transformation Covariance

More information

MATH 240 Fall, Chapter 1: Linear Equations and Matrices

MATH 240 Fall, Chapter 1: Linear Equations and Matrices MATH 240 Fall, 2007 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 9th Ed. written by Prof. J. Beachy Sections 1.1 1.5, 2.1 2.3, 4.2 4.9, 3.1 3.5, 5.3 5.5, 6.1 6.3, 6.5, 7.1 7.3 DEFINITIONS

More information

Singular value decomposition

Singular value decomposition Singular value decomposition The singular value decomposition of a matrix is usually referred to as the SVD. This is the final and best factorization of a matrix: A = UΣV T where U is orthogonal, Σ is

More information

MATH ASSIGNMENT 5. v = 6

MATH ASSIGNMENT 5. v = 6 () Let MATH - ASSIGNMENT 5 u = v = w = (a) Find a vector that is perpendicular to both v and w. (b) Find the angle between u and v. (c) Find dist(u, v). (d) Find proj u v. (e) Find the area of the parallelogram

More information

Similarly rotations R y (θ) commute and rotations R z (θ) commute. In general, however, rotations in three dimensions do not commute.

Similarly rotations R y (θ) commute and rotations R z (θ) commute. In general, however, rotations in three dimensions do not commute. 7. Rotations This lecture is a more formal discussion of orthogonal transformations, the orthogonal group O(3), and especially the orientation preserving ones, SO(3), the rotations. A matrix is defined

More information

2 Singular Value Decomposition

2 Singular Value Decomposition Singular Value Decomposition The singular value decomposition (SVD) allows us to transform a matrix A C m n to diagonal form using unitary matrices, ie, A = Û ˆΣV (4) Here Û Cm n has orthonormal columns,

More information

COURSE OUTLINE. Course Number Course Title Credits MAT208 Linear Algebra 4. Co- or Pre-requisite. Calculus I

COURSE OUTLINE. Course Number Course Title Credits MAT208 Linear Algebra 4. Co- or Pre-requisite. Calculus I COURSE OUTLINE Course Number Course Title Credits MAT208 Linear Algebra 4 Hours: lecture/lab/other 4 lecture hours Co- or Pre-requisite Calculus I Implementation sem/year Fall 2012 Catalog description

More information

MAT 2038 LINEAR ALGEBRA II Instructor: Engin Mermut Course assistant: Zübeyir Türkoğlu web:

MAT 2038 LINEAR ALGEBRA II Instructor: Engin Mermut Course assistant: Zübeyir Türkoğlu web: MAT 2038 LINEAR ALGEBRA II 15.05.2015 Dokuz Eylül University, Faculty of Science, Department of Mathematics Instructor: Engin Mermut Course assistant: Zübeyir Türkoğlu web: http://kisi.deu.edu.tr/engin.mermut/

More information

Math 215 HW #4 Solutions

Math 215 HW #4 Solutions Math 5 HW #4 Solutions. Problem.3.6. Choose three independent columns of U. Then make two other choice. Do the same for A. You have found bases for which spaces? U = 3 4 6 7 9 and A = 3 4 6 7 9 4 6 8 Solution:

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 3191 Applied Linear Algebra Lecture 5: Quadratic Forms Stephen Billups University of Colorado at Denver Math 3191Applied Linear Algebra p.1/16 Diagonalization of Symmetric Matrices Recall: A symmetric

More information

3d Geometry for Computer Graphics. Lesson 1: Basics & PCA

3d Geometry for Computer Graphics. Lesson 1: Basics & PCA 3d Geometry for Computer Graphics Lesson 1: Basics & PCA 3d geometry 3d geometry 3d geometry Why?? We represent objects using mainly linear primitives: points lines, segments planes, polygons Need to know

More information

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is

(, ) : R n R n R. 1. It is bilinear, meaning it s linear in each argument: that is 7 Inner products Up until now, we have only examined the properties of vectors and matrices in R n. But normally, when we think of R n, we re really thinking of n-dimensional Euclidean space - that is,

More information

Matrix Multiplication

Matrix Multiplication Matrix Multiplication Stephen Boyd EE103 Stanford University October 17, 2016 Outline Matrix multiplication Composition of linear functions Matrix powers QR factorization Matrix multiplication 2 Matrix

More information

Inner product. Definition of inner product

Inner product. Definition of inner product Math 20F Linear Algebra Lecture 25 1 Inner product Review: Definition of inner product. Slide 1 Norm and distance. Orthogonal vectors. Orthogonal complement. Orthogonal basis. Definition of inner product

More information

Inverses. Stephen Boyd. EE103 Stanford University. October 27, 2015

Inverses. Stephen Boyd. EE103 Stanford University. October 27, 2015 Inverses Stephen Boyd EE103 Stanford University October 27, 2015 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number

More information

Orthogonal Projections and Orthonormal Bases

Orthogonal Projections and Orthonormal Bases CS 3, HANDOUT -A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).

More information

1 Orthogonal projections and the approximation

1 Orthogonal projections and the approximation Math 1512 Fall 2010 Notes on least squares approximation Given n data points (x 1, y 1 ),..., (x n, y n ), we would like to find the line L, with an equation of the form y = mx + b, which is the best fit

More information

Matrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Pearson Education, Inc.

Matrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Pearson Education, Inc. 2 Matrix Algebra 2.3 CHARACTERIZATIONS OF INVERTIBLE MATRICES Theorem 8: Let A be a square matrix. Then the following statements are equivalent. That is, for a given A, the statements are either all true

More information

Matrix generalities. Summary. 1. Particular matrices. Matrix of dimension ; A a. Zero matrix: All its elements a 0

Matrix generalities. Summary. 1. Particular matrices. Matrix of dimension ; A a. Zero matrix: All its elements a 0 Matrix generalities Summary 1. Particular matrices... 1 2. Matrix operations... 2 Scalar multiplication:... 2 Sum of two matrices of the same dimension () and... 2 Multiplication of two matrices and of

More information

Section 3.1. Solution: C(A) is a line (the x-axis), C(B) is a plane (the xy-plane), C(C) is a line (through 0 and (1, 2, 0)). Section 3.

Section 3.1. Solution: C(A) is a line (the x-axis), C(B) is a plane (the xy-plane), C(C) is a line (through 0 and (1, 2, 0)). Section 3. Section. 2. Suppose the multiplication cx is defined to produce (cx, ) instead of (cx, cx 2 ), which of the eight conditions are not satisfied?, 4, 5. Which of the following subsets of R are actually subspaces?

More information

Practice Final Questions with Solutions

Practice Final Questions with Solutions Practice 18.06 Final Questions with Solutions 17th December 2007 Notes on the practice questions The final exam will be on Thursday, Dec. 20, from 9am to 12noon at the Johnson Track, and will most likely

More information

Direct Sums of Subspaces and Fundamental Subspaces

Direct Sums of Subspaces and Fundamental Subspaces Direct Sums of Subspaces and Fundamental Subspaces S. F. Ellermeyer July, 008 Direct Sums Suppose that V is a vector space and that H and K are subspaces of V such that H \ K = f0g. The direct sum of H

More information

MATH 304 Linear Algebra Lecture 28: Orthogonal bases. The Gram-Schmidt orthogonalization process.

MATH 304 Linear Algebra Lecture 28: Orthogonal bases. The Gram-Schmidt orthogonalization process. MATH 304 Linear Algebra Lecture 28: Orthogonal bases. The Gram-Schmidt orthogonalization process. Orthogonal sets Let V be an inner product space with an inner product, and the induced norm v = v,v. Definition.

More information

Lesson 5 Linear Transformations in Geometry

Lesson 5 Linear Transformations in Geometry Lesson 5 Linear Transformations in Geometr Math 21b Februar 14, 2007 Announcements Homework for Februar 16: 2.2: 6,8,16,30,34,47*,50* Problem Session: Frida 1-2pm in SC 103b (?) M office hours: Monda 2-4,

More information

4.12 Orthogonal Sets of Vectors and the Gram-Schmidt Process

4.12 Orthogonal Sets of Vectors and the Gram-Schmidt Process 4 Orthogonal Sets of Vectors and the Gram-Schmidt Process 33 6 Using Equation (4), determine all vectors satisfying v, v > 0 Such vectors are called spacelike vectors 7 MakeasketchofR and indicate the

More information

Inner products and orthogonality

Inner products and orthogonality Chapter 5 Inner products and orthogonality Inner product spaces, norms, orthogonality, Gram-Schmidt process Reading The list below gives examples of relevant reading. (For full publication details, see

More information

Projections and orthonormal bases

Projections and orthonormal bases Projections and orthonormal bases Yehonatan Sella Consider the following situation. Given a line L R, and given any other vector v R, we can project the vector v onto the line L by dropping a perpendicular

More information

(Practice)Exam in Linear Algebra

(Practice)Exam in Linear Algebra (Practice)Exam in Linear Algebra First Year at The Faculties of Engineering and Science and of Health This test has 9 pages and 15 problems. In two-sided print. It is allowed to use books, notes, photocopies

More information

[1] Diagonal factorization

[1] Diagonal factorization 8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

More information

Linear Algebra Review (with a Small Dose of Optimization) Hristo Paskov CS246

Linear Algebra Review (with a Small Dose of Optimization) Hristo Paskov CS246 Linear Algebra Review (with a Small Dose of Optimization) Hristo Paskov CS246 Outline Basic definitions Subspaces and Dimensionality Matrix functions: inverses and eigenvalue decompositions Convex optimization

More information

GRE math study group Linear algebra examples D Joyce, Fall 2011

GRE math study group Linear algebra examples D Joyce, Fall 2011 GRE math study group Linear algebra examples D Joyce, Fall 20 Linear algebra is one of the topics covered by the GRE test in mathematics. Here are the questions relating to linear algebra on the sample

More information

1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)

1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible

More information

YORK UNIVERSITY Faculty of Science and Engineering MATH M Test #2 Solutions

YORK UNIVERSITY Faculty of Science and Engineering MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science and Engineering MATH 2022 3.00 M Test #2 Solutions 1. (10 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this question each

More information

1 Review of two equations in two unknowns

1 Review of two equations in two unknowns Contents 1 Review of two equations in two unknowns 1.1 The "standard" method for finding the solution 1.2 The geometric method of finding the solution 2 Some equations for which the "standard" method doesn't

More information

x n This may give us an idea for the general formula. So again. Suppose that

x n This may give us an idea for the general formula. So again. Suppose that LECTURE I. Orthogonal projection I talked a bit about orthogonal projection last time and we saw that it was a useful tool for understanding the relationship between V and V. Now let s speak of it a little

More information

Linear Algebra Test File Spring Test #1

Linear Algebra Test File Spring Test #1 Linear Algebra Test File Spring 2015 Test #1 For problems 1-3, consider the following system of equations. Do not use your calculator. x + y - 2z = 0 3x + 2y + 4z = 10 2x + y + 6z = 10 1.) Solve the system

More information

COT4501 Spring 2012 Homework III

COT4501 Spring 2012 Homework III COT451 Spring 1 Homework III This assignment has six problems and they are equally weighted. The assignment is due in class on Tuesday, February 1, 1. There are four regular problems and two computer problems

More information

MATH 304 Linear Algebra Lecture 13: Span. Spanning set.

MATH 304 Linear Algebra Lecture 13: Span. Spanning set. MATH 304 Linear Algebra Lecture 13: Span. Spanning set. Subspaces of vector spaces Definition. A vector space V 0 is a subspace of a vector space V if V 0 V and the linear operations on V 0 agree with

More information

M341 (56140), Sample Final Exam Solutions

M341 (56140), Sample Final Exam Solutions M4 (5640), Sample Final Exam Solutions Let V be an n-dimensional vector space and W be an m-dimensional vector space a) Suppose n < m Show that there is no linear transformation L: V W such that L is onto

More information