Recall that two vectors in are perpendicular or orthogonal provided that their dot



Similar documents
These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

α = u v. In other words, Orthogonal Projection

Orthogonal Diagonalization of Symmetric Matrices

Chapter 6. Orthogonality

by the matrix A results in a vector which is a reflection of the given

Solutions to Math 51 First Exam January 29, 2015

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Orthogonal Projections

Math 215 HW #6 Solutions

Similarity and Diagonalization. Similar Matrices

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

MATH APPLIED MATRIX THEORY

Linearly Independent Sets and Linearly Dependent Sets

Inner Product Spaces and Orthogonality

Inner product. Definition of inner product

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Lecture 14: Section 3.3

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Math 312 Homework 1 Solutions

Numerical Analysis Lecture Notes

1 Introduction to Matrices

Mathematics Course 111: Algebra I Part IV: Vector Spaces

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Orthogonal Projections and Orthonormal Bases

Section 1.1. Introduction to R n

160 CHAPTER 4. VECTOR SPACES

Lecture Notes 2: Matrices as Systems of Linear Equations

1 VECTOR SPACES AND SUBSPACES

Section 4.4 Inner Product Spaces

9 Multiplication of Vectors: The Scalar or Dot Product

1 Sets and Set Notation.

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Linear Algebra Notes

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Methods for Finding Bases

( ) which must be a vector

Linear Algebra Review. Vectors

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Inner Product Spaces

Section 13.5 Equations of Lines and Planes

Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Solving Systems of Linear Equations

LINEAR ALGEBRA W W L CHEN

NOTES ON LINEAR TRANSFORMATIONS

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Section 9.5: Equations of Lines and Planes

12.5 Equations of Lines and Planes

4: SINGLE-PERIOD MARKET MODELS

Linear Algebra Notes for Marsden and Tromba Vector Calculus

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Systems of Linear Equations

A vector is a directed line segment used to represent a vector quantity.

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

Vector Spaces 4.4 Spanning and Independence

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

5.3 The Cross Product in R 3

MATH1231 Algebra, 2015 Chapter 7: Linear maps

Inner products on R n, and more

Name: Section Registered In:

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Examination paper for TMA4115 Matematikk 3

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Vector Math Computer Graphics Scott D. Anderson

MAT 242 Test 3 SOLUTIONS, FORM A

Linear Algebra: Vectors

Linear Equations in Linear Algebra

3 Orthogonal Vectors and Matrices

LINEAR ALGEBRA. September 23, 2010

Arithmetic and Algebra of Matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

x = + x 2 + x

JUST THE MATHS UNIT NUMBER 8.5. VECTORS 5 (Vector equations of straight lines) A.J.Hobson

Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

3. INNER PRODUCT SPACES

Subspaces of R n LECTURE Subspaces

4.5 Linear Dependence and Linear Independence

Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Applied Linear Algebra

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

LINES AND PLANES CHRIS JOHNSON

Section Inner Products and Norms

Applied Linear Algebra I Review page 1

Solving Linear Systems, Continued and The Inverse of a Matrix

Equations Involving Lines and Planes Standard equations for lines in space

Solutions to old Exam 1 problems

MAT 242 Test 2 SOLUTIONS, FORM T

The Determinant: a Means to Calculate Volume

Row Echelon Form and Reduced Row Echelon Form

Transcription:

Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal while are not 2 We can define an inner product on the vector space of all polynomials of degree at most 3 by setting arbitrarily) (There is nothing special about integrating over [0,1]; This interval was chosen Then, for example, Hence, relative to the inner product we have that the two polynomials are orthogonal in So, more generally, we say that in a vector space V with inner product provided

that Example Consider the matrix Then, by the elementary row operations, we have that As discussed in the previous sections, the row space of A coincides with the row space of In this case, we see that a basis for is given by By consideration of, it follows that the null space of A,, has a basis given by We note that, as per the Fundamental Theorem of Linear Algebra, that Let s consider vectors in and, say,

and By direct computation we see that and so So, is this an accident that an element of is orthogonal to an element of? To answer this let s consider the dot product of arbitrary elements of and Since is a basis for, there exists scalars so that every vector in can be written as

Similarly, since is a basis for, there exists scalars so that every vector in can be written as Now, We conclude that if and, then and so Definition

Suppose V is a vector space with inner product (Think and ) 1 The subspaces of are said to be orthogonal, denoted, if for all 2 Let W be a subspace of V Then we define (read W perp ) to be the set of vectors in V given by The set is called the orthogonal complement of W Examples 1 From the above work, if, then 2 Let A be any matrix Now, the null space of A consists of those vectors x with However, if and only if for each row of the matrix A Hence, the null space of A is the set of all vectors orthogonal to the rows of A and, hence, the row space of A (Why?) We conclude that The above suggest the following method for finding given a subspace W of 1 Find a matrix A having as row vectors a generating set for W 2 Find the null space of A This null space is

3 Suppose that and Then are orthogonal subspaces of To verify this observe that Thus, Since and,

it follows that So, what is the set? Let Then, from part 2 above, In fact, a basis for can be shown to be Finally, we note that the set forms a basis for In particular, every element of can be written as the sum of a vector in and a vector in 4 Let W be the subspace of (= the vector space of all polynomials of degree at most 3) with basis We take as our inner product on the function Find as basis for Solution Let Then

for all Hence, in particular, and Solving the linear system we find that we have pivot variables of and with free variables of c and d It follows that for some Hence, the polynomials span Since these two polynomials are not multiples of each other, they are linearly independent and so they form a basis for

Theorem Suppose that W is a subspace of 1 is a subspace of 2 3 4 Each vector in can be expressed uniquely in the form where and Definition Let V and W be two subspaces of If each vector can be expressed uniquely in the form where and, the we say is the direct sum of V and W and we write Example 1

2 Fundamental Subspaces of a Matrix Let A be an matrix Then the four fundamental subspaces of A are = row space of A = null space of A = column space of A = null space of Example Let Since, it follows that has a basis of and that has a basis of Because, we have that

has a basis of and that consists of all scalar multiples of the vector Fundamental Theorem of Linear Algebra - Part II Let A be an matrix 1 is the orthogonal complement of in 2 is the orthogonal complement of in 3 4 Example Let Write uniquely as the sum of a vector in and a vector in It is sufficient to so that Reducing the associated augmented matrix

to we see that a = 5, b = -6, c = 1 and d = 2 Set and Then Why is this the only way to represent as a sum of a vector from and a vector from? Definition Let and let W be a subspace of If where and, then we call the projection of b onto W and write Example 1 Suppose and W is the subspace of with basis vectors Then, by the previous example,

2 Find if and Solution We note that are linearly independent and, hence, form a basis for W So, we find a basis for by finding the null space for or, equivalently, We see that We now seek so that (*) ( Of course, ) To solve the equation (*) it is sufficient to row reduce the augmented matrix obtaining Thus, We observe that there exists a matrix P given by

so that We call P the projection matrix The projection matrix given by (where the rows of A form a basis for W) is expensive computationally but if one is computing several projections onto W it may very well be worth the effort as the above formula is valid for all vectors b 3 Find the projection of onto the plane in via the projection matrix Solution We seek a set of basis vectors for the plane We claim the two vectors and form a basis (Any two vectors solving that are not multiples of one another will work) Set Then and

Hence, So, We note that and, hence, Why is this not surprising? Properties of a Projection Matrix: (1) (That is, P is idempotent) (2) (That is, P is symmetric)