Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.



Similar documents
MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Inner product. Definition of inner product

x = + x 2 + x

3. INNER PRODUCT SPACES

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Lecture 14: Section 3.3

Math 241, Exam 1 Information.

12.5 Equations of Lines and Planes

1 Introduction to Matrices

Orthogonal Projections

α = u v. In other words, Orthogonal Projection

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

MAT 242 Test 3 SOLUTIONS, FORM A

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Section 9.5: Equations of Lines and Planes

L 2 : x = s + 1, y = s, z = 4s Suppose that C has coordinates (x, y, z). Then from the vector equality AC = BD, one has

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

ISOMETRIES OF R n KEITH CONRAD

Math 215 HW #6 Solutions

9 Multiplication of Vectors: The Scalar or Dot Product

Similarity and Diagonalization. Similar Matrices

[1] Diagonal factorization

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Section 1.1. Introduction to R n

THREE DIMENSIONAL GEOMETRY

1 VECTOR SPACES AND SUBSPACES

Inner Product Spaces

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Inner Product Spaces and Orthogonality

5.3 The Cross Product in R 3

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Chapter 6. Orthogonality

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Exam 1 Sample Question SOLUTIONS. y = 2x

Solutions to old Exam 1 problems

Numerical Analysis Lecture Notes

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

Math 312 Homework 1 Solutions

Equations Involving Lines and Planes Standard equations for lines in space

Chapter 17. Orthogonal Matrices and Symmetries of Space

Orthogonal Diagonalization of Symmetric Matrices

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Solutions to Math 51 First Exam January 29, 2015

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

Orthogonal Projections and Orthonormal Bases

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

BALTIC OLYMPIAD IN INFORMATICS Stockholm, April 18-22, 2009 Page 1 of?? ENG rectangle. Rectangle

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

LINEAR ALGEBRA W W L CHEN

C relative to O being abc,, respectively, then b a c.

LINEAR ALGEBRA. September 23, 2010

Section Inner Products and Norms

Section The given line has equations. x = 3 + t(13 3) = t, y = 2 + t(3 + 2) = 2 + 5t, z = 7 + t( 8 7) = 7 15t.

JUST THE MATHS UNIT NUMBER 8.5. VECTORS 5 (Vector equations of straight lines) A.J.Hobson

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Applied Linear Algebra I Review page 1

LINES AND PLANES IN R 3

VECTOR ALGEBRA A quantity that has magnitude as well as direction is called a vector. is given by a and is represented by a.

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

9 MATRICES AND TRANSFORMATIONS

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

Section 2.4: Equations of Lines and Planes

Eigenvalues and Eigenvectors

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

LINEAR ALGEBRA W W L CHEN

Linear Algebra I. Ronald van Luijk, 2012

FURTHER VECTORS (MEI)

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Part I. Basic Maths for Game Design

5. Orthogonal matrices

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Linear Algebra Review. Vectors

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Vector Algebra CHAPTER 13. Ü13.1. Basic Concepts

Section 13.5 Equations of Lines and Planes

Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

Figure 1.1 Vector A and Vector F

discuss how to describe points, lines and planes in 3 space.

a.) Write the line 2x - 4y = 9 into slope intercept form b.) Find the slope of the line parallel to part a

Unified Lecture # 4 Vectors

NOTES ON LINEAR TRANSFORMATIONS

Vector Math Computer Graphics Scott D. Anderson

Lecture 2 Matrix Operations


LS.6 Solution Matrices

3.2 Sources, Sinks, Saddles, and Spirals

Vector Spaces 4.4 Spanning and Independence

( ) which must be a vector

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

13.4 THE CROSS PRODUCT

Transcription:

Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve the following problems from the text. Most are simple mental exercises. These are not to be handed in. Sec. 5.1, #28, 29, 31 Sec. 5.2 #33 1. [Bretscher, Sec. 5.1 #16] Consider the following vectors in R 4 u 1 =, u 2 =, u 3 =. Can you find a vector u 4 in R 4 such that the vectors u 1, u 2, u 3, u 4 are orthonormal? If so, how many such vectors are there? Solution: Since R 4 is four dimensional, you can extend these three orthonormal vectors to an orthonormal basis. First find a vector orthogonal to these three; then normalize it to be a unit vector. In this case, looking at the three given vectors, another immediately comes to mind: u 4 =, (1) with ˆ u := u 4 another possibility. These two are the only possibility since the orthogonal complement of the span of u 1, u 2, u 3 is one dimensional so a basis will have only one vector. After we have found one, which we call u 4, any other, say ˆ w must have the form w = c u 4 for some constant c. Because we want a unit vector, so c = ±1. 1 = w = c 2 u 4 = c 2, But what if my u 4 didn t immediately come to mind? Use the Gram-Schmidt process. Pick any vector not in the span of u 1, u 2, u 3. Almost any vector in R 4 will do. I will 1 try the simple w := 0 0. We want to write w in the form 0 w = a u 1 + b u 2 + c u 3 + z, 1

where z is orthogonal to u 1, u 2, and u 3. Taking the inner product of both sides of this successively with u 1, u 2, and u 3 (which are unit vectors), we find that a = w, u 1 =, b = w, u 2 =, c = w, u 3 =. Then 1 3/2 1/4 z = w [() u 1 + () u 2 + () u 3 ] = 0 0 1 2 = 1/4 1/4 0 1/4 To get the desired unit vector we let u 4 = z/ z which agrees with (1) 2. [Bretscher, Sec. 5.1 #17] Find a basis for W, where 1 5 W = span 2 3, 6 7. 4 8 Solution: The vectors x = (x 1, x 2, x 3, x 4 ) W must satisfy x 1 + 2x 2 + 3x 3 + 4x 4 =0 5x 1 + 6x 2 + 7x 3 + 8x 4 =0 Solving these equations for x 1 and x 2 in terms of x 3 and x 4 we find Thus x 1 = x 3 + 2x 4 x 2 = 2x 3 3x 4. x 3 + 2x 4 1 2 x = 2x 3 3x 4 x 3 = 2 1 x 3 + 3 0 x 4 x 4 0 1 The two vectors at the end of the previous line are a basis for W 3. [Bretscher, Sec. 5.1 #21] Find scalars a, b, c, d, e, f, and g so that the following vectors are orthonormal: a d, f b 1, g c e. 2

Solution: The orthogonality gives ab + d + fg = 0, ac + ed + f/2 = 0, bc + e + g/2 = 0. Because we want unit vectors, so we can t scale the second or third vectors, we need b = g = 0 and we can t simply let c = e = 0 (it took me a few minutes to grasp this). The orthogonality conditions are then d = 0, ac + f/2 = 0, e = 0. That these are unit vectors gives a 2 +f 2 = 1 and c 2 +1/4 = 1. Therefore c = ± 3/2, so f = ( 3)a, which in turn implies a = ±. 49 4. [Bretscher, Sec. 5.1 #26] Find the orthogonal projection P S of x := 49 into 49 2 3 the subspace S of R 3 spanned by v 1 := 3 and v 2 := 6. 6 2 Solution: We are fortunate that the vectors v 1 and v 2 are orthogonal. We want to find constants a and b so that x = a v 1 + b v 2 + w, (2) where w is orthogonal to S. Then the desired projection will be P S x = a v 1 + b v 2. To find the scalars a and n, take the inner product of (2) with v 1 and then v 2 we find x, v 1 = a v 1 2 and x, v 2 = a v 2 2. Using the particular vectors in this problem, a = 11 and b = 1. Thus 19 P S x = 11 v 1 v 2 = 39 64 5. [Bretscher, Sec. 5.1 #37] Consider a plane in R 3 with orthonormal basis u 1 and u 2. Let x be a vector in R 3. Find a formula for the reflection R x of x across the plane. Solution: The key is a picture (first try it in R 2 where is a line through the origin). Let P x be the orthogonal projection of x into the plane. Then w := P x = x P x is the projection of x orthogonal to. From the picture, to get the reflection, replace w by w 3

W=X P X X = P X + W W Thus, since x = P x + w, then P X R X = P X W R x = P x w = P x ( x P x) = 2P x x. In summary, orthogonal projections and reflections for a subspace are related by the simple formula R = 2P I. Note that the orthogonal projection, P x, is easy to compute if you know an orthonormal basis. All of this is very general. In this problem u 1 and u 2 are an orthonormal basis for the subspace, so P x = x, u 1 u 1 + x, u 2 u 2. Consequently, R x = 2( x, u 1 u 1 + x, u 2 u 2 ) x. 6. [Bretscher, Sec. 5.2 #32] Find an orthonormal basis for the plane x 1 +x 2 +x 3 = 0. Solution: Pick any point in the plane, say v 1 = (1, 1,0). This will be the first vector in our orthogonal basis. We use the Gram-Schmidt process to extend this to an orthogonal basis for the plane. Pick any other point in the plane, say w 1 := (1,0 1). Write it as w 1 = a v 1 + z, where z is perpendicular to v 1. Note that, although unknown, z will also be in the plane since it will be a linear combination of v 1 and w, both of which are in the plane. As usual, by taking the inner product of both sides of w 1 = a v 1 + z with v 1, we find Thus a = w 1, v 1 / v 1 2 = 1 2. z = w 1 1 2 v 1 = 1 is in the plane and orthogonal to v 1. The vectors v 1 and z are an orthogonal basis for this plane. To get an orthonormal basis we just make these into unit vectors u 1 := v 1 v 1 = 1 1 1 and u 2 := z 2 z = 1 0 3/2 1 4

7. Let be a linear space. A linear map P : is called a projection if P 2 = P (this P is not necessarily an orthogonal projection ). a) Show that the matrix P = ( 0 1 0 1 ) is a projection. Draw a sketch of R2 showing the vectors (1,2), ( 1,0), (3,1) and (0,3) and their images under the map P. Also indicate both the image,, and kernel, W, of P. Solution: The image of P is the line x 1 = x 2 (the subspace ); the kernel is the x 1 axis (the subspace W ). See the figure below. b) Repeat this for the complementary projection Q := I P. Solution: The image of Q is the x 1 axis (the subspace W ); its kernel is the line x 1 = x 2 (the subspace ). P: Q: 3 2 1 3 2 1 1 2 1 c) If the image and kernel of a projection P are orthogonal then P is called an orthogonal projection. [This of course now assumes that has an inner product.] Let M = ( 0 0 a c ). For which real value(s) of a and c is this a projection? An orthogonal projection? ( ) 0 ac Solution: M 2 = 0 c 2 so M 2 = M requires that ac = a and c 2 = c. The first requires that either a = 0 or c = 1. If a = 0 the second equation is satisfied if either c = 0 or c = 1. If a 0, then c = 1. Thus, the possibilities are: ( ) ( ) 0 0 0 a P 1 :=, or P 0 0 2 := (for any a). 0 1 For an orthogonal projection P, its image and kernal must be orthogonal. Since for P 1 its image is just 0, which is orthogonal to everything, it is an orthogonal projection. The kernel of P 2 is the horizontal axis. Its image consists of points of the form t(a, 1) for any scalar t. This straight line is perpendicular to the horizontal axis if (and only if) a = 0. Thus P 2 is an orthogonal projection if and only if a = 0. The remaining problems are from the Lecture notes on ectors http://www.math.upenn.edu/ kazdan/312f12/notes/vectors/vectors8.pdf 5 3 W

8. [p. 8 #5] The origin and the vectors X, Y, and X + Y define a parallelogram whose diagonals have length X + Y and X Y. Prove the parallelogram law X + Y 2 + X Y 2 = 2 X 2 + 2 Y 2 ; This states that in a parallelogram, the sum of the squares of the lengths of the diagonals equals the sum of the squares of the four sides. Solution: The standard procedure is to express the norm in terms of the inner product and use the usual algebraic rules for the inner product. Thus X+Y 2 = X+Y, X+Y = X, X + X, Y + Y, X + Y, Y = X, X +2 X, Y + Y, Y, with a similar formula for X Y 2. After easy algebra, the result is clear. 9. [p. 8 #6] a) Find the distance D from the straight line 3x 4y = 10 to the origin. Solution: Note that the equation of the parallel line l through the origin is 3x 4y = 0, which we rewrite as N, X = 0, where N := (3, 4) and X = (x, y). Let X 0 be some point on the original line, so N, X 0 = 10. Then the desired distance D is the same as the distance from X 0 to the line l: N, X = 0, through the origin. But the equation for l says the vector N is perpendicular to the line l. Thus the distance D is the length of the projection of X 0 in the direction of N, that is, D = N, X 0 = 10 N 5 = 2. b) Find the distance D from the plane ax + by + cz = d to the origin (assume the vector N = (a,b,c) 0). Solution: The solution presented in the above special case generalizes immediately to give D = N, X 0 N = d N. 10. [p. 8 #8] a) If X and Y are real vectors, show that X, Y = 1 4 ( X + Y 2 X Y 2). (3) This formula is the simplest way to recover properties of the inner product from the norm. Solution: The straightforward procedure is the same as in Problem 8: rewrite the norms on the right in terms of the inner product and expand using algebra. 6

b) As an application, show that if a square matrix R has the property that it preserves length, so RX = X for every vector X, then it preserves the inner product, that is, RX, RY = X, Y for all vectors X and Y. Solution: We know that RZ = Z for any vector Z. This implies R(X + Y ) = X + Y for any vectors X and Y, and, similarly, R(X Y ) = X Y for any vectors X and Y. Consequently, by equation (3) (used twice) for all vectors X and Y. 4 RX, RY = R(X + Y ) 2 R(X Y ) 2 = X + Y 2 X Y 2 =4 X, Y 11. [p. 9 #10] a) If a certain matrix C satisfies X, CY = 0 for all vectors X and Y, show that C = 0. Solution: Since X can be any vector, let X = CY to show that CY 2 = CY, CY = 0. Thus CY = 0 for all Y so C = 0. b) If the matrices A and B satisfy X, AY = X, BY for all vectors X and Y, show that A = B. Solution: We have 0 = X, AY X, BY = X, (AY BY ) = X, (A B)Y for all X and Y so by part (a) with C := A B, we conclude that A = B. 12. [p. 9 #11 12] A matrix A is called anti-symmetric (or skew-symmetric) if A = A. a) Give an example of a 3 3 anti-symmetric matrix. Solution: The most general anti-symmetric 3 3 matrix has the form 0 a b a 0 c. b c 0 b) If A is any anti-symmetric matrix, show that X, AX = 0 for all vectors X. Solution: X, AX = A X, X = AX, X = X, AX. Thus 2 X, AX = 0 so X, AX = 0. 7

c) Say X(t) is a solution of the differential equation dx = AX, where A is an antisymmetric matrix. dt ( Show ) that X(t) = constant. [Remark: A special case is cos t that X(t) := satisfies X sin t = AX with A = ( 0 1 1 0 ) so this problem gives another proof that cos 2 t + sin 2 t = 1]. Solution: Let E(t) := X(t) 2. We show that de/dt = 0. But, using part (b), de dt = d dt X(t), X(t) = 2 X(t), X (t) = 2 X(t), AX(t) = 0. 1-B This is a followup to problem 7. Bonus Problem [Please give this directly to Professor Kazdan] a) If a projection P is self-adjoint, so P = P, show that P is an orthogonal projection. b) Conversely, if P is an orthogonal projection, show that it is self-adjoint. [Last revised: November 10, 2012] 8