3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.



Similar documents
Inner Product Spaces and Orthogonality

Chapter 17. Orthogonal Matrices and Symmetries of Space

LINEAR ALGEBRA. September 23, 2010

Chapter 6. Orthogonality

Similarity and Diagonalization. Similar Matrices

Lecture 1: Schur s Unitary Triangularization Theorem

APPLICATIONS. are symmetric, but. are not.

Applied Linear Algebra I Review page 1

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Section Inner Products and Norms

6. Cholesky factorization

1 VECTOR SPACES AND SUBSPACES

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Eigenvalues and Eigenvectors

LINEAR ALGEBRA W W L CHEN

by the matrix A results in a vector which is a reflection of the given

Solving Linear Systems, Continued and The Inverse of a Matrix

42 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. Figure 1.18: Parabola y = 2x Brief review of Conic Sections

The Characteristic Polynomial

Section 1.1. Introduction to R n

x = + x 2 + x

Notes on Symmetric Matrices

Two vectors are equal if they have the same length and direction. They do not

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

ISOMETRIES OF R n KEITH CONRAD

1 Sets and Set Notation.

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Recall that two vectors in are perpendicular or orthogonal provided that their dot

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Similar matrices and Jordan form

[1] Diagonal factorization

Linear Algebra Review. Vectors

3. INNER PRODUCT SPACES

Exam 1 Sample Question SOLUTIONS. y = 2x

Orthogonal Projections

MATH APPLIED MATRIX THEORY

Math 312 Homework 1 Solutions

Identifying second degree equations

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION

Factorization Theorems

α = u v. In other words, Orthogonal Projection

Vector and Matrix Norms

5. Orthogonal matrices

(a) We have x = 3 + 2t, y = 2 t, z = 6 so solving for t we get the symmetric equations. x 3 2. = 2 y, z = 6. t 2 2t + 1 = 0,

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Inner Product Spaces

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra: Determinants, Inverses, Rank

Rotation Matrices and Homogeneous Transformations

12.5 Equations of Lines and Planes

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Inner products on R n, and more

THREE DIMENSIONAL GEOMETRY

Numerical Analysis Lecture Notes

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Lecture 4: Partitioned Matrices and Determinants

Vector Spaces. Chapter R 2 through R n

Lecture 14: Section 3.3

Algebra 2 Chapter 1 Vocabulary. identity - A statement that equates two equivalent expressions.

NOTES ON LINEAR TRANSFORMATIONS

Mathematics Course 111: Algebra I Part IV: Vector Spaces

( 1) = 9 = 3 We would like to make the length 6. The only vectors in the same direction as v are those

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

Orthogonal Diagonalization of Symmetric Matrices

Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Lecture 5: Singular Value Decomposition SVD (1)

4.5 Linear Dependence and Linear Independence

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Math 241, Exam 1 Information.

Notes on Determinant

Theory of Matrices. Chapter 5

Inner product. Definition of inner product

Introduction to Algebraic Geometry. Bézout s Theorem and Inflection Points

University of Lille I PC first year list of exercises n 7. Review

Linear Algebra I. Ronald van Luijk, 2012

Orthogonal Bases and the QR Algorithm

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Numerical Analysis Lecture Notes

How To Prove The Dirichlet Unit Theorem

8 Square matrices continued: Determinants

CS3220 Lecture Notes: QR factorization and orthogonal transformations

A note on companion matrices

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Solving Systems of Linear Equations

Notes from February 11

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

A =

1 Introduction to Matrices

Transcription:

Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R n 1, Ax,Ay = x,y. (d) for any vector x R n 1, Ax = x.. Let A be an n n unitary matrix. Then prove that (a) the rows/columns of A form an orthonormal basis of the complex vector space C n. (b) for any two vectors x,y C n 1, Ax,Ay = x,y. (c) for any vector x C n 1, Ax = x.. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices. 4. Prove the statements made in Remark??.?? about orthogonal matrices. State and prove a similar result for unitary matrices. 5. Let A be an n n upper triangular matrix. If A is also an orthogonal matrix then prove that A = I n. 6. Determine an orthonormal basis of R 4 containing the vectors (1,,1,) and (,1,,1). 7. Consider the real inner product space C[ 1,1] with f,g = 1 1 f(t)g(t)dt. Prove that the polynomials 1,x, x 1, 5 x x form an orthogonal set in C[ 1,1]. Find the corresponding functions f(x) with f(x) = 1. 8. Consider the real inner product space C[ π,π] with f,g = orthonormal basis for L(x,sinx,sin(x+1)). π π f(t)g(t)dt. Find an 9. Let M be a subspace of R n and dimm = m. A vector x R n is said to be orthogonal to M if x,y = 0 for every y M. (a) How many linearly independent vectors can be orthogonal to M? (b) If M = {(x 1,x,x ) R : x 1 + x + x = 0}, determine a maximal set of linearly independent vectors orthogonal to M in R. 10. Determinean orthogonalbasisofl({(1,1,0,1),( 1,1,1, 1),(0,,1,0),(1,0,0,0)}) in R 4. 1

11. Let R n be endowed with the standard inner product. Suppose we have a vector x t = (x 1,x,...,x n ) R n with x = 1. (a) Then prove that the set {x} can always be extended to form an orthonormal basis of R n. (b) Let this basis be {x,x,...,x n }. Suppose B = (e 1 ],e,...,e n ) is the standard basis of R n and let A = [[x] B, [x ] B,..., [x n ] B. Then prove that A is an orthogonal matrix. 1. Let v,w R n,n 1 with u = w = 1. Prove that there exists an orthogonal matrix A such that Av = w. Prove also that A can be chosen such that det(a) = 1. Example 1. Let V = R and W = {(x,y,z) R : x+y z = 0}. Then it can be easily verified that {(1,1, 1)} is a basis of W as for each (x,y,z) W, we have x+y z = 0 and hence (x,y,z),(1,1, 1) = x+y z = 0 for each (x,y,z) W. Also, usingequation (??), for everyx t = (x,y,z) R, we haveu = x+y z (1,1, 1), w = ( x y+z, x+y+z, x+y+z ) and x = w+u. Let A = 1 1 1 1 1 and B = 1 1 1 1 1 1 1. 1 1 1 1 1 Then by definition, P W (x) = w = Ax and P W (x) = u = Bx. Observe that A = A,B = B, A t = A, B t = B, A B = 0, B A = 0 and A+B = I, where 0 is the zero matrix of size and I is the identity matrix of size. Also, verify that rank(a) = and rank(b) = 1.. Let W = L( (1,,1) ) R. Then using Example??.??, and Equation (??), we get W = L({(,1,0),( 1,0,1)}) = L({(,1,0),(1,, 5)}), u = ( 5x y z, x+y z, x y+5z ) and w = x+y+z (1,,1) with (x,y,z) = w +u. 6 6 6 6 Hence, for 1 1 5 1 A = 1 6 4 1 1 and B = 1 6 1 5 we have P W (x) = w = Ax and P W (x) = u = Bx. Observe that A = A,B = B, A t = A and B t = B, A B = 0, B A = 0 and A + B = I, where 0 is the zero matrix of size and I is the identity matrix of size. Also, verify that rank(a) = 1 and rank(b) =.,

Example 1. Let A = diag(d 1,d,...,d n ) with d i R for 1 i n. Then p(λ) = n (λ d i ) and the eigen-pairs are (d 1,e 1 ),(d,e ),...,(d n,e n ). i=1 [ ] 1 1. Let A =. Then p(λ) = (1 λ). Hence, the characteristic equation has 0 1 roots 1,1. That is, 1 is a repeated eigenvalue. But the system (A I )x = 0 for x = (x 1,x ) t implies that x = 0. Thus, x = (x 1,0) t is a solution of (A I )x = 0. Hence using Remark??.??, (1,0) t is an eigenvector. Therefore, note that 1 is a repeated eigenvalue whereas there is only one eigenvector. [ ] 1 0. Let A =. Then p(λ) = (1 λ). Again, 1 is a repeated root of p(λ) = 0. But 0 1 in this case, the system (A I )x = 0 has a solution for every x t R. Hence, we can choose any two linearly independent vectors x t,y t from R to get (1,x) and (1,y) as the two eigen-pairs. In general, if x 1,x,...,x n R n are linearly independent vectors then (1,x 1 ), (1,x ),...,(1,x n ) are eigen-pairs of the identity matrix, I n. [ ] 1 4. Let A =. Then p(λ) = (λ )(λ +1) and its roots are, 1. Verify that 1 the eigen-pairs are (,(1,1) t ) and ( 1,(1, 1) t ). The readers are advised to prove the linear independence of the two eigenvectors. [ ] 1 1 5. Let A =. Then p(λ) = λ λ+ and its roots are 1+i,1 i. Hence, 1 1 over R, the matrix A has no eigenvalue. Over C, the reader is required to show that the eigen-pairs are (1+i,(i,1) t ) and (1 i,(1,i) t ). Exercise 4 1. Find the eigenvalues of a triangular matrix.. [ Find eigen-pairs ] [ over C, for each ] of [ the following ] matrices: [ ] 1 1+i i 1+i cosθ sinθ cosθ sinθ,, and 1 i 1 1+i i sinθ cosθ sinθ cosθ. Let A and B be similar matrices. (a) Then prove that A and B have the same set of eigenvalues. (b) If B = PAP 1 for some invertible matrix P then prove that Px is an eigenvector of B if and only if x is an eigenvector of A. 4. Let A = (a ij ) be an n n matrix. Suppose that for all i, 1 i n,. n a ij = a. Then prove that a is an eigenvalue of A. What is the corresponding eigenvector? j=1

5. Prove that the matrices A and A t have the same set of eigenvalues. Construct a matrix A such that the eigenvectors of A and A t are different. 6. Let A be a matrix such that A = A (A is called an idempotent matrix). Then prove that its eigenvalues are either 0 or 1 or both. 7. Let A be a matrix such that A k = 0 (A is called a nilpotent matrix) for some positive integer k 1. Then prove that its eigenvalues are all 0. [ ] [ ] 1 i 8. Compute the eigen-pairs of the matrices and. 1 0 i 0 Exercise 5 1. Let A be a skew symmetric matrix of order n+1. Then prove that 0 is an eigenvalue of A.. Let A be a orthogonal matrix (AA t = I). If det(a) = 1, then prove that there exists a non-zero vector v R such that Av = v. Exercise 6 Find inverse of the following matrices by using the Cayley Hamilton Theorem 4 i) 5 6 7 1 1 1 ii) 1 1 1 1 1 iii) 1 1. 1 1 0 1 1 0 1 Exercise 7 1. Let A,B M n (R). Prove that (a) if λ is an eigenvalue of A then λ k is an eigenvalue of A k for all k Z +. (b) if A is invertible and λ is an eigenvalue of A then 1 λ is an eigenvalue of A 1. (c) if A is nonsingular then BA 1 and A 1 B have the same set of eigenvalues. (d) AB and BA have the same non-zero eigenvalues. In each case, what can you say about the eigenvectors?. Prove that there does not exist an A M n (C) satisfying A n 0 but A n+1 = 0. That is, if A is an n n nilpotent matrix then the order of nilpotency n.. Let A M n (R) be an invertible matrix and let x t,y t R n. Define B = xy t A 1. Then prove that (a) 0 is an eigenvalue of B of multiplicity n 1 [Hint: Use Exercise 7.1d]. (b) λ 0 = y t A 1 x is an eigenvalue of multiplicity 1. (c) 1+αλ 0 is an eigenvalue of I +αb of multiplicity 1 for any α R. (d) 1 is an eigenvalue of I +αb of multiplicity n 1 for any α R. 4

(e) det(a+αxy t ) equals (1 + αλ 0 )det(a) for any α R. This result is known as the Shermon-Morrison formula for determinant. 4. Let A,B M (R) such that det(a) = det(b) and tr(a) = tr(b). (a) Do A and B have the same set of eigenvalues? (b) Give examples to show that the matrices A and B need not be similar. 5. Let A,B M n (R). Also, let (λ 1,u) be an eigen-pair for A and (λ,v) be an eigen-pair for B. (a) If u = αv for some α R then (λ 1 +λ,u) is an eigen-pair for A+B. (b) Give an example to show that if u and v are linearly independent then λ 1 +λ need not be an eigenvalue of A+B. 6. Let A M n (R)be aninvertible matrixwith eigen-pairs(λ 1,u 1 ),(λ,u ),...,(λ n,u n ). Thenprove thatb = {u 1,u,...,u n } formsabasisof R n (R). If[b] B = (c 1,c,...,c n ) t then the system Ax = b has the unique solution x = c 1 u 1 + c u + + c n u n. λ 1 λ λ n [ ] cosθ sinθ Exercise 8 1. Are the matrices A = and B = sinθ cosθ some θ,0 θ π, diagonalizable? [ ] cosθ sinθ sinθ cosθ for. Find the eigen-pairs of A = [a ij ] n n, where a ij = a if i = j and b, otherwise. [ ] A 0. Let A M n (R) and B M m (R). Suppose C =. Then prove that C is 0 B diagonalizable if and only if both A and B are diagonalizable. 4. Let T : R 5 R 5 be a linear operator with rank (T I) = and N(T) = {(x 1,x,x,x 4,x 5 ) R 5 x 1 +x 4 +x 5 = 0, x +x = 0}. (a) Determine the eigenvalues of T? (b) Find the number of linearly independent eigenvectors corresponding to each eigenvalue? (c) Is T diagonalizable? Justify your answer. 5. Let A be a non-zero square matrix such that A = 0. Prove that A cannot be diagonalized. [Hint: Use Remark??.] 5

6. Are the following matrices diagonalizable? 1 1 1 0 1 0 1 i) 0 0 1 1, ii) 0 0 1, iii) 0 0 0 0 0 4 1 0 5 6 and iv) 0 4 [ ] i. i 0 Exercise 9 1. Let A be a square matrix such that UAU is a diagonal matrix for some unitary matrix U. Prove that A is a normal matrix.. Let A M n (C). Then A = 1 (A + A ) + 1 (A A ), where 1 (A + A ) is the Hermitian part of A and 1 (A A ) is the skew-hermitian part of A. Recall that a similar result was given in Exercise??.??.. Every square matrix can be uniquely expressed as A = S+iT, where both S and T are Hermitian matrices. 4. Let A M n (C). Prove that A A is always skew-hermitian. 5. Does there exist a unitary matrix U such that U 1 AU = B where 1 1 4 1 A = 0 and B = 0 1. 0 0 0 0 Exercise 10 1. Let A be a skew-hermitian matrix. Then the eigenvalues of A are either zero or purely imaginary. Also, the eigenvectors corresponding to distinct eigenvalues are mutually orthogonal. [Hint: Carefully see the proof of Theorem??.]. Let A be a normal matrix with (λ,x) as an eigen-pair. Then (a) (A ) k x for k Z + is also an eigenvector corresponding to λ. (b) (λ,x) is an eigen-pair for A. [Hint: Verify A x λx = Ax λx.]. Let A be an n n unitary matrix. Then (a) the rows of A form an orthonormal basis of C n. (b) the columns of A form an orthonormal basis of C n. (c) for any two vectors x,y C n 1, Ax,Ay = x,y. (d) for any vector x C n 1, Ax = x. (e) λ = 1 for any eigenvalue λ of A. (f) the eigenvectors x,y corresponding to distinct eigenvalues λ and µ satisfy x,y = 0. That is, if (λ,x) and (µ,y) are eigen-pairs with λ µ, then x and y are mutually orthogonal. 6

[ ] [ ] 4 4 10 9 4. Show that the matrices A = and B = 0 4 4 to find a unitary matrix U such that A = U BU? are similar. Is it possible 5. Let A be a orthogonal matrix. Then prove the following: [ ] cosθ sinθ (a) if det(a) = 1, then A = for some θ, 0 θ < π. That is, A sinθ cosθ counterclockwise rotates every point in R by an angle θ. [ ] cosθ sinθ (b) if deta = 1, then A = for some θ, 0 θ < π. That is, sinθ cosθ A reflects every point in R about a line passing through origin. Determine this line. [ Or equivalently, ] there exists a non-singular matrix P such that 1 0 P 1 AP =. 0 1 6. Let A be a orthogonal matrix. Then prove the following: (a) if det(a) = 1, then A is a rotation about a fixed axis, in the sense that A has an eigen-pair (1,x) such that the restriction of A to the plane x is a two dimensional rotation in x. (b) if deta = 1, then A corresponds to a reflection through a plane P, followed by a rotation about the line through origin that is orthogonal to P. 1 1 7. Let A = 1 1. Findanon-singularmatrix P such thatp 1 AP = diag (4,1,1). 1 1 Use this to compute A 01. 8. Let A be a Hermitian matrix. Then prove that rank(a) equals the number of nonzero eigenvalues of A. Exercise 11 1. Let A M n (R) be an invertible matrix. Prove that AA t = PDP t, where P is an orthogonal and D is a diagonal matrix with positive diagonal entries. 1 1 1 1 1 1 0. Let A = 0 1, B = 0 1 0 and U = 1 1 1 0. Prove that A 0 0 0 0 0 0 and B are unitarily equivalent via the unitary matrix U. Hence, conclude that the upper triangular matrix obtained in the Schur s Lemma need not be unique.. Prove Remark??. 4. Let A be a normal matrix. If all the eigenvalues of A are 0 then prove that A = 0. What happens if all the eigenvalues of A are 1? 7

5. Let A be an n n matrix. Prove that if A is (a) Hermitian and xax = 0 for all x C n then A = 0. (b) a real, symmetric matrix and xax t = 0 for all x R n then A = 0. Do these results hold for arbitrary matrices? We complete this chapter by understanding the graph of ax +hxy +by +fx+gy+c = 0 for a,b,c,f,g,h R. We first look at the following example. Example 1 Sketch the graph of x +4xy +y = 5. Solution: Note that x + 4xy + y = [x, y] [ ] matrix are (5,(1,1) t ), (1,(1, 1) t ). Thus, [ ] [ = Now, let u = x+y and v = x y. Then 1 [ ] [5 ] [ ] 0 1 1 0 1 1. [ ][ ] x x +4xy +y = [x, y] y [ ] [5 ] [ = [x, y] 0 1 1 1 0 1 = [ u, v ][ ][ ] 5 0 u = 5u +v. 0 1 v ][ ] x and the eigen-pairs of the y 1 ] [x ] Thus, the given graph reduces to 5u +v = 5 or equivalently to u + v = 1. Therefore, the 5 given graph represents an ellipse with the principal axes u = 0 and v = 0 (correspinding to the line x+y = 0 and x y = 0, respectively). See Figure 1. y y = x y = x Figure 1: The ellipse x +4xy +y = 5. 8

We now consider the general conic. We obtain conditions on the eigenvalues of the associated quadratic form, defined below, to characterize conic sections in R (endowed with the standard inner product). Definition 1 (Quadratic Form) Let ax + hxy + by + gx + fy + c = 0 be the equation of a general conic. The quadratic expression ax +hxy +by = [ x, y ][ ][ ] a h x h b y is called the quadratic form associated with the given conic. Theorem 14 For fixed real numbers a,b,c,g,f and h, consider the general conic Then prove that this conic represents 1. an ellipse if ab h > 0,. a parabola if ab h = 0, and ax +hxy +by +gx+fy +c = 0.. a hyperbola if ab h < 0. [ ] a h Proof: Let A =. Then ax + hxy + by = [ x y ] [ ] x A is the associated h b y quadratic form. As A is a symmetric matrix, by Corollary??, the eigenvalues λ 1,λ of A are both real, the corresponding eigenvectors u 1,u are orthonormal and A is unitarily diagonalizable with A = [ ] [ ] [ ] [ ] [ ][ ] λ 1 0 u t u 1 u 1 u u t 0 λ u t. Let = 1 x v u t. Then y ax +hxy+by = λ 1 u +λ v and the equation of the conic section in the (u,v)-plane, reduces to λ 1 u +λ v +g 1 u+f 1 v +c = 0. (1) Now, depending on the eigenvalues λ 1,λ, we consider different cases: 1. λ 1 = 0 = λ. Substituting λ 1 = λ = 0 in Equation (1) gives the straight line g 1 u+f 1 v +c = 0 in the (u,v)-plane.. λ 1 = 0,λ > 0. As λ 1 = 0, det(a) = 0. That is, ab h = det(a) = 0. Also, in this case, Equation (1) reduces to λ (v +d 1 ) = d u+d for some d 1,d,d R. To understand this case, we need to consider the following subcases: (a) Let d = d = 0. Then v +d 1 = 0 is a pair of coincident lines. 9

(b) Let d = 0, d 0. d i. If d > 0, then we get a pair of parallel lines given by v = d 1 ± λ. ii. If d < 0, the solution set of the corresponding conic is an empty set. (c) Ifd 0.Then thegivenequationisoftheformy = 4aX forsometranslates X = x+α and Y = y +β and thus represents a parabola.. λ 1 > 0 and λ < 0. In this case, ab h = det(a) = λ 1 λ < 0. Let λ = α with α > 0. Then Equation (1) can be rewritten as λ 1 (u+d 1 ) α (v +d ) = d for some d 1,d,d R () whose understanding requires the following subcases: (a) Let d = 0. Then Equation () reduces to ( λ1 (u+d 1 )+ α (v+d )) ( λ1 (u+d 1 ) ) α (v +d ) = 0 or equivalently, a pair of intersecting straight lines in the (u, v)-plane. (b) Let d 0. In particular, let d > 0. Then Equation () reduces to λ 1 (u+d 1 ) d α (v +d ) d = 1 or equivalently, a hyperbola in the (u,v)-plane, with principal axes u+d 1 = 0 and v+d = 0. 4. λ 1,λ > 0. In this case, ab h = det(a) = λ 1 λ > 0 and Equation (1) can be rewritten as λ 1 (u+d 1 ) +λ (v +d ) = d for some d 1,d,d R. () We consider the following three subcases to understand this. (a) Let d = 0.Then Equation()reduces toapair ofperpendicular lines u+d 1 = 0 and v +d = 0 in the (u,v)-plane. (b) Let d < 0. Then the solution set of Equation () is an empty set. (c) Let d > 0. Then Equation () reduces to the ellipse λ 1 (u+d 1 ) d + α (v +d ) d = 1 whose principal axes are u+d 1 = 0 and v +d = 0. [ ] x Remark 15 Observe that the condition = [ ] [ ] u u 1 u implies that the principal y v axes of the conic are functions of the eigenvectors u 1 and u. 10

Exercise 16 Sketch the graph of the following surfaces: 1. x +xy +y 6x 10y =.. x +6xy +y 1x 6y = 5.. 4x 4xy +y +1x 8y = 10. 4. x 6xy +5y 10x+4y = 7. As a last application, we consider the following problem that helps us in understanding the quadrics. Let ax +by +cz +dxy +exz +fyz +lx+my +nz +q = 0 (4) be a general quadric. Then to get the geometrical idea of this quadric, do the following: a d e l x 1. Define A = d b f, b = m and x = y. Note that Equation (4) can be e f c n z rewritten as x t Ax+b t x+q = 0.. As A issymmetric, find an orthogonalmatrix P such that P t AP = diag(λ 1,λ,λ ).. Let y = P t x = (y 1,y,y ) t. Then Equation (4) reduces to λ 1 y 1 +λ y +λ y +l 1y 1 +l y +l y +q = 0. (5) 4. Depending on which λ i 0, rewrite Equation (5). That is, if λ 1 0 then rewrite ) ( ) λ 1 y1 +l 1y 1 as λ 1 (y 1 + l 1 λ 1 l. 1 λ 1 5. Use the condition x = Py to determine the center and the planes of symmetry of the quadric in terms of the original system. Example 17 Determine the following quadrics 1. x +y +z +xy +xz +yz +4x+y +4z + = 0.. x y +z +10 = 0. 1 1 4 Solution: For Part 1, observe that A = 1 1, b = and q =. Also, the 1 1 4 1 1 6 1 4 0 0 orthonormal matrix P = 6 1 and P t AP = 0 1 0. Hence, the quadric 0 0 1 1 0 6 reduces to 4y1 +y +y + 10 y 1 + y 6 y + = 0. Or equivalently to 4(y 1 + 5 4 ) +(y + 1 ) +(y 1 6 ) = 9 1. 11

So, the standard form of the quadric is 4z1 +z +z = 9, where the center is given by 1 (x,y,z) t = P( 5 4, 1 1, 6 ) t = (, 1, 4 4 4 )t. 0 0 For Part, observe that A = 0 1 0, b = 0 and q = 10. In this case, we can 0 0 1 rewrite the quadric as y 10 x 10 z 10 = 1 which is the equation of a hyperboloid consisting of two sheets. The calculation of the planes of symmetry is left as an exercise to the reader. 1