3 Orthogonal Vectors and Matrices



Similar documents
Linear Algebra Review. Vectors

α = u v. In other words, Orthogonal Projection

5. Orthogonal matrices

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Inner Product Spaces and Orthogonality

Lecture 5: Singular Value Decomposition SVD (1)

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

Similarity and Diagonalization. Similar Matrices

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Linear Algebra: Determinants, Inverses, Rank

Lecture 2 Matrix Operations

1 VECTOR SPACES AND SUBSPACES

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Numerical Analysis Lecture Notes

Orthogonal Projections

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

LINEAR ALGEBRA. September 23, 2010

Solution of Linear Systems

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Inner Product Spaces

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Chapter 17. Orthogonal Matrices and Symmetries of Space

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

ISOMETRIES OF R n KEITH CONRAD

Chapter 6. Orthogonality

Nonlinear Iterative Partial Least Squares Method

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Section 1.1. Introduction to R n

Notes on Symmetric Matrices

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Mathematics Course 111: Algebra I Part IV: Vector Spaces

October 3rd, Linear Algebra & Properties of the Covariance Matrix

CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES. From Exploratory Factor Analysis Ledyard R Tucker and Robert C.

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

1 Introduction to Matrices

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

9 MATRICES AND TRANSFORMATIONS

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Applied Linear Algebra I Review page 1

Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

Vector and Matrix Norms

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Orthogonal Diagonalization of Symmetric Matrices

Inner product. Definition of inner product

Introduction to Matrix Algebra

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

x = + x 2 + x

Eigenvalues and Eigenvectors

Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

[1] Diagonal factorization

Operation Count; Numerical Linear Algebra

Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

LINEAR ALGEBRA W W L CHEN

Electrical Engineering 103 Applied Numerical Computing

Continued Fractions and the Euclidean Algorithm

The Projection Matrix

Orthogonal Bases and the QR Algorithm

Unified Lecture # 4 Vectors

Linear Algebra: Vectors

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

Lecture 3: Coordinate Systems and Transformations

THREE DIMENSIONAL GEOMETRY

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Two vectors are equal if they have the same length and direction. They do not

MATH APPLIED MATRIX THEORY

Mechanics lecture 7 Moment of a force, torque, equilibrium of a body

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Linear Algebra Methods for Data Mining

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

Content. Chapter 4 Functions Basic concepts on real functions 62. Credits 11

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

9 Multiplication of Vectors: The Scalar or Dot Product

Lecture 14: Section 3.3

8 Square matrices continued: Determinants

3D Tranformations. CS 4620 Lecture 6. Cornell CS4620 Fall 2013 Lecture Steve Marschner (with previous instructors James/Bala)

The Characteristic Polynomial

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

521493S Computer Graphics. Exercise 2 & course schedule change

7 Gaussian Elimination and LU Factorization

Inner products on R n, and more

DATA ANALYSIS II. Matrix Algorithms

Section Inner Products and Norms

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

LINES AND PLANES CHRIS JOHNSON

discuss how to describe points, lines and planes in 3 space.

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Numerical Methods I Eigenvalue Problems

Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree of PhD of Engineering in Informatics

Solving Simultaneous Equations and Matrices

Notes on Determinant

State of Stress at Point

Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

Data Mining: Algorithms and Applications Matrix Math Review

Transcription:

3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first two of these factorizations involve orthogonal matrices These matrices play a fundamental role in many numerical methods This lecture first considers orthogonal vectors and then defines orthogonal matrices 3 Orthogonal Vectors A pair of vector u,v R m is said to be orthogonal if (u,v) = 0 In view of formula () in Lecture, orthogonal vectors meet at a right angle The zero-vector 0 is orthogonal to all vector, but we are more interested in nonvanishing orthogonal vectors A set of vectors S n = {v j } n in Rm is said to be orthonormal if each pair of distinct vectors in S n is orthogonal and all vectors in S n are of unit length, ie, if (v j,v k ) = { 0, j k,, j = k () Here we have used that (v k,v k ) = v k It is not difficult to show that orthonormal vectors are linearly independent; see Exercise 3 below It follows that the m vectors of an orthonormal set S m in R m form a basis for R m Example 3 The set S 3 = {e j } 3 in R5 is orthonormal, where the e j are axis vectors; cf (5) of Lecture Example 3 The set S = {v,v } in R, with v = [,] T, v = [,] T, is orthonormal Moreover, the set S forms a basis for R An arbitrary vector v R m can be decomposed into orthogonal components Consider the set S n = {v j } n of orthonormal vectors in Rm, and regard the expression r = v (v j,v)v j () The vector (v j,v)v j is referred to as the orthogonal component of v in the direction v j Moreover, the vector r is orthogonal to the vectors v j This can be seen by computing the inner products (v k,r) for all k We obtain (v k,r) = (v k,v (v j,v)v j ) = (v k,v) (v j,v)(v k,v j )

Using (), the sum in the right-hand side simplifies to which shows that (v j,v)(v k,v j ) = (v k,v), (v k,r) = 0 Thus, v can be expressed as a sum of the orthogonal vectors r,v,,v n, Example 33 v = r + (v j,v)v j Let v = [,,3,4,5] T and let the set S be the same as in Example 3 Then r = [0,0,0,4,5] T Clearly, r is orthogonal to the axis vectors e,e,e 3 Exercise 3 Let S n = {v j } n be a set of orthonormal vectors in Rm Show that the vectors v,v,,v n are linearly independent Hint: Assume this is not the case For instance, assume that v is a linear combination of the vectors v,v 3,,v n, and apply () 3 Orthogonal Matrices A square matrix Q = [q,q,,q m ] R m m is said to be orthogonal if its columns {q j } m form an orthonormal basis of R m Since the columns q,q,,q m are linearly independent, cf Exercise 3, the matrix Q is nonsingular Thus, Q has an inverse, which we denote by Q It follows from the orthonormality of the columns of Q that (q,q ) (q,q ) (q,q m ) Q T (q,q ) (q,q ) (q,q m ) Q = = I, (q m,q ) (q m,q ) (q m,q m ) where I denotes the identity matrix Multiplying the above expression by the inverse Q from the right-hand side shows that Q T = Q Thus, the transpose of an orthogonal matrix is the inverse Example 34 The identity matrix I is orthogonal

Example 35 The matrix is orthogonal Its inverse is its transpose, [ Q = Q = Q T = / / / / ] [ / / / / Geometrically, multiplying a vector by an orthogonal matrix reflects the vector in some plane and/or rotates it Therefore, multiplying a vector by an orthogonal matrices does not change its length Therefore, the norm of a vector u is invariant under multiplication by an orthogonal matrix Q, ie, ] Qu = u (3) This can be seen by using the properties (8) and (6) of Lecture We have Qu = (Qu) T (Qu) = u T Q T (Qu) = u T (Q T Q)u = u T u = u We take square-roots of the right-hand side and left-hand side, and since is nonnegative, equation (3) follows 33 Householder Matrices Matrices of the form H = I ρuu T R m m, u 0, ρ = u T u, (4) are known as Householder matrices They are used in numerical methods for least-squares approximation and eigenvalue computations We will discuss the former application in the next lecture Householder matrices are symmetric, ie, H = H T, and orthogonal The latter property follows from H T H = H = (I ρuu T )(I ρuu T ) = I ρuu T ρuu T + (ρuu T )(ρuu T ) = I ρuu T ρuu T + ρu(ρu T u)u T = I ρuu T ρuu T + ρuu T = I, where we have used that ρu T u = ; cf (4) Our interest in Householder matrices stems from that they are orthogonal and the vector u in their definition can be chosen so that an arbitrary (but fixed) vector w R m \{0} is mapped by H onto a multiple of the axis vector e We will now show how this can be done Let w 0 be given We would like for some scalar σ It follows from (3) that Hw = σe (5) w = Hw = σe = σ e = σ Therefore, σ = ± w (6) 3

Moreover, using the definition (4) of H, we obtain from which it follows that σe = Hw = (I ρuu T )w = w τu, τu = w σe τ = ρu T w The matrix H is independent of the scaling factor τ in the sense that the entries of the matrix H do not change if we replace τu by u We therefore may choose u = w σe (7) This choice of u and either one of the choices (6) of σ give a Householder matrix that satisfies (5) Nevertheless, in finite precision arithmetic, the choice of sign in (6) may be important To see this, let w = [w,w,,w m ] T and write the vector (7) in the form u = w σ w w 3 w m If the components w j for j are of small magnitude compared to w, then w w and, therefore, the first component of u satisfies u = w σ = w ± w w ± w (8) We would like to avoid the situation that w is large and u is small, because then u is determined with low relative accuracy; see Exercise 33 below We therefore let which yields Thus, u is computed by adding numbers of the same sign Example 36 σ = sign(w ) w, (9) u = w + sign(w ) w (0) Let w = [,,,] T We are interested in determining the Householder matrix H that maps w onto a multiple of e The parameter σ in (5) is chosen, such that σ = w =, ie, σ is or The vector u in (4) is given by u = w ± σe = The choice σ = yields u = [3 ] T We choose the σ positive, because the first entry of w is positive Finally, we obtain ρ = / u = /6 and / / / / H = I ρuu T = / 5/6 /6 /6 / /6 5/6 /6 / /6 /6 5/6 ± σ 4

It is easy to verify that H is orthogonal and maps w to e In most applications it is not necessary to explicitly form the Householder matrix H Matrix-vector products with a vector v are computed by using the definition (4) of H, ie, Hv = (I ρuu T )v = v (ρu T v)u The left-hand side is computed by first evaluating the scalar τ = ρu T v and then computing the vector scaling and addition v τu This way of evaluating Hw requires fewer arithmetic floating-point operations than straightforward computation of the matrix-vector product; see Exercise 36 Moreover, the entries of H do not have to be stored, only the vector u and scalar ρ The savings in arithmetic operations and storage is important for large problems Exercise 3 Let w = [,,3] T Determine the Householder matrix that maps w to a multiple of e Only the vector u in (4) has to be computed Exercise 33 This exercise illustrates the importance of the choice of the sign of σ in (6) Let w = [,05 0 8 ] T and let u be the vector in the definition of Householder matrices (4), chosen such that Hw = σe MATLAB yields >> w=[; 5e-9] w = 0000 00000 >> sigma=norm(w) sigma = >> u=w()+sigma u = >> u=w()-sigma u = 0 5

where the u denote the computed approximations of the first component, u, of the vector u How large are the absolute and relative errors in the computed approximations u of the component u of the vector u? Exercise 34 Show that the product U U of two orthogonal matrices is an orthogonal matrix Is the product of k > orthogonal matrices an orthogonal matrix? Exercise 35 Let Q be an orthogonal matrix, ie, Q T Q = I Show that QQ T = I Exercise 36 What is the count of arithmetic floating point operations for evaluating a matrix vector product with an n n Householder matrix H when the representation (4) of H is used? Only u and ρ are stored, not the entries of H What is the count of arithmetic floating point operations for evaluating a matrix-vector product with H when the entries of H (but not u and ρ) are available Correct order of magnitude of the arithmetic work when n is large suffices Exercise 37 Let the nonvanishing m-vectors w and w be given Determine an orthogonal matrix H, such that w = Hw Hint: Use Householder matrices 6