# Orthogonal Matrices. u v = u v cos(θ) T (u) + T (v) = T (u + v). It s even easier to. If u and v are nonzero vectors then

Save this PDF as:

Size: px
Start display at page:

Download "Orthogonal Matrices. u v = u v cos(θ) T (u) + T (v) = T (u + v). It s even easier to. If u and v are nonzero vectors then"

## Transcription

1 Part 2. 1 Part 2. Orthogonal Matrices If u and v are nonzero vectors then u v = u v cos(θ) is 0 if and only if cos(θ) = 0, i.e., θ = 90. Hence, we say that two vectors u and v are perpendicular or orthogonal (in symbols u v) if u v = 0. A vector u is a unit vector if u = 1. Now rotate the vectors, i.e., apply T. Let T : R 2 R 2 be rotation around the origin by θ degrees. This is a linear transformation. To see this, consider the addition of u and v as in the following diagram. This diagram shows that T (u) + T (v) = T (u + v). It s even easier to

2 Part 2. 3 Part 2. check that T (sv) = st (v), so T is linear. We need to find T (e 1 ), T (e 2 ). If we rotate e 1 by θ, we get (cos(θ), sin(θ)) by the unit circle definition of sin and cos. Recall the addition laws for sin and cos. cos(a + B) = cos(a) cos(b) sin(a) sin(b) cos(a B) = cos(a) cos(b) + sin(a) sin(b) sin(a + B) = sin(a) cos(b) + cos(a) sin(b) sin(a B) = sin(a) cos(b) cos(a) sin(b) Recall also that cos( θ) = cos(θ) sin( θ) = sin(θ) The vector w = ( sin(θ), cos(θ)) is a unit vector and w T (e 1 ) = 0, so T (e 2 ) must be either w or w. Considering the sign of the first component shows that T (e 2 ) = w. Thus, the matrix of T is R(θ) = cos θ sin(θ). sin(θ) cos(θ) Exercise: Rotating by θ and then by ϕ should be the same as rotating θ + ϕ. Thus, we have R(θ)R(ϕ) = R(θ + ϕ) = R(ϕ)R(θ). Use this and matrix multiplication to derive the addition laws for sin and cos. What is R(θ) 1?. Exercise: What is the matrix of reflection through the x-axis?

3 Part 2. 5 Part 2. A matrix A is orthogonal if Av = v for all vectors v. If A and B are orthogonal, so is AB. If A is orthogonal, so is A 1. Clearly I is orthogonal.. Rotation matrices are orthogonal. The set of orthogonal 2 2 matrices is denoted by O(2). Classification of Orthogonal Matrices If A is orthogonal, then Au Av = u v for all vectors u and v. (It follows that A preserves the angles between vectors). To see this, recall that 2u v = u 2 + v 2 u v 2. Hence, if A is orthogonal, we have 2Au Av = Au 2 + Av 2 Au Av 2 = Au 2 + Av 2 A(u v) 2 = u 2 + v 2 u v 2 = 2u v so Au Av = u v. If A is orthogonal, we must have Ae 1 2 = Ae 1 Ae 1 = e 1 e 1 = 1 ( ) Ae 2 2 = Ae 1 Ae 2 = e 2 e 2 = 1 Ae 1 Ae 2 = e 1 e 2 = 0. Exercise: If A satisfies ( ), A is orthogonal. Equations ( ) say that the columns of A are orthogonal unit vectors. Thus, if the first column of A is a, a 2 + b 2 = 1 b

4 Part 2. 7 Part 2. the possibilities for the second column are b, a b a So A must look like one of the matrices a b, a b. b a b a Since a 2 + b 2 = 1, we can find θ so that a = cos(θ) and b = sin(θ). Thus, in one case, we get A = a b = cos θ b a sin(θ) sin(θ) = R(θ), cos(θ) so A is rotation by θ degrees. In the other case we have A = a b = cos θ b a sin(θ) sin(θ) cos(θ) Call this matrix S(θ). What does this represent geometrically? We claim that there is a line L through the origin so that A = S(θ) leaves every point on L fixed. To see this, it will suffice to find a unit vector u = cos(ϕ) sin(ϕ) so that Au = u. If we do this, any other vector along L will have the form tu for some scalar t, and we will have A(tu) = tau = tu. Since we only need to consider each line through the origin once, we can suppose 0 ϕ < 180. We may as well suppose that 0 θ < 360. Thus, we are trying to find ϕ so that Au = u, i.e. cos θ sin(θ) cos(ϕ) = cos(ϕ) sin(θ) cos(θ) sin(ϕ) sin(ϕ)

5 Part 2. 9 Part 2. 1 If we carry out the multiplication, this becomes cos(θ) cos(ϕ) + sin(θ) sin(ϕ) = cos(ϕ) sin(θ) cos(ϕ) cos(θ) sin(ϕ) sin(ϕ) Using the addition laws, this is the same as cos(θ ϕ) = cos(ϕ) sin(θ ϕ) sin(ϕ) Because of our angle restrictions, we must have θ ϕ = ϕ, so ϕ = θ/2. Thus, Au = u for u = cos(θ/2) sin(θ/2). Consider the vector v = sin(θ/2), cos(θ/2) which is a unit vector orthogonal to u. Calculate Av as follows Av = cos θ sin(θ) sin(θ/2) sin(θ) cos(θ) cos(θ/2) cos(θ) sin(θ/2) + sin(θ) cos(θ/2) = sin(θ) sin(θ/2) cos(θ) cos(θ/2) sin(θ θ/2) = cos(θ θ/2) = sin(θ/2) cos(θ/2) = v, So Av = v. A little thought shows that A is reflection through the line L that passes through the origin and is parallel to u. Any vector w can be written as w = su + tv for scalars s and t and A(su + tv) = sau + tav = su tv.

6 Part Part 2. 1 Isometries of the Plane Exercise: Verify the following equations. R(θ)S(ϕ) = S(θ + ϕ) S(ϕ)R(θ) = S(ϕ θ) S(θ)S(ϕ) = R(θ ϕ). If a point has coordinates (x, y), the vector with components (x, y) is the vector from the origin to the point. This is called the position vector of the point. Generally, we use the same notation for the point and its position vector. The distance between two points p and q in R 2 is given by d(p, q) = p q. A transformation T : R 2 R 2 is called an isometry if it is 1-1 and onto and d(t (p), T (q)) = d(p, q) for all points p and q. One kind of isometry we have discovered is T (p) = Ap, where A is an orthogonal matrix. Another kind of isometry is translation. Let v be a fixed vector and define T v by

7 Part Part 2. 1 T v (p) = p + v. Translations are not linear. The identity mapping id: R 2 R 2 defined by id(p) = p is obviously an isometry. The composition of two isometries is an isometry. The inverse of an isometry is an isometry. Let x, y and z be noncollinear points. Then a point p is uniquely determined by the three numbers d(x, p), d(y, p) and d(z, p). If x, y and z are three noncollinear points and T is an isometry such that T (x) = x, T (y) = y and T (z) = z, then T = id. To see this, suppose that p is any point. Then p is determined by the numbers d(x, p), d(y, p) and d(z, p). Since T is an isometry d(x, p) = d(t (x), T (p)) = d(x, T (p)). Similarly, d(y, T (p)) = d(y, p) and d(z, T (p)) = d(z, p). Thus, we must have T (p) = p. Problem: If points x,y and z form a right triangle with the right angle at x, then T (x), T (y) and T (z) form a right triangle with the right angle at T (x).

8 Part Part 2. 1 Theorem If T is an isometry, there is a vector v and an orthogonal matrix A so that T (p) = Ap + v for all p, i.e., T is the composition of a rotation or reflection at the origin and a translation. Proof: The points 0, e 1 and e 2 are noncollinear and form a right triangle, so the points T (0), T (e 1 ) and T (e 2 ) form a right triangle. Let v = T (0). Let T v be translation by v. Then T = T v T is an isometry and T (0) = 0. We must have d(0, T (e 1 )) = d(0, e 1 ) = 1, so we can rotate T (e 1 ) around the origin so it coincides with e 1. Let R be the rotation matrix for this rotation. Then T = RT v T is an isometry with T (0) = 0 and T (e 1 ) = e 1. Since d(t (e 2 ), 0) = 1 and the points 0, e 1 and T (e 2 ) form a right triangle, T (e 2 ) must be either e 2 or e 2. If it s e 2 let M = I and if it s e 2 let M be the matrix of reflection through the x axis. In either case M is an orthogonal matrix and MT (e2) = e 2. Let T = MRT v T. Then T (0) = 0, T (e 1 ) = e 1 and T (e 2 ) = e 2, so T must be the identity. If we set A = MR, A is orthogonal and we have T = AT v T = id. Thus, for any point point p, AT v T (p) = p. Multiplying by A 1 we get T v T (p) = A 1 p. Apply T v to both sides to get T (p) = T v A 1 p = A 1 p v. Since A 1 is orthogonal, the proof is complete. Every isometry can be written in the form T (x) = Ax + u. To abbreviate this, we ll simply denote this isometry by (A u), so the rule is (A u)(x) = Ax + u In this notation, (I 0) = id, (I u) is translation by u, and (A 0) is

9 Part Part 2. 1 transformation x Ax. Given isometries (A u) and (B v), we have (A u)(b v)(x) = (A u)(bx + v) = A(Bx + v) + u = ABx + Av + u = (AB u + Av)(x), so the rule for combining these symbols is (A u)(b v) = (AB u + Av). Exercise: Verify that (A u) 1 = (A 1 A 1 u) A Glide Reflection is reflection in some line followed by translation in a direction parallel to the line. Theorem Every isometry falls into one of the following mutually exclusive cases. I. The identity. II. A translation. III. Rotation about some point (i.e., not necessarily the origin). IV. Reflection about some line. V. A glide reflection. We already know that (I 0) is the identity (Case I), (0 u) is translation (Case II) and that (A 0) is either a rotation about the origin or a reflection about a line through the origin. Consider (A w) where A is a rotation matrix, A = R(θ). A Rotation leaves fixed the rotation center, so we expect (A w) to have a fixed point, i.e., a point p so that Ap + w = p. Rearranging this equation gives w = p Ap = (I A)p, so p is the

10 Part Part 2. 2 solution, if any, of the equation ( ) w = (I A)p. The matrix I A is given by I A = 1 cos(θ) sin(θ). sin(θ) 1 cos(θ) so det(a) = (1 cos(θ)) 2 + sin 2 (θ) = 1 2 cos θ + cos 2 (θ) + sin 2 (θ) = 2 2 cos θ. This is nonzero as long as cos(θ) 1. If cos(θ) = 1, then A = I, and we ve already dealt with that case. So, we can assume det(i A) 0. This means that I A is invertible, so equation ( ) has the unique solution p = (I A) 1 w. Now that we know the fixed point p exists, we can write (A w) = (A p Ap). Then we have (A p Ap)(x) = Ax + p Ap = A(x p) + p. If we think about this geometrically, this shows that (A p Ap) is rotation about the point p: We take the vector x p that points from p to x, take the arrow at the origin, rotate the arrow by A and move the arrow back to p. This gives us the rotation of x about p. Consider the case (A w) where A is a reflection matrix. Recall that for a reflection matrix, there are orthogonal unit vectors u and v so that Au = u and Av = v. Consider first the case where w = sv. Let p = sv/2. We claim that p is

11 Part Part 2. 2 fixed. (A sv)(p) = Ap + sv so all the points on L are fixed. We can also write = A(sv/2) + sv = (s/2)av + sv = (s/2)( v) + sv = (s s/2)v = sv/2 = p. Thus, sv = 2p and we can write (A w) = (A 2p). Note Ap = p. As t runs over all real numbers, the point p + tu run along the line, call it L, that passes through p and is parallel to u. For points on L, we have (A 2p)(p + tu) = A(p + tu) + 2p = Ap + tu + 2p = p + tu + 2p = p + tu, (A 2p)(x) = Ax + 2p = Ax + p + p = Ax ( p) + p = Ax Ap + p = A(x p) + p. By the same reasoning as before, this shows that (A 2p) is reflection through L. Exercise: Show that (A w) has fixed points if and only if w = sv. Consider finally the case (A w) where w is not a multiple of v. Then we can write w = αu + βv. We then have (A w) = (A αu + βv) = (I αu)(a βv)

12 Part We know that (A βv) is reflection about some line parallel to u and (I αu) is translation in a direction parallel to u, so (A w) is a glide reflection. A glide reflection has no fixed points.

### ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

### Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain

Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal

### LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

### Chapter 17. Orthogonal Matrices and Symmetries of Space

Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length

### MATH 304 Linear Algebra Lecture 24: Scalar product.

MATH 304 Linear Algebra Lecture 24: Scalar product. Vectors: geometric approach B A B A A vector is represented by a directed segment. Directed segment is drawn as an arrow. Different arrows represent

### α = u v. In other words, Orthogonal Projection

Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

### Lecture L3 - Vectors, Matrices and Coordinate Transformations

S. Widnall 16.07 Dynamics Fall 2009 Lecture notes based on J. Peraire Version 2.0 Lecture L3 - Vectors, Matrices and Coordinate Transformations By using vectors and defining appropriate operations between

### Linear Algebra Notes for Marsden and Tromba Vector Calculus

Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of

### Chapter 6. Orthogonality

6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be

### Unified Lecture # 4 Vectors

Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,

### Section 1.1. Introduction to R n

The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to

### Geometric Transformations

Geometric Transformations Definitions Def: f is a mapping (function) of a set A into a set B if for every element a of A there exists a unique element b of B that is paired with a; this pairing is denoted

### Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Chapter 6 Linear Transformation 6 Intro to Linear Transformation Homework: Textbook, 6 Ex, 5, 9,, 5,, 7, 9,5, 55, 57, 6(a,b), 6; page 7- In this section, we discuss linear transformations 89 9 CHAPTER

### NOTES ON LINEAR TRANSFORMATIONS

NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all

### Vector Notation: AB represents the vector from point A to point B on a graph. The vector can be computed by B A.

1 Linear Transformations Prepared by: Robin Michelle King A transformation of an object is a change in position or dimension (or both) of the object. The resulting object after the transformation is called

### Vectors, Gradient, Divergence and Curl.

Vectors, Gradient, Divergence and Curl. 1 Introduction A vector is determined by its length and direction. They are usually denoted with letters with arrows on the top a or in bold letter a. We will use

### 9 MATRICES AND TRANSFORMATIONS

9 MATRICES AND TRANSFORMATIONS Chapter 9 Matrices and Transformations Objectives After studying this chapter you should be able to handle matrix (and vector) algebra with confidence, and understand the

### Equations Involving Lines and Planes Standard equations for lines in space

Equations Involving Lines and Planes In this section we will collect various important formulas regarding equations of lines and planes in three dimensional space Reminder regarding notation: any quantity

### Lecture 14: Section 3.3

Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in

### 5.3 The Cross Product in R 3

53 The Cross Product in R 3 Definition 531 Let u = [u 1, u 2, u 3 ] and v = [v 1, v 2, v 3 ] Then the vector given by [u 2 v 3 u 3 v 2, u 3 v 1 u 1 v 3, u 1 v 2 u 2 v 1 ] is called the cross product (or

### Chapter 19. General Matrices. An n m matrix is an array. a 11 a 12 a 1m a 21 a 22 a 2m A = a n1 a n2 a nm. The matrix A has n row vectors

Chapter 9. General Matrices An n m matrix is an array a a a m a a a m... = [a ij]. a n a n a nm The matrix A has n row vectors and m column vectors row i (A) = [a i, a i,..., a im ] R m a j a j a nj col

### Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

The Calculus of Functions of Several Variables Section 1.4 Lines, Planes, Hyperplanes In this section we will add to our basic geometric understing of R n by studying lines planes. If we do this carefully,

### Geometric Transformations

CS3 INTRODUCTION TO COMPUTER GRAPHICS Geometric Transformations D and 3D CS3 INTRODUCTION TO COMPUTER GRAPHICS Grading Plan to be out Wednesdas one week after the due date CS3 INTRODUCTION TO COMPUTER

### Chapter 5 Polar Coordinates; Vectors 5.1 Polar coordinates 1. Pole and polar axis

Chapter 5 Polar Coordinates; Vectors 5.1 Polar coordinates 1. Pole and polar axis 2. Polar coordinates A point P in a polar coordinate system is represented by an ordered pair of numbers (r, θ). If r >

### 28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z

28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE 1.4 Cross Product 1.4.1 Definitions The cross product is the second multiplication operation between vectors we will study. The goal behind the definition

### Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

1 Chapter 13. VECTORS IN THREE DIMENSIONAL SPACE Let s begin with some names and notation for things: R is the set (collection) of real numbers. We write x R to mean that x is a real number. A real number

### 1 Eigenvalues and Eigenvectors

Math 20 Chapter 5 Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors. Definition: A scalar λ is called an eigenvalue of the n n matrix A is there is a nontrivial solution x of Ax = λx. Such an x

### Figure 1.1 Vector A and Vector F

CHAPTER I VECTOR QUANTITIES Quantities are anything which can be measured, and stated with number. Quantities in physics are divided into two types; scalar and vector quantities. Scalar quantities have

### MAT 1341: REVIEW II SANGHOON BAEK

MAT 1341: REVIEW II SANGHOON BAEK 1. Projections and Cross Product 1.1. Projections. Definition 1.1. Given a vector u, the rectangular (or perpendicular or orthogonal) components are two vectors u 1 and

### Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

### Solutions to Math 51 First Exam January 29, 2015

Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not

### 9.4. The Scalar Product. Introduction. Prerequisites. Learning Style. Learning Outcomes

The Scalar Product 9.4 Introduction There are two kinds of multiplication involving vectors. The first is known as the scalar product or dot product. This is so-called because when the scalar product of

### LS.6 Solution Matrices

LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions

### 12.5 Equations of Lines and Planes

Instructor: Longfei Li Math 43 Lecture Notes.5 Equations of Lines and Planes What do we need to determine a line? D: a point on the line: P 0 (x 0, y 0 ) direction (slope): k 3D: a point on the line: P

### is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

.4 COORDINATES EXAMPLE Let V be the plane in R with equation x +2x 2 +x 0, a two-dimensional subspace of R. We can describe a vector in this plane by its spatial (D)coordinates; for example, vector x 5

### Rotation Matrices and Homogeneous Transformations

Rotation Matrices and Homogeneous Transformations A coordinate frame in an n-dimensional space is defined by n mutually orthogonal unit vectors. In particular, for a two-dimensional (2D) space, i.e., n

### Orthogonal Projections

Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

### Section 1.7 22 Continued

Section 1.5 23 A homogeneous equation is always consistent. TRUE - The trivial solution is always a solution. The equation Ax = 0 gives an explicit descriptions of its solution set. FALSE - The equation

### Exam 1 Sample Question SOLUTIONS. y = 2x

Exam Sample Question SOLUTIONS. Eliminate the parameter to find a Cartesian equation for the curve: x e t, y e t. SOLUTION: You might look at the coordinates and notice that If you don t see it, we can

### Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two

### Recall that two vectors in are perpendicular or orthogonal provided that their dot

Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

### Some linear transformations on R 2 Math 130 Linear Algebra D Joyce, Fall 2015

Some linear transformations on R 2 Math 3 Linear Algebra D Joce, Fall 25 Let s look at some some linear transformations on the plane R 2. We ll look at several kinds of operators on R 2 including reflections,

### L 2 : x = s + 1, y = s, z = 4s + 4. 3. Suppose that C has coordinates (x, y, z). Then from the vector equality AC = BD, one has

The line L through the points A and B is parallel to the vector AB = 3, 2, and has parametric equations x = 3t + 2, y = 2t +, z = t Therefore, the intersection point of the line with the plane should satisfy:

### 1 Symmetries of regular polyhedra

1230, notes 5 1 Symmetries of regular polyhedra Symmetry groups Recall: Group axioms: Suppose that (G, ) is a group and a, b, c are elements of G. Then (i) a b G (ii) (a b) c = a (b c) (iii) There is an

### Inner Product Spaces

Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and

### MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the

### 3. INNER PRODUCT SPACES

. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.

### MATH 551 - APPLIED MATRIX THEORY

MATH 55 - APPLIED MATRIX THEORY FINAL TEST: SAMPLE with SOLUTIONS (25 points NAME: PROBLEM (3 points A web of 5 pages is described by a directed graph whose matrix is given by A Do the following ( points

### Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)

MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of

### Math 497C Sep 9, Curves and Surfaces Fall 2004, PSU

Math 497C Sep 9, 2004 1 Curves and Surfaces Fall 2004, PSU Lecture Notes 2 15 sometries of the Euclidean Space Let M 1 and M 2 be a pair of metric space and d 1 and d 2 be their respective metrics We say

### Diagonal, Symmetric and Triangular Matrices

Contents 1 Diagonal, Symmetric Triangular Matrices 2 Diagonal Matrices 2.1 Products, Powers Inverses of Diagonal Matrices 2.1.1 Theorem (Powers of Matrices) 2.2 Multiplying Matrices on the Left Right by

### Geometric Objects and Transformations

Geometric Objects and ransformations How to represent basic geometric types, such as points and vectors? How to convert between various represenations? (through a linear transformation!) How to establish

### 9 Multiplication of Vectors: The Scalar or Dot Product

Arkansas Tech University MATH 934: Calculus III Dr. Marcel B Finan 9 Multiplication of Vectors: The Scalar or Dot Product Up to this point we have defined what vectors are and discussed basic notation

### MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets.

MATH 304 Linear Algebra Lecture 20: Inner product spaces. Orthogonal sets. Norm The notion of norm generalizes the notion of length of a vector in R n. Definition. Let V be a vector space. A function α

### Similarity and Diagonalization. Similar Matrices

MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

### 1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

### WHICH LINEAR-FRACTIONAL TRANSFORMATIONS INDUCE ROTATIONS OF THE SPHERE?

WHICH LINEAR-FRACTIONAL TRANSFORMATIONS INDUCE ROTATIONS OF THE SPHERE? JOEL H. SHAPIRO Abstract. These notes supplement the discussion of linear fractional mappings presented in a beginning graduate course

### 1.7 Cylindrical and Spherical Coordinates

56 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE 1.7 Cylindrical and Spherical Coordinates 1.7.1 Review: Polar Coordinates The polar coordinate system is a two-dimensional coordinate system in which the

### 3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R

### Matrix Representations of Linear Transformations and Changes of Coordinates

Matrix Representations of Linear Transformations and Changes of Coordinates 01 Subspaces and Bases 011 Definitions A subspace V of R n is a subset of R n that contains the zero element and is closed under

### University of Lille I PC first year list of exercises n 7. Review

University of Lille I PC first year list of exercises n 7 Review Exercise Solve the following systems in 4 different ways (by substitution, by the Gauss method, by inverting the matrix of coefficients

### Section V.4: Cross Product

Section V.4: Cross Product Definition The cross product of vectors A and B is written as A B. The result of the cross product A B is a third vector which is perpendicular to both A and B. (Because the

### MATHEMATICS (CLASSES XI XII)

MATHEMATICS (CLASSES XI XII) General Guidelines (i) All concepts/identities must be illustrated by situational examples. (ii) The language of word problems must be clear, simple and unambiguous. (iii)

### VECTOR ALGEBRA. 10.1.1 A quantity that has magnitude as well as direction is called a vector. is given by a and is represented by a.

VECTOR ALGEBRA Chapter 10 101 Overview 1011 A quantity that has magnitude as well as direction is called a vector 101 The unit vector in the direction of a a is given y a and is represented y a 101 Position

### A QUICK GUIDE TO THE FORMULAS OF MULTIVARIABLE CALCULUS

A QUIK GUIDE TO THE FOMULAS OF MULTIVAIABLE ALULUS ontents 1. Analytic Geometry 2 1.1. Definition of a Vector 2 1.2. Scalar Product 2 1.3. Properties of the Scalar Product 2 1.4. Length and Unit Vectors

### Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

Chapter 1 Matrices, Vectors, and Vector Calculus In this chapter, we will focus on the mathematical tools required for the course. The main concepts that will be covered are: Coordinate transformations

### Systems of Linear Equations

Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and

### TWO-DIMENSIONAL TRANSFORMATION

CHAPTER 2 TWO-DIMENSIONAL TRANSFORMATION 2.1 Introduction As stated earlier, Computer Aided Design consists of three components, namely, Design (Geometric Modeling), Analysis (FEA, etc), and Visualization

### Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product

Cross product and determinants (Sect. 12.4) Two main ways to introduce the cross product Geometrical definition Properties Expression in components. Definition in components Properties Geometrical expression.

### 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each)

Math 33 AH : Solution to the Final Exam Honors Linear Algebra and Applications 1. True/False: Circle the correct answer. No justifications are needed in this exercise. (1 point each) (1) If A is an invertible

### 1 Review of complex numbers

1 Review of complex numbers 1.1 Complex numbers: algebra The set C of complex numbers is formed by adding a square root i of 1 to the set of real numbers: i = 1. Every complex number can be written uniquely

### 1 Scalars, Vectors and Tensors

DEPARTMENT OF PHYSICS INDIAN INSTITUTE OF TECHNOLOGY, MADRAS PH350 Classical Physics Handout 1 8.8.2009 1 Scalars, Vectors and Tensors In physics, we are interested in obtaining laws (in the form of mathematical

### 2D Geometric Transformations

2D Geometric Transformations (Chapter 5 in FVD) 2D Geometric Transformations Question: How do we represent a geometric object in the plane? Answer: For now, assume that objects consist of points and lines.

### Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?

### Class XI Chapter 5 Complex Numbers and Quadratic Equations Maths. Exercise 5.1. Page 1 of 34

Question 1: Exercise 5.1 Express the given complex number in the form a + ib: Question 2: Express the given complex number in the form a + ib: i 9 + i 19 Question 3: Express the given complex number in

### 2D Geometric Transformations. COMP 770 Fall 2011

2D Geometric Transformations COMP 770 Fall 2011 1 A little quick math background Notation for sets, functions, mappings Linear transformations Matrices Matrix-vector multiplication Matrix-matrix multiplication

### Solution. Area(OABC) = Area(OAB) + Area(OBC) = 1 2 det( [ 5 2 1 2. Question 2. Let A = (a) Calculate the nullspace of the matrix A.

Solutions to Math 30 Take-home prelim Question. Find the area of the quadrilateral OABC on the figure below, coordinates given in brackets. [See pp. 60 63 of the book.] y C(, 4) B(, ) A(5, ) O x Area(OABC)

### INTRODUCTORY LINEAR ALGEBRA WITH APPLICATIONS B. KOLMAN, D. R. HILL

SOLUTIONS OF THEORETICAL EXERCISES selected from INTRODUCTORY LINEAR ALGEBRA WITH APPLICATIONS B. KOLMAN, D. R. HILL Eighth Edition, Prentice Hall, 2005. Dr. Grigore CĂLUGĂREANU Department of Mathematics

### Section 9.5: Equations of Lines and Planes

Lines in 3D Space Section 9.5: Equations of Lines and Planes Practice HW from Stewart Textbook (not to hand in) p. 673 # 3-5 odd, 2-37 odd, 4, 47 Consider the line L through the point P = ( x, y, ) that

### [1] Diagonal factorization

8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:

### Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

Geometry of Vectors Carlo Tomasi This note explores the geometric meaning of norm, inner product, orthogonality, and projection for vectors. For vectors in three-dimensional space, we also examine the

### Linear Algebra Notes

Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note

### 5. Orthogonal matrices

L Vandenberghe EE133A (Spring 2016) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

### Geometric Image Transformations

Geometric Image Transformations Part One 2D Transformations Spatial Coordinates (x,y) are mapped to new coords (u,v) pixels of source image -> pixels of destination image Types of 2D Transformations Affine

### MAT188H1S Lec0101 Burbulla

Winter 206 Linear Transformations A linear transformation T : R m R n is a function that takes vectors in R m to vectors in R n such that and T (u + v) T (u) + T (v) T (k v) k T (v), for all vectors u

### Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry

### 6. Vectors. 1 2009-2016 Scott Surgent (surgent@asu.edu)

6. Vectors For purposes of applications in calculus and physics, a vector has both a direction and a magnitude (length), and is usually represented as an arrow. The start of the arrow is the vector s foot,

### LINES AND PLANES CHRIS JOHNSON

LINES AND PLANES CHRIS JOHNSON Abstract. In this lecture we derive the equations for lines and planes living in 3-space, as well as define the angle between two non-parallel planes, and determine the distance

### Bending of Beams with Unsymmetrical Sections

Bending of Beams with Unsmmetrical Sections Assume that CZ is a neutral ais. C = centroid of section Hence, if > 0, da has negative stress. From the diagram below, we have: δ = α and s = αρ and δ ε = =

### geometric transforms

geometric transforms 1 linear algebra review 2 matrices matrix and vector notation use column for vectors m 11 =[ ] M = [ m ij ] m 21 m 12 m 22 =[ ] v v 1 v = [ ] T v 1 v 2 2 3 matrix operations addition

### Solutions to old Exam 1 problems

Solutions to old Exam 1 problems Hi students! I am putting this old version of my review for the first midterm review, place and time to be announced. Check for updates on the web site as to which sections

### Math 215 HW #6 Solutions

Math 5 HW #6 Solutions Problem 34 Show that x y is orthogonal to x + y if and only if x = y Proof First, suppose x y is orthogonal to x + y Then since x, y = y, x In other words, = x y, x + y = (x y) T

### Geometric description of the cross product of the vectors u and v. The cross product of two vectors is a vector! u x v is perpendicular to u and v

12.4 Cross Product Geometric description of the cross product of the vectors u and v The cross product of two vectors is a vector! u x v is perpendicular to u and v The length of u x v is uv u v sin The

### Sec 4.1 Vector Spaces and Subspaces

Sec 4. Vector Spaces and Subspaces Motivation Let S be the set of all solutions to the differential equation y + y =. Let T be the set of all 2 3 matrices with real entries. These two sets share many common

### v 1 v 3 u v = (( 1)4 (3)2, [1(4) ( 2)2], 1(3) ( 2)( 1)) = ( 10, 8, 1) (d) u (v w) = (u w)v (u v)w (Relationship between dot and cross product)

0.1 Cross Product The dot product of two vectors is a scalar, a number in R. Next we will define the cross product of two vectors in 3-space. This time the outcome will be a vector in 3-space. Definition

### The Inverse of a Matrix

The Inverse of a Matrix 7.4 Introduction In number arithmetic every number a ( 0) has a reciprocal b written as a or such that a ba = ab =. Some, but not all, square matrices have inverses. If a square

### B such that AB = I and BA = I. (We say B is an inverse of A.) Definition A square matrix A is invertible (or nonsingular) if matrix

Matrix inverses Recall... Definition A square matrix A is invertible (or nonsingular) if matrix B such that AB = and BA =. (We say B is an inverse of A.) Remark Not all square matrices are invertible.