# Math 54 Midterm 2, Fall 2015

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Math 54 Midterm 2, Fall 2015 Name (Last, First): Student ID: GSI/Section: This is a closed book exam, no notes or calculators allowed. It consists of 7 problems, each worth 10 points. The lowest problem will be dropped, making the exam out of 60 points. Please avoid writing near the corner of the page where the exam is stapled, this area will be removed when the papers are scanned for grading. There are blank sheets attached at the back, feel free to use them for scratch work. If you want anything on those sheets graded, please indicate on the relevant problem which page your work is located on. DO NOT REMOVE OR ADD ANY PAGES!

2 1. (10 points) Consider the following matrix: A = (a) (3 points) Compute the eigenvalues of A. Solution: The characteristic polynomial is: det(a λi) = det 1 λ λ 0 (1) 6 6 λ [ ] 1 λ 1 = ( λ) det (2) 1 3 λ = ( λ)((1 λ)(3 λ) + 1) (3) = ( λ)(λ 2 4λ + 4) = λ(λ 2) 2. (4) So the eigenvalues are 0 (with multiplicity 1) and 2 (with multiplicity 2). (b) (4 points) Find a basis for the eigenspace corresponding to each of the eigenvalues. Solution: Finding an eigenvector for λ = 0 is easy: we can already see that the third variable is free, so E 0 = Null(A) contains the eigenvector (0, 0, 1). (More generally, (0, 0, t) works, where we require t 0 to count it as an eigenvector). Then since the mutiplicity of λ = 0 is one, we know that this is the whole eigenspace. For λ = 2, we need to find a basis for E 2 = Null(A 2I), so we row-reduce: A 2I = (5) (6) 0 This time, the null space is 1-dimensional, spanned by (1, 1, 0), which we get from looking at the first and second rows. (c) (3 points) Is A diagonalizable? Justify. Solution: No, since the dimension of the λ = 2 eigenspace is 1 dimensional, but the multiplicity of λ = 2 is 2 and 1 < 2.

3 2. (10 points) (a) (2 points) Check that the following set of vectors is a basis for R B = 0, 1, Solution: We have det = so the matrix formed by these column vectors is invertible and hence they form a basis. (b) (4 points) Compute the change of basis matrices P S B and P B S where S is the standard basis of R 3. Solution: Computing P S B is easier. We have P S B = ([b 1 ] S [b 2 ] S [b 3 ] S ) = Now, P B S = P 1 S B. Computing the inverse in any way gives P B S = (c) (4 points) Consider the linear transformation T : R 3 R 3 given by x x y T y = x z z y z Using the results of the previous part, find the matrix of T with respect to B. Feel free to write your answer as a product of matrices. Solution: From the formula, we see that the standard matrix of T is T S = Hence, the matrix with respect to B is T B = P B S T S P S B = = Page 2

4 3. (10 points) Label the following statements as true or false. The correct answer is worth 1 point. An additional point will be awarded for a brief justification (a) (2 points) The vector x = 2 is an eigenvector of the matrix A = Solution: False. Computing Ax = 1 13, we see that this is not a scalar multiple of x. 13 (b) (2 points) If the n n matrix A represents the linear transformation T : R n R n with respect to one basis, and B represents the same transformation with respect to a different basis, then det A = det B. Solution: True. In this situation, we have A = P BP 1, where P is some change-of-basis matrix. Then det(a) = det(p ) det(b) det(p ) 1 = det B. (c) (2 points) If A is a 4 4 matrix with characteristic polynomial χ A (λ) = λ 4 + 3λ 3 11λ 2 + λ + 5, then A must be invertible. Solution: True. A must be 4 4 because the degree of the characteristic polynomial is the size of the matrix, and it must be invertible because det(a 0 I) = χ A (0) = 5 0; that is, 0 is not an eigenvalue. (d) (2 points) If W R n is a subspace, and y is in R n, then y is in W or y is in W. Solution: False. Take W = Span(e 1, e 2 ) in R 3. Then y = (1, 2, 3) is in neither. (e) (2 points) For vectors u, v R n, if u 2 + v 2 = u + v 2, then u and v are orthogonal. Solution: True. We have u + v 2 = (u + v) (u + v) = (u u) + 2(u v) + (v v). If this is equal to (u u) + (v v), then we must have 2(u v) = 0, so that (u v) = 0, so u and v are orthogonal. Page 3

5 4. (10 points) Provide an example of the following, or explain why no such example can exist. If you want to provide an example, you are allowed to choose a specfic value for n. We say that a matrix A is diagonalizable if A = P DP 1 where D is a diagonal matrix and P, D, and P 1 are allowed to have complex entries. (This is just a reminder, not part of the question.) (a) (4 points) Two n n matrices A and B where A and B have the same eigenvalues with the same multiplicities but are not similar. [ ] [ ] 0 1 Solution: Let A = and B =. Both A and B have the same characteristic polynomial, λ 2, and thus the same eigenvalues with multiplicity. To see they are not similar, observe the only matrix similar to B is itself since P BP 1 must be the zero matrix. (b) (3 points) Two n n matrices A and B that are diagonalizable such that A B is not diagonalizable. [ ] [ ] [ ] Solution: Let A = and B =. Then A B =, which is not diagonalizable, even though A and B are, since they have distinct eigenvalues. (c) (3 points) An n n invertible matrix A where A is diagonalizable but A 1 is not. Solution: No such example exists. Since A is invertible 0 is not an eigenvalue so D is invertible. Then given A = P DP 1 we see A 1 = (P DP 1 ) 1 = (P 1 ) 1 D 1 P 1 = P D 1 P 1 so A 1 is diagonalizable. Page 4

6 Math 54 Midterm 2 October 29, (10 points) (a) (3 points) Consider vectors v 1 = and v 2 = Find an orthonormal basis B = {b 1, b 2 } of the plane spanned by v 1 and v 2 in R 3. Solution: We see that v 1 and v 2 are already orthogonal, so we normalize both to get b 1 = 1/3 2/3 2/3 2/3 and b 2 = 1/3 2/3 which is the orthonormal basis we desire. (b) (4 points) Find a third vector b 3 such that the matrix A with columns b 1, b 2, b 3 is orthogonal. (Hint, there are multiple possibilities for b 3.) Solution: For A to be an orthogonal matrix its columns must form an orthonormal basis. Thus b 3 must be a unit vector in the orthogonal compliment of the span of b 1 and b 2, which is the nullspace [ of the matrix] formed by[ turning the ] rows of b 1 and b 2 into the columns. 1/3 2/3 2/ Nul = Nul. Therefore the basis for the nullspace is 2/3 1/3 2/ Normalizing this gives us 2/3 2/3 1/3. The two possibilities for b 3 are ± 2/3 2/3 1/3. (c) (3 points) What is det(a)? Solution: Since A is orthogonal, AA T = Id, so det(a) det(a T ) = det(a) 2 = 1. So det(a) = ±1. So det(a) = 1. Page 5

7 6. (10 points) Consider the following vectors in R 4 : v 1 = 2 1, v 2 = 1 0, y = Let W = Span(v 1, v 2 ). (a) (1 point) Show that {v 1, v 2 } is an orthogonal set. Solution: We have v 1 v 2 = = 0. (b) (6 points) Find the closest point to y in the subspace W. Solution: We project y onto W to get: ŷ = v 1 y v 1 v 1 v 1 + v 2 y v 2 v 2 = v v 2 = 3v 1 + v 2 = which is the closest point to y in W (c) (3 points) Find the distance from y to the subspace W. Solution: We need to compute y ŷ. We have: So that y ŷ = 64 = 8 is the distance. y ŷ = Page 6

8 7. (10 points) An n n square matrix A is called idempotent if A 2 = A. Recall that E λ = Nul(A λi) denotes the λ-eigenspace of A. (a) (3 points) Show that the only possible eigenvalues of an idempotent matrix are 0 and 1. Solution: Suppose v is an eigenvector with eigenvalue λ, so that Av = λv. Applying A a second time gives A 2 v = λ 2 v, but using A 2 = A gives A 2 v = Av = λv. Thus λ 2 v = λv. Because v 0, it follows that λ 2 = λ, hence λ = 0 or 1. (b) (2 points) If A is an idempotent matrix, show that Col(A) is precisely the eigenspace E 1 of A. Solution: If v E 1, then Av = v, so v Col(A). This shows E 1 Col(A). Conversely, if v Col(A), then there is some w such that Aw = v, hence A 2 w = Av. Using idempotentce of A we get Aw = A 2 w = Av, so v = Aw = Av. (c) (2 points) If A is an idempotent matrix, show that Col(I A) is precisely the eigenspace E 0 of A. (Hint: A A 2 = A(I A) = 0.) Solution: We proceed as above: If v E 0, then Av = 0, hence v Av = (I A)v = v, so v Col(I A). This shows E 0 Col(I A). Conversely, if v Col(I A), then there is some w such that (I A)w = w Aw = v, hence Av = A(w Aw) = Aw A 2 w = Aw Aw = 0, so v E 0. (d) (3 points) Show that an idempotent matrix is diagonalizable. (Hint: I = A + (I A).) Solution: Every vector v R n can be written in the form v = Av + (I A)v (7) where Av Col(A) and (I A)v Col(I A). By the previous sections, these are eigenspaces E 1 and E 0 respectively; this means that the eigenspaces of A span R n, so we can find a basis of R n consisting of eigenvectors of A (by combining a basis of E 1 and a basis of E 0 ), so A is diagonalizable. Page 7

9 . Page 8

10 . Page 9

### Recall that two vectors in are perpendicular or orthogonal provided that their dot

Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal

### LINEAR ALGEBRA. September 23, 2010

LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................

### x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

Math 24 FINAL EXAM (2/9/9 - SOLUTIONS ( Find the general solution to the system of equations 2 4 5 6 7 ( r 2 2r r 2 r 5r r x + y + z 2x + y + 4z 5x + 6y + 7z 2 2 2 2 So x z + y 2z 2 and z is free. ( r

### Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?

### Similar matrices and Jordan form

Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive

### 1 Sets and Set Notation.

LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most

### Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = 36 + 41i.

Math 5A HW4 Solutions September 5, 202 University of California, Los Angeles Problem 4..3b Calculate the determinant, 5 2i 6 + 4i 3 + i 7i Solution: The textbook s instructions give us, (5 2i)7i (6 + 4i)(

### 3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

Exercise 1 1. Let A be an n n orthogonal matrix. Then prove that (a) the rows of A form an orthonormal basis of R n. (b) the columns of A form an orthonormal basis of R n. (c) for any two vectors x,y R

### α = u v. In other words, Orthogonal Projection

Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v

### Orthogonal Projections

Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

### is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

.4 COORDINATES EXAMPLE Let V be the plane in R with equation x +2x 2 +x 0, a two-dimensional subspace of R. We can describe a vector in this plane by its spatial (D)coordinates; for example, vector x 5

### Inner Product Spaces and Orthogonality

Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,

### Math 312 Homework 1 Solutions

Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please

### Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

Math 550 Notes Chapter 7 Jesse Crawford Department of Mathematics Tarleton State University Fall 2010 (Tarleton State University) Math 550 Chapter 7 Fall 2010 1 / 34 Outline 1 Self-Adjoint and Normal Operators

### LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

### Math 215 HW #6 Solutions

Math 5 HW #6 Solutions Problem 34 Show that x y is orthogonal to x + y if and only if x = y Proof First, suppose x y is orthogonal to x + y Then since x, y = y, x In other words, = x y, x + y = (x y) T

### 18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in 2-106. Total: 175 points.

806 Problem Set 4 Solution Due Wednesday, March 2009 at 4 pm in 2-06 Total: 75 points Problem : A is an m n matrix of rank r Suppose there are right-hand-sides b for which A x = b has no solution (a) What

### x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

Cross product 1 Chapter 7 Cross product We are getting ready to study integration in several variables. Until now we have been doing only differential calculus. One outcome of this study will be our ability

### ( ) which must be a vector

MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are

### ISOMETRIES OF R n KEITH CONRAD

ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x

### Eigenvalues and Eigenvectors

Chapter 6 Eigenvalues and Eigenvectors 6. Introduction to Eigenvalues Linear equations Ax D b come from steady state problems. Eigenvalues have their greatest importance in dynamic problems. The solution

### The Characteristic Polynomial

Physics 116A Winter 2011 The Characteristic Polynomial 1 Coefficients of the characteristic polynomial Consider the eigenvalue problem for an n n matrix A, A v = λ v, v 0 (1) The solution to this problem

### Inner products on R n, and more

Inner products on R n, and more Peyam Ryan Tabrizian Friday, April 12th, 2013 1 Introduction You might be wondering: Are there inner products on R n that are not the usual dot product x y = x 1 y 1 + +

### Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two

### MATH1231 Algebra, 2015 Chapter 7: Linear maps

MATH1231 Algebra, 2015 Chapter 7: Linear maps A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra 1 / 43 Chapter

### 1 VECTOR SPACES AND SUBSPACES

1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such

### Chapter 20. Vector Spaces and Bases

Chapter 20. Vector Spaces and Bases In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have looked at lines, planes, hyperplanes, and have seen that there is no limit

### The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression The SVD is the most generally applicable of the orthogonal-diagonal-orthogonal type matrix decompositions Every

### Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components The eigenvalues and eigenvectors of a square matrix play a key role in some important operations in statistics. In particular, they

### 6. Cholesky factorization

6. Cholesky factorization EE103 (Fall 2011-12) triangular matrices forward and backward substitution the Cholesky factorization solving Ax = b with A positive definite inverse of a positive definite matrix

### SALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey 08069 COURSE SYLLABUS COVER SHEET Course Title Course Number Department Linear Algebra Mathematics MAT-240 Action Taken (Please Check One) New Course Initiated

### CS3220 Lecture Notes: QR factorization and orthogonal transformations

CS3220 Lecture Notes: QR factorization and orthogonal transformations Steve Marschner Cornell University 11 March 2009 In this lecture I ll talk about orthogonal matrices and their properties, discuss

### Factorization Theorems

Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization

### Solutions to Assignment 10

Soltions to Assignment Math 27, Fall 22.4.8 Define T : R R by T (x) = Ax where A is a matrix with eigenvales and -2. Does there exist a basis B for R sch that the B-matrix for T is a diagonal matrix? We

### Adding vectors We can do arithmetic with vectors. We ll start with vector addition and related operations. Suppose you have two vectors

1 Chapter 13. VECTORS IN THREE DIMENSIONAL SPACE Let s begin with some names and notation for things: R is the set (collection) of real numbers. We write x R to mean that x is a real number. A real number

### December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation

### Chapter 6. Linear Transformation. 6.1 Intro. to Linear Transformation

Chapter 6 Linear Transformation 6 Intro to Linear Transformation Homework: Textbook, 6 Ex, 5, 9,, 5,, 7, 9,5, 55, 57, 6(a,b), 6; page 7- In this section, we discuss linear transformations 89 9 CHAPTER

### CONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Chapter 2 CONTROLLABILITY 2 Reachable Set and Controllability Suppose we have a linear system described by the state equation ẋ Ax + Bu (2) x() x Consider the following problem For a given vector x in

### 15.062 Data Mining: Algorithms and Applications Matrix Math Review

.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop

### Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Section 5. l j v j = [ u u j u m ] l jj = l jj u j + + l mj u m. l mj Section 5. 5.. Not orthogonal, the column vectors fail to be perpendicular to each other. 5..2 his matrix is orthogonal. Check that

### APPLICATIONS. are symmetric, but. are not.

CHAPTER III APPLICATIONS Real Symmetric Matrices The most common matrices we meet in applications are symmetric, that is, they are square matrices which are equal to their transposes In symbols, A t =

### Lecture notes on linear algebra

Lecture notes on linear algebra David Lerner Department of Mathematics University of Kansas These are notes of a course given in Fall, 2007 and 2008 to the Honors sections of our elementary linear algebra

### Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday.

Math 312, Fall 2012 Jerry L. Kazdan Problem Set 5 Due: In class Thursday, Oct. 18 Late papers will be accepted until 1:00 PM Friday. In addition to the problems below, you should also know how to solve

### Lecture 14: Section 3.3

Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in

### Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products H. Geuvers Institute for Computing and Information Sciences Intelligent Systems Version: spring 2015 H. Geuvers Version:

### SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

### 2 Polynomials over a field

2 Polynomials over a field A polynomial over a field F is a sequence (a 0, a 1, a 2,, a n, ) where a i F i with a i = 0 from some point on a i is called the i th coefficient of f We define three special

### 4.3 Least Squares Approximations

18 Chapter. Orthogonality.3 Least Squares Approximations It often happens that Ax D b has no solution. The usual reason is: too many equations. The matrix has more rows than columns. There are more equations

### Applied Linear Algebra

Applied Linear Algebra OTTO BRETSCHER http://www.prenhall.com/bretscher Chapter 7 Eigenvalues and Eigenvectors Chia-Hui Chang Email: chia@csie.ncu.edu.tw National Central University, Taiwan 7.1 DYNAMICAL

### DATA ANALYSIS II. Matrix Algorithms

DATA ANALYSIS II Matrix Algorithms Similarity Matrix Given a dataset D = {x i }, i=1,..,n consisting of n points in R d, let A denote the n n symmetric similarity matrix between the points, given as where

### Matrices and Linear Algebra

Chapter 2 Matrices and Linear Algebra 2. Basics Definition 2... A matrix is an m n array of scalars from a given field F. The individual values in the matrix are called entries. Examples. 2 3 A = 2 4 2

### 1.3. DOT PRODUCT 19. 6. If θ is the angle (between 0 and π) between two non-zero vectors u and v,

1.3. DOT PRODUCT 19 1.3 Dot Product 1.3.1 Definitions and Properties The dot product is the first way to multiply two vectors. The definition we will give below may appear arbitrary. But it is not. It

### 3 Orthogonal Vectors and Matrices

3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first

### Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

### 3. INNER PRODUCT SPACES

. INNER PRODUCT SPACES.. Definition So far we have studied abstract vector spaces. These are a generalisation of the geometric spaces R and R. But these have more structure than just that of a vector space.

### Chapter 7. Permutation Groups

Chapter 7 Permutation Groups () We started the study of groups by considering planar isometries In the previous chapter, we learnt that finite groups of planar isometries can only be cyclic or dihedral

### Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009, 2011, 2014 Preface The title of the book sounds a bit mysterious. Why should anyone

### Section 1.4. Lines, Planes, and Hyperplanes. The Calculus of Functions of Several Variables

The Calculus of Functions of Several Variables Section 1.4 Lines, Planes, Hyperplanes In this section we will add to our basic geometric understing of R n by studying lines planes. If we do this carefully,

### Orthogonal Bases and the QR Algorithm

Orthogonal Bases and the QR Algorithm Orthogonal Bases by Peter J Olver University of Minnesota Throughout, we work in the Euclidean vector space V = R n, the space of column vectors with n real entries

### Section 1.1. Introduction to R n

The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to

### Orthogonal Projections and Orthonormal Bases

CS 3, HANDOUT -A, 3 November 04 (adjusted on 7 November 04) Orthogonal Projections and Orthonormal Bases (continuation of Handout 07 of 6 September 04) Definition (Orthogonality, length, unit vectors).

### Nonlinear Iterative Partial Least Squares Method

Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for

### MAT188H1S Lec0101 Burbulla

Winter 206 Linear Transformations A linear transformation T : R m R n is a function that takes vectors in R m to vectors in R n such that and T (u + v) T (u) + T (v) T (k v) k T (v), for all vectors u

### Linear Algebra I. Ronald van Luijk, 2012

Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.

### Mathematics Course 111: Algebra I Part IV: Vector Spaces

Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are

### GROUP ALGEBRAS. ANDREI YAFAEV

GROUP ALGEBRAS. ANDREI YAFAEV We will associate a certain algebra to a finite group and prove that it is semisimple. Then we will apply Wedderburn s theory to its study. Definition 0.1. Let G be a finite

CALIFORNIA INSTITUTE OF TECHNOLOGY Division of the Humanities and Social Sciences More than you wanted to know about quadratic forms KC Border Contents 1 Quadratic forms 1 1.1 Quadratic forms on the unit

### Modélisation et résolutions numérique et symbolique

Modélisation et résolutions numérique et symbolique via les logiciels Maple et Matlab Jeremy Berthomieu Mohab Safey El Din Stef Graillat Mohab.Safey@lip6.fr Outline Previous course: partial review of what

### Subspaces of R n LECTURE 7. 1. Subspaces

LECTURE 7 Subspaces of R n Subspaces Definition 7 A subset W of R n is said to be closed under vector addition if for all u, v W, u + v is also in W If rv is in W for all vectors v W and all scalars r

### System of First Order Differential Equations

CHAPTER System of First Order Differential Equations In this chapter, we will discuss system of first order differential equations. There are many applications that involving find several unknown functions

### CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

### INVARIANT METRICS WITH NONNEGATIVE CURVATURE ON COMPACT LIE GROUPS

INVARIANT METRICS WITH NONNEGATIVE CURVATURE ON COMPACT LIE GROUPS NATHAN BROWN, RACHEL FINCK, MATTHEW SPENCER, KRISTOPHER TAPP, AND ZHONGTAO WU Abstract. We classify the left-invariant metrics with nonnegative

### 9 MATRICES AND TRANSFORMATIONS

9 MATRICES AND TRANSFORMATIONS Chapter 9 Matrices and Transformations Objectives After studying this chapter you should be able to handle matrix (and vector) algebra with confidence, and understand the

### Equations Involving Lines and Planes Standard equations for lines in space

Equations Involving Lines and Planes In this section we will collect various important formulas regarding equations of lines and planes in three dimensional space Reminder regarding notation: any quantity

### Lecture Topic: Low-Rank Approximations

Lecture Topic: Low-Rank Approximations Low-Rank Approximations We have seen principal component analysis. The extraction of the first principle eigenvalue could be seen as an approximation of the original

### Solutions to old Exam 1 problems

Solutions to old Exam 1 problems Hi students! I am putting this old version of my review for the first midterm review, place and time to be announced. Check for updates on the web site as to which sections

### LINES AND PLANES CHRIS JOHNSON

LINES AND PLANES CHRIS JOHNSON Abstract. In this lecture we derive the equations for lines and planes living in 3-space, as well as define the angle between two non-parallel planes, and determine the distance

### Math 1050 Khan Academy Extra Credit Algebra Assignment

Math 1050 Khan Academy Extra Credit Algebra Assignment KhanAcademy.org offers over 2,700 instructional videos, including hundreds of videos teaching algebra concepts, and corresponding problem sets. In

### Section 9.5: Equations of Lines and Planes

Lines in 3D Space Section 9.5: Equations of Lines and Planes Practice HW from Stewart Textbook (not to hand in) p. 673 # 3-5 odd, 2-37 odd, 4, 47 Consider the line L through the point P = ( x, y, ) that

### Grassmann Algebra in Game Development. Eric Lengyel, PhD Terathon Software

Grassmann Algebra in Game Development Eric Lengyel, PhD Terathon Software Math used in 3D programming Dot / cross products, scalar triple product Planes as 4D vectors Homogeneous coordinates Plücker coordinates

### Identifying second degree equations

Chapter 7 Identifing second degree equations 7.1 The eigenvalue method In this section we appl eigenvalue methods to determine the geometrical nature of the second degree equation a 2 + 2h + b 2 + 2g +

### Linear Codes. Chapter 3. 3.1 Basics

Chapter 3 Linear Codes In order to define codes that we can encode and decode efficiently, we add more structure to the codespace. We shall be mainly interested in linear codes. A linear code of length

### Invariant Metrics with Nonnegative Curvature on Compact Lie Groups

Canad. Math. Bull. Vol. 50 (1), 2007 pp. 24 34 Invariant Metrics with Nonnegative Curvature on Compact Lie Groups Nathan Brown, Rachel Finck, Matthew Spencer, Kristopher Tapp and Zhongtao Wu Abstract.

### Dimensionality Reduction: Principal Components Analysis

Dimensionality Reduction: Principal Components Analysis In data mining one often encounters situations where there are a large number of variables in the database. In such situations it is very likely

### Introduction to Principal Component Analysis: Stock Market Values

Chapter 10 Introduction to Principal Component Analysis: Stock Market Values The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from

### = 2 + 1 2 2 = 3 4, Now assume that P (k) is true for some fixed k 2. This means that

Instructions. Answer each of the questions on your own paper, and be sure to show your work so that partial credit can be adequately assessed. Credit will not be given for answers (even correct ones) without

### NUMERICAL METHODS FOR LARGE EIGENVALUE PROBLEMS

NUMERICAL METHODS FOR LARGE EIGENVALUE PROBLEMS Second edition Yousef Saad Copyright c 2011 by the Society for Industrial and Applied Mathematics Contents Preface to Classics Edition Preface xiii xv 1

### Solution of Linear Systems

Chapter 3 Solution of Linear Systems In this chapter we study algorithms for possibly the most commonly occurring problem in scientific computing, the solution of linear systems of equations. We start

### Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

Dot product and vector projections (Sect. 12.3) Two definitions for the dot product. Geometric definition of dot product. Orthogonal vectors. Dot product and orthogonal projections. Properties of the dot

### Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree of PhD of Engineering in Informatics

INTERNATIONAL BLACK SEA UNIVERSITY COMPUTER TECHNOLOGIES AND ENGINEERING FACULTY ELABORATION OF AN ALGORITHM OF DETECTING TESTS DIMENSIONALITY Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree

### 8. Linear least-squares

8. Linear least-squares EE13 (Fall 211-12) definition examples and applications solution of a least-squares problem, normal equations 8-1 Definition overdetermined linear equations if b range(a), cannot

### Classification of Cartan matrices

Chapter 7 Classification of Cartan matrices In this chapter we describe a classification of generalised Cartan matrices This classification can be compared as the rough classification of varieties in terms

### 26. Determinants I. 1. Prehistory

26. Determinants I 26.1 Prehistory 26.2 Definitions 26.3 Uniqueness and other properties 26.4 Existence Both as a careful review of a more pedestrian viewpoint, and as a transition to a coordinate-independent

### Arithmetic and Algebra of Matrices

Arithmetic and Algebra of Matrices Math 572: Algebra for Middle School Teachers The University of Montana 1 The Real Numbers 2 Classroom Connection: Systems of Linear Equations 3 Rational Numbers 4 Irrational

### Subspace Analysis and Optimization for AAM Based Face Alignment

Subspace Analysis and Optimization for AAM Based Face Alignment Ming Zhao Chun Chen College of Computer Science Zhejiang University Hangzhou, 310027, P.R.China zhaoming1999@zju.edu.cn Stan Z. Li Microsoft

### Study Guide 2 Solutions MATH 111

Study Guide 2 Solutions MATH 111 Having read through the sample test, I wanted to warn everyone, that I might consider asking questions involving inequalities, the absolute value function (as in the suggested