Linear Algebra Review. Vectors

Similar documents
Similarity and Diagonalization. Similar Matrices

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Lecture 5: Singular Value Decomposition SVD (1)

Orthogonal Diagonalization of Symmetric Matrices

1 Introduction to Matrices

Chapter 6. Orthogonality

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Section Inner Products and Norms

Vector and Matrix Norms

MATH APPLIED MATRIX THEORY

Applied Linear Algebra I Review page 1

Inner Product Spaces and Orthogonality

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

October 3rd, Linear Algebra & Properties of the Covariance Matrix

3 Orthogonal Vectors and Matrices

[1] Diagonal factorization

5. Orthogonal matrices

α = u v. In other words, Orthogonal Projection

x = + x 2 + x

Introduction to Matrix Algebra

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

Similar matrices and Jordan form

by the matrix A results in a vector which is a reflection of the given

17. Inner product spaces Definition Let V be a real vector space. An inner product on V is a function

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Inner Product Spaces

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Lecture 2 Matrix Operations

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Inner products on R n, and more

Chapter 17. Orthogonal Matrices and Symmetries of Space

Data Mining: Algorithms and Applications Matrix Math Review

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

LINEAR ALGEBRA. September 23, 2010

Linear Algebra: Vectors

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Linear Algebra Notes

Finite Dimensional Hilbert Spaces and Linear Inverse Problems

1 VECTOR SPACES AND SUBSPACES

Recall that two vectors in are perpendicular or orthogonal provided that their dot

Lecture 1: Schur s Unitary Triangularization Theorem

Linear Algebra: Determinants, Inverses, Rank

MAT 242 Test 2 SOLUTIONS, FORM T

Notes on Symmetric Matrices

Numerical Analysis Lecture Notes

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree of PhD of Engineering in Informatics

Using row reduction to calculate the inverse and the determinant of a square matrix

Eigenvalues and Eigenvectors

A =

Name: Section Registered In:

160 CHAPTER 4. VECTOR SPACES

Examination paper for TMA4205 Numerical Linear Algebra

Math 550 Notes. Chapter 7. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Math 215 HW #6 Solutions

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

Systems of Linear Equations

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Numerical Methods I Eigenvalue Problems

x1 x 2 x 3 y 1 y 2 y 3 x 1 y 2 x 2 y 1 0.

BANACH AND HILBERT SPACE REVIEW

ISOMETRIES OF R n KEITH CONRAD

T ( a i x i ) = a i T (x i ).

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

CS 5614: (Big) Data Management Systems. B. Aditya Prakash Lecture #18: Dimensionality Reduc7on

Matrix Representations of Linear Transformations and Changes of Coordinates

Inner product. Definition of inner product

LINEAR ALGEBRA W W L CHEN

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

Examination paper for TMA4115 Matematikk 3

DATA ANALYSIS II. Matrix Algorithms

Mathematics Course 111: Algebra I Part IV: Vector Spaces

Quadratic forms Cochran s theorem, degrees of freedom, and all that

Lecture Notes 2: Matrices as Systems of Linear Equations

Nonlinear Iterative Partial Least Squares Method

Lecture L3 - Vectors, Matrices and Coordinate Transformations

Linear Algebra and TI 89

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

Chapter 20. Vector Spaces and Bases

Orthogonal Projections

Numerical Methods I Solving Linear Systems: Sparse Matrices, Iterative Methods and Non-Square Systems

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Brief Introduction to Vectors and Matrices

4 MT210 Notebook Eigenvalues and Eigenvectors Definitions; Graphical Illustrations... 3

Math 312 Homework 1 Solutions

Finite dimensional C -algebras

x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

A linear combination is a sum of scalars times quantities. Such expressions arise quite frequently and have the form

Computing Orthonormal Sets in 2D, 3D, and 4D

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Solving Linear Systems, Continued and The Inverse of a Matrix

Transcription:

Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length of x, a.k.a. the norm or 2-norm of x, is x = x 2 + x 2 2 +L+ x n 2 e.g., x = 3 2 + 2 2 + 5 2 = 38

Good Review Materials http://www.imageprocessingbook.com/dip2e/dip2e_downloads/review_material_downloads.htm (Gonzales Woods review materials) Chapt. : Linear Algebra Review Chapt. 2: Probability, Random Variables, Random Vectors Online vector addition demo: http://www.pa.uky.edu/~phy2/vecarith/index.html 2

Vector Addition v u+v u Vector Subtraction u u-v v 3

4 Example (on board) Inner product (dot product) of two vectors a = 6 2 3 ( ( ( b = 4 5 a b = a T b = 6 2 3 [ ] 4 5 ( ( ( = 6 4 + 2 + (3) 5 =

Inner (dot) Product v α u The inner product is a SCALAR. 5

Transpose: Transpose of a Matrix Examples: If, we say A is symmetric. Example of symmetric matrix 6

7

8

9

Matrix Product Product: A and B must have compatible dimensions In Matlab: >> A*B Examples: Matrix Multiplication is not commutative:

Matrix Sum Sum: Example: A and B must have the same dimensions Determinant of a Matrix Determinant: A must be square Example:

Determinant in Matlab Inverse of a Matrix If A is a square matrix, the inverse of A, called A -, satisfies AA - = I and A - A = I, Where I, the identity matrix, is a diagonal matrix with all s on the diagonal. I 2 = I 3 = 2

Inverse of a 2D Matrix Example: Inverses in Matlab 3

Other Terms 4

Matrix Transformation: Scale A square diagonal matrix scales each dimension by the corresponding diagonal element. Example: 2 6.5 8 3 = 2 4 3 5

http://www.math.ubc.ca/~cass/courses/m39-8a/java/m39gfx/eigen.html 6

7 Some Properties of Eigenvalues and Eigenvectors If λ,, λ n are distinct eigenvalues of a matrix, then the corresponding eigenvectors e,, e n are linearly independent. A real, symmetric square matrix has real eigenvalues, with eigenvectors that can be chosen to be orthonormal. Linear Independence A set of vectors is linearly dependent if one of the vectors can be expressed as a linear combination of the other vectors. Example:,, 2 A set of vectors is linearly independent if none of the vectors can be expressed as a linear combination of the other vectors. Example:,, 2 3

Rank of a matrix The rank of a matrix is the number of linearly independent columns of the matrix. Examples: 2 has rank 2 2 has rank 3 Note: the rank of a matrix is also the number of linearly independent rows of the matrix. Singular Matrix All of the following conditions are equivalent. We say a square (n n) matrix is singular if any one of these conditions (and hence all of them) is satisfied. The columns are linearly dependent The rows are linearly dependent The determinant = The matrix is not invertible The matrix is not full rank (i.e., rank < n) 8

9 Linear Spaces A linear space is the set of all vectors that can be expressed as a linear combination of a set of basis vectors. We say this space is the span of the basis vectors. Example: R 3, 3-dimensional Euclidean space, is spanned by each of the following two bases:,,, 2, Linear Subspaces A linear subspace is the space spanned by a subset of the vectors in a linear space. The space spanned by the following vectors is a two-dimensional subspace of R 3.,, What does it look like? What does it look like? The space spanned by the following vectors is a two-dimensional subspace of R 3.

Orthogonal and Orthonormal Bases n linearly independent real vectors span R n, n-dimensional Euclidean space They form a basis for the space. An orthogonal basis, a,, a n satisfies a i a j = if i j An orthonormal basis, a,, a n satisfies a i a j = if i j a i a j = if i = j Examples. Orthonormal Matrices A square matrix is orthonormal (also called unitary) if its columns are orthonormal vectors. A matrix A is orthonormal iff AA T = I. If A is orthonormal, A - = A T AA T = A T A = I. A rotation matrix is an orthonormal matrix with determinant =. It is also possible for an orthonormal matrix to have determinant = -. This is a rotation plus a flip (reflection). 2

SVD: Singular Value Decomposition Any matrix A (m n) can be written as the product of three matrices: A = UDV T where U is an m m orthonormal matrix D is an m n diagonal matrix. Its diagonal elements, σ, σ 2,, are called the singular values of A, and satisfy σ σ 2. V is an n n orthonormal matrix Example: if m > n A U D V T ( ( ( ( * * 2 T + v, = u u 2 u 3 L u m * n M M M T + v n, ) ) ) ) >> x = [ 2 3; 2 7 4; -3 6; 2 4 9; 5-8 ] x = 2 3 2 7 4-3 6 2 4 9 5-8 SVD in Matlab >> [u,s,v] = svd(x) u = -.24538.78 -.29 -.4742 -.82963 -.53253 -.684 -.5286 -.4536.472 -.3668.24939.79767 -.38766.2395 -.64223.4422 -.5795.6667 -.9874.3869.84546 -.26226 -.2428.589 s = 4.42 8.8258 5.6928 v =.82.4826 -.87639 -.68573 -.6395 -.362 -.72763.6748.3863 2

Some Properties of SVD The rank of matrix A is equal to the number of nonzero singular values σ i A square (n n) matrix A is singular iff at least one of its singular values σ,, σ n is zero. Geometric Interpretation of SVD If A is a square (n n) matrix, A U D V T = ( L ( u L u n ) L ) * T + v, * 2 M M M T * n + v n, U is a unitary matrix: rotation (possibly plus flip) D is a scale matrix V (and thus V T ) is a unitary matrix Punchline: An arbitrary n-d linear transformation is equivalent to a rotation (plus perhaps a flip), followed by a scale transformation, followed by a rotation Advanced: y = Ax = UDV T x V T expresses x in terms of the basis V. D rescales each coordinate (each dimension) The new coordinates are the coordinates of y in terms of the basis U 22