763313A QUANTUM MECHANICS II Solutions 2 Spring 2017

Similar documents
Similarity and Diagonalization. Similar Matrices

Introduction to Matrix Algebra

Chapter 6. Orthogonality

Linear Algebra Review. Vectors

[1] Diagonal factorization

Section Inner Products and Norms

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

Solution to Homework 2

MATH APPLIED MATRIX THEORY

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

Similar matrices and Jordan form

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

DERIVATIVES AS MATRICES; CHAIN RULE

The Characteristic Polynomial

Orthogonal Diagonalization of Symmetric Matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Inner Product Spaces and Orthogonality

Math 312 Homework 1 Solutions

Linear Algebra and TI 89

LS.6 Solution Matrices

NOTES ON LINEAR TRANSFORMATIONS

3. Let A and B be two n n orthogonal matrices. Then prove that AB and BA are both orthogonal matrices. Prove a similar result for unitary matrices.

SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison

DATA ANALYSIS II. Matrix Algorithms

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Lecture 1: Schur s Unitary Triangularization Theorem

5. Orthogonal matrices

Data Mining: Algorithms and Applications Matrix Math Review

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Continued Fractions and the Euclidean Algorithm

Notes on Symmetric Matrices

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Inner Product Spaces

1.2 Solving a System of Linear Equations

Arkansas Tech University MATH 4033: Elementary Modern Algebra Dr. Marcel B. Finan

7 Gaussian Elimination and LU Factorization

3 Orthogonal Vectors and Matrices

Chapter 17. Orthogonal Matrices and Symmetries of Space

Notes on Determinant

Introduction to Matrices for Engineers

Linear Algebra: Determinants, Inverses, Rank

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Lecture 2 Matrix Operations

Vector and Matrix Norms

x = + x 2 + x

University of Lille I PC first year list of exercises n 7. Review

Inner product. Definition of inner product

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Chapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor

Quantum Physics II (8.05) Fall 2013 Assignment 4

Numerical Methods I Eigenvalue Problems

October 3rd, Linear Algebra & Properties of the Covariance Matrix

LINEAR ALGEBRA W W L CHEN

Applied Linear Algebra I Review page 1

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

1 Sets and Set Notation.

Section 5.3. Section 5.3. u m ] l jj. = l jj u j + + l mj u m. v j = [ u 1 u j. l mj

160 CHAPTER 4. VECTOR SPACES

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

5.3 The Cross Product in R 3

8.2. Solution by Inverse Matrix Method. Introduction. Prerequisites. Learning Outcomes

Methods for Finding Bases

Solving Systems of Linear Equations

Examination paper for TMA4115 Matematikk 3

Using row reduction to calculate the inverse and the determinant of a square matrix

SOLVING LINEAR SYSTEMS

Iterative Methods for Solving Linear Systems

MAT 242 Test 2 SOLUTIONS, FORM T

1 Introduction to Matrices

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

MATHEMATICS FOR ENGINEERS BASIC MATRIX THEORY TUTORIAL 2

Solving simultaneous equations using the inverse matrix

Linear Algebra I. Ronald van Luijk, 2012

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

Systems of Linear Equations

Lecture 5: Singular Value Decomposition SVD (1)

1 Lecture 3: Operators in Quantum Mechanics

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

9 MATRICES AND TRANSFORMATIONS

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.

Matrix Representations of Linear Transformations and Changes of Coordinates

Solutions to Math 51 First Exam January 29, 2015

ASEN Structures. MDOF Dynamic Systems. ASEN 3112 Lecture 1 Slide 1

by the matrix A results in a vector which is a reflection of the given

A note on companion matrices

0.1 Phase Estimation Technique

Inner products on R n, and more

BANACH AND HILBERT SPACE REVIEW

The Matrix Elements of a 3 3 Orthogonal Matrix Revisited

α = u v. In other words, Orthogonal Projection

Chapter 7. Permutation Groups

T ( a i x i ) = a i T (x i ).

Eigenvalues, Eigenvectors, and Differential Equations

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

1 Symmetries of regular polyhedra

Transcription:

76333A QUANTUM MECHANICS II Solutions Spring 07. Let A be a diagonal and real matrix whose diagonal elements are not equal. a Find the eigenvalues and eigenvectors of A. b Find the Hermitian matrices that commute with A. c Find the eigenvalues and eigenvectors of the matrices we found in b. Solution: a Let us use the notation ( a 0 A = 0 a By using the definition of the matrix product, one can easily show that the eigenvectors of a diagonal n n matrix are the n-component standard basis vectors. Furthermore, when one constructs that proof, one sees that the diagonal elements are the eigenvalues. Thus the eigenvalues and eigenvectors of A are ( λ = a : a = 0 ( 0 λ = a : a = b We call a matrix B Hermitian if it is equal to its conjugate transpose, i.e. B = B. Consequently, a Hermitian matrix is of the form ( b b B = b b where b, b R. That is, a matrix is Hermitian if its diagonal elements are real and off-diagonal elements are complex conjugates of each other. Anyway, direct calculation yields ( ( ( a 0 b b AB = a b 0 a b = a b b a b a b and ( ( ( b b BA = a 0 a b b = a b b 0 a a b a b Calculating the commutator results in ( 0 (a AB BA = a b (a a b 0

Since we want the matrices A and B to commute, we demand that b = b = 0, (recall that a a i.e. ( b 0 B = 0 b Thus a Hermitian matrix B commutes with A if and only if it is diagonal. c We recall that the matrices we found in b are diagonal Hermitian matrices. We easily see that a diagonal Hermitian matrix is a diagonal real matrix. Thus, for the case of unequal diagonal elements, we can refer to part a of this problem. Similarly, for the case of equal diagonal elements, we can refer to part a of problem.. Let A be real and diagonal matrix whose diagonal elements are equal. Let B be a real and symmetric matrix ( 0 b B =. b c a Find the eigenvalues and eigenvectors of matrix A. b Show that A and B commute. c Find the joint eigenvectors of the matrices. Solution: a Let us use the notation A = ( a 0 = ai 0 a Consequently, any X R is an eigenvector of A corresponding to eigenvalue a. In problem a, we stated that the eigenvectors of a diagonal n n matrix are the n-component standard basis vectors. However, now any X R is an eigenvector of A. Indeed, now there are more eigenvectors than the stament claims there to be. The explanation is that the statement is slightly incorrect. To be more precise, it does not take into account the following fact: If eigenvectors X, X,..., X k all correspond to the same eigenvalue, say λ, then any linear combination c X, c X,..., c k X k corresponds also to λ. However, the statement gives us a complete orthonormal set of eigenvectors. Since a complete set of eigenvectors is just what we usually want to construct, the statement is completely usable. b Because A is a constant times the identity matrix, A and B commute with each other. c By solving the characteristic equation for B (λ cλ b = 0 we obtain its eigenvalues λ = (c + c + 4b λ = (c c + 4b

We note that the eigenvalues are unequal (we assume that B is not a zero matrix. We obtain the eigenvectors of B from the equation ( ( ( 0 b x x = λ b c y y = { λx + by = 0 bx + (c λy = 0 For the sake of convenience, we restrict ourselves to the case b 0. Then we obtain y = (λ/bx from the first equation. Consequently, the eigenvalues and eigenvectors of B are λ = (c + ( c + 4b : λ = c, λ /b λ = (c c + 4b : λ = c ( λ /b where c, c C\{0}. Let us yet normalize the eigenvectors. We could determine the normalization factors by an easy mental calculation. However, for the sake of exercise, we carry out the formal calculations. Setting λ λ =, we obtain ( c λ /b ( c = c λ /b ( + λ /b = We see that normalization determines c up to an arbitrary phase factor. Choosing c to be real and positive, we obtain c = + λ /b., Similarly, we obtain c = + λ /b. Consequently, the eigenvalues and normalized eigenvectors of B are λ = (c + ( c + 4b : λ =, + λ /b λ /b λ = (c ( c + 4b : λ =, + λ /b λ /b 3

We recall that any X R is an eigenvector of A. Therefore λ and λ are the simultaneous eigenvectors of A and B. 3. Diagonalize the matrix B of the previous problem. Give an interpretation of the matrix elements of the matrix obtained this way. Solution: For every Hermitian matrix H there is a unitary matrix U so that D = UHU is diagonal. The colums of U are the normalized eigenvectors of H (Cf. lecture notes p. 9. The eigenvalues and eigenvectors of B are λ = (c + c + 4b : λ = λ = (c c + 4b : λ = ( + λ /b λ /b + λ /b ( λ /b Thus the inverse of the unitary matrix that diagonalizes B is +λ U = /b +λ /b λ /b λ /b +λ /b +λ /b From matrix theory we know that D = UBU is diagonal with the eigenvalues of B as the diagonal elements. However, for the sake of exercise, let us calculate D. We introduce the notation U = ( U U... U n That is, we denote the ith column of U by U i. By using the definition of the matrix product, we can easily show that BU = ( BU BU... BU n We can then write U as U = ( λ λ.,, Thus, BU = ( B λ B λ = ( λ λ λ λ We can then diagonalize B as ( D = UBU λ (λ = λ λ λ λ = ( ( λ λ λ λ λ λ λ 0 = λ λ λ λ λ λ 0 λ 4

We got the expected result, namely that D is diagonal with the eigenvalues of B as its diagonal elements. 4. Show that there is no unitary matrix that diagonalizes both matrices σ x and σ y. Solution: We recall the definitions of the Pauli matrices ( 0 σ = 0 ( 0 i σ = i 0 We easily see that σ and σ are Hermitian. On the other hand, two Hermitian matrices A and B can be diagonalized by the same unitary matrix U if and only if they commute (cf. lectures p. 0. Therefore it suffices to show that σ and σ do not commute. 5. Let A and B be two non-commuting operators. Show that e A e B = e A+B+ [A,B] holds up to second order in operator multiplication. Solution: Let C be an operator. Furthermore, let f be a function that has a Maclaurin series f(x = a k x k. We define f(c = k=0 a k C k, where C 0 is to be understood as the identity operator I. Thus, by definition, Using these we obtain k=0 e A = I + A + A +..., e B = I + B + B +... e A e b = (I + A + A +... (I + B + B +... = I + A + A + B + AB + A B + B + AB + 4 A B +... = I + A + B + AB + A + B +... 5

By definition, e A+B+ [A,B] = I + A + B + (AB BA + [ A + B + (AB BA ] +... = I + A + B + AB BA + A + B + AB + BA +... = I + A + B + AB + A + B +.... Therefore e A e B = e A+B+ [A,B] holds up to a second order in operator multiplication. 6