76333A QUANTUM MECHANICS II Solutions Spring 07. Let A be a diagonal and real matrix whose diagonal elements are not equal. a Find the eigenvalues and eigenvectors of A. b Find the Hermitian matrices that commute with A. c Find the eigenvalues and eigenvectors of the matrices we found in b. Solution: a Let us use the notation ( a 0 A = 0 a By using the definition of the matrix product, one can easily show that the eigenvectors of a diagonal n n matrix are the n-component standard basis vectors. Furthermore, when one constructs that proof, one sees that the diagonal elements are the eigenvalues. Thus the eigenvalues and eigenvectors of A are ( λ = a : a = 0 ( 0 λ = a : a = b We call a matrix B Hermitian if it is equal to its conjugate transpose, i.e. B = B. Consequently, a Hermitian matrix is of the form ( b b B = b b where b, b R. That is, a matrix is Hermitian if its diagonal elements are real and off-diagonal elements are complex conjugates of each other. Anyway, direct calculation yields ( ( ( a 0 b b AB = a b 0 a b = a b b a b a b and ( ( ( b b BA = a 0 a b b = a b b 0 a a b a b Calculating the commutator results in ( 0 (a AB BA = a b (a a b 0
Since we want the matrices A and B to commute, we demand that b = b = 0, (recall that a a i.e. ( b 0 B = 0 b Thus a Hermitian matrix B commutes with A if and only if it is diagonal. c We recall that the matrices we found in b are diagonal Hermitian matrices. We easily see that a diagonal Hermitian matrix is a diagonal real matrix. Thus, for the case of unequal diagonal elements, we can refer to part a of this problem. Similarly, for the case of equal diagonal elements, we can refer to part a of problem.. Let A be real and diagonal matrix whose diagonal elements are equal. Let B be a real and symmetric matrix ( 0 b B =. b c a Find the eigenvalues and eigenvectors of matrix A. b Show that A and B commute. c Find the joint eigenvectors of the matrices. Solution: a Let us use the notation A = ( a 0 = ai 0 a Consequently, any X R is an eigenvector of A corresponding to eigenvalue a. In problem a, we stated that the eigenvectors of a diagonal n n matrix are the n-component standard basis vectors. However, now any X R is an eigenvector of A. Indeed, now there are more eigenvectors than the stament claims there to be. The explanation is that the statement is slightly incorrect. To be more precise, it does not take into account the following fact: If eigenvectors X, X,..., X k all correspond to the same eigenvalue, say λ, then any linear combination c X, c X,..., c k X k corresponds also to λ. However, the statement gives us a complete orthonormal set of eigenvectors. Since a complete set of eigenvectors is just what we usually want to construct, the statement is completely usable. b Because A is a constant times the identity matrix, A and B commute with each other. c By solving the characteristic equation for B (λ cλ b = 0 we obtain its eigenvalues λ = (c + c + 4b λ = (c c + 4b
We note that the eigenvalues are unequal (we assume that B is not a zero matrix. We obtain the eigenvectors of B from the equation ( ( ( 0 b x x = λ b c y y = { λx + by = 0 bx + (c λy = 0 For the sake of convenience, we restrict ourselves to the case b 0. Then we obtain y = (λ/bx from the first equation. Consequently, the eigenvalues and eigenvectors of B are λ = (c + ( c + 4b : λ = c, λ /b λ = (c c + 4b : λ = c ( λ /b where c, c C\{0}. Let us yet normalize the eigenvectors. We could determine the normalization factors by an easy mental calculation. However, for the sake of exercise, we carry out the formal calculations. Setting λ λ =, we obtain ( c λ /b ( c = c λ /b ( + λ /b = We see that normalization determines c up to an arbitrary phase factor. Choosing c to be real and positive, we obtain c = + λ /b., Similarly, we obtain c = + λ /b. Consequently, the eigenvalues and normalized eigenvectors of B are λ = (c + ( c + 4b : λ =, + λ /b λ /b λ = (c ( c + 4b : λ =, + λ /b λ /b 3
We recall that any X R is an eigenvector of A. Therefore λ and λ are the simultaneous eigenvectors of A and B. 3. Diagonalize the matrix B of the previous problem. Give an interpretation of the matrix elements of the matrix obtained this way. Solution: For every Hermitian matrix H there is a unitary matrix U so that D = UHU is diagonal. The colums of U are the normalized eigenvectors of H (Cf. lecture notes p. 9. The eigenvalues and eigenvectors of B are λ = (c + c + 4b : λ = λ = (c c + 4b : λ = ( + λ /b λ /b + λ /b ( λ /b Thus the inverse of the unitary matrix that diagonalizes B is +λ U = /b +λ /b λ /b λ /b +λ /b +λ /b From matrix theory we know that D = UBU is diagonal with the eigenvalues of B as the diagonal elements. However, for the sake of exercise, let us calculate D. We introduce the notation U = ( U U... U n That is, we denote the ith column of U by U i. By using the definition of the matrix product, we can easily show that BU = ( BU BU... BU n We can then write U as U = ( λ λ.,, Thus, BU = ( B λ B λ = ( λ λ λ λ We can then diagonalize B as ( D = UBU λ (λ = λ λ λ λ = ( ( λ λ λ λ λ λ λ 0 = λ λ λ λ λ λ 0 λ 4
We got the expected result, namely that D is diagonal with the eigenvalues of B as its diagonal elements. 4. Show that there is no unitary matrix that diagonalizes both matrices σ x and σ y. Solution: We recall the definitions of the Pauli matrices ( 0 σ = 0 ( 0 i σ = i 0 We easily see that σ and σ are Hermitian. On the other hand, two Hermitian matrices A and B can be diagonalized by the same unitary matrix U if and only if they commute (cf. lectures p. 0. Therefore it suffices to show that σ and σ do not commute. 5. Let A and B be two non-commuting operators. Show that e A e B = e A+B+ [A,B] holds up to second order in operator multiplication. Solution: Let C be an operator. Furthermore, let f be a function that has a Maclaurin series f(x = a k x k. We define f(c = k=0 a k C k, where C 0 is to be understood as the identity operator I. Thus, by definition, Using these we obtain k=0 e A = I + A + A +..., e B = I + B + B +... e A e b = (I + A + A +... (I + B + B +... = I + A + A + B + AB + A B + B + AB + 4 A B +... = I + A + B + AB + A + B +... 5
By definition, e A+B+ [A,B] = I + A + B + (AB BA + [ A + B + (AB BA ] +... = I + A + B + AB BA + A + B + AB + BA +... = I + A + B + AB + A + B +.... Therefore e A e B = e A+B+ [A,B] holds up to a second order in operator multiplication. 6