Professional Documents
Culture Documents
763313a Quantum Mechanics II - Solution S
763313a Quantum Mechanics II - Solution S
763313a Quantum Mechanics II - Solution S
Solution
a) Let us use the notation
a11 0
A= . (1.1)
0 a22
By using the definition of the matrix product, one can easily show that the
eigenvectors of a diagonal n × n matrix are the n-component standard basis
vectors. Furthermore, when one constructs that proof, one sees that the diagonal
elements are the eigenvalues. Thus the eigenvalues and eigenvectors of A are
1
λ = a11 : |a11 i = ,
0
(1.2)
0
λ = a22 : |a22 i = .
1
where b11 , b22 ∈ R. That is, a 2 × 2 matrix is Hermitian if its diagonal ele-
ments are real and off-diagonal elements are complex conjugates of each other.
Anyway, direct calculation yields
a11 0 b11 b12 a11 b11 a11 b12
AB = = (1.4)
0 a22 b∗12 b22 a22 b∗12 a22 b22
and
b11 b12 a11 0 a11 b11 a22 b12
BA = = . (1.5)
b∗12 b22 0 a22 a11 b∗12 a22 b22
1
Employing Eqs. (1.4) and (1.5), we obtain
0 b12 (a11 − a22 )
AB − BA = ∗ . (1.6)
b12 (a22 − a11 ) 0
As we set this commutator equal to zero, we obtain b12 = 0 (because a11 6= a22 ).
Thus a Hermitian matrix B commutes with A if and only if it is diagonal.
2
2. Degenerate Eigenvalues
Let A be a real and diagonal 2 × 2 matrix whose diagonal elements are equal.
Let B be a real and symmetric 2 × 2 matrix
0 b
B= .
b c
a) Find the eigenvalues and eigenvectors of matrix A.
b) Show that A and B commute.
c) Find the simultaneous eigenvectors of the matrices.
Solution
a) Let us use the notation
a 0
A= . (2.1)
0 a
We immediately see that
A = aI2×2 . (2.2)
Consequently, any X ∈ R2 is an eigenvector of A corresponding to eigenvalue a.
In problem 1 a), we stated that the eigenvectors of a diagonal n × n matrix
are the n-component standard basis vectors. However, now any X ∈ R2 is an
eigenvector of A. Indeed, now there are more eigenvectors than the stament
claims there to be. The explanation is that the statement is slightly incorrect.
To be more precise, it does not take into account the following fact: If eigen-
vectors X1 , X2 , · · · , Xk all correspond to the same eigenvalue, say λ, then any
linear combination c1 X1 + c2 X2 + · · · + ck Xk corresponds also to λ. However,
the statement gives us a complete orthonormal set of eigenvectors 1 . Since a
complete set of eigenvectors is just what we usually want to construct, the state-
ment is completely usable.
2 2
1
√ B reads λ − cλ − b1 = 0.
c) The characteristic equation of matrix √ By solving it,
we obtain the eigenvalues λ1 = 2 (c + c + 4b ) and λ2 = 2 (c − c2 + 4b2 ). We
2 2
note that the eigenvalues are unequal (we assume that B is not a zero matrix).
We obtain the eigenvectors of B from the equation
0 b x1 x
=λ 1 . (2.3)
b c x2 x2
1 We say that a set of eigenvectors is complete if all eigenvectors can be expressed as linear
3
Written component-wise, Eq. (2.3) reads
−λx1 + bx2 = 0
(2.4)
bx1 + (c − λ)x2 = 0.
where c1 , c2 ∈ C\{0}.
Let us yet normalize the eigenvectors. We could determine the normalization
factors by an easy mental calculation. However, for the sake of exercise, we carry
out the formal calculations. Setting hλ1 |λ1 i = 1, we obtain
λ2
1
c∗1 1 λ1 /b c1 = 1 ⇔ |c1 |2 1 + 21 = 1.
(2.6)
λ1 /b b
Similarly we obtain
1
c2 = p . (2.8)
1 + λ22 /b2
4
3. Diagonalization of a Matrix
Diagonalize the matrix B of the previous problem. Give an interpretation of
the matrix elements of the matrix obtained this way.
Solution
For every Hermitian matrix H there is a unitary matrix U so that
D = U HU −1 (3.1)
is diagonal. The columns of U −1 are the normalized eigenvectors of H. (Cf.
lecture notes p. 9.)
We recall that B is a real and symmetric matrix
0 b
B= . (3.2)
b c
The eigenvalues and eigenvectors of B are [cf. Eq. (2.9)]
√
1 √ 1 1
λ1 = 2 (c + c2 + 4b2 ) : |λ1 i = ,
1+λ21 /b2 λ1 /b
(3.3)
√
1 √ 1 1
λ2 = 2 (c − c2 + 4b2 ) : |λ2 i = .
1+λ22 /b2 λ2 /b
Thus the inverse of the unitary matrix that diagonalizes B is
√ 12 2 √ 12 2
1+λ /b 1+λ /b
U −1 = λ1 /b1 2 . (3.4)
√ 2 2 √ λ2 /b2 2
1+λ1 /b 1+λ2 /b
5
Employing Eqs. (3.7) and (3.8), we obtain
−1
D = U
BU
hλ1 |
= λ1 |λ1 i λ2 |λ2 i
hλ2 |
λ1 hλ1 |λ1 i λ2 hλ1 |λ2 i
=
λ1 hλ2 |λ1 i λ2 hλ2 |λ2 i
λ1 0
= . (3.9)
0 λ2
We got the expected result, namely that D is diagonal with the eigenvalues of
B as its diagonal elements.
6
4. The Simultaneous Diagonalization of Pauli Matrices
Show that there is no unitary matrix that diagonalizes both matrices σ1 and
σ2 .
Solution
We recall the definitions (cf. Wikipedia, Pauli matrices)
0 1
σ1 = , (4.1)
1 0
0 −i
σ2 = . (4.2)
i 0
We easily see that σ1 and σ2 are Hermitian. On the other hand, two Hermitian
matrices A and B can be diagonalized by the same unitary matrix U if and only
if they commute (cf. lectures p. 10). Therefore it suffices to show that σ1 and
σ2 do not commute.
7
5. Application of the Campbell-Baker-Hausdorff Lemma
Let A and B be two non-commuting operators. Show that
1
eA eB = eA+B+ 2 [A,B] (i)
holds up to second order in operator multiplication.
Solution
Let C be an operator. Furthermore, let f be a function that has a Maclaurin
series
∞
X
f (x) = ak xk . (5.1)
k=0
We define
∞
X
f (C) = ak C k , (5.2)
k=0