Professional Documents
Culture Documents
Quantum Chemistry II: Math Introduction: Albeiro Restrepo May 24, 2009
Quantum Chemistry II: Math Introduction: Albeiro Restrepo May 24, 2009
Albeiro Restrepo
May 24, 2009
Contents
1 More about matrices 2
1.1 General definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.1 Transpose matrix . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.2 Adjoint matrix . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Square matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.1 Diagonal matrix . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.2 Identity matrix . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.3 Inverse matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.4 Unitary matrix . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.5 Hermitian matrix . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.6 Trace or Character . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3 Useful matrix theorems . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2 Operators 7
1
1 MORE ABOUT MATRICES 2
³ ´
AT = (A)ji = Aji (1)
ij
Example:
à ! 1 4
1 2 3
A2×3 = =⇒ AT
3×2 = 2 5
4 5 6
3 6
³ ´ ·³ ´ ¸∗
A† = AT = A∗ji (2)
ij ji
Example:
à ! 1 4
1 2i 3 †
A= =⇒ A = −2i 5
4 5 6
3 6
If the matrix elements of A are real, then A† = AT . The matrix product between
the adjoint of a column matrix and the matrix itself is given by
1 MORE ABOUT MATRICES 3
a1
a2
³ ´
·
Xn Xn
a† a = a∗1 a∗2 · · · a∗n
=
a ∗
i ai = |ai |2
· i=1 i=1
·
an
which is exactly the magnitude of the vector represented by ~a. Therefore, from now
on, our definition of the scalar product between vectors with complex components
will be
n
X
~a · ~b = a† b = a∗i bi (3)
i=1
1 0 0
A= 0 2 0
0 0 3
There is an identity matrix for every dimension: I2×2 , I5×5 , etc. It follows from the
definition that multiplying any matrix by I either by the left or by the right would
leave the matrix unchanged, hence the name identity matrix:
∀Am×m , if A−1 −1 −1
m×m exists, then Am×m Am×m = Am×m Am×m = Im×m
Not all square matrices have inverses. The -1 in the notation is intended to mean
the inverse, not an exponent, inasmuchlike sin−1 x stands for the inverse function of
the sine function.
U is unitary if U−1 = U†
Unitary matrices with real components (Uij ∈ <) are said to be orthogonal.
A is Hermitian if A = A†
Hermitian matrices are very important in Quantum Mechanics. We will show later
that Hermitian matrices have real eigenvalues and that their eigenvectors are or-
thogonal.
1 MORE ABOUT MATRICES 5
n
X
tr A = χ (A) = Aii (6)
i=1
Example:
1 2 3 X3
A = 4 5 6 =⇒ tr A = Aii = 1 + 5 + 9 = 15
7 8 9 i=1
Traces have some very important properties to be demonstrated later. For instance,
traces are invariant under unitary transformations (rotations, diagonalizations, etc.);
traces are also independent of the order in which matrices are multiplied. Characters
are very important in group theory, which we will be exploring in the near future.
Theorem 1.1 The trace of the product of two matrices is invariant regardless the
order of multiplication.
AB = C, BA = D
therefore,
n
X n
X
tr C = Cmm , tr D = Dpp
m=1 p=1
n
X n
X
Cij = Aik Bkj =⇒ Cmm = Amk Bkm =⇒
k=1 k=1
n
X n X
X n n X
X n
tr C = Cmm = Amk Bkm = Bkm Amk
m=1 m=1 k=1 m=1 k=1
1 MORE ABOUT MATRICES 6
the last step is possible because number multiplication is commutative (matrix multi-
plication is not). The above sum contains all the possible products between elements
of A and B, a total of n2 terms. Now, since the summation order is irrelevant, it
is legal to exchange the addition order, and since summation indexes are dummy
variables, we are allowed to change their labels; therefore,
n X
n n
à n ! n n
X X X X X
tr C = Bkm Amk = Bkm Amk = Dkk = Dpp = tr D
m=1 k=1 k=1 m=1 k=1 p=1
Theorem 1.2 The inverse of a product of matrices equals the product of the inverses
in reverse order.
Proof Take the products of the inverses of two matrices to be
B−1 A−1 = C
B−1 A−1 A = CA
B−1 = CA
B−1 B = CAB
I = C (AB)
Theorem 1.2 is known as the switch over rule; it also holds for the adjoint of a matrix
product.
A = U† BU
AU−1 = U† BUU−1
AU−1 = U† B
AU† = U−1 B
UAU† = UU−1 B
UAU† = B
2 OPERATORS 7
2 Operators
An operator is a rule to transform mathematical objects. A classical example of
operator is the derivative operator: takes a function and transform it into its deriva-
tive. An operator is said to be lineal if it is distributive with respect to addition
of its arguments (numbers, functions, vectors, matrices, etc.) and with respect to
scalar multiplication:
³ ´
O r~a + s~b = rO~a + sO~b or O (r |ai + s| bi) = rO |ai + sO| bi (7)
Derivative is a linear operator, while taking the square root is not. Linear operators
have some very nice and useful features. Consider for example a linear operator
acting on a given ket of its own solution space. Since the ket belongs to the solution
space of the operator, it can be written as a linear combination of the n kets of some
basis set {|ii}, and since it is linear, it can be distributed inside the sum:
à n ! n
X X
O |ψi = O ci |ii = ci O |ii (8)
i=1 i=1
This innocent looking equation is indeed very powerful, for it tells us that we do not
need to know what a linear operator does to every ket of its solution space, possibly
infinite in number, it suffices to know what the operator does on the reduced number
of kets in a given basis set, from there the entire space can be constructed (this is
also a consequence of the solutions of an operator constituting a vector space).