Professional Documents
Culture Documents
Boltzmann Distribution
Boltzmann Distribution
=
=
s
i
i i
n U
1
c is constant, as well as the total number of particles
=
=
s
i
i
n N
1
.
We have to find the number of particles on each state, that is the numbers
s
n n n ... , ,
2 1
.
The equilibrium distribution is the most disordered one. That means it is the most probable
distribution, the distribution with the biggest statistical weight.
Assumptions:
- the temperature is constant, as is the energy (strictly speaking the two
conditions are different, but the difference is very small)
- the particles are independent, in the sense that we may put any number of
particles in any state, if the two conditions
=
=
s
i
i
n N
1
(1)
=
=
s
i
i i
n U
1
c (2)
are satisfied.
Lets compute the statistical weight for a certain distribution of particles among the s
states, i.e. the number of ways we obtain certain values of n
i
s. Its clear that we obtain the
same distribution if we permute the particles. The number of such equivalent distributions is
4
N!. But here we add in all the cases where particles in one cell are permuted and that doesnt
give new states. So we have to divide by ! !... ! !
3 2 1 s
n n n n to get the number of different ways
to get a certain family of values
s
n n n ... , ,
2 1
:
! !... !
!
2 1 s
n n n
N
To get the probability of such a distribution we have to multiply this number by the a priori
probability to have one of these distributions.
The probability that one particle is in the first state is g
1
; the probability to have n
1
particles in this state is
1
1
n
g
The probability that one particle is in the second state is g
2
; the probability to have n
2
particles in this state is
2
2
n
g
..
The probability that one particle is in the ss state is g
s
; the probability to have n
s
particles in this state is
s
n
s
g
The probability to have n
1
particles in the first state, n
2
in the second,, n
s
in the ss is
the product of the individual probabilities, i.e.
s
n
s
n n
g g g ....
2 1
2 1
. The statistical weight is
eventually:
s
n
s
n n
s
g g g
n n n
N
....
! !... !
!
2 1
2 1
2 1
= O (3)
We want the maximum of this number. Its easier to compute the maximum of its
logarithm:
( ) ( ) ( )
= =
+ =
|
|
.
|
\
|
= O
s
i
i
s
i
i i
n
s
n n
s
g n n n N N g g g
n n n
N
s
1
i
1
2 1
2 1
Log 1 Log 1 Log ....
! !... !
!
Log Log
2 1
To find the maximum we compute the derivative and make it zero:
0 d ) 1 Log ( dLog
1
= = O
=
i
s
i
i
i
n
n
g
(4)
Differentiation of the two conditions (1) and (2) gives two new conditions:
0 d
1
=
=
s
i
i
n (5)
0 d
1
i
=
=
s
i
i
n c (6)
Multiply (5) by and (6) by (Lagrange) and add all together to find the
conditions:
5
0 d ) 1 Log (
1
= +
=
i i
s
i
i
i
n
n
g
|c (7)
Here all the variations are independent so each bracket is zero and we get:
] exp[
i i i
Ag n |c = (8)
which is the Boltzmann distribution. One can demonstrate that
T k
B
1
= | . Note the
exponential behavior.
The constant A is obtained from the normalization condition
N n
s
i
i
=
=1
or 1
1 1
= =
= =
s
i
i
s
i
i
P
N
n
(9)
where P
i
is the probability to have n
i
particles with energy
i
c . Using (8) we find:
] exp[
1 ] exp[
1
1
1
i
s
i
i
i
s
i
i
Ag
N
A hence Ag
N
|c
|c
= =
=
=
(10)
Remark 1. We have assumed energy has a discrete spectrum, it does not vary
continuously. This is a quantum picture, used because it is easier to think of numbers instead
of continuous varying functions. Anyway, each measurement apparatus has a lower value it
can measure, the so-called resolution. We cant measure energy values with accuracy superior
than the resolution. So we may assume we measure the energy in discrete portions.
Remark 2. Its easier to think of numbers but harder to do sums than integrals, so in
many cases we go from sums to integrals. (9) becomes
( ) 1 d ) ( ) ( d
1
= =
}
}
=
E E E G E P G(E) P
allvalues
s
i
allvalues
i
(11)
G(E) is the number of states with energy between E and E+dE, the spectral number of states.
If you want to use the number of states per dE and per dV you introduce the spectral density
of states g(E). (E) is the density of probability (per unit of energy and of volume).