MTD 03 - Prob&BoltzDis36

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Molecular Thermodynamics (CH3141)

Probabilities and probability distributions


Boltzmann distribution
Boltzmann distribution for macroscopic systems
(Canonical) Partition Function as a bridge with
thermodynamics

N.A.M. (Klaas) Besseling


1

Probabilities and probability distributions


Discrete probability distribution
e.g. the outcome of throwing a die
all 6 possibilities states of the die equally probable
e.g. probability of throwing 5:

Pi

1
6

P5 = 61

P =1
i

1 2 3 4 5 6

normalisation

In Molecular Thermodynamics we will often deal with


Subsets of outcomes
The probability of any outcome belonging to a specific subset is
the sum of the probabilities of the outcomes belonging to that subset
mathematically PI = Pi
iI

iI

denotes the sum over all i


that belong to subset I

example with a die

1 1 1 3
the chance of an odd number: Podd = P1 + P3 + P5 = + + =
6 6 6 6
(the odd numbers are a subset
of all numbers)
Podd = Pi
odd#i

Subsets of outcomes
Euro crisis
A pile of euro coins
The number of coins from country c and year j is indicated as nc ,j
The probability to draw a coin from country c and year j is
nc ,j
n
Pc ,j =
= c ,j
c ,j nc ,j N
where N = c ,j nc ,j is the total number of coins
Write an expression for the probability to draw a coin from country c
Write an expression for the probability to draw a coin from year j
nc ,j
n
Pc = Pc ,j = j
Pj = Pc ,j = c c ,j
N
N
j
c
4

Subsets of outcomes
Molecular Thermodynamics example
- j is the index of a quantum state, and Ej the energy of that state
- often there is more than one quantum state with the same energy Ea
(degeneracy)
- pj is the probability that a system is in quantum state j

( ) p

- the probability that the system has energy Ea is p Ea =

j E j =Ea

sum over all states j


that have energy Ea
5

Subsets of outcomes
another Molecular Thermodynamics example

for an open system the probability


that there are N molecules, and that
the system is in quantum state j
is denoted pj(N)
write an expression in terms of probabilities pj(N) for the probability
that there are N molecules in the system, irrespective of the quantum
state

p(N) = p j (N)

Mean value, (expectation value) average outcome


for e.g. throwing a die many times (or for throwing many dice):

i = 1 16 + 2 16 + 3 16 %.%.%.% = 3 12

in general: i = iPi
i

but also: i = i 2 Pi
2

generally: fi = fi Pi
i

some important examples


- mean: i = iPi
i

(i i )

(check this)

- variance: 2 = ( i i )2 = i 2 i 2
( is the standard deviation)
(measure of the width of a distribution)

i 2 2ii + i 2 =
i 2 2ii + i 2 =
i 2 2i 2 + i 2 =
= i2 i 2

Mean value
Molecular Thermodynamics example
pj is the probability that a system is in quantum state j with energy Ej
There are an infinite number of quantum states: j = 1, 2,
write an expression for the mean energy of the system

E = pjEj = pjEj
j=1

Combination of independent events


E.g. throw a die twice
what is the probability that
first throw yields 5, second throw yields 2:

P5,2 = P5 P2 =

1 1
6 6

(product of independent chances)


if order does not matter: P5 P2 + P2 P5 = 2P2 P5 = 2 16 16
nr. of permutations
nr. of ways to get 2 and 5
- such nrs. of ways are very important in Molecular Thermodynamics
- molecular basis of Entropy
9

the 6 permutations of
three distinguishable balls
this number of permutations is calculated as 3x2x1 = 3!
- for the 1st position 3 possible colors
- for the 2nd position 2 possible colors
- for the 3rd position 1 possible colors
or, equivalently
- for the 1st color 3 possible positions
- for the 2nd colors 2 possible positions
- for the 3rd colors 1 possible position

10

Flip 6 coins, and lay them out; e.g.

or

....
what is the number of permutations for 2 heads and 4 tails?
what would the number of permutations be if we would have
6 different objects rather than just two (head and tail)?
6!?
what is the number of permutations for 2 heads and 4 tails?
need to devide by 2! and by 4!

Continuous distributions

6!
2!4!

?
11

x is a continuous random variable

- One cannot say P is the probability of x!

P(x)dx

- P(x)dx is the probability of a value


between x and x+dx .

normalisation: P(x)dx = 1
x x+dx

P(x)dx is dimensionless
(when x has dimension [x],
then P(x) has dimension 1/[x])

mean f :

f (x) =

f (x)P(x)dx

(f is some function of random variable x):

e.g.

x=

xP(x)dx

x =
2

P(x)dx

12

An important example of a continuous probability distribution is the


Gaussian distribution: (also called Normal distribution)

x
P(x) =
exp
2 2

2 2
1

the mean of x is = x
the variance ( x x ) = 2 , and the standard deviation =
are measures for the width of the distribution, of the uncertainty of x
2

- The 1/e width: the width of the peak, where p = 1/e times its
maximum height P = 1 2 2 is 2 2
- Another measure for the width is x defined such that

xP = Pdx = 1 hence x = 2

13

The Boltzmann distribution


If a (quantum) state has energy ,
then the probability of that state
and how often it occurs


exp
kT

Boltzmann
factor

where T absolute temperature


23
k Boltzmann constant ( = R N Av = 1.38 10 J K )

In Molecular Thermodynamics it is common to specify amounts of


matter in terms of nrs. N of molecules, atoms, particles, etc.
rather than as nrs. n of moles of molecules, atoms, particles, etc.
14

Boltzmann distribution
- The higher the energy , the lower the probability
- States with < kT are well accessible
- States with > kT are poorly accessible
- The higher T, the more accessible a high-energy state

1

exp
kT

high T

e 1
low T
kT

kT

when the energy difference between states equals kT


the corresponding Boltzmann factors differ by a factor e
15

Molecular state 0 has energy 0 , state 1 has energy 1


Express the ratio of the numbers of molecules in state 0 and
state 1 in terms of 0 and 1

n1
n0

P1
P0

(
exp (

) = exp

kT
kT )

exp 1 kT
0

1 0
&&&or&&&n1 = n0 exp kT

16

This course could now proceed in two directions:


1

where does this Boltzmann distribution come from?


- why this expression for those probabilities?
- can we derive it from some underlying principle?

what are the consequences?


- what can we do with it?

- For now, we proceed with 2, discussing some examples that


help you to get acquainted with the Boltzmann distribution.
- Then discuss the relations with thermodynamic properties.
- Then discuss in some detail an application to illustrate the
machinery of statistical thermodynamics: the ideal gas.
- Later on we will see about the origins of the Boltzmann law,
and when it applies, and when not.

Some well-known examples of the Boltzmann distribution


Example 1: the barometric height distribution

For molecules in a gravitational field: (h) (h0 ) = mg h h0

mg ( h h0 )
(h) = (h0 )exp

kT

number
density
(nr.
of
molecules
/ particles per unit volume)
(h) =
at height h
h0 = some arbitrary reference height
m = mass of molecule
g = acceleration of gravity
This is indeed a special version of the Boltzmann distribution law,
mgh is the potential energy of a particle at height h,
h characterises the state of a particle (molecule)
18

(h10 )

mg ( h h0 )
(h) = (h0 )exp

kT

high T or small m or g

(h0 ) e
low T or large m or g

h h0

lg

0 = mg h h0

kT

- the gravitational length lg = kT mg is a measure of the height /


thickness of anatmosphere.
- it is the height increase over which decreases by a factor e
- lg small for low T and/or large m, large for high T and/or small m 19

a proof for the barometric height distribution

h + dh
h
A

mg ( h h0 )
(h) = (h0 )exp

kT

mass = m(h) Adh


downward force per unit area = gm(h)dh

dp = gm (h)dh
p = kT
ideal gas
d(h) gm
=
(h)
dh
kT
differential equation of which
barometric distribution is the solution
(check this)

20

10

the Maxwell velocity distribution

Sandler 3.9
(more advanced)

Molecules in a gas fly around with all kinds of


velocities (classical mechanics view).
The Maxwell distribution gives the distribution of
the velocities, and how it depends on temperature:
P(v) =

James Clerk Maxwell

mv 2
m
exp
2 kT
2kT

- v = a velocity component (vx, vy and vz; characterise state of molecule)


- P(v)dv = is the probability that a molecule has at any particular time a
velocity between v and v + dv
This is another example of a Boltzmann distribution;
the energy associated with velocity component v is 12 mv 2
NB note (check) that P(v) is a Gaussian distribution (what is v , v ?)
21

Compare
mv 2
m
P(v) =
exp
2 kT
2kT

and

x
P(x) =
exp
2 2

2 2
1

The Maxwell velocity distribution is an example of a Gaussian


probability distribution
(Gaussian distribution of velocity; exponential distribution of kinetic energy)

v = 0 because positive and negative values are equally probable

v = ( v v ) = kT m
2

the higher T and/or the lower m


- the larger the width of the distribution
- the more probable large v
22

11

The Boltzmann distribution law


The exponential dependency of probability on energy is related to
the product law for independent events!
Examine two independent degrees of freedom of a system
e.g. vx and vy of a molecule in a gas.
- consider the simultaneous occurrence of x and y as one combined
event with probability P x + y where P is some unknown
function.
- because the occurrences are independent P x + y = P( x )P( y )

The only possibility for P is an exponential function:

( (

))

ABexp x + y = Aexp x Bexp y

By comparing relations derived from Pi exp( i )


with known relations in classical thermodynamics
1
it turns out that =
kT
A more thorough justification will be given later in this course

23

The Boltzmann Distribution for Macroscopic Systems


In the previous examples the Boltzmann distribution law was
applied to states of single molecules (or atoms or particles) that do
not interact with others
(Barometric distribution, Velocity distribution).
This approach runs into problems with interacting particles,
molecules,
The Boltzmann distribution law can also be applied to
microstates of complete macroscopic systems:
(systems that are the subject matter of Classical Thermodynamics)
This point of view was developed by

24

12

Important to distinguish between

Sandler 1.2

Macrostate (= thermodynamic state): described by a few, macroscopic,


phenomenological variables e.g. N, V, and T
Microstate: described by very large number of molecular variables:
!
!
!
!
!

all molecular position and velocity (or momentum) coordinates


(with a Classical Mechanical description of motion of the particles)
or
all quantum numbers
(with a Quantum Mechanical description of motion of the particles)

- What microstates are possible depends on N and V.


- T determines the probability distribution of those states
- Always a very, very, very, very large number of different microstates
possible for a given macrostate.
25

Always a very large number of different microstates


possible for a given macrostate.

Molecular Thermodynamics =
analysing the statistics of all the microstates
for a given macrostate
- We need the probability distribution for the microstates
for a given macrostate
- From these we can calculate mean values (e.g. of the energy)
- These are identified with thermodynamic variables
26

13

Sandler 2.1

For a macroscopic system, values for


the number of molecules N, the volume V, and the temperature T,
provide a complete description of its macrostate.
(for the moment we deal with homogeneous one-component systems)
The probability Pi (N ,V ,T ) that the system is in microstate i
E (N ,V )
Pi (N ,V ,T ) exp i
kT

is proportional to the Boltzmann factor:

Normalized:

Pi (N ,V ,T ) =

exp Ei (N ,V ) kT

microstates j

exp E j (N ,V ) kT

This is the canonical probability distribution


27

Obviously, Pi is a normalised probability distribution:

microstates i

because of the denominator

microstates i

Q(N,V,T ) =

Pi = 1

E (N,V )
exp i

kT
microstates i

indicates a sum over all possible microstates


that are consistent with N and V

Obviously all Pi and Q depend on N, V and T

28

14

Q(N,V,T ) =

E
exp i
kT
microstates i

is called the (Canonical) Partition Function


in English:
in German:
in Dutch:
in English sometimes:

Partition Function
Zustandsumme
Toestandsom
Sum over states

We will see that Q is a very important function.


It plays an important role as a bridge with Thermodynamics.
What thermodynamic function has a special relation with the
variables N, V, and T?

29

Sandler 2.2, 3.3

The internal mechanical energy


(in classical mechanics: the total of all kinetic and potential
energies of all the molecules, atoms;
in quantum mechanics: the energy level of the quantum state)

of a closed system (N,V,T are fixed)


(NB the macroscopic / thermodynamic state is fixed)

fluctuates in time, around a certain average

t
as the system assumes different microstates over time.
Also when the energy of thermodynamic replicas of the system
(same N,V,T) are measured at the same time, the outcome varies.

30

15

Using the Boltzmann distribution law, we can calculate the


mean mechanical internal energy of a molecular system:

E = Ei pi =
i

Ei exp ( Ei kT )
Q

For macroscopic systems,


- fluctuations of E are very small
- as compared to the mean value E .
(will be demonstrated later)

The mean internal mechanical energy E can be identified as


the thermodynamic internal energy U
States i and their energies Ei depend on N and V.

E =U

is a function of N, V and T.
31

U=

Ei exp Ei kT

can be rewritten as

klnQ
U =

1 T
V ,N

( )

check this (use d ln x dx = 1 x )

klnQ
1 Q
U =
= k
1 T
Q 1 T
V ,N

( )

( )

d ln x 1
=
dx
x

and chain rule

E E
1
exp i i
= k

Q microstates0i
kT k
V ,N

Q(N ,V ,T ) =

1 T

( )

E
exp i
kT
microstates i

E E
= exp i i

kT k
microstates.i
V ,N
32

16

Ei exp ( Ei kT )
Q

U =E=
i

k lnQ
can be rewritten as U =
(1 T ) V ,N

Known from classical thermodynamics:


(the Gibbs-Helmholtz equation)

AT
U =

(1 T ) V ,N

where A is the Helmholtz (free) energy

A = kT lnQ

Comparing expressions for U suggests that


This turns out to be true!

Canonical Partition Function is directly related to Helmholtz Energy


33

The Gibbs-Helmholtz relation can be written as

A T
1 T

=U

or as

V ,N

A T
T

=
V ,N

U
T2

first show that one expression follows from the other (hint: chain rule)
then derive second expression from dA = SdT pdV + dN

A T
T

A T
T

V ,N

V ,N

A T 1 T
=
1 T V ,N T

tom

A
S =
T V ,N
1 T
1 A
S A
ST + A
U
= + A
= 2 =
= 2
2

T T V ,N
T T
T
T
T
34

17

OVERVIEW OF WHAT WE KNOW NOW ABOUT


STATISTICAL THERMODYNAMCS:

Q(N,V,T ) =

E
exp i
kT
microstates i

canonical partition function

calculated from microscopic properties: from energies of microstates

A = kT lnQ

establishes link with thermodynamics


provides microscopic, molecular expression for A

35

From A(N ,V ,T ) microscopic expressions for other


thermodynamic quantities can be calculated using familiar
thermodynamic relations:
E.g.

A
S =
T N ,V
U = A + TS
A
p =
V N ,T

(from dA = SdT pdV + ... + dN


or

A T
U =
1 T V ,N

A
=
N V ,T
apply these ideas to relatively simple
molecular systems
36

18

You might also like