Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Entropy in Statistical Mechanics

Parth G

1 Introduction
Hey there! In this document we will be answering questions on the definition of
entropy used in statistical mechanics. This will help us to really get a feel for
the concept and what it represents. We won’t be looking at all the far reaching
implications of entropy that you might have heard of (such as the heat death
of the universe, for example). Instead, we’ll be focussing on the basics. You
may have heard of entropy being described as “a measure of disorder”, and by
working through these questions we will hopefully get a better understanding
of what that means.
In order to get the most out of these questions, I highly recommend you first
watch my YouTube video discussing the topic. The questions outlined here are
meant to be an extension to everything that is discussed in the video, whilst
also providing a deeper insight by solving some problems.
Most of the questions outlined here are fairly tricky, so please don’t let that put
you off! Give them a good go, and once you’ve made as much progress as you
can, feel free to check the answers at the end of this document. I will also make
a video walking through each of the problems in detail, so keep an eye out for
that on my channel. Let’s get into it!

2 Questions
2.1 Microstates
Imagine we are studying a system consisting of 4 particles in a box. Each of
these particles (A-D) can occupy an energy level with energy nE, where n is a
positive integer, and E is an arbitrary (but constant) amount of energy. (For
now, assume that the particles can be distinguished from each other).
a) What is the total energy of the system when the four particles occupy the
energy levels shown in Figure 1?
b) If the entire system contains 7E worth of energy, all of which is distributed
across the four particles, what is the total number of microstates the system
can occupy?

1
Figure 1: Particles A-D shown occupying specific energy levels in the system.

2.2 Calculating Entropy


A system consists of 3 particles in a box. Each particle can occupy an energy
level with energy nE, as shown in Figure 2.

Figure 2: Particles A-C can occupy the energy levels shown in this diagram.

Once again, assume the particles can be distinguished (told apart) from each
other. If the total energy of the system is 4E, what is the entropy of the system?

2.3 Units!
In statistical mechanics, the (Boltzmann) entropy of a system is defined as

S = kB ln(Ω)

where S is the entropy of the system, kB is the Boltzmann constant, and Ω is


the number of microstates the system can occupy.
The Boltzmann constant has (S.I.) units of J/K. What are the units of entropy?

2
2.4 Distinguishable and Indistinguishable Particles
In the weird and wacky world of quantum mechanics, there happen to be certain
kinds of particle that are indistinguishable from each other. If we take two
(or more) of these particles, not only are they identical in every way to each
other, but they also cannot be labelled and tracked over time. If we label them
particles A and B, then some time later we have no way of knowing which of
the particles is A and which is B. All we know is that there are two particles.
This idea of indistinguishability (wow what a long word) is discussed in one of
my YouTube videos in more detail.
When physicists applied the ideas behind thermodynamics to the world of quan-
tum mechanics, they found that it was important to know whether the particles
in the system were distinguishable from each other or not. This example should
help us see why that is the case.
A system consists of 3 particles in a box. Each particle can occupy an energy
level of energy nE, where n and E hold their usual meanings. The total energy
contained within the system is 4E.
a) What is the total number of microstates the system can occupy if we assume
that the particles are distinguishable from each other?
b) What is the total number of microstates the system can occupy if we assume
that the particles are indistinguishable from each other?
c) How do your findings affect the entropy of each system?

2.5 Arriving at the Entropy Equation


In this document so far, we have seen entropy being defined as S = kB ln(Ω),
where all quantities hold their usual meanings. However, this is not the most
general definition of entropy in statistical mechanics. The most general defini-
tion is dependent on the probability, pi , with which the system can occupy a
particular microstate.
This definition of entropy is actually given by


S = −kB pi ln(pi ).
i

This looks like a complicated expression, so let’s break it down before attempting
to answer a question about it. As we’ve seen already, pi is the probability of
our system occupying a particular microstate. Even more specifically, this is
the probability of our system occupying the ith microstate out of all possible
microstates.
Now the expression pi ln(pi ) is simply equal to the probability of our system
occupying the ith microstate multiplied by the natural logarithm of this proba-
bility. The interesting part comes in when we sum over all i. Basically, we find

3
pi ln(pi ) for each possible microstate, and then add all of these values together.
Finally, we multiply this sum by a constant (−kB ) in order to find the entropy
of the system. This is shown more clearly in Figure 3.

Figure 3: How we can calculate the entropy of a system that can occupy three
different microstates.

With all this in mind, show (and explain) how the expression S = −kB i pi ln(pi )
reduces to the more commonly seen expression S = kB ln(Ω), where Ω is the
total number of possible microstates of a system. Discuss the assumption(s)
made during this process.
Hint: The main assumption to be made is briefly discussed in my video on this
topic.

3 Solutions
Although I have typed up some basic solutions to the questions you have (hope-
fully) attempted, I’ll soon produce a video going through each of the questions
in more detail.

3.1 Microstates
a) 14E
b) There are 20 possible microstates the system can occupy. These are shown
in Figure 4.

4
Figure 4: All the possible microstates the system can occupy.

3.2 Calculating Entropy

Figure 5: The three microstates our system can occupy.

Total number of microstates: Ω = 3


Entropy of the system: S = 1.52 × 10−23 J/K (2 d.p.)
We see here that the entropy of a system is directly dependent on how many
ways it can be arranged. This is where the discussion of entropy as a measure
of disorder comes into the discussion. If a system can only be arranged in a
small number of ways (i.e. Ω is small), it is considered an ”ordered” system, and
its entropy is relatively small. If it can be arranged in a large number of ways
(Ω is large), it is considered a ”disordered” system, and its entropy is large.

3.3 Units!
Ω, the number of possible microstates, is just a number. It is dimensionless.
Taking the natural logarithm of this number also keeps it dimensionless. There-
fore, the only units in the equation S = kB ln (Ω) are the units of kB .
We find, then, that entropy also has the units of J/K, or Joules per Kelvin.
Entropy can be thought of as some amount of energy per unit temperature,

5
and this links back nicely to the classical thermodynamic definition of entropy:
heat transferred to/from a system per unit temperature.

3.4 Distinguishable and Indistinguishable Particles


a) If the particles can be distinguished from each other, we can label them
particles A, B, and C. In order for 3 particles to share 4E worth of energy, two
of the particles must be in the 1E energy level, and the third must be in the
2E energy level. Because the particles are distinguishable from each other, we
can have three possible microstates: one where particle A is in the 2E level,
one where particle B is in the 2E level, and one where particle C is in the 2E
level. In other words, Ω = 3.

Figure 6: All possible microstates for this system when the particles are distin-
guishable.

b) If the particles are indistinguishable from each other, we have no way of


knowing which of the particles is in the 2E level. All we can know is that
there is one particle in this higher level. In other words, however we draw
our system, all three diagrams shown below are equivalent! There is only one
possible microstate the system can occupy. Ω = 1.

Figure 7: There is only one possible microstate for this system when the particles
are indistinguishable. All three diagrams are equivalent!

This is a tricky idea to get your head round - after all, it’s very difficult to
draw indistinguishable particles, and to visualise them. For a more detailed
explanation of indistinguishable particles, check out this video I made a while
ago.
c) The systems being discussed in parts (a) and (b) are almost identical to each
other. The only difference between them is that one contains distinguishable
particles, while the other contains indistinguishable particles. We have seen that

6
this results in a different number of microstates for each system. Therefore, the
entropies (S = kB ln (Ω)) of the two systems are different!

3.5 Arriving at the Entropy Equation


In statistical mechanics (the study of systems at the particle level, ∑
rather than
systems as a whole), entropy was initially defined as S = −kB i pi ln(pi ).
In this equation, pi is the probability of our system occupying a particular
microstate. Specifically, the ith microstate.
For isolated systems (i.e. ones that do not interact with any external objects
or systems), if they have reached thermal equilibrium, we make the assump-
tion of equal a priori probability, or the fundamental assumption of statistical
thermodynamics. This is the assumption that the system is equally likely to
occupy any one of the possible microstates (assuming we only know how many
particles are in the system and the total energy of the system). Mathemati-
cally, we can write this as p1 = p2 = p3 = ... = pΩ (since there are Ω possible
microstates).
This means that each microstate has a probability of 1/Ω of being occupied.
Therefore, for each microstate, pi ln (pi ) = (1/Ω) ln (1/Ω). We add these up for
each microstate in order to satisfy the summation over i in the original equation.
This means we add the expression up Ω times.
We find S = (−kB ) × (Ω) × (1/Ω) ln (1/Ω).
S = (−kB ) × ln (1/Ω)
Let’s recall that 1/Ω can be written as Ω−1 . We can also use the property of
logarithms that ln (ab ) = b ln (a) to find that ln (Ω−1 ) = (−1) ln (Ω).
Substituting this into our expression for entropy thus far, we finally find
S = (−kB ) × − ln (Ω)
S = kB ln (Ω).
We have arrived at the commonly discussed equation for the entropy of an
isolated system in thermal equilibrium!

4 Conclusion
That rounds up the five questions I have written for this topic. I hope you
enjoyed attempting them, and that they provided you with some insight into
the world of entropy.
I also hope that this document serves as a good supplement to the video I made
on the topic. I really enjoyed typing this stuff up, and I’ll be looking to make
something like this for future videos. I’m considering starting a Patreon and
chucking these written documents on there. Or maybe I’ll make the documents

7
easily accessible and put the solution videos on Patreon - I don’t know yet. I’d
love to hear your thoughts about this, so drop me a message either on one of
my YouTube videos, or on Instagram @parthvlogs. It’s also possible (and fairly
likely) that there’s a mistake or two somewhere in this document. Please do let
me know if you find one - again, either on YouTube or on Instagram.
Lastly, I’d like to thank you for all your wonderful support, it really means a
lot to me. You rock! I look forward to seeing you in the next one of these
documents. :)

You might also like