Professional Documents
Culture Documents
Entropy in Statistical Mechanics
Entropy in Statistical Mechanics
Parth G
1 Introduction
Hey there! In this document we will be answering questions on the definition of
entropy used in statistical mechanics. This will help us to really get a feel for
the concept and what it represents. We won’t be looking at all the far reaching
implications of entropy that you might have heard of (such as the heat death
of the universe, for example). Instead, we’ll be focussing on the basics. You
may have heard of entropy being described as “a measure of disorder”, and by
working through these questions we will hopefully get a better understanding
of what that means.
In order to get the most out of these questions, I highly recommend you first
watch my YouTube video discussing the topic. The questions outlined here are
meant to be an extension to everything that is discussed in the video, whilst
also providing a deeper insight by solving some problems.
Most of the questions outlined here are fairly tricky, so please don’t let that put
you off! Give them a good go, and once you’ve made as much progress as you
can, feel free to check the answers at the end of this document. I will also make
a video walking through each of the problems in detail, so keep an eye out for
that on my channel. Let’s get into it!
2 Questions
2.1 Microstates
Imagine we are studying a system consisting of 4 particles in a box. Each of
these particles (A-D) can occupy an energy level with energy nE, where n is a
positive integer, and E is an arbitrary (but constant) amount of energy. (For
now, assume that the particles can be distinguished from each other).
a) What is the total energy of the system when the four particles occupy the
energy levels shown in Figure 1?
b) If the entire system contains 7E worth of energy, all of which is distributed
across the four particles, what is the total number of microstates the system
can occupy?
1
Figure 1: Particles A-D shown occupying specific energy levels in the system.
Figure 2: Particles A-C can occupy the energy levels shown in this diagram.
Once again, assume the particles can be distinguished (told apart) from each
other. If the total energy of the system is 4E, what is the entropy of the system?
2.3 Units!
In statistical mechanics, the (Boltzmann) entropy of a system is defined as
S = kB ln(Ω)
2
2.4 Distinguishable and Indistinguishable Particles
In the weird and wacky world of quantum mechanics, there happen to be certain
kinds of particle that are indistinguishable from each other. If we take two
(or more) of these particles, not only are they identical in every way to each
other, but they also cannot be labelled and tracked over time. If we label them
particles A and B, then some time later we have no way of knowing which of
the particles is A and which is B. All we know is that there are two particles.
This idea of indistinguishability (wow what a long word) is discussed in one of
my YouTube videos in more detail.
When physicists applied the ideas behind thermodynamics to the world of quan-
tum mechanics, they found that it was important to know whether the particles
in the system were distinguishable from each other or not. This example should
help us see why that is the case.
A system consists of 3 particles in a box. Each particle can occupy an energy
level of energy nE, where n and E hold their usual meanings. The total energy
contained within the system is 4E.
a) What is the total number of microstates the system can occupy if we assume
that the particles are distinguishable from each other?
b) What is the total number of microstates the system can occupy if we assume
that the particles are indistinguishable from each other?
c) How do your findings affect the entropy of each system?
∑
S = −kB pi ln(pi ).
i
This looks like a complicated expression, so let’s break it down before attempting
to answer a question about it. As we’ve seen already, pi is the probability of
our system occupying a particular microstate. Even more specifically, this is
the probability of our system occupying the ith microstate out of all possible
microstates.
Now the expression pi ln(pi ) is simply equal to the probability of our system
occupying the ith microstate multiplied by the natural logarithm of this proba-
bility. The interesting part comes in when we sum over all i. Basically, we find
3
pi ln(pi ) for each possible microstate, and then add all of these values together.
Finally, we multiply this sum by a constant (−kB ) in order to find the entropy
of the system. This is shown more clearly in Figure 3.
Figure 3: How we can calculate the entropy of a system that can occupy three
different microstates.
∑
With all this in mind, show (and explain) how the expression S = −kB i pi ln(pi )
reduces to the more commonly seen expression S = kB ln(Ω), where Ω is the
total number of possible microstates of a system. Discuss the assumption(s)
made during this process.
Hint: The main assumption to be made is briefly discussed in my video on this
topic.
3 Solutions
Although I have typed up some basic solutions to the questions you have (hope-
fully) attempted, I’ll soon produce a video going through each of the questions
in more detail.
3.1 Microstates
a) 14E
b) There are 20 possible microstates the system can occupy. These are shown
in Figure 4.
4
Figure 4: All the possible microstates the system can occupy.
3.3 Units!
Ω, the number of possible microstates, is just a number. It is dimensionless.
Taking the natural logarithm of this number also keeps it dimensionless. There-
fore, the only units in the equation S = kB ln (Ω) are the units of kB .
We find, then, that entropy also has the units of J/K, or Joules per Kelvin.
Entropy can be thought of as some amount of energy per unit temperature,
5
and this links back nicely to the classical thermodynamic definition of entropy:
heat transferred to/from a system per unit temperature.
Figure 6: All possible microstates for this system when the particles are distin-
guishable.
Figure 7: There is only one possible microstate for this system when the particles
are indistinguishable. All three diagrams are equivalent!
This is a tricky idea to get your head round - after all, it’s very difficult to
draw indistinguishable particles, and to visualise them. For a more detailed
explanation of indistinguishable particles, check out this video I made a while
ago.
c) The systems being discussed in parts (a) and (b) are almost identical to each
other. The only difference between them is that one contains distinguishable
particles, while the other contains indistinguishable particles. We have seen that
6
this results in a different number of microstates for each system. Therefore, the
entropies (S = kB ln (Ω)) of the two systems are different!
4 Conclusion
That rounds up the five questions I have written for this topic. I hope you
enjoyed attempting them, and that they provided you with some insight into
the world of entropy.
I also hope that this document serves as a good supplement to the video I made
on the topic. I really enjoyed typing this stuff up, and I’ll be looking to make
something like this for future videos. I’m considering starting a Patreon and
chucking these written documents on there. Or maybe I’ll make the documents
7
easily accessible and put the solution videos on Patreon - I don’t know yet. I’d
love to hear your thoughts about this, so drop me a message either on one of
my YouTube videos, or on Instagram @parthvlogs. It’s also possible (and fairly
likely) that there’s a mistake or two somewhere in this document. Please do let
me know if you find one - again, either on YouTube or on Instagram.
Lastly, I’d like to thank you for all your wonderful support, it really means a
lot to me. You rock! I look forward to seeing you in the next one of these
documents. :)