Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Boltzmann's entropy formula

In statistical mechanics, Boltzmann's equation (also known as Boltzmann-Planck


equation) is a probability equation relating the entropy S of an ideal gas to the
quantity W, the number of real microstates corresponding to the gas' macrostate:

Boltzmann's equation—carved on
his gravestone.[1]

(1)
 

     

where kB is the Boltzmann constant (also written as simply k) and equal to 1.38065 × 10−23 J/K.

In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or
molecules of a thermodynamic system can be arranged.

Contents
History
Generalization
Boltzmann entropy excludes statistical dependencies
See also
References
External links

History
The equation was originally formulated by Ludwig Boltzmann between 1872 and 1875,
but later put into its current form by Max Planck in about 1900.[2][3] To quote Planck,
"the logarithmic connection between entropy and probability was first stated by L.
Boltzmann in his kinetic theory of gases".

The value of W was originally intended to be proportional to the Wahrscheinlichkeit


(the German word for probability) of a macroscopic state for some probability
distribution of possible microstates—the collection of (unobservable) "ways" the
(observable) thermodynamic state of a system can be realized by assigning different
positions and momenta to the various molecules. Interpreted in this way, Boltzmann's
formula is the most general formula for the thermodynamic entropy. However,
Boltzmann's paradigm was an ideal gas of N identical particles, of which Ni are in the
i-th microscopic condition (range) of position and momentum. For this case, the
probability of each microstate of the system is equal, so it was equivalent for
Boltzmann to calculate the number of microstates associated with a macrostate. W Boltzmann's grave in the
was historically misinterpreted as literally meaning the number of microstates, and Zentralfriedhof, Vienna, with bust
that is what it usually means today. W can be counted using the formula for and entropy formula.
permutations

(2)
 

     

where i ranges over all possible molecular conditions and "!" denotes factorial. The "correction" in the denominator is
due to the fact that identical particles in the same condition are indistinguishable. W is sometimes called the
"thermodynamic probability" since it is an integer greater than one, while mathematical probabilities are always
numbers between zero and one.

Generalization
Boltzmann's formula applies to microstates of the universe as a whole, each possible microstate of which is presumed
to be equally probable.
But in thermodynamics it is important to be able to make the approximation of dividing the universe into a system of
interest, plus its surroundings; and then to be able to identify the entropy of the system with the system entropy in
classical thermodynamics. The microstates of such a thermodynamic system are not equally probable—for example,
high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed
temperature by allowing contact with a heat bath. For thermodynamic systems where microstates of the system may
not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:

(3)
 

     

This reduces to equation (1) if the probabilities pi are all equal.

Boltzmann used a formula as early as 1866.[4] He interpreted ρ as a density in phase space—without


mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can
retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later work[5] and recognized it as more general than
equation (1). That is, equation (1) is a corollary of equation (3)—and not vice versa. In every situation where equation
(1) is valid, equation (3) is valid also—and not vice versa.

Boltzmann entropy excludes statistical dependencies


The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation
that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each
particle has an identical independent probability distribution, and ignoring interactions and correlations between
the particles. This is exact for an ideal gas of identical particles, and may or may not be a good approximation for
other systems.[6]

The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic
system as statistically independent. The probability distribution of the system as a whole then factorises into the
product of N separate identical terms, one term for each particle; and the Gibbs entropy simplifies to the Boltzmann
entropy

(4)
 

     

where the summation is taken over each possible state in the 6-dimensional phase space of a single particle (rather
than the 6N-dimensional phase space of the system as a whole).

This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872. For the special case of
an ideal gas it exactly corresponds to the proper thermodynamic entropy.

However, for anything but the most dilute of real gases, it leads to increasingly wrong predictions of entropies and
physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must
follow Gibbs, and consider the ensemble of states of the system as a whole, rather than single particle states.

See also
History of entropy
Gibbs entropy
nat (unit)
von Neumann entropy

References
1. See: photo of Boltzmann's grave in the Zentralfriedhof, Vienna, with bust and entropy formula.
2. Boltzmann equation (http://scienceworld.wolfram.com/physics/BoltzmannEquation.html). Eric Weisstein's World of
Physics (states the year was 1872).
3. Perrot, Pierre (1998). A to Z of Thermodynamics. Oxford University Press. ISBN 0-19-856552-6. (states the year was 1875)
4. Ludwig Boltzmann (1866). "Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie". Wiener
Berichte. 53: 195–220.
5. Ludwig Boltzmann (1896). Vorlesungen über Gastheorie, vol. I. J.A. Barth, Leipzig.; Ludwig Boltzmann (1898). Vorlesungen
über Gastheorie, vol. II. J.A. Barth, Leipzig.
. Jaynes, E. T. (1965). Gibbs vs Boltzmann entropies (http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf). American
Journal of Physics, 33, 391-8.
External links
Introduction to Boltzmann's Equation (https://web.archive.org/web/20021219005150/http://www.chemsoc.org/exemplarc
hem/entries/pkirby/exemchem/Boltzmann/Boltzmann.html)
Vorlesungen über Gastheorie, Ludwig Boltzmann (1896) vol. I, J.A. Barth, Leipzig (https://archive.org/details/vorlesungenb
erg01boltgoog)
Vorlesungen über Gastheorie, Ludwig Boltzmann (1898) vol. II. J.A. Barth, Leipzig. (https://archive.org/details/vorlesungen
berg02boltgoog)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Boltzmann%27s_entropy_formula&oldid=941901236"

This page was last edited on 21 February 2020, at 10:06 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Terms of
Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.

You might also like