Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Lecture 5

Entropy

James Chou BCMP201 Spring 2008

Some common definitions of entropy

A measure of the disorder of a system.

A measure of the amount of energy in a system that is available for doing work; entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity.

Differentiation of Heat (Q), dQ, is not an exact differential and therefore cannot be integrated. Therefore we introduce an integration factor (1/T) such that dQ/T can be integrated. And this dQ/T is called entropy.

Strategies for understanding entropy

The concept of maximum entropy in statistics

Establish the link between statistical entropy and physical entropy

Use the principle of maximum entropy to explain the energetic properties of molecules in the nanoworld

An interesting observation

Random distribution of kinetic energy through random collisions

Consider velocity in the y direction

2 #v x f (v x ) = exp 2 2 2" 2!"

0
2

! 2 = average of (v x " v x )
!=
2 vx =

kB T m
-Vx 0 Vx

-Vx

Vx

Goal of this lecture:


Use the fundamental principle of maximum entropy to explain the physical properties of a complex system in equilibrium with the universe.

Maximum Entropy = Minimum Bias

Principle of Maximum-Entropy in Statistics


Given some information or constraints about a random variable, we should choose that probability distribution for it, which is consistent with the given information, but has otherwise maximum uncertainty associated with it.

Example: Rolling a die with 6 possible outcomes. The only constraint we have is

P ( X = 1) + P ( X = 2) + ... + P ( X = 6) = 1
Without additional information about the die, the most unbiased distribution is such that all outcomes are equally probable.

P ( X = 1) = P ( X = 2) = ... = P ( X = 6) = 1 6

Shannons Measure of Uncertainty Shannon [1948] suggested the following measure of uncertainty, which is commonly known as the statistical entropy.

H = !" pi ln pi
i=1

1. H is a positive function of p1, p2, , pn.

2. H = 0 if one outcome has probability of 1.

3. H is maximum when the outcomes are equally likely. In the case of the die, you will find the maximum entropy to be
6

H = !" pi ln pi = ln6 .
i =1

A quick review on logarithm

log e x = ln x, e = 2.73

ln( AB) = ln A + ln B ,

ln( A / B) = ln A ! ln B ,

d 1 ln( x ) = dx x

Stirling approximation: ln ( N !) " N ln N ! N , for very large N

Shannons entropy in terms of the number of possible outcomes. Example: the number of outcomes ! from rolling the die N times:

N! != ( Np1 )!( Np2 )!...( Np6 )!


ln ! = ln N!"# ln(Npi )!
i= 1 6

Permutation of N numbers between 1 and 6 Factor out the redundant outcomes

Using Stirlings approximation for very large N, ln N! ! N ln N " N , ln # becomes

ln # = N ln N " $ Npi ln Npi = N ln N " ln N $ Npi " N $ pi ln pi = "N $ pi ln pi


i= 1 i= 1 i= 1 i= 1

ln ! = " N # pi ln pi = NH
i =1

Conclusion: ln ! is linearly proportional to H. Therefore, maximizing the total number of possible outcomes is equivalent to maximizing Shannons statistical entropy.

Statistical Entropy

# of possible outcomes

H = Const ! ln"

Entropy in Statistical Physics


Definition of physical entropy:

S = const ! ln " ,

" = # of possible microstates of a close system.

A microstate is the detailed state of a physical system. Example: In an ideal gas, a microstate consists of the position and velocity of every molecule in the system. So the number of microstates is just what Feynman said: the number of different ways the inside of the system can be changed without changing the outside.

Principle of maximum entropy (The second law of thermodynamics)


If a closed system is not in a state of statistical equilibrium, its macroscopic state will vary in time, until ultimately the system reaches a state of maximum entropy. Moreover, at equilibrium, all microstates are equally probable.

An example of maximizing entropy:

S = Const x ln (# of Velocity States X # of Position States) # of velocity states does not change. # of position states does change
r v r !S = S2 " S1 = Const # [ln $v 2 + ln $ 2 " ln $1 " ln $1 ]

!S = Const " ln(2V ) # ln V N = Const " N ln2


N

What is temperature?
Not in Equilibrium Equilibrium

E1

E2

Picture from hyperphysics.phy-astr.gsu.edu

E = E1 + E 2 = const.

dE1 = !dE 2

S = Const " ln ( #1#2 ) = S1 ( E1 ) + S2 ( E2 )


Maximize S,

dS dS dS dE 2 dS1 dS2 = 1+ 2 = ! =0 dE1 dE1 dE 2 dE1 dE1 dE 2

At equilibrium,

dS1 dS2 = dE1 dE2

Temperature T is defined as 1 T = dS dE . The temperatures of bodies in equilibrium with one another are equal.

What is the physical unit of T?


Since T is measured at a fixed number of particles N and volume V, a more stringent definition is T = ( dE dS ) N ,V . Thus far, S is defined to be const. ! ln(") . If S is a dimension-less quantity, T has the dimensions of energy (e.g. in units of Joules (J)). But J is too large a quantity. Example: Room temperature = 404.34 x 10-23 J !

It is more convenient to measure T in degrees Kelvin (K). The conversion factor between energy and degree is the Boltzmanns constant, kB = 1.38 X 10-23 J / K. Hence we redefine S and T by incorporating the conversion factor.

S = kB ln !

and

T " T / kB .

What does T = ( dE dS ) N ,V mean?


! 5! $ S1 = kB ln# & " 2!2!% ! 5! $ S2 = kB ln# & " 3!2!%

Lower T

3e 2e e

!E !S = e ( k B ln 3)
! 5! $ S1 = kB ln# & " 2!2!% ! 5! $ S2 = kB ln# & " 3!2!%

Higher T

3e 2e e

!E !S = 3e ( k B ln 3)

Same change in entropy, but more energy is given away by the system initially with higher T. Hence temperature is a measure of the tendency of an object to spontaneously give up energy to its surroundings.

Can we derive the equation of state of a gas (PV = nRT) from the concept of entropy?

Step 1: Evaluate S = kB ln ! .

! = !v " !r

!v = # of velocity states !r = # of position states

One particle N particles Since v " E 1 2 ,

!v " 4 #v 2 , $ is the speed in 3D. !v " 4 #v 3 N %1 & 4 #v 3 N for large N .


3kB N ln E + Const. 2

S = kB ln ! =

Step 2: Relate kinetic energy E to temperature.

S = kB ln ! =

3kB N ln E + Const. 2
" 3 3 E = k B NT = k B ( nN 0 )T 2 2
N0 = 6.02 x 1023 mol-1

1 dS 3k B N = = T dE 2E
n = # of moles,

R = kBN0 = 8.314 J mol-1 K-1 (Rydberg constant) Energy of n mole of ideal gas: Energy of one ideal gas molecule:

3 E = nRT . 2 E= 3 kB T . 2

Step 3: Relate temperature to pressure.


For a particle in a box, each collision with a wall occurs in a time interval of 2L v x , and change in momentum (e.g. in x direction), !px is 2 mv x .

F=

!px 2 mv x 2 = = mv x L. !t 2 L v x
N

For N particles in a box,


2 2 F = " mv x L = Nm v L. i x i=1

2 2 vx + vy + v z2

2 2 2 2 Since P = F L = Nm v x V and E = N m v = N m v x , 2 2

2 E. 3 3 E = nRT , we obtain PV = nRT . Finally, since 2


we obtain PV =

How do we deal with the enormous complexity of a biological system?

Boltzmann and Gibbs Distribution Goal: Describe the probability distribution of a molecule of interest, with energy E a , in equilibrium with a macroscopic thermal reservoir with energy E B .
molecule of interest, a

The second law says that at equilibrium, or maximum entropy, all microstates are equally probable, with a probability P0 .
Surrounding, B

In the joint system, the probability of the molecule of interest in a particular state, E a , is

# SB ( E B ) & p( E a ) = !B ( E B ) " P0 = exp% ( " P0 . $ kB '

SB ( E B ) = k B ln(!B ( E B ))

S( E B )

! S (E )$ p( E a ) = exp# B B & ' P0 " kB %

Ea EB E tot

?
E tot = E B + E a ! E B

We can use the first-order Taylors expansion to approximate SB ( E B ) because E B is very near E tot .

SB ( E B ) ! SB ( E tot ) "

dSB ( E tot ) 1 E a = SB ( E tot ) " Ea dE B kB T

Hence we obtain

" S (E )% " (E % " (E % p( E a ) = P0 ! exp$ B tot ' exp$ a ' = A ! exp$ a ' # k B & # kB T & # kB T &

Boltzmann Distribution, also known as the Gibbs Distribution

IMPORTANT: The probability distribution of the molecule of interest in equilibrium with its surrounding depends only on the temperature of the surrounding.

Now we can explain the velocity distribution at equilibrium using the Boltzmann distribution

Random distribution of kinetic energy through random collisions

" 1 2% ! mv " !E % $ 2 ' P = A exp $ = A exp $ kT ' # k BT ' & B $ ' # & " % $ !v2 ' = A exp $ ' k T " % B $ 2$ ' && # # m '

2 #v x f (v x ) = exp 2 2 2" 2!"

!=

2 vx =

kB T m

Example on rate constant and transition state

k A! !" B
Transition state

The reaction rate constant, k [s-1], is proportional to exp$ #

" !Ea % ' , where E a is the RT &

Ea

activation energy, or energy barrier, in units of J mol-1.

" !Ea % k = A exp$ ' # RT &

Arrhenius equation

!E
A

Suppose Ea of a reaction is 100 kJ mol-1 and a catalyst lowers this to 80 kJ mol-1. Approximately how much faster will the reaction proceed with the catalyst?
k (catalyzed) exp(!80 RT ) = = e 8 " 3000 k (uncatalyzed) exp(!100 RT )
RT ! 2.5 kJ mol-1 at room temperature

High energy barriers result in high specificity in a cellular signaling pathway.

The role of prion conformational switch in neurodegenerative diseases

k Catalyzed by either mutation or binding of PrPSc

!E ++ " 40 kcal mol-1 = 167.4 kJ mol-1

PrPC PrPSc

What about low energy barrier? The temperature-gated vanilloid receptor VR1, a pain receptor, is activated by heat T > 45 C.

Ea
A

!E

It was found experimentally that the probability of VR1 being in the open state is 0.04 at 40 C and 0.98 at 50 C. What is the energy barrier?

Take home messages

Equilibrium = A system reaching a state of maximum entropy. Equilibrium = All microstates are equally probable.

S = kB ln !

T = A measure of the tendency of an object to spontaneously give up energy to its surroundings.

T = ( dE dS ) N ,V

The Boltzmann & Gibbs Distribution

# "E & p( E a ) = A ! exp% a ( $ kB T '

You might also like