Professional Documents
Culture Documents
Lecture5 PDF
Lecture5 PDF
Entropy
A measure of the amount of energy in a system that is available for doing work; entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity.
Differentiation of Heat (Q), dQ, is not an exact differential and therefore cannot be integrated. Therefore we introduce an integration factor (1/T) such that dQ/T can be integrated. And this dQ/T is called entropy.
Use the principle of maximum entropy to explain the energetic properties of molecules in the nanoworld
An interesting observation
0
2
! 2 = average of (v x " v x )
!=
2 vx =
kB T m
-Vx 0 Vx
-Vx
Vx
Example: Rolling a die with 6 possible outcomes. The only constraint we have is
P ( X = 1) + P ( X = 2) + ... + P ( X = 6) = 1
Without additional information about the die, the most unbiased distribution is such that all outcomes are equally probable.
P ( X = 1) = P ( X = 2) = ... = P ( X = 6) = 1 6
Shannons Measure of Uncertainty Shannon [1948] suggested the following measure of uncertainty, which is commonly known as the statistical entropy.
H = !" pi ln pi
i=1
3. H is maximum when the outcomes are equally likely. In the case of the die, you will find the maximum entropy to be
6
H = !" pi ln pi = ln6 .
i =1
log e x = ln x, e = 2.73
ln( AB) = ln A + ln B ,
ln( A / B) = ln A ! ln B ,
d 1 ln( x ) = dx x
Shannons entropy in terms of the number of possible outcomes. Example: the number of outcomes ! from rolling the die N times:
ln ! = " N # pi ln pi = NH
i =1
Conclusion: ln ! is linearly proportional to H. Therefore, maximizing the total number of possible outcomes is equivalent to maximizing Shannons statistical entropy.
Statistical Entropy
# of possible outcomes
H = Const ! ln"
S = const ! ln " ,
A microstate is the detailed state of a physical system. Example: In an ideal gas, a microstate consists of the position and velocity of every molecule in the system. So the number of microstates is just what Feynman said: the number of different ways the inside of the system can be changed without changing the outside.
S = Const x ln (# of Velocity States X # of Position States) # of velocity states does not change. # of position states does change
r v r !S = S2 " S1 = Const # [ln $v 2 + ln $ 2 " ln $1 " ln $1 ]
What is temperature?
Not in Equilibrium Equilibrium
E1
E2
E = E1 + E 2 = const.
dE1 = !dE 2
At equilibrium,
Temperature T is defined as 1 T = dS dE . The temperatures of bodies in equilibrium with one another are equal.
It is more convenient to measure T in degrees Kelvin (K). The conversion factor between energy and degree is the Boltzmanns constant, kB = 1.38 X 10-23 J / K. Hence we redefine S and T by incorporating the conversion factor.
S = kB ln !
and
T " T / kB .
Lower T
3e 2e e
!E !S = e ( k B ln 3)
! 5! $ S1 = kB ln# & " 2!2!% ! 5! $ S2 = kB ln# & " 3!2!%
Higher T
3e 2e e
!E !S = 3e ( k B ln 3)
Same change in entropy, but more energy is given away by the system initially with higher T. Hence temperature is a measure of the tendency of an object to spontaneously give up energy to its surroundings.
Can we derive the equation of state of a gas (PV = nRT) from the concept of entropy?
Step 1: Evaluate S = kB ln ! .
! = !v " !r
S = kB ln ! =
S = kB ln ! =
3kB N ln E + Const. 2
" 3 3 E = k B NT = k B ( nN 0 )T 2 2
N0 = 6.02 x 1023 mol-1
1 dS 3k B N = = T dE 2E
n = # of moles,
R = kBN0 = 8.314 J mol-1 K-1 (Rydberg constant) Energy of n mole of ideal gas: Energy of one ideal gas molecule:
3 E = nRT . 2 E= 3 kB T . 2
F=
!px 2 mv x 2 = = mv x L. !t 2 L v x
N
2 2 vx + vy + v z2
2 2 2 2 Since P = F L = Nm v x V and E = N m v = N m v x , 2 2
Boltzmann and Gibbs Distribution Goal: Describe the probability distribution of a molecule of interest, with energy E a , in equilibrium with a macroscopic thermal reservoir with energy E B .
molecule of interest, a
The second law says that at equilibrium, or maximum entropy, all microstates are equally probable, with a probability P0 .
Surrounding, B
In the joint system, the probability of the molecule of interest in a particular state, E a , is
SB ( E B ) = k B ln(!B ( E B ))
S( E B )
Ea EB E tot
?
E tot = E B + E a ! E B
We can use the first-order Taylors expansion to approximate SB ( E B ) because E B is very near E tot .
SB ( E B ) ! SB ( E tot ) "
Hence we obtain
" S (E )% " (E % " (E % p( E a ) = P0 ! exp$ B tot ' exp$ a ' = A ! exp$ a ' # k B & # kB T & # kB T &
IMPORTANT: The probability distribution of the molecule of interest in equilibrium with its surrounding depends only on the temperature of the surrounding.
Now we can explain the velocity distribution at equilibrium using the Boltzmann distribution
" 1 2% ! mv " !E % $ 2 ' P = A exp $ = A exp $ kT ' # k BT ' & B $ ' # & " % $ !v2 ' = A exp $ ' k T " % B $ 2$ ' && # # m '
!=
2 vx =
kB T m
k A! !" B
Transition state
Ea
Arrhenius equation
!E
A
Suppose Ea of a reaction is 100 kJ mol-1 and a catalyst lowers this to 80 kJ mol-1. Approximately how much faster will the reaction proceed with the catalyst?
k (catalyzed) exp(!80 RT ) = = e 8 " 3000 k (uncatalyzed) exp(!100 RT )
RT ! 2.5 kJ mol-1 at room temperature
PrPC PrPSc
What about low energy barrier? The temperature-gated vanilloid receptor VR1, a pain receptor, is activated by heat T > 45 C.
Ea
A
!E
It was found experimentally that the probability of VR1 being in the open state is 0.04 at 40 C and 0.98 at 50 C. What is the energy barrier?
Equilibrium = A system reaching a state of maximum entropy. Equilibrium = All microstates are equally probable.
S = kB ln !
T = ( dE dS ) N ,V