Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 33

Basic ideas of statistical physics

Dr. J P SINGH
Associate Professor in Physics
PGGC-11, Chandigarh
Basic ideas of statistical physics

Statistics is a branch of science which deals with


the collection, classification and interpretation
of numerical facts. When statistical concept are
applied to physics then a new branch of science
is called Physics Statistical Physics.
Trial → experiment→ tossing of coin

Event → outcome of experiment

Exhaustive events The total number of


possible outcomes in any trial

For tossing of coin exhaustive events = 2


Favourable events number if possible
outcomes (events) in any trial
Number of cases favourable in drawing a king from a
pack of cards is 4.

Mutually exclusive events no two of them


can occur simultaneously.
Either head up or tail up in tossing of coin.

Equally likely events every event is equally


preferred.

Head up or tail up
Independent events if occurrence of one
event is independent of other
Tossing of two coin

Probability

The probability of an event =

Number of cases in which the event occurs


Total number of ways
If m is the number of cases in which an event occurs and
n the number of cases in which an event fails, then

m
Probability of occurrence of the event =
mn

n
Probability of failing of the event =
mn
m n
or1  
mn mn

The sum of these two probabilities i.e. the total


probability is always one since the event may either
occur or fail.
Tossing of two coins :
The following combinations of Heads up (H) and
Tails up(T) are possible :
1 1 1 1
H1  , H 2  T1  , T2 
2 2 2 2
1 1 1
H1 H 2   
2 2 4
1 1 1 P  P1  P2  ...........Pn
T1T2   
2 2 4
1 1 1
H1 H 2   
2 2 4
1 1 1
H1 H 2   
2 2 4
Principle of equal a priori probability

The principle of assuming equal probability for


events which are equally likely is known as the
principle of equal a priori probability.

A priori really means something which exists in


our mind prior to and independently of the
observation we are going to make.
Distribution of 4 different Particles in
two Compartments of equal sizes
Particles must go in one of the compartments.

Both the compartments are exactly alike.

The particles are distinguishable. Let the four particles


be called as a, b, c and d.

The total number of particles in two compartments is 4


i.e.
2
 ni  4
i 1
The meaningful ways in which these four particles can
be distributed among the two compartments is shown
in table.
Macrostate
The arrangement of the particles of a system without
distinguishing them from one another is called
macrostate of the system.
In this example if 4 particles are distributed
in 2 compts, then the possible macrostates (4+1) =5

If n particles are to be distributed in 2 compts.


Then the no. of macrostates is = n+1
Microstate
The distinct arrangement of the particles of a system is
called its microstate.
For example, if four distinguishable particles
are distributed in two compartments, then
the no. of possible microstates (16) = 24

If n particles are to be distributed in 2


compartments. The no. of microstates is = 2n

=(Compts)particles
hermodynamic probability or frequency
The numbers of microstates in a given macrostate is
called thermodynamics probability or frequency of that
macrostate.

For distribution of 4 particles in 2 identical


compartments
W(4,0) =1
W(3,1) =4
W(2,2) = 6
W(1,3) = 4
W(0,4) =1
W depends on the distinguishable or indistinguishable
nature of the particles. For indistinguishable particles,
W=1
Micro- States macrostate Frequency probability
Comp 1 Comp 2 W

1
(4,0) 1
5
1
(3,1) 1 5
1
(2,2) 1
5
1
(1,3) 1
5
1
(0,4) 1
5
All the microstates of a system have equal a priori
probability.
1
Probability of a microstate =
Total no. of microstate
1 1 1
  4  n
16 2 2
Probability of a macrostate =
(no. of microstates in that microstate) 

(Probability of one miscrostate)


1 1 1
W  W  4 W  n
16 2 2
= thermodynamic probability× prob. Of one
microstate
Constraints
Restrictions imposed on a system are called constraints.

Example
total no. particles in two compartments = 4

Only 5 macrostates (4.0), (3,1), (2,2),(1,3),(0,4) possible

The macrostates (1,2), (4,2), (0,1), (0,0) etc not possible


Accessible and inaccessible states

The macrostates / microstates which are allowed under


given constraints are called accessible states.

The macrostates/ microstates which are not allowed


under given constraints are called inaccessible states

Greater the number of constraints, smaller the number


of accessible microstates.
Distribution of n Particles in 2 Compartments

The (n+1) macrostates are

(0, n) (1, n, 1)… (n1, n2)…… (2, n2),….. (n 0),


Out of these macrostates, let us consider a particular
macrostate (n1, n2) such that

n1 + n 2 = n

n particles can be arranged among themselves in

n
Pn = n! ways
These arrangements include meaningful as well as
meaningless arrangements.

Total number of ways = (no. of meaningful ways)


 (no.of meaningless ways)

n1 particles in comp. 1 can be arranged in


= n1 ! meaningless ways.

n2 particles in comp. 2 can be arranged in


= n2 ! meaningless ways.
n1 particles in comp. 1 and n2 particles in comp. 2 can be
arranged in
= n1 !  n2 ! meaningless ways.
Total number of ways
no. of meaningful ways 
no. of meaningless ways
n! n!
W (n1 , n2 )  
n1!n2 ! n1! (n – n1 )!
The total no. of microstates = 2n
n! 1
 ( n1 , n2 )  . n
n1! n2 ! 2
Prob. of distribution (r,n-r)

n! 1
(r , n  r )  . n
r!( n  r )! 2
Deviation from the state of Maximum
probability
The probability of the macrostate (r, n r) is
n! 1
(r , n  r )  . n
r!( n  r )! 2
When n particles are distributed in two comp., the
number of macrostates = (n+1)

The macrostate (r, n r) is of maximum probability if r =


n/2, provided n is even.
n n
The prob. of the most probable macrostate  , 
2 2
n! 1
Pmax   n
n n
! ! 2
2 2
Probability of macrostate is slightly deviate from most
probable state by x.(x<<n)
n n 
Then new macrostate will be   x,  x 
2 2 

n! 1
Px   n
n  x  2
  x !   x  !
2  2 
2
n 
 !
Px  Pmax 2 
n  n 
  x  !  x  !
2  2 
stirling’s formula ln n!  n ln n  n

Taylor’s theorem 2 3
y y
ln(1  y )  y    ......provided | y |  1
2 3

Do at home to get
 f 2n 
Px  Pmax exp  

 2 
x
Where f  is the fractional deviation from most
n/2
probable no. of particles in a cell
Discussion
n Px
Pmax
103 0.999

106 0.607

108 1
e 50
1010 1
5000
e
Thus we conclude that ass n increases the prob. of a
macrostate decreases more rapidly even for small
deviations w.r.t. the most probable state.
n1 > n2 > n3

n3

n2

n1

0.2 0.1 0 0.1 0.2

(2x / n)
Static and Dynamic systems

Static systems: If the particles of a system remain


at rest in a particular microstate, it is called static
system.

Dynamic systems: If the particles of a system


are in motion and can move from one microstate to
another, it is called dynamic system.
Equilibrium state of a dynamic system
A dynamic system continuously changes from one
microstate to another. Since all microstates of a
system have equal a priori probability, therefore, the
system should spend same amount of time in each of
the microstate.

If tobs be the time of observation in N microstates

The time spent by the system in a particular macrostate


tobs
tm 
N
Let microstate (n1 , n2 ) has frequency W (n1 , n2 )
Time spend t (n1 , n2 ) in macrostate (n1 , n2 )

 Average time spend in each microstate  No. of microstate

tobs
t (n1 , n2 )   W (n1 , n2 )
N
t (n1 , n2 )  tobs  P (n1 , n2 )
t (n1 , n2 )
P (n1 , n2 ) 
tobs

That is the fraction of the time spent by a dynamic


system in the macrostate is equal to the probability
of that state
Equilibrium state of dynamic system

The macrostate having maximum probability is termed


as most probable state. For a dynamic system consisting
of large number of particles, the probability of deviation
from the most probable state decrease very rapidly.
So majority of time the system stays in the most
probable state. If the system is disturbed, it again tends
to go towards the most probable state because the
probability of staying in the disturbed state is very small.
Thus, the most probable state behaves as the equilibrium
state to which the system returns again and again.
Distribution of n distinguishable

The thermodynamic prob. for macrostate (n1 , n2 , n3 ....nk )


n!
n!  k
W (n1 , n2 ....... nk ) 
n1!n2!.....nk !  ni !
i 1
particles in k compartments of unequal
Let the comp. 1 is divided into g1 no. of cells
sizes
Particle 1st can be placed in comp.1 in = g1 no. of ways
Particle 2nd can be placed in comp.1 in = g1 no. of ways
Particle n1th can be placed in comp.1 in = g1 no. of ways
n1 particles in comp. 1 can be placed in =g1n1
n2
g
n2 particles in comp. 2 can be placed in = 2
nk
nk particles in comp. k can be placed in = gk

total no. ways in which n particles in k comparmrnts


can be arranged in the cells in these compartments is
given by
 g1n1 .g 2n2 .g 3n3 ......g knk
k
ni
 gi
i 1
Thermodynamic probability for macrostate is
n!
W (n1 , n2 .......nk )   ( g 1 ) n1 ( g 2 ) n2 ....( g k ) nk
n1!n2!.....nk !

k ni
( gi )
W  n! 
i 1 ni !

You might also like