Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

RISING STAR ACADEMY

28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 1

Markov Chain
Def. A markov chain is a stochastic model describing a sequence of possible events in which the
probability of each event depends only on the state attained in the previous event.
A markov chain is a system that changes from state to state according to given probabilities. The
table of probabilities is called transition matrix (P).
State space : A state space is denoted by S is either finite or countably infinite i.e.,
S  1,2,3,4  finite

or S  1,2,3,4,... countable

State transition diagram : For a given transition matrix P, we can draw a state transition diagram as
follows :
Let state transition matrix P for state space S  1, 2,3 is given by

 
0 1 0
 
1 1
P 0
2 2
1 1 1
 
3 3 3
Then the corresponding state transition diagram is given by

Note : In a state transition probability matrix each row sum is always 1.


Def. Pijn  P  X n  j | X 0  i  be the transition probability of Markov Chain from an initial state i to a final

state j in n-steps.
1. 1-step transition probability : Pij1  P  X n1  j | X n  i  or P  X1  j | X 0  i 

2. m-step transition probability : Pij   P  X nm  j | X n  i  (Probability of moving i to j in m  steps )


m

www.risingstarmath.com
2

1 1
e.g., S  1,2 P  2 2
 
1 0

Then the corresponding state transition diagram is given by

1 1 1 3
To find P11 2  1   
2 2 2 4
OR
Alternate method using matrix P

1  
3 1
1 1 1
Find P   2 4
2 2 2  
4

2
   1 1
1 0 1 0
 2 2 
3
From this matrix P11 2 
4
 
3. Accessibility : Any state ‘j’ is said to be accessibility from the state ‘i’ If  n  0 such that Pij  0 .
n

4. Communicating states : Two states ‘i’ and ‘j’ are said to be communicating states if we can move
   
i  j and j  i i.e., i  j i.e.,  n, m  0 such that Pij  0, Pji  0 .
n m

5. Class of a state : If i  S then Cl  i    j | i  j

Note that i  Cl  i  , since i  i

Note : Classes corresponding to different states of S are either disjoint or same.


For e.g., S  1, 2,3

0 1 0 
P  1 0 0 
0 1 0 

The corresponding state transition diagram is given by

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 3

Cl 1  1, 2
Cl  3  3

Here Cl 1 and Cl  3 are disjoint classes.

6. Irreducible Markov Chain : A markov chain is said to be irreducible if we have only one closed
communicating class corresponding to all the states of state space S. Otherwise Markov chain is said to be
reducible i.e., if we have more than one (at least two) disjoint classes then Markov chain is always reducible.
0 1 0 
For e.g., S  1, 2,3 P  0 0 1 
1 0 0 

The corresponding state transition diagram is given by

Cl 1  1, 2,3


Cl  2   1, 2,3
Cl  3  1, 2,3

In this case all the states communicating with each other. So, we get only one closed communicating
class for state space S. So, this Markov chain is irreducible. In the previous example, we have two
disjoint classes.
Cl 1  1, 2
  Reducible Markov Chain
Cl  3  3 

www.risingstarmath.com
4


7. Period of a state : The period of a state is denoted by d  i  and given by d  i   gcd n  1 | Pii n  0 
e.g., (1) d 1  gcd 2,4,6,8,...  2

e.g., (2)

d 1  gcd 1,2,3,4,...  1

d  2  gcd 2,3,4,...  1

Note : A state with self loop has always period 1.


8. Aperiodic Markov Chain : In a Markov chain if period of each state is 1, then Markov chain is said to
be aperiodic otherwise it is periodic. i.e., if period of any state ‘i’, d  i   1 then ‘i’ state is said to be

aperiodic state otherwise periodic state. So, if  i  S , d  i   1 then Markov chain is aperiodic.

i.e., d  i   1  i is aperiodic state

d  i   1  i is periodic state

e.g., S  1,2

d 1  1 and d  2  1

 d 1  d  2  1  Markov chain is aperiodic.

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 5

Example : Consider a Markov chain with state space S  1,2,3,4 and transition probability matrix.

1 1 
2 0 0
2
 
1 1
0 0
P  2 2 
0 0 1 0 

1 1 1 1
 
4 4 4 4
Then check (i) Markov Chain is irreducible or NOT ?
(ii) Markov Chain is aperiodic or NOT ?
Solution : The corresponding state transition diagram is given by

Cl 1  1,2  Cl  2

Cl  3  3

Cl  4  4

Here, we have exactly three disjoint classes, so given Markov chain is reducible.
Now, each state is self loop. So, period of each state is 1.
 d 1  d  2  d  3  d  4  1

So, given Markov chain is aperiodic.

Some observation :
1. If ‘i’ and ‘j’ are in the same class i.e., i  j then period of both i and j are same i.e., d  i   d  j  .

2. If the given Markov chain is irreducible then period of all states are same.

www.risingstarmath.com
6
9. Absorbing state : A state ‘i’ is said to be absorbing state if it has a self loop with probability 1. i.e.,

Pii   1.
1

e.g.

10. Stochastic matrix : The transition matrix P is said to be Stochastic matrix if each pij  0 and each row

sum is 1.
Note : If each column sum is also 1 then matrix is said to be doubly Stochastic.
 
11. First visit Probability : fij m   P  X nm  j ; X k  j / X n  i 
 n k  n  m 
e.g.,

1 1 1 1 1 1 1
p113  1    1   
2 2 2 2 2 2 2
Now f113  0

 1 ; i  Recurrent state
12. Recurrent and transient states : fii   fii   
n

n 1  1 ; i  Transient state
Remark : ( fii  P (Chain ever returns to state ‘i’ starting from ‘i’))

Convergent  Transient state 
Note : If  P    Divergent  Recurrent state 
ii
n

n1  

Since if  P 
n 1
ii
n
is convergent.

 lim Pii   0  i is transient.


n
n

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 7

e.g.,

Cl 1  1 and Cl  2  2

1
f111  f 221  1
2
f11 2  0 f 22 2  0
f113  0 f 22 3  0
 
 
1
 f11   f11 n  
 1 f 22   f 22 n   1
n 1 2 n 1

 (1) transient state  (2) is recurrent state


Another Method to find Recurrent and Transient state :
Closed Communicating Class : A class ‘C’ is said to be closed if for i  C , j  C c such that

Pij n   0  n i.e., it is NOT possible to go ‘i’ to ‘j’ in any number of steps.

In a finite Markov Chain :


1. All the states of a closed communicating class are recurrent.
2. All the states of a class which is not closed are transient.
e.g.,

Cl 1  1, 2


 Reducible
Cl  3  3 

Cl 1 is NOT closed as p131  0

www.risingstarmath.com
8

Cl  3 is closed as p31
 n  n
 0, p32 0

So, we have states 1 and 2 are transient states and state 3 is recurrent state.
Results :
1. If i  j i.e., ‘i’ and ‘j’ are in same class then
‘i’ is recurrent state  ‘j’ is recurrent state and ‘i’ transient  ‘j’ is transient.
2. In a finite Markov chain all states cannot be transient states i.e.,  atleast one recurrent state.
3. If chain is finite and irreducible  all states are recurrent states.
4. Absorbing states are always recurrent states.
Example : Let  X n n1 be a Markov chain on a state space  N ,  N  1, ...  1,0,1,2,..., N 1, N

1 1
Such that Pi ,i 1  Pi ,i 1    N  1  i  N  1 and PN , N  PN , N 1  P N , N  P N , N 1 
2 2
Then
1. Chain is irreducible
2. Chain is aperiodic
3. State ‘0’ is recurrent
4. State ‘0’ is transient.
Solution :

Here, Cl  0   N ,..., 2, 1,0,1,2,..., N 1, N

We have only one closed communicating class. So, given Markov chain is irreducible.
Also, d  N   d   N   1  self loop
So, d   N   d   N  1...  d  0  d 1...d  N   1

So, period of each state is 1.


So, Markov chain is aperiodic.
Since chain is finite and irreducible.
So, all states are recurrent.
 ‘0’ is a recurrent state.

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 9

NET-June-2018
Que : Consider a Markov chain having state space S  1,2,3,4 with transition probability matrix

1 1 
2 0 0
2
 
1 1 1 1
4 4
P   Pij 
4 4
given by P  
1 1 1 
0
3 3 3
1 1 
 0 0
2 2 
Then

1. lim P22 n   0 ,
n
 P   
n 0
n
22


2. lim P22 n   0 ,
n
 P   
n 0
n
22


3. lim P22 n   1 ,
n
 P   
n 0
n
22


4. lim P22 n   1 ,
n
 P   
n 0
n
22

NET-Dec-2017 (Part-B)
Que : Consider a Markov Chain  X n | n  0 with static space 1,2,3 and transition matrix

 1 1
0 2 2
 
P
1 1
0
2 2
1 1 
 0
 2 2 

Then P  X 3  1 / X 0  1 equals

1. 0
1
2.
4

www.risingstarmath.com
10

1
3.
2
1
4.
8
NET-June-2015 (Part-C)
Que : Consider a Markov Chain with state space S  0,1,2,3 and with transition probability matrix

2 1 
3 0 0
3
 
1 0 0 0
P is given by P   1 1 
 0 0
2 2 
1 1 1 1
 
4 4 4 4
Then
1. ‘1’ is a recurrent state
2. ‘0’ is a recurrent state
3. ‘3’ is a recurrent state
4. ‘2’ is a recurrent state

Mean Recurrence Time : If ‘i’ is a recurrent state, then we have fii   fii n   1 .
n 1

So, we have mean recurrence time i (or expected time to return to state ‘i’ starting from state ‘i’) is

  ; i is a positive recurrent
given by i   nfii   
n

n1   ; if i is a null recurrent


Note :
1. Recurrent state has two types
(i) positive recurrent  i   

(ii) Null recurrent  i   

2. If a Markov chain is finite and irreducible, then all the states are positive recurrent
3. If an infinite Markov Chain is irreducible, then all the states can be transient.

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 11

Example : Let  X n  be a Markov chain on state space S  0,1,2,3,... with

 P0 P1 P2 ... ...
1 0 0 ... ...
  
P0 1 0

... ... such that

 Pi  1 and
i 0
 iP  
i 1
i

0 0 1 ... ...
 ... ... ... ... ...
Then
1. Chain is reducible
2. Chain is irreducible and transient
3. Chain is positive recurrent
4. Chain is Null recurrent
Solution :

Cl  0  0,1,2,3,4,5,...  irreducible.

To check ‘0’ is recurrent or transient state :


f00   P0
1

f00   P1 1  P1 ( 0 1  0 )
2

f00   P2 11  P2 ( 0  2 1  0 )
3

f00   P3 111  P3 ( 0  3  2 1  0 )
4

……………………………………………………….

 f 00   f 00 n   f 001  f 00 2  f 003  ...
n 1

 P0  P1  P2  ...

www.risingstarmath.com
12

  Pi  1
i 0

 f00  1  ‘0’ is recurrent state.


So, all states are recurrent state.
Now, we have to check for positive recurrent and null recurrent.

 0   n  f 00 n   1 P0  2  P1  3  P2  ...
n 1

  P0  P1  P2  ...  1 P1  2  P2  3  P2  ...


  
  Pi   i  Pi  1   i  Pi
i 0 i 1 i 1

 0    ‘0’ is a +ve recurrent state.


So, all states are positive recurrent. So, given Markov Chain is positive recurrent. So, option (3) is
correct and all other option are wrong.
Stationary Distribution :
For a Markov Chain the vector    i  ; i  S is a stationary distribution. If it satisfies following

conditions :
1.    P
2.  i  0  i  S

3. 
iS
i 1

Note : For a finite Markov Chain we always have a stationary distribution and for an infinite Markov
Chain stationary distribution may or may not exist.

Example : Let  X n  be a Markov chain on state space 0,1 such that

P  X n1  1/ X n  0  P0  1  P  X n1  0 / X n  0 and P  X n1  1 / X n  1  P1  1  P  X n1  0 / X n  1 .

Let    0 , 1  be a stationary distribution.

Then
1. 1  P1

2. 1  P0

1
3.  1 
2
P0
4.  1 
1  P1  P0

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 13

1  P0 P0 
Solution : P  
1  P1 P1 

Now,    0 , 1  is a stationary distribution.

    P  ,  i  0  i  0,1 and 
i 0,1
i 1

i.e.,  0  1  1 …(*)

1  P0 P0 
and  0 , 1    0 , 1  
1  P1 P1 

  0 , 1    0 1  P0   1 1  P1  , 0 P0  1P1 

  0   0 1  P0   1 1  P1 

and 1   0 P0  1P1 …(1)

(1)  P0 0   P1 1 1  0 …(2)

From (2)   0 
1  P1  1
P0

1  P1  1  
From (*)
P0
1 1 (*)  0  1  1

 1  P1  1  1P0  P0

 1 1  P1  P0   P0

P0
 1 
1  P1  P0
So, option (4) is correct.

www.risingstarmath.com
14

Types of stationary distribution :


Markov Chain
Finite Infinite (Countably)
Always solution exist Solution may or may not exist
i.e., always stationary distribution exist i.e., stationary distribution may or may not exist.
(unique or infinite) (unique or infinite or no solution)
Irreducible and aperiodic  unique solution Irreducible, aperiodic and all state are positive
recurrent  unique solution
Otherwise, solve and we get more than one (Irreducible and transient) or (irreducible and
stationary distribution Null recurrent)  No solution
i.e., infinite solutions.
Otherwise, solve if we get more than one
stationary distribution  infinite solution

NET-June-2016 (Part-C)
Que : Let  X n  be a finite Markov Chain then number of stationary distribution can be ?

1. 0
2. 2
3. 1
4. 
NET-June-2017 (Part-C)
Que : Which of the following statements are correct ?
1. For a finite state Markov Chain there is at least one transient state.
2. For a finite Markov Chain there is atleast one stationary distribution.
3. For a countable state Markov Chain, every state can be transient.
4. For an aperiodic countable state Markov Chain there is atleast one stationary distribution.
NET-June-2016 (Part-C)
Que : Consider the Markov Chain with state space S  1,2,3,..., n where n  10 . Suppose that the

transition probability matrix P   Pij  satisfies

Pij  0 if i  j is even
Pij  0 if i  j is odd

Then
1. The Markov Chain is irreducible.
2.  a state ‘i’ which is transient.
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 15

3.  a state ‘i’ with period d  i   1

4. There are infinitely many stationary distribution.

Limiting distribution : We know that   1,  2 ,...,  i  where  i  Probability of being in state

i  average proportion of time chain is in state ‘i’  lim P  X n  i 


n

Results :
1. lim Pij n   lim Pjj n   j
n n

2. If ‘j’ is transient state then lim Pjj n   0


n

3. If ‘j’ is periodic, lim Pjj n  does not exist.


n

4. If Markov Chain is irreducible, aperiodic and positive recurrent then lim Pij n    j (independent of ‘i’)
n

5. If ‘j’ is a null recurrent state then lim Pjj n   0


n

1
6. If Markov Chain is irreducible, aperiodic and positive recurrent then lim Pij     j where  j is the
n
n j
mean recurrence time of state j.
7. If Markov Chain is irreducible, aperiodic and doubly Stochastic n  n Markov Chain then
1 1 1 1 1
   , , ,...,  i.e.,  j   1  j  n
n n n n n
Note : If Markov chain is irreducible, aperiodic and positive recurrent then stationary distribution is
the limiting distribution for given Markov Chain.

NET-June-2019 (Part-B)
1 0 0 0 0
1 1 1 
 0 0
3 3 3 
 1 1 1 
Que : Consider a Markov Chain with state space 0,1,2,3,4 and transition matrix P   0 0
 3 3 3 
 1 1 1
0 0 
 3 3 3
 0 0 0 0 1 

www.risingstarmath.com
16

Then lim P23 n  equals


n

1
1.
3
1
2.
2
3. 0
4. 1
NET-June-2019 (Part-C)
1 5 1
4 8 8
 
Que : Consider a Markov Chain with state space 0,1,2 and transition matrix P  
1 3
0 .
4 4
1 3 1 

 2 8 8 
Then which of the following are true ?
1. lim P12 n   0
n

2. lim P12 n   lim P21 n 


n n

1
3. lim P22  
n
n 8
1
4. lim P21  
n
n 3
NET-Dec-2018 (Part-C)
1 1 
2 0
2
 
Que : Consider a Markov Chain with transition probability matrix P given by P   0
1 1
 2 2
1 1 1 

 3 3 3 

For any two states ‘i’ and ‘j’, let Pij  denotes the n-step transition probability of going from ‘i’ to ‘j’.
n

Identify the correct statements.


2
1. lim P11   2. lim P21 n   0
n
n 9 n

1 1
3. lim P32   4. lim P13  
n n
n 3 n 3

www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 17

NET-Dec-2019 (Part-B)
Que : Let  X n : n  0 be a two state Markov Chain with state space S  0,1 and transition matrix

1 1
2 2  . Assume X  0, the expected return time to 0 is
P  0
1 2
 3 3 
5
1.
2
9
2.
4
3
3.
2
4. 3
NET-Dec-2019 (Part-C)
Que : Let  X n : n  0 be a Markov chain with state space N  0 such that the transition probability are

 q for j0

given by Pij  1  q for j  i 1
 0
 otherwise

for i  0,1, 2,... where 0  q  1 . Then which of the following statements are correct ?
1. The Markov Chain is irreducible.
2. The Markov Chain is aperiodic.
3. P00 n  q for all n  1
4. The Markov Chain is positive recurrent.
NET-Dec-2015 (Part-C)
Que : Let  X n : n  0 be a Markov Chain on the state space S  1,2,3,...,23 with transition probability

1
given by Pi ,i1  Pi ,i1   2  i  22
2
1 1
P1,2  P1,23  , P23,1  P23,22  . Then which of the following statements are true ?
2 2
1.  X n : n  0 has a unique stationary distribution.
2.  X n : n  0 is irreducible.
www.risingstarmath.com
18

1
3. P  X n  1 
23
4.  X n : n  0 is recurrent.

Assam Meghalaya Paper


NET-Dec-2019 (Part-B)
Que : Consider a Markov Chain with state space S  0,1,...,1000 and transition probabilities given

1
by pi ,i1  1 for 0  i  999 and p1000,1000   p1000,0 .
2
Then
1
1. lim pij  
n
for all i, j
n 1000
1
2. lim pij   for all i, j  999
n
n 1000
1
3. lim pij  
n
for all i, j
n 1001
1
4. lim pij   for all i, j  999
n
n 1002

NET-Nov-2020 (Part-B)
Que : Consider a Markov chain X 0 , X 1 , X 2 ,... with state space S. Suppose i, j  S are two states which
communicate with each other. Which of the following statements is NOT correct ?
1. Period of i  Period of j
2. ‘i’ is recurrent if and only if j is recurrent.
3. lim P  X n  i / X 0  k   lim P  X n  j / X 0  k  for all k  S .
n n

4. lim P  X n  j / X 0  i   lim P  X n  j / X 0  j 
n n

Part-C
Que : Consider a Markov chain with countable state space S. Identify the correct statements.
1. If the Markov Chain is aperiodic and irreducible then there exists a stationary distribution.
2. If the Markov Chain is aperiodic and irreducible then there is atmost one stationary distribution.
3. If S is finite there exist a stationary distribution.
4. If S is finite then there is exactly one stationary distribution.

www.risingstarmath.com

You might also like