Professional Documents
Culture Documents
Markov Chain Final
Markov Chain Final
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 1
Markov Chain
Def. A markov chain is a stochastic model describing a sequence of possible events in which the
probability of each event depends only on the state attained in the previous event.
A markov chain is a system that changes from state to state according to given probabilities. The
table of probabilities is called transition matrix (P).
State space : A state space is denoted by S is either finite or countably infinite i.e.,
S 1,2,3,4 finite
or S 1,2,3,4,... countable
State transition diagram : For a given transition matrix P, we can draw a state transition diagram as
follows :
Let state transition matrix P for state space S 1, 2,3 is given by
0 1 0
1 1
P 0
2 2
1 1 1
3 3 3
Then the corresponding state transition diagram is given by
state j in n-steps.
1. 1-step transition probability : Pij1 P X n1 j | X n i or P X1 j | X 0 i
www.risingstarmath.com
2
1 1
e.g., S 1,2 P 2 2
1 0
1 1 1 3
To find P11 2 1
2 2 2 4
OR
Alternate method using matrix P
1
3 1
1 1 1
Find P 2 4
2 2 2
4
2
1 1
1 0 1 0
2 2
3
From this matrix P11 2
4
3. Accessibility : Any state ‘j’ is said to be accessibility from the state ‘i’ If n 0 such that Pij 0 .
n
4. Communicating states : Two states ‘i’ and ‘j’ are said to be communicating states if we can move
i j and j i i.e., i j i.e., n, m 0 such that Pij 0, Pji 0 .
n m
0 1 0
P 1 0 0
0 1 0
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 3
Cl 1 1, 2
Cl 3 3
6. Irreducible Markov Chain : A markov chain is said to be irreducible if we have only one closed
communicating class corresponding to all the states of state space S. Otherwise Markov chain is said to be
reducible i.e., if we have more than one (at least two) disjoint classes then Markov chain is always reducible.
0 1 0
For e.g., S 1, 2,3 P 0 0 1
1 0 0
In this case all the states communicating with each other. So, we get only one closed communicating
class for state space S. So, this Markov chain is irreducible. In the previous example, we have two
disjoint classes.
Cl 1 1, 2
Reducible Markov Chain
Cl 3 3
www.risingstarmath.com
4
7. Period of a state : The period of a state is denoted by d i and given by d i gcd n 1 | Pii n 0
e.g., (1) d 1 gcd 2,4,6,8,... 2
e.g., (2)
d 2 gcd 2,3,4,... 1
aperiodic state otherwise periodic state. So, if i S , d i 1 then Markov chain is aperiodic.
d i 1 i is periodic state
e.g., S 1,2
d 1 1 and d 2 1
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 5
Example : Consider a Markov chain with state space S 1,2,3,4 and transition probability matrix.
1 1
2 0 0
2
1 1
0 0
P 2 2
0 0 1 0
1 1 1 1
4 4 4 4
Then check (i) Markov Chain is irreducible or NOT ?
(ii) Markov Chain is aperiodic or NOT ?
Solution : The corresponding state transition diagram is given by
Cl 1 1,2 Cl 2
Cl 3 3
Cl 4 4
Here, we have exactly three disjoint classes, so given Markov chain is reducible.
Now, each state is self loop. So, period of each state is 1.
d 1 d 2 d 3 d 4 1
Some observation :
1. If ‘i’ and ‘j’ are in the same class i.e., i j then period of both i and j are same i.e., d i d j .
2. If the given Markov chain is irreducible then period of all states are same.
www.risingstarmath.com
6
9. Absorbing state : A state ‘i’ is said to be absorbing state if it has a self loop with probability 1. i.e.,
Pii 1.
1
e.g.
10. Stochastic matrix : The transition matrix P is said to be Stochastic matrix if each pij 0 and each row
sum is 1.
Note : If each column sum is also 1 then matrix is said to be doubly Stochastic.
11. First visit Probability : fij m P X nm j ; X k j / X n i
n k n m
e.g.,
1 1 1 1 1 1 1
p113 1 1
2 2 2 2 2 2 2
Now f113 0
1 ; i Recurrent state
12. Recurrent and transient states : fii fii
n
n 1 1 ; i Transient state
Remark : ( fii P (Chain ever returns to state ‘i’ starting from ‘i’))
Convergent Transient state
Note : If P Divergent Recurrent state
ii
n
n1
Since if P
n 1
ii
n
is convergent.
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 7
e.g.,
1
f111 f 221 1
2
f11 2 0 f 22 2 0
f113 0 f 22 3 0
1
f11 f11 n
1 f 22 f 22 n 1
n 1 2 n 1
www.risingstarmath.com
8
Cl 3 is closed as p31
n n
0, p32 0
So, we have states 1 and 2 are transient states and state 3 is recurrent state.
Results :
1. If i j i.e., ‘i’ and ‘j’ are in same class then
‘i’ is recurrent state ‘j’ is recurrent state and ‘i’ transient ‘j’ is transient.
2. In a finite Markov chain all states cannot be transient states i.e., atleast one recurrent state.
3. If chain is finite and irreducible all states are recurrent states.
4. Absorbing states are always recurrent states.
Example : Let X n n1 be a Markov chain on a state space N , N 1, ... 1,0,1,2,..., N 1, N
1 1
Such that Pi ,i 1 Pi ,i 1 N 1 i N 1 and PN , N PN , N 1 P N , N P N , N 1
2 2
Then
1. Chain is irreducible
2. Chain is aperiodic
3. State ‘0’ is recurrent
4. State ‘0’ is transient.
Solution :
We have only one closed communicating class. So, given Markov chain is irreducible.
Also, d N d N 1 self loop
So, d N d N 1... d 0 d 1...d N 1
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 9
NET-June-2018
Que : Consider a Markov chain having state space S 1,2,3,4 with transition probability matrix
1 1
2 0 0
2
1 1 1 1
4 4
P Pij
4 4
given by P
1 1 1
0
3 3 3
1 1
0 0
2 2
Then
1. lim P22 n 0 ,
n
P
n 0
n
22
2. lim P22 n 0 ,
n
P
n 0
n
22
3. lim P22 n 1 ,
n
P
n 0
n
22
4. lim P22 n 1 ,
n
P
n 0
n
22
NET-Dec-2017 (Part-B)
Que : Consider a Markov Chain X n | n 0 with static space 1,2,3 and transition matrix
1 1
0 2 2
P
1 1
0
2 2
1 1
0
2 2
Then P X 3 1 / X 0 1 equals
1. 0
1
2.
4
www.risingstarmath.com
10
1
3.
2
1
4.
8
NET-June-2015 (Part-C)
Que : Consider a Markov Chain with state space S 0,1,2,3 and with transition probability matrix
2 1
3 0 0
3
1 0 0 0
P is given by P 1 1
0 0
2 2
1 1 1 1
4 4 4 4
Then
1. ‘1’ is a recurrent state
2. ‘0’ is a recurrent state
3. ‘3’ is a recurrent state
4. ‘2’ is a recurrent state
Mean Recurrence Time : If ‘i’ is a recurrent state, then we have fii fii n 1 .
n 1
So, we have mean recurrence time i (or expected time to return to state ‘i’ starting from state ‘i’) is
; i is a positive recurrent
given by i nfii
n
2. If a Markov chain is finite and irreducible, then all the states are positive recurrent
3. If an infinite Markov Chain is irreducible, then all the states can be transient.
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 11
P0 P1 P2 ... ...
1 0 0 ... ...
P0 1 0
... ... such that
Pi 1 and
i 0
iP
i 1
i
0 0 1 ... ...
... ... ... ... ...
Then
1. Chain is reducible
2. Chain is irreducible and transient
3. Chain is positive recurrent
4. Chain is Null recurrent
Solution :
Cl 0 0,1,2,3,4,5,... irreducible.
f00 P1 1 P1 ( 0 1 0 )
2
f00 P2 11 P2 ( 0 2 1 0 )
3
f00 P3 111 P3 ( 0 3 2 1 0 )
4
……………………………………………………….
f 00 f 00 n f 001 f 00 2 f 003 ...
n 1
P0 P1 P2 ...
www.risingstarmath.com
12
Pi 1
i 0
conditions :
1. P
2. i 0 i S
3.
iS
i 1
Note : For a finite Markov Chain we always have a stationary distribution and for an infinite Markov
Chain stationary distribution may or may not exist.
Then
1. 1 P1
2. 1 P0
1
3. 1
2
P0
4. 1
1 P1 P0
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 13
1 P0 P0
Solution : P
1 P1 P1
P , i 0 i 0,1 and
i 0,1
i 1
i.e., 0 1 1 …(*)
1 P0 P0
and 0 , 1 0 , 1
1 P1 P1
0 , 1 0 1 P0 1 1 P1 , 0 P0 1P1
0 0 1 P0 1 1 P1
From (2) 0
1 P1 1
P0
1 P1 1
From (*)
P0
1 1 (*) 0 1 1
1 P1 1 1P0 P0
1 1 P1 P0 P0
P0
1
1 P1 P0
So, option (4) is correct.
www.risingstarmath.com
14
NET-June-2016 (Part-C)
Que : Let X n be a finite Markov Chain then number of stationary distribution can be ?
1. 0
2. 2
3. 1
4.
NET-June-2017 (Part-C)
Que : Which of the following statements are correct ?
1. For a finite state Markov Chain there is at least one transient state.
2. For a finite Markov Chain there is atleast one stationary distribution.
3. For a countable state Markov Chain, every state can be transient.
4. For an aperiodic countable state Markov Chain there is atleast one stationary distribution.
NET-June-2016 (Part-C)
Que : Consider the Markov Chain with state space S 1,2,3,..., n where n 10 . Suppose that the
Pij 0 if i j is even
Pij 0 if i j is odd
Then
1. The Markov Chain is irreducible.
2. a state ‘i’ which is transient.
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 15
Limiting distribution : We know that 1, 2 ,..., i where i Probability of being in state
Results :
1. lim Pij n lim Pjj n j
n n
4. If Markov Chain is irreducible, aperiodic and positive recurrent then lim Pij n j (independent of ‘i’)
n
1
6. If Markov Chain is irreducible, aperiodic and positive recurrent then lim Pij j where j is the
n
n j
mean recurrence time of state j.
7. If Markov Chain is irreducible, aperiodic and doubly Stochastic n n Markov Chain then
1 1 1 1 1
, , ,..., i.e., j 1 j n
n n n n n
Note : If Markov chain is irreducible, aperiodic and positive recurrent then stationary distribution is
the limiting distribution for given Markov Chain.
NET-June-2019 (Part-B)
1 0 0 0 0
1 1 1
0 0
3 3 3
1 1 1
Que : Consider a Markov Chain with state space 0,1,2,3,4 and transition matrix P 0 0
3 3 3
1 1 1
0 0
3 3 3
0 0 0 0 1
www.risingstarmath.com
16
1
1.
3
1
2.
2
3. 0
4. 1
NET-June-2019 (Part-C)
1 5 1
4 8 8
Que : Consider a Markov Chain with state space 0,1,2 and transition matrix P
1 3
0 .
4 4
1 3 1
2 8 8
Then which of the following are true ?
1. lim P12 n 0
n
1
3. lim P22
n
n 8
1
4. lim P21
n
n 3
NET-Dec-2018 (Part-C)
1 1
2 0
2
Que : Consider a Markov Chain with transition probability matrix P given by P 0
1 1
2 2
1 1 1
3 3 3
For any two states ‘i’ and ‘j’, let Pij denotes the n-step transition probability of going from ‘i’ to ‘j’.
n
1 1
3. lim P32 4. lim P13
n n
n 3 n 3
www.risingstarmath.com
RISING STAR ACADEMY
28-A, Jia Sarai, Near Hauz Khas Metro Station, New Delhi, Mob : 07838699091
439/29,Chhotu Ram Nagar, Near Power House,Delhi Road, Rohtak , Mob : 09728862122
NET Markov Chain Page 17
NET-Dec-2019 (Part-B)
Que : Let X n : n 0 be a two state Markov Chain with state space S 0,1 and transition matrix
1 1
2 2 . Assume X 0, the expected return time to 0 is
P 0
1 2
3 3
5
1.
2
9
2.
4
3
3.
2
4. 3
NET-Dec-2019 (Part-C)
Que : Let X n : n 0 be a Markov chain with state space N 0 such that the transition probability are
q for j0
given by Pij 1 q for j i 1
0
otherwise
for i 0,1, 2,... where 0 q 1 . Then which of the following statements are correct ?
1. The Markov Chain is irreducible.
2. The Markov Chain is aperiodic.
3. P00 n q for all n 1
4. The Markov Chain is positive recurrent.
NET-Dec-2015 (Part-C)
Que : Let X n : n 0 be a Markov Chain on the state space S 1,2,3,...,23 with transition probability
1
given by Pi ,i1 Pi ,i1 2 i 22
2
1 1
P1,2 P1,23 , P23,1 P23,22 . Then which of the following statements are true ?
2 2
1. X n : n 0 has a unique stationary distribution.
2. X n : n 0 is irreducible.
www.risingstarmath.com
18
1
3. P X n 1
23
4. X n : n 0 is recurrent.
1
by pi ,i1 1 for 0 i 999 and p1000,1000 p1000,0 .
2
Then
1
1. lim pij
n
for all i, j
n 1000
1
2. lim pij for all i, j 999
n
n 1000
1
3. lim pij
n
for all i, j
n 1001
1
4. lim pij for all i, j 999
n
n 1002
NET-Nov-2020 (Part-B)
Que : Consider a Markov chain X 0 , X 1 , X 2 ,... with state space S. Suppose i, j S are two states which
communicate with each other. Which of the following statements is NOT correct ?
1. Period of i Period of j
2. ‘i’ is recurrent if and only if j is recurrent.
3. lim P X n i / X 0 k lim P X n j / X 0 k for all k S .
n n
4. lim P X n j / X 0 i lim P X n j / X 0 j
n n
Part-C
Que : Consider a Markov chain with countable state space S. Identify the correct statements.
1. If the Markov Chain is aperiodic and irreducible then there exists a stationary distribution.
2. If the Markov Chain is aperiodic and irreducible then there is atmost one stationary distribution.
3. If S is finite there exist a stationary distribution.
4. If S is finite then there is exactly one stationary distribution.
www.risingstarmath.com