Professional Documents
Culture Documents
Markov Processes
Markov Processes
Markov Processes
MARKOV CHAIN
Probability of being in a particular state at
any time period depends only on the state
in the immediately preceding time period.
---------------------------------------------------
MARKOV PROCESSES
A. MARKET SHARE:
MARKET RESEARCH:
Data - 100 Shoppers over 30 days.
Probability of selecting a store (State) in a given
period in terms of the store (state) that was selected
during the previous period.
Of all customers who shopped at ABC in a day,
90% shopped at ABC the following day while 10%
switched to XYZ.
Similarly, 80% shopped at XYZ the following day
and 20% switched to ABC.
Transition from a state in a given period to another
state in the following period.
MARKOV PROCESSES
TRANSITION PROBABILITIES:
Current Period Next Period
ABC XYZ
ABC 0.9 0.1
XYZ 0.2 0.8
Pij = Probability of making a transition
from state i in a given period to state j in
the next period.
MARKOV PROCESSES
0.9 ABC
0.9 0.1
XYZ
ABC 0.1 0.2
ABC
0.8
XYZ
MARKOV PROCESSES
1(0) 2(0) = 1 0 or 0 1
ABC XYZ
MARKOV PROCESSES
= 1 0 0.9 0.1
0.2 0.8
= 0.9 0.1
= .9 .1 0.9 0.1
0.2 0.8
= 0.83 0.17
1 (10) = 0.676
2 (10) = 0.324
For XYZ:
1 (n) 0 .2 .34 …… 0.648
1000 Customers
ABC = 667
XYZ = 333
MARKOV PROCESSES
XYZ -- Advertisements 0.85 0.15
0.20 0.80
1 = 0.57
2 = 0.43
State Category
State 1. Paid
State 2. Bad Debt
State 3. 0-3 months
State 4. 4-6 months
MARKOV PROCESSES
Transition Probabilities:
pij = probability of a Rs. in State i in one
month moving to State j in the next month.
N= Fundamental Matrix = (I - Q) -1
MATRIX
A= 0.7 -0.3
-0.3 0.9
d = (0.7) (0.9) - (-0.3) (-0.3) = (0.63- 0.09) = 0.54
A-1 = 0.9/0.54 0.3/0.54 = 1.67 0.56
0.3/0.54 0.7/0.54 0.56 1.30
If A = a11 a12
a21 a22 where d=a11a22- a21a12
Then A-1 = a22/d -a12/d
-a21/d a11/d
MARKOV PROCESSES
I- Q = 0.7 -0.3
-0.3 0.9 ;
1.67 0.56
N= 0.56 1.30 = (I-Q) -1
1 0 0 0
0 1 0 0
New P = -----------------------------
.6 0 .3 .1
.4 .2 .3 .1
= 4450 550
Collected Bad Debt
(Costs +
Discounts) 6 % Reduction in Bad Debt.
Markov Processes
• Trials of the Process: The events that trigger transitions of the system from one
state to another. In many applications successive time periods represent the trials
of the process.
• State of the System: The condition of the system at any particular trial or time
period.
• Transition Probability: Given the system is in state i during one period, the
transition probability pij is the probability that the system will be in state j
during the next period.
• State Probability: The probability that the system will be in any particular
state.(That is i(n) is the probability of the system being in state i in period n.)
• Steady State Probability: The probability that the system will be in any
particular state after a large number of transitions. Once steady state has been
reached , the state probabilities do not change from period to period.
• Absorbing State: A state is said to be absorbing if the probability of making
a transition out of that state is zero. Thus once the system has made a
transition into an absorbing state, it will remain there.
• Fundamental Matrix: A matrix necessary for the computation of probabilities
associated with absorbing states of a Markov Process.
THANK YOU