Markov Processes

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 30

MARKOV PROCESSES

• Trials of the system: Events/ Points of Time


• State of the System
• Transition Probability
• State Probability
• Steady State probability
• Absorbing State
• Fundamental Matrix
MARKOV PROCESSES

•The Evolution of Systems over Repeated


Trials
•State of the System Cannot be
Determined with Certainty
•Transition Probabilities are Used
•Probability of the System being in a
particular State at a time
MARKOV PROCESSES

• Maintenance- M/C Failure - Replacement-


Inspection Strategy.
• Brand Switching- Duration
• Market Share
• Disbursement Analysis
• A/C Receivable Analysis
MARKOV PROCESSES

MARKOV CHAIN
Probability of being in a particular state at
any time period depends only on the state
in the immediately preceding time period.
---------------------------------------------------
MARKOV PROCESSES

A. MARKET SHARE:

Trials of the Process: Shopping Trips


(Daily/Weekly/Monthly)
State of the System: Store Selected in a given period.

Two Shopping Alternatives- Two States (Finite)


State 1. Customer Shops at ABC
State 2. Customer Shops at XYZ.

The System is in State 2 at Trial 4 =>


MARKOV PROCESSES

MARKET RESEARCH:
Data - 100 Shoppers over 30 days.
Probability of selecting a store (State) in a given
period in terms of the store (state) that was selected
during the previous period.
Of all customers who shopped at ABC in a day,
90% shopped at ABC the following day while 10%
switched to XYZ.
Similarly, 80% shopped at XYZ the following day
and 20% switched to ABC.
Transition from a state in a given period to another
state in the following period.
MARKOV PROCESSES

TRANSITION PROBABILITIES:
Current Period Next Period
ABC XYZ
ABC 0.9 0.1
XYZ 0.2 0.8
Pij = Probability of making a transition
from state i in a given period to state j in
the next period.
MARKOV PROCESSES

P = p11 p12 = 0.9 0.1


p21 p22 0.2 0.8

0.9 ABC

0.9 0.1
XYZ
ABC 0.1 0.2
ABC
0.8
XYZ
MARKOV PROCESSES

i(n) = Probability that the System is in


State ‘i’ in period ‘n’. (State Probability)
i = state; n = number of transactions.

1(0) 2(0) = 1 0 or 0 1
ABC XYZ
MARKOV PROCESSES

 (n) = 1(n) 2(n) Vector of the state


probabilities of the system in period n.
State probabilities for period n+1:
 (next period) =  (current period) P
 (n+1) =  (n) P
 (0) = 1, 0
therefore,  (1) = (0) P (Period 1)
MARKOV PROCESSES

 (1) = (0) P (Period 1)


or 1 (1)  2(1) = 1(0)  2(0) p11 p12
p21 p22

= 1 0 0.9 0.1
0.2 0.8
= 0.9 0.1

1 (1) = 0.9 ;  2(1) = 0.1


MARKOV PROCESSES

 (2) = (1) P (Period 2)


or 1 (2)  2(2) = 1(1)  2(1) p11 p12
p21 p22

= .9 .1 0.9 0.1
0.2 0.8
= 0.83 0.17

1 (2) = 0.83 ;  2(2) = 0.17


MARKOV PROCESSES

 (3) = (2) P (Period 3)


 (4) = (3) P (Period 4)
……………………………….
 (n+1) = (n) P

State Probabilities for Future Periods (ABC)


Period (n)
State 0 1 2 3 4 5 6
Prob.
1 (n) 1 .9 .83 .781 .747 .723 .706
2 (n) 0 .1 .17 .219 .253 .277 .294
MARKOV PROCESSES

State Probabilities for Future Periods (ABC)


Period (n)
State 7 8 9 10
Prob.
1 (n) .694 .686 .680 .676

2 (n) .306 .314 .320 .324


MARKOV PROCESSES

1 (10) = 0.676
2 (10) = 0.324

For XYZ:
1 (n) 0 .2 .34 …… 0.648

2 (n) 1 .8 .66 …… 0.352


MARKOV PROCESSES
Steady State Probabilities:
The probabilities after a large number of
transitions (is dependent of the beginning
state of the system).
1 = Steady state probability for state 1.
2 = Steady state probability for state 2.
1(n+1) 2(n+1) = 1(n) 2(n) p11 p12
p21 p22
0.9 0.1
0.2 0.8
MARKOV PROCESSES
1 = 0.9 1 + 0.2 2
2 = 0.1 1 + 0.8 2
1 + 2 = 1 or 2 = (1- 1)
1 = 0.9 1 + 0.2 (1- 1)
= 0.9 1 +0.2 -0.2 1
or 1 = 2/3 = 0.667; 2 = 1/3 = 0.333

1000 Customers
ABC = 667
XYZ = 333
MARKOV PROCESSES
XYZ -- Advertisements 0.85 0.15
0.20 0.80

1 = 0.57
2 = 0.43

Profit/ Customer = Rs. ------


Cost of Promotion = Rs. -----
MARKOV PROCESSES
B. ACCOUNTS RECEIVABLE ANALYSIS:
Two aging categories:
(i) 0-3 months (ii) 4-6 months
Bad Debt > 6 months.
March 31 # Rs. 5000 A/C receivable.
Date Rupees
December 10 2000
February 10 1000
March 18 500
March 30 1500
How much bad debt? 31 March
Total Balance Method.
MARKOV PROCESSES

State Category
State 1. Paid
State 2. Bad Debt
State 3. 0-3 months
State 4. 4-6 months
MARKOV PROCESSES
Transition Probabilities:
pij = probability of a Rs. in State i in one
month moving to State j in the next month.

p11 p12 p13 p14 1 0 0 0


p21 p22 p23 p24 0 1 0 0
P = p31 p32 p33 p34 = .4 0 .3 .3
p41 p42 p43 p44 .4 .2 .3 .1
MARKOV PROCESSES
Absorbing State:
Probability of a transition to any other state
is 0. (The system remains in the state indefinitely)

Do not compute steady state probabilities.


Absorbing state probabilities for
Rs. in (4-6) month category.
MARKOV PROCESSES
Fundamental Matrix:
Partitioning the matrix (4 parts)
I
1 0 0 0
0 1 0 0
P = ------------------------
.4 0 .3 .3
R .4 .2 .3 .1 Q

N= Fundamental Matrix = (I - Q) -1
MATRIX
A= 0.7 -0.3
-0.3 0.9
d = (0.7) (0.9) - (-0.3) (-0.3) = (0.63- 0.09) = 0.54
A-1 = 0.9/0.54 0.3/0.54 = 1.67 0.56
0.3/0.54 0.7/0.54 0.56 1.30
If A = a11 a12
a21 a22 where d=a11a22- a21a12
Then A-1 = a22/d -a12/d
-a21/d a11/d
MARKOV PROCESSES
I- Q = 0.7 -0.3
-0.3 0.9 ;

1.67 0.56
N= 0.56 1.30 = (I-Q) -1

1.67 0.56 0.4 0.0


NR = 0.56 1.30 0.4 0.2
= 0.89 0.11
0.74 0.26
MARKOV PROCESSES
=> Probability that A/C receivable in states
3 or 4 will eventually reach each of the
absorbing states.
Let B = b1 b2
Rs. in (0-3) m Rs. in (4-6) m
If b1 = Rs. 3000 b2 = Rs. 2000
B. NR = 3000 2000 0.89 0.11
0.74 0.26
= 4150 850
Collected Bad Debt
MARKOV PROCESSES
Credit Policy --- discount for prompt payment

1 0 0 0
0 1 0 0
New P = -----------------------------
.6 0 .3 .1
.4 .2 .3 .1

N = 1.5 0.17 NR = .97 .03


0.5 1.17 .77 .23
B.NR = 3000 2000 .97 .03
.77 .23
MARKOV PROCESSES

= 4450 550
Collected Bad Debt

(Costs +
Discounts) 6 % Reduction in Bad Debt.
Markov Processes
• Trials of the Process: The events that trigger transitions of the system from one
state to another. In many applications successive time periods represent the trials
of the process.
• State of the System: The condition of the system at any particular trial or time
period.
• Transition Probability: Given the system is in state i during one period, the
transition probability pij is the probability that the system will be in state j
during the next period.
• State Probability: The probability that the system will be in any particular
state.(That is i(n) is the probability of the system being in state i in period n.)
• Steady State Probability: The probability that the system will be in any
particular state after a large number of transitions. Once steady state has been
reached , the state probabilities do not change from period to period.
• Absorbing State: A state is said to be absorbing if the probability of making
a transition out of that state is zero. Thus once the system has made a
transition into an absorbing state, it will remain there.
• Fundamental Matrix: A matrix necessary for the computation of probabilities
associated with absorbing states of a Markov Process.
THANK YOU

You might also like