Markov Analysis
Markov Analysis
Markov Analysis
Introduction
Markov Analysis
A
Numerous
applications in
Markov Analysis
Matrix of Transition Probabilities
Assumptions of Markov
Analysis
1.
2.
2.
where
n = number of states
n = P (being in state 1, 2, , state n)
Most of the time, problems deal with more than one
item!
Store 1: 40%
Store 2: 30%
Store 3: 30%
Store 2:
Store 3:
Visit Store 1 = 20%
Visit Store 2 = 20%
Return to Store 3 = 60%
4%
Next Period:
Store 3:
4%
3% 21%
6%
Next Period:
6%
6% 18%
Matrix of Transition
Probabilities
To calculate periodic changes, it is much more
convenient to use
a matrix of transition probabilities.
a matrix of conditional probabilities of being in a
future state given a current state.
P1n
*
*
*
*
Pm1
*****
******
Pmn
*
*
*
*
Important:
Each row must sum to 1.
But, the columns do NOT necessarily sum to 1.
0.1
0.1
0.1
0.7
0.2
0.2
0.2
0.6
Row 1 interpretation:
0.8 = P11 = P (in state 1 after being in state 1)
0.1 = P12 = P (in state 2 after being in state 1)
0.1 = P13 = P (in state 3 after being in state 1)
Given the
1. vector of state probabilities and
2. matrix of transitional probabilities.
.1
.7
.2
.1
.2
.6
(1) (0) P
(1) = .4 .3 .3
.8
.1
.2
.1
.7
.2
.1
.2
.6
In general,
(n) = (0)Pn
Therefore, the state probabilities n periods
in the future can be obtained from the
where
(1) = vector of states for the machine in period 1
= 1 = P (being in state 1) = P (machine working)
= 0 = P (being in state 2) = P (machine broken)
P=
0.8 0.2
0.1 0.9
where
P11 = 0.8 = probability of machine working this period if
working last period
P12 = 0.2 = probability of machine not working this period
if working last
P21 = 0.1 = probability of machine working
this period if not working last
P22 = 0.9 = probability machine not working this period if
not working last
(1) = (0)P
(2) = (1)P
0.8 0.2
= (0.8,
0.1 0.9
0.2)
= [(0.8)(0.8)+(0.2)(0.1), (0.8)(0.2)+(0.2)(0.9)]
= (0.66,
Thus,
0.34) if the machine works next month, then
in two months there is
a 66% chance that it will be working
and
a 34% chance it will be broken.
Equilibrium Conditions
Equilibrium
Equilibrium
At
On
By
State 1
State 2
1.0
.8
.66
.562
.4934
.44538
.411766
.388236
.371765
.360235
.352165
.346515
.342560
.339792
.337854
0.0
.2
.34
.438
.5066
.55462
.588234
.611763
.628234
.639754
.647834
.653484
.657439
.660207
.662145
(n)
Current
State
P
Matrix of
Transition
(n+1)
New
State
Equilibrium Equations
(i 1) (i ) P
Assume:
(i) 1
p
2 , P 11
p21
p12
p22
Then:
1 2 1P11 2 P21 1P12 2 P22
or :
1 1P11 2 P21,
2 1P12 2 P22
Therefore:
2 p 21
1 p12
1
2
and
1 p11
1 p22
Equilibrium Equations
continued
It is always true that
(next period) = (this period) P
or
(n+1) = (n) P
At Equilibrium:
* (n+1) = (n) = (n) P
* (n) = (n) P
Dropping the n term:
=P
Equilibrium Equations
continued
Machine Breakdown example
The state probabilities must sum to 1,
therefore:
s = 1
In this example, then:
1 + 2 = 1
Equilibrium Equations
continued
Machine Breakdown Example
Summarizing the equilibrium equations:
1 = 0.8 1 + 0.1 2
2 = 0.2 1 + 0.9 2
1 + 2 = 1
Solving by simultaneous equations:
1 = 0.333333
2 = 0.666667
Therefore, in the long-run, the machine
will be functioning 33.33% of the time
and broken down 66.67% of the time.
Absorbing States
Any
Absorbing States
Accounts Receivable example
The possible states are
Paid
Bad debt
Less than 1 month old debt
1 to 3 months old debt
Paid
Bad
<1
1-3
1
0
0.6
0.4
Bad
0
1
0
0.1
<1
0
0
0.2
0.3
1-3
0
0
0.2
0.2
Markov Process
Fundamental Matrix
0
P=
1
0
0.6
0.4 A
0
1
0
0
0
0
0.2
0.3 B
0
0
0.2
0.2
Markov Process
Fundamental Matrix continued
Let P
A
F I B
Markov Process
Fundamental Matrix continued
Once the FA matrix is found, multiply by the
M vector, which is the starting values for the
non-absorbing states, MFA,
where
M = (M1, M2, M3, Mn)
The resulting vector will indicate how many
observations end up in the first nonabsorbing state and the second nonabsorbing state, respectively.