Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Loading...
User Settings
close menu
Welcome to Scribd!
Upload
Read for free
FAQ and support
Language (EN)
Sign in
0 ratings
0% found this document useful (0 votes)
143 views
Unit-5 (Markov Process)
Uploaded by
Hrishikesh Vasantham
AI-enhanced
srm notes
Copyright:
© All Rights Reserved
Available Formats
Download
as PDF or read online from Scribd
Download
Save
Save Unit-5 (Markov process) For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Unit-5 (Markov Process)
Uploaded by
Hrishikesh Vasantham
0 ratings
0% found this document useful (0 votes)
143 views
45 pages
AI-enhanced title
Document Information
click to expand document information
srm notes
Original Title
Unit-5 (Markov process)
Copyright
© © All Rights Reserved
Available Formats
PDF or read online from Scribd
Share this document
Share or Embed Document
Sharing Options
Share on Facebook, opens a new window
Facebook
Share on Twitter, opens a new window
Twitter
Share on LinkedIn, opens a new window
LinkedIn
Share with Email, opens mail client
Email
Copy link
Copy link
Did you find this document useful?
0%
0% found this document useful, Mark this document as useful
0%
0% found this document not useful, Mark this document as not useful
Is this content inappropriate?
Report
srm notes
Copyright:
© All Rights Reserved
Available Formats
Download
as PDF or read online from Scribd
Download now
Download as pdf
Save
Save Unit-5 (Markov process) For Later
0 ratings
0% found this document useful (0 votes)
143 views
45 pages
Unit-5 (Markov Process)
Uploaded by
Hrishikesh Vasantham
AI-enhanced title
srm notes
Copyright:
© All Rights Reserved
Available Formats
Download
as PDF or read online from Scribd
Save
Save Unit-5 (Markov process) For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download as pdf
Jump to Page
You are on page 1
of 45
Search inside document
Fullscreen
bom] # MARKOV PROCESS # One of the interesting models of a random process is the Markov Process (Markov Model) where the process depends only on the present value but not on the past value, Mathematically, a random process {X (D} is called a Markoy Process if given the value of X (2), the value of X (v) for v > ¢ does not depend on the values of X (w), for w
TION propert and which takes only discrete i sates 1 is discrete or continuous. Then {X (1)} is ca whether ¥, ry aly we define the Markov Chain as follows athe TEP (Xn an! A! P (%y= Fa! Mn then the process {Xn} 2 hich Valug led = dy—1 Xn-2 > In-B oes Xq= ay} =ay-1} for all 7 Of ty Qyivery is called as Markoy Chain. Here ay, yy +++ An ATE called the states of the Markov Chain, GONE-STEP TRANSITION PROBABILITY The conditional probability [P= 41% = 4] ; called the one-step transition probability from state a; to state q at the nth step (trial) and is denoted by pj (n- 1,7). (QHOMOGENOUS MARKOV CHAIN If the one step transition probability does not depend on the step, that is if Dy (n — 1, 1) = Py (m- 1, m)], then the Markov Chain is called the homogenous Markov Chain or the Chain is said to have stationary transition probabilities. REMARK : Note that the use of the word “Stationary” does not imply a stationary random sequence. OTRANSITION PROBABILITY MATRIX (TPM) ay _ When the Markov Chain is homogenous, the one $7 insition probability is denoted by pj. The matrix [p= ol called {one-sep) transition probability matrix (/p7)- E such Hic : The tpm of a Markov chain is 2 peas 8 € all py > 0 and the sum of all elements of ‘ansition matrix is equal to 1. Qn- ST Be EP TRANSITION PROBABILITY © conditi ili = she ae probability that the process is in stale 4 5 A hat it was in stat le aja i. is called the nese ee , at step 0. i.e., @. | P transition probability and denoted by Py i M UNITS Stochast!® any 10 fARKOV PROCESSES AND MARKOV CHAINS ag CHAPMAN-KOLMOGOROV THEOREM If P is the transition probability matri Markov Chain, then the n-step tpm P@) is Sail or 8 Homogeneous ie, (ey) = py" ag REGULAR MATRIX : A Stochastic Matrix P is said to be a regular matrix, if all the entries of p” (for some positive integer m) are Positive. : If the transition probability matrix is regular, that the homogenous Markov Chain is regular. Q CLASSIFICATION OF STATES OF A MARKOV CHAIN Irreducible : A Markov Chain is said to be irreducible if every state can be reached from every other state, where Pj >0 for some n and for all ¢ and j. Note that the tpm of an irreducible chain is an irreducible matrix. Otherwise, the Chain is said to be non-irreducible or reducible. Return State : If p,“) > 0, for some 7 > 1, then we call the state i of the Markov Chain as return state. Period : Let p,(”) > 0 for all m. Let i be a return state. Then we define the period d, as follows. d; = GCD {m: p> 0}, where GCD stands for the Greatest Common divisor. NOTE : 1. If d,= 1, then state / is said to be aperiodic. If 4,> 1, then state i is said to be periodic of period d,. _ 2. Obviously from Note 1, we have state i is periodic of Period d,, ifd;#1. _ Recurrence time probability : The probability that the chain returns to state i, starting from state i, for the first time at the ™ step is called the Recurrence time probability or the first ‘eturn time probability, b REMARK : |. The first return time probability is denoted ¥ Fi and {n, SM}, n = 1, 2, 3, v5 is the distribution o “turrence times of the state i. 3.51 then we saye v ase, PROBABILITY AND QUEUEING THe, 0 © 2M ay= fu = 1, then the return to state jj n=l Scettain, 3. be = DASa () js called the mean recurrence ti n=l MMe OF the state i. 4, The state i is said to be transient if the return to g uncertain i.e., 2, : 5. The state i is said to be persistent or recurrent if the retu state iis certain i.e., Ay <1. IM to| 6. The state i is said to be non null persistent if its me recurrence time 1, is finite and null persistent if p1,,= 0, a tate ji 7. Anon null persistent and aperiodic state is called ergodic, Q CONSTRUCTING A TRANSITION PROBABILITY MATRIX FOR A MARKOV CHAIN (Random walk with reflecting barriers) In this example, we construct the tpm for a Markov Chain. Assume that a travelling sales representative is at an integral point of the X-axis between the origin and at the point x = 3. He can take a unit step either to the left or the right. If he goes right, the probability is 0.7 and if he goes to the left, the probability is 03. Note that, if the sales is at the origin, going to the left is not possible and if the salesman is at x = 3, he cannot go right. i. only when he is at the point x = 1 and x = 2, both ways are possible fo him, The transition probability matrix is given as follows : States of X, 0123 States of X19 a3 aj 0 07 3. Lo 0 1 ctl REMARK : p,;, the element in the 2nd row, 3 state 2 tg at Sl the aboye tpm is 0.7, This means that, if the process at n2oh step (n - 1), the probability that it moves to state 3 at where n is any positive integer. UNIT 3...KOV. PROCESSES AND MARKOV CHAINS 3.63 MAR DEFINITION : If the probability that the Process in stat j= 1, 2,3, ... 4) at any arbitrary step then the roWaeaee 2 (Pp Py ++ Pp is called the probability distribution of the rocess at that time. Note that, p() = {p,, PL, ..., Pi} is the initial probability distribution, Bad Example : For the above example, ‘we will assume that the i] ding etn le 1 a1 initial distribution of the chain is P= G.5g4). . 1 ie, P {XR i} ‘So gy V0, 11253 ae(l) From the table we have P{X;=2/Xp=1} = 0.7 P{X,=1/X,=2} =/03 Also, P{X= 1, X= 2/ Xq = 1} = PLEX) = 1/X,= 2} P {X= 2/ XH} = 030.7 = 0.21 +2) P{X)=1,X,=2, X= 1} = P {Xp=1} xP {X,=], Xy=2/X= 1} [P(A AB)=P(B): P(A) = $21) = 0.0525 (From(I)and(2))—--- G) Similarly, P(X, = 3, Xp = 1, X)=2, Xo= 1} = P{X,=1,X,=2 X07 4 P (Xy=3/X2= 1, X,=2, X=} 0.0525 x P {X3=3/ X= 1} [By Markov property and (3)] 0.0525x0 = 0 W UNIT3 #LLL GLEE ae PROBABILITY AND AN CURL G The, 4 mw EXAMPLE 1 a of the Markovian transition Probas abil The initial process matrix is given by 02 03 05 p=|01 0.2 0.7| with initial pratabiities pe 06 03 01 p= 03, pp) = 03 find jp © rf =04, om To find po, po, po which are all the entries corresponding to the matrix P() (i.e.,) PO) = PO). P The tpm of a Markov chain is 0.2 0.3 0.5 =|01 02 07 06 0.3 0.1 The initial probability distribution is PO = [04 03 0.3) (@) To find P(!) Pil) = pO.P 02 03 05 = jos 03 0301 02 97 06 03 ol PC!) = [0,08 + 0.03 + 0.18 0 ne PO) = [0.29 0.27 0.44] PY = 0,29 pi) = 0.27 PO = 0.44r RKOV PROCESSES AND. MARKOV CHAINS 3.86 EXAMPLE 2 ___ “ Assume @ Markov on with transition Probability matrix fit: 4h a4 n2 Dias 3 Find the essential states, 3 1 4 044 i H0..0 1 pea The tpm of a Markov chain is given Osiiqerrag Ing CPi. 14 4.444 le 2 Poe oe 7 eae godu 309 3Lo.0 0.1 The pictorial representation of the tpm is From the fig, it is clear that no other state is accessible from State 3 and hence P3371. é “, State 3 is an absorbing state. The states 1 and 2 are accessible to one another. 2 fs The state 0 is accessible to all other states. But io other es are accessible to 0. Thus 0 is notian essential state. “. There are no essential states. UNIT3I . PROBABILITY ANI } 3.86 7 P QUEUEING f 3m y EXAMPLE — Consider @ Markov chain with state {0, 1} ang 7 f : Ans oy 0 ‘| F ix P= As th 1 probability matrix P [" os the state 9 periodic? Hoag is the riod? yy chain with state {0, 1} is o 1 if al P= ili 0 ntation of the tpm is as follows 1 The tpm of a Marko’ The pictorial represe! From the fig, the transition starts from 0 to | and again con: back to 0. Thus 2 transitions occur. , The state 0 is periodic with a period of 2. be m EXAMPLE 4 Consider the Markov Ch oo 6L Oras 06 0 9 1} 03 07 0 0 | it ii ible. If no, find 2/02 04 01 03 , As it irreducible. If 3 ain with transition ‘probability matric 0 0 0 61 classes. Find the nature of the states. Sle Since no state is accessible from state 3, all “iR states do not communicate with it each other, the chain is not irreducible or ergodic. The classes of the chain are {0, 1}, {2}, {3}. State 3 is an absorbing — : fe no other state is SSI i ible from it and p33 = 1. pau. mey' Vial, mUNIT3Ankov. PROCESSES AND MARKOV CHAINS, wi MPLE 5 Gant [eee ae es 1 ja AP 7 4 be a Stochastic matrix, Check whether it is a IAU. Jane ‘06, pm 01 2 ; =[lo Li i i: Given A f! 7 is a Stochastic matrix as 4,= | and 3.67 \ i= allentries are positive, To prove that A is regular, we have to Prove that the entries in A? is positive. O1 A= [ i 252, 0 1\f0 1 a= {1 alla a] = 2 BN 422; Sum of the entries in the both rows is 1 and all entries are positive. :. Ais a regular matrix. aI— Ni Bie NI— MEXAMPLE 6 Consider the Markov Chain with tpm given by 12°34 1f?010 i820 Bd, Po lo100 Show that itis ergodic. 4, di dod 412 8 8 4 [A.U, Dec 07] Let us consider the state 4. 1 Pie > This state is aperiodic, we UNITSSEES Ww NA Ive mm. Win 3.58 1 Now consider fag!) = 1 1 fu = 3 tb 4 - Sad iam > Fu = Df i=l = (ay (2) i i s eae { Fo tig tet qaqa ( rp = 1 Hae = Df +Q) fud*B) L441 || No) 432) aa(t - 1Q) +26) +36) +4) ID ks, 8 5 = -g which is a finite value. => 4th state is ergodic. => All the states are ergodic. Hence the given transitio. probability matrix is ergodic. @ EXAMPLE 7 @ LL The probability of a dry day following a rainy day is zal t that the probability of a rainy day following a dry day * q Given that May 1" is a dry day. Find the probability that May is a dry day and also May 5" is a dry day. r 7 j ot give | Since the transition probability matrix ie first we will find the tpm. The process {X,,} has 2 states D and R, where \ D > Dry day R- Rainy day The tpm of the Markov chain is | WUNITS....ee ov PROCESSES AND. MARKOV CHAINS i! __ 3.89 States of X,, R 1d D/2 2 states fXp-1 P= Py 9 8° 3, We write the initial probability distribution, nce May 1“ is a dry day, the initial Probability distribution is PO = [1 oO} To find the probability that May 3" is a dry day. ie, the entry corresponding to the first row, first column in thematrix P= PQ). P 11 2-12. PO=PO.P= [1 oO] 5 333. pQ) = (5 5 Lead 1o1j]2 2 1 BANwAry. 3)=p2).p=|o > an-| eee PO)= PO). P [2 3] 12 -[i+4 rel 3 03 -[%e ~ 124 24 por = [FF al Thus probability that May 3" is a dry day is Probability corresponding to the 5 * ‘first row, first column entry in PG) = 77 To find the probability that May 5" is a dry day. entry corresponding to the first row, first column in the Matix POS), Similarly,“ puay = pay, p . UNIT3PROBABILITY AND 360 SEVEN HE | aa é 3 I 22 “(12 TJ ]1 2 3 3. I 7 5M F [R+e 2434] 29.0, ibd p= [5 a] | | ti 29 43) 122] | = pi) aysa kh pO) = pl). p (5 2 7 33 29 43 29 86" = (at ar6 aa 3 13 259 PO) les Al .. The probability that May 5" is a dry day is Probability corresponding to the 1B first row, first column entry in PG) = “gy m EXAMPLE 8 A raining process is considered as two state Markov Ger it rains, it is considered to be state 0 and if it does not a chain is in state 1. The transition probability of the Ma Chain is defined as ie [rs 4 P= loz 08 wi 10d Find the probability that it will rain for 3. days in ini assuming that it will rain after 3 days. Assume ctivel robabilities of state O and state 1 as 0.4 and 0.6 resp? {08 7 ays To find the probability that it will rain for sre A today assuming that it will rain after 3 days (4% (Since 0 represent the raining state). Let {X,,} represent the raining process. BUNITS |,waRkov PROCESSES AND MARKOV CHAINS The tpm of the Markov Chain is States of X, Ooivl 0706 04 States of X,_; P = 1 [os Dal P(X =0) = 04 Gwen P(Xp=1) = 06 | 3.61 (A) Now, p2= PxP 2 re a 0.6. 0.4 ~ 0.2 0.8 bee Al 0.36+0.08 0.24 + 0.32 0.12 + 0.16 008 soe -[ [nae 7a 0.28 0.72. & P. p2 = pt) = ps=p2_p ee os6) Ps 04] 0.28 0.72) l02 0.8. 0.264+0.112 0.176 + nae 0.168 + 0.144 0.112 +0.576, 0.376 0.624 oo) o1) 0.312 0.688 (0) id The probability that it will rain on the 3 day Pay = 0.376 The unconditional probability that it will rain after 3 days is p3 = P[X3=0] = ¥. P(X £0/Xp= i xP Ko=9 i=0 = 2) P(Xp=0) trig P= = (0376) (0.4) + (0.312) (0.6) From (A) = 0.1504 + 0.1872 P[X3 = 0] = 0.3376 UNITS &PROBABILITY AND Qu 3.62 SENG ti PLE 9 mEXA is tossed repeatedly. If X,, denotes the rma sh A fair dice f urring in the first n tosses, find the," of the numbers occ ' e trans, probability matrix P of the Markov Chain (x he Fiat P {X, = 6 and P?, TAU, Sung: ey cael [step 1: | ‘To find the transition probability matrix, The state space is given by {1, 2, 3, 4, 5, 6}. The trans probability matrix is formed by using the following investigatig Let X,= Maximum of the numbers occurring in the first trials 3, say if (n+ 1)" trial results in 1,2 0r3 if (n + 1)" trial results in 4 if (2+ 1)" trial results ins if (n + 1)' trial results in 6 3, 4, Then X,= 5 6, & 6 1 P{Xp41=3/Xn=3} = GAGS = For i=4,5,6 ; 1 P{Xn4 1 =i) Xn=3} = GE The tpm of the chain is given by V2. Ba eS 1 _ oe ee 6.6, pear | O00 erage Be ‘looott se >To oot % 6.L0., 0;, 90\,0,4.0° [Or m UNITS .,wal al 0 0000 aINn aR o Aw AR ale 0 al Al AE o an 0 o Oo AN Or aE ar ap To find P? ; Applying the matrix multiplication we get, 1 Se De eT. 36 36 36 36 RKOV. PROCESSES AND MARKOV CHAINS Ar aR are al eal 9 36 al o i 36 cocococoeH oOo ° cocoon OO ° coomeUuNnd ° o aIN Ar °o Aw Ar Ab o AR Ae aK are o So AM A A OA aE 3.63 Roar ar ae ak ae UNIT3PROBABILITY AND QUEUEING THEO Initial probability distribution Titial state probability distribution is given by Lal 1, 1 o) OE t AG é) since all the values 1, 2,3, |, P= (E668 te, are equally likely. [ st. To find P (X2 = 9) P (X= 6} = DP Xr =6/ Xo= A} xP (Ko=H} i=l = P (X= 6/ X= 1) xP (Xo=1) +P (X2=6/ Xo=2) xP (Xo=2) 4 +P (X)=6/ Xo=3) P (Xo=3) +P (X)=6/ Xo=4) P (Xo=4) +P (X,=6/ Xp=5) P (Xo=5) +P (X2=6/ Xo= 6) P (X= 6) “6 Note that P (Xp= 1) =P (Xp = 2) =P (Xp =3)= P (Xy=4) =P (Xy=5) =P (Xo= 9) ak 6 1 P{X,=6} = —Jp @ @ a 2 = 6} 6 [Pic + Dg + De Ql Dey + Pag + P56 +P | Q) [ Note that Py. denotes the 1St row, 6" column entry in the matrix P2 | we sik a Fy 6136+ 36+ 36436 + 36 * 36 1 6 mUNIT?MARKOV, PROCESSES AND MARKOV CHAINS w EXAMPLE 10 Te BURR cle se ncmmnenemaestesaatentt The transition ee, matrix of a Markov chain 3.65 Ky Rone = 1, 2, 3, sesso having 3 states 1, 2, and 3 is 05 0.4 0.2 0.2 0.4 0.3. 0.1 P = 106 0.3 [A.U. Apr ’07] and the initial distribution is PO = (0.7, 0.2, 0.1). Find (i) P{X2=3} and (ii) P{X3=2, X)=3, X) =3, Xp= J. 01 05 04 Given p =|06 02 02], 0.3 04 03. P(Xp=1) = 0.7 P(Xp=2) = 0.2 P(Xp=3) = O1 p@)= p2 = p-P 01 05 04) f01 05 04 =|06 02 0.2) |06 02 0.2 0.3 -04 035.103 0.4 0.3. 0.43 0.31 0.26 = {0.24 042 0.34 0.36 0.35 0.29. (2) _ Pi, = 0:26 2) Hence, ps) = 0.34 = (@ (2) Pry = 0.29 () To find P (X, =3) P(X,=3) (2) 2 , P (X= 2 Pie | Xo=i}xP {Xo=F} +p > P(Xy=2) +p PX%=3) a» UNIT 3PROBABILITY AND QUEUEING Tye, 0) 0.26 x 0.7 + 0.34 x 0.2 + 0.29 x 0.1 ‘ = 0,182 +.0.068 + 0.029 - p(X 23)" 0.279 (i) To find P {X3=2 5 X2= 3 X1= 3, Xp=2} P{X,=3,X0=2} = P{X,=3/Xo=2} xP {Xp =2} We know that P{X,=3 /Xo=2} = P23 = 0.2 i Substitute (2) in (1) 4 P{X,=3,Xp=2} = 02x02 = 0.04 5 Also, P(X )=3,X,=3,X=2} = P(X)=3/X,=3,X9=2} i xP {X,=3,Xo=2} = P{X,=3/X)=3} ‘ xP {X,=3,X072} [By using Markovian property] = 03 x 0.04 [By (4)] ele 0 P{X3=2,X)=3,X)=3,Xy=2} = P{X3=2/ X)=3,X1=3,X07?} xP {X,=3,X23%072! )- p(B) w [By (a ++ (1) Next, a [P(A AB) =P(A/B P{X352/ X2=3) 2} : = =3,X07 xP {X_=35%X1 nr 0.4 x 0,012 0.0048 nou @ UNIT 3 .....MARKOV PROCESSES AND MARKOV CHAINS s EXAMPLE 11 # The pm of a Markov chain with three states 0, 1 »2is 3.67 3 1 4) 4: © rid 4 P=14°204 and the initial state distribution of 3 1 OF @i4 the chain is P (Xq =) = 4Zizo, 1 2, find () P (Xz= 2) and (i) P(X = 1, X2= 2, Xp = 1, Xp=2) The tpm of a Markov chain is given by O12 ofa ao roild “lo | P(Xp=0) => P(Xp=1) = + = (A) P(%y=2) = @ To find P (X, = 2) ta (fa a ° Now, P2=Px p= ; } a a : 7 oF iba sPROBABILITY AND QUEUEING 7, 3.68, 10 Bu a * . Tina whos! oh (Yoo) Pai) (@o2) by bb yr] 16“ 16 | 5 16 @0) id (P12) 3 9 a) 16 4 16 _ (p29) P21) 22) 2 @ P&.=2)= oe eae 2 = P{X,=2/ Xp = 0} xP {Xp = 0} +P {X)=2/ Xq=1} xP {Xp= I} +P {X,=2/ Xp =2} xP {Xp=3 (2: . a P(Ky=O +P P (X=!) 2 +P P (Xp=?) = eQ)-Fe(Q) 6) om P(X) =2) = (@) POs 1,X)=2,X,=1, Xo=2) il = POX =1/%=2; xy sXAN™ oy P(X) =23 vm om UNITMARKOV. PROCESSES AND MARKOV CHAINS 3.69 = P(X%3=1/X,= 1=13X9=2) P(X. =2/X)=1;Xy=2) P(X) = 1, X9=2) = P(X3=1/X)=2; X)=1; Xg=2) P(X) =2/X,=1,X9=2) P(X) =1/X9=2)P (Xp=2) 1 nod (1) () pX=2) Po Pry Poy [Using Markovian property] SG a)ae) m EXAMPLE 12 @ A gambler has Rs. 2. He bets Re. 1 at a time and wins Re, I with probability t . He stops playing if he losses Rs. 2 or wins Rs. 4, (a) What is the TPM of the related Markov Chain ? (b) What is the probability that he has lost his money at the end of 5 plays ? (c) What is the probability that the game lasts more than 7plays ? [A.U. Apr °04, Dec °06] Rea Formation of tpm : Let {X,,} represent the amount with the player at the end of the round of the play. it State space of {X,} = {0, 1, 2, 3, 4, 5, 6} as, the game is over, ee Player loses all the money (X,, = 0) (X, = 2-2 = 0) or wins $4.(X,=244=6) UNIT3yy . PROBABILITY AND QUEL Eiug hi ; SRO; We observe that if the player wins Rs.4, then he will x Rs(4-+2)= R86 sic te = (a) The spm of the Markov chain is given by 0123 456 of! 9000564 1 1 ie 0 79009 pe te 2)0 5 2 t 90 310 0 79700 1 1 ri ee ‘ 000703 6 a 3 2 2 | 9000 705 61000000 1 Since the player has got Rs.2 initially the initial Probability distribution of {X,,} is po = (001000 0) PU) =POP=(0 0 1000 0) oon oNIHo Oo onto o RI-o Oo O o ° 0 a 2 0 L 2 0 0 0 cooNF oN Oo UNITS .....:S AND MARKOV ARKOV. PROCESSE: CHAINS 3.74 ta 90 0 Go 1 the d 2970000 =p@P=(0 3.03000) 1 1 pa)=PO)P 2ei 2 0304000 1 1 TDF 0 5 00 ret Al stata 00705 : 000001 Aivaul yn“) = (5030400) Similarly, AL 9 3: pal PG)= PQ) Pp = G 40503 0) fe 4)=p3)p = |= = = aie ro=POr= (Fo F040 4) cael 5)=p4)p= (= = = ae PO) = PO p GsoFoth 29 7 13 (6) = = |= = = p@) = p(s) p = ox 0 @ 0 4) no-no P= (Fr ay O aR O Tax 4) (b) P{the man has lost money at the end of 5 plays} = P {Xs=0} = the entry corresponding to the first row, first column is P(5) aap a8 (©) P (the game lasts more than 7 plays} = P {the system is neither in state 0 nor in 6 at the end of the seventh round} P {X, = 1, 2, 3, 4, or 5} P(X7= 1) +P (X7=2)+P(X7=3) +P (X7=4)+P(X7=5) fT UNIT3 &PROBABILITY AND QUEUEING THEg, RY 13 1 21 140+ jog + * 128 4427413 2 4 128 = 128 uw at = 64 EXAMPLE 13 8 = Gy G; are throwing 4 ball to each other, G, ce Three girls Gp the ball to G2 and G, always throws the ball to G the ball to G as to G;, Prow likely to throws is Markovian. Find the transition matrix and [A.U, Dec °05} always throws but G; is just as that the process ify the states. Step li Formation of tpm : The transition probability matrix 0 f the process {X,} is given below. States of X,, G, G, G3 0.1.0 States of X,,_ 1 G, : q ! G L222 0 [Note that the Ist row values are 0, 1, 0, because G, always throws the ball to G,. Similarly, since Gy always throws the ball to G;, 2nd row values are 0, 1, 0 and since G3 throws to G, and Gp G are equally likely, the 3rd row values ares 4 » 0) - Clearly, the states of X,, depend only on states 01 on states of X,,_ 9, Xy_ 3, +++ OF earlier states. There! is a Markov Chain, Classification of states : 0 0 7 2 of X,-» fore, Xe) Now, P2 = PxP= Ni-os NI-o- ° nico mUNIT3 .. aFae MARKOV, PROCESSES AND MARKOV CHAINS 3.73 eae 0 0 1 Se rig 279 po]e|2/2 1 | a fig ok Leeral 2u-2 O22 aM Testy 4\ 4.2 Here, pr > 0, P13 > 0, pa) > 0, py. > 0, P33) > 0 and allother py? > 0. Therefore, the chain is irreducible. Also, ior Ce Taste DoD <2 4 4 2 arated pera ryt th ge f Pal qg 4 2H) 4 2 4g jandPe=l gg 2 Heelaee 1 aud ergs Ae, 8) 8 2. 8 8 8. and so on. We note that p,, pi), py, py! ete. are > 0 for i = 2, 3 and the GCD of 2, 3, 5, 6, ... = 1. <. The state 1 (state G,) is periodic with period 1, i.e., aperiodic. Similarly, state G2 and state G3 are also aperiodic. Since the chain is finite and irreducible, all its states are non null persistent. Moreover all the states are ergodic. EXAMPLE 14 @ Consider the Markov Chain with TPM 01 2 oro 1 0 1 1 P=1/3 9 3 2Lo 1 0. Show that P2" = P2, Also, find the nature of the states. aay {A.U, June '04] 01 Consider pom } 0 01 oN OS The chain is irreducible, _ UNIT3PROBABILITY AND QUEUE} ING Ty oy z & W ~ 3 o oN So - oOo oN oO Deoni-ooed o io RIF ONI= owe " fess ore % NI- oO NIH Rif eo NIG NI- oONIS NIK oN " Cc sa, t W co NI— oO NI S i NIH Oo NIK u %, & 34 Thus pi= pe sania y Similarly po= Pps. P2=P' isp pin = p2 In general, is Note that Pt! = P"-P im = BP i 35 i 70 Poo a Also, Poo” > 0 po)? 0 pt? | 2 Be py? 0 Bi? O i?! i 2: i Dag > 0, rai? tose ey = Pil ‘ge 7 2) Therefore, the Markov Chain is irreducible. Pit ea period om for all i, all the states of the chain are periodic W! Conclusion : site all 8 state Since the chain is finite and irreducible non-null persistent, All States are not ergodic. UNITS...MARKOV PROCESSES AND MARKOV CHaing 3.78 EXAMPLE 15 A man goes to his office by car or 5 a . He never goes 2 days ina row by he ce 4 train every t day he is just as lik, ut if he drives one day, then the next a JUST as likely to go by is t0 travel by train. Now Suppose that aie ja as he week, the man tossed a fair dice and wen by enti baie Ye the only if a “6” appeared. Find (i) Probability that he went iy ale on the third day and (ii) the Probability he went by car to work in is along run. IA.U. Dec *04, Apr 07] The travel pattern is a Markov Chain, with State space = {Train, Car} = {T, C}. The transition Probability of the matrix Tec (0. ll pes “|r Ch2 3:2, 1 The initial state probability distribution is P) = @ 5 3) Since P (going by car) = P (getting 6 in the toss of the dice) Nin ape u and —_P (going by train) (@ To find the probability that he went by train on the third day i.e., the entry corresponding to the state “T” in P©), pe?) = pp aa als NI- S Ric s tl ap SI- Sl= " ee 8 ~ PG) UNIT3 @PROBABILITY AND QUEUEING THEORy 376 : is office by train on the third dayy — «. P(the man going to his off ie Ny A a u (m, 7) be the stationary state distribution of Let t= (Dp a Markov Chain. BY mo cd of 7, 1} = (m %) ie, (™ ™) ; ; (m t, z 2 (1) Bilt ++ (2) Equation (1) and Equation (2) represent the same. Since 7 is a stationary state distribution, m+n = 1 + Q) a> Tm = 2m Substituting 2, = 27, in (3), we get 1 mt+2m = 1 > 3m = 1> | 4 =3 Next, Tm = 2m = 2 2 ~. P {the man going by a car in the long run} = ™ = 3 m EXAMPLE 16 @ In Reliance Super Market there are two brands of rice Ran S. A customer buys brand R with probability 0.7 if his la! Purchase was R and buys brand S with probability 0.4 if his last purchase was S assuming Markov Chain model. Obtain (i) 0" en 2. (i) 2-step tpm P? and (ii) the stationary distribu brant highlight the proportion of customers who woul rand R and brand S in the long run, | I Let the process {X,,} has 2 states, viz, brand R and brand S- | i WUNITS ...MARKOV PROCESSES AND MARKOV CHAINS 3.77 (i) The transition probability matrix of the Process {X,} is given below. States of X,, Ros R [07 03 States of X,_ s (Re ial 0.7 0.3 P= cee fae aa (ii) 2—step tpm : By Chapman-Kolmogorov theorem, We know that P™) = pn -, The 2-step tpmis P@) = p2= py Pp 0.7 0.37 [0.7 03 ae [oe oa) [ne al _ [0.49+0.18 0.214 0.12 . Ine +0.24" 0.18 + oral 0.67 0.33 na [oe 034] (iii) The stationary distribution’: 2 Let = (x), m5) be the stationary state distribution of the Markov chain, By the property of'z, xP =" : 0.7. 0.3 ce) [oe oat (1 ™) 0.7m + 0.6m, = 1 0.6m) = 2 -0.70, 0.62 = 0.37, = 6m) = 37) 4 5 2m = ws (1) Since x is a stationary distribution, en ® Substituting (1) in (2), we getPROBABILITY AND QUEUEING TH ay Se adda tllacniage : Ry my 2 = 3m = } + in(1) we eet 23) EH out of 3 customers, 2 will buy brand R and substituting "= 2 =m > MP3 \= a , Inthe long run, out of 3 customers 1 will buy brand S. mw EXAMPLE 17 8 Rahul’s work habits are as. follows. If he works one night, he next night, On the otherhand, the is 70% sure not to work t work two nights in succession is 0.6, probability that he does not In the long run, how often does he work? Formation of (pm : The process {X,,} has 2 states W and N, where W- work N-— does not work The transition probability matrix of the process {Xq below. is given w oN Hei pe P= loa 06 ibution of the 7 Let r [nm 1] be the stationary state distr rkov chain. By the property of xP = x, we have im m] (ta 0.7 1 Log og|7 fH 7a 0.3m) +0.4m9 = my 0.4m) = 1-037) WUNITS.....|
70 345 14 367 To u a i we a aCONCLUSION : Hence 12:3% of the sj systems are highly distorted, day he is twice as likely to sell in city A as in the fe long run, how oft other city. In th fen does he sell in each of the cities? rmation of fpm: Let {x in} Fepresent the salesman’s sales pattern. The process {X,} has 3 states A, B and C. ; The transition Probability matrix of the process {X,} is given low, ABC ATO 1 0 2,1 p=B|3 °3 21 cl3 3 4 UNITS @PROBABILITY AND QUEUEING THeg,, oS tion! jistribution : i state distributi 3) be the stationary tibution 7 Let m= (™ 72 n 7 : i of nP =, we have Markov chain. By the Boner 1 3 " ie, Im ™ Tl [my 73] aie win o 1 3 0 2 Det Om +73 M2 53 = ™ ees (I) 1 m+ Omg tz M3 = M2 aaa 0) 1 On +75 (mp) +003 = 1% aly as 3, 02. M3 My = 303 8) Substituting (3) in (2) we get, 3m, +73 = 3 3m3) 3m) +73 = 973 3m, = 813 Mm = 373 Since 7 is a stationary distribution we have, TM t+MyQ+7m3 = 1 Substituting (3) & (4) in (5), we get wl) 8 FZ 34 3ny +73 = 1 83 + 9n, + 3m3 = 1 WUNTTS....MARKOV PROCESSES AND MARKOV CHAINS 3.85 20n; = 3 es 3 * 20 + (6) Substituting (6) in (4) we get, = BFS FT ai (3) OF Bi tS tp) Substituting (6) in (3) we get, 3 TM = 3 &) ee ™2 7 30 + (8) By (6), (7) & (8) the Steady state probability distribution is z [+ oes ae [se O04 zr | Thus in the long run, he sells 40% of the time in city A, 45% of the time in city B and 15% of the time in city C = EXAMPLE 21 @ On a given day, a retired professor Dr. Charles Fish, amuses ° himself with only one of the Sollowing activities. Reading ~ (activity 1), &ardening (activity 2) or working on his book about 4 river valley (activity 3). For 1
0.05 my 40.15 2-7, = —0.25 . =0.95 my +0.15 tm = —0.25 6) Substituting 73 in (2) we get, 0.25 my + 0.10 my + 0.40 (1 — - M9) Ty 0.25 1 +0.10 2) +0.40-0.40 m1 -0.40 m2 = M2 =0.15 my - 1.30 m2 -0.40 ... (7) (6) x 26 + (7) x3 => [0.95% 26-015 x3] my = ~ 0.25 x 26-0403 [-24.70— 0.45] nm, = —6.5- 1.20 = 25,15 m = —7.70 1 ™m © 251.5 m= 03060 Substituting 1 in (6) we get, — 0,95 (0,306) +0.15 m2 = — 0.25 mw UNITMARKOV PROCESSES AND MARKOV cHais 29 +0.15 my = ~ 0.95 O15 ty = 0.04 %= ou (2 = 0267] Finally, 1: ee ™ = T = 1-0.306— 0.267 = 1-0.573 13. = 0-427 CONCLUSION : Therefore, Dr. Charles devotes approximately 30% of the days to reading, 27% of the days to gardening and 43% of the days to writing. EXAMPLE 22 @ There are 2 white balls in bag A and 3 red balls in bag B. At each step of the Process, a ball is selected from each bag and the 2 balls selected are interchanged. Let the State a; of the system be the number of red ball in A after i changes. What is the Probability that there are 2 red balls in A after 3 Steps ? In the long run, what is the Probability that there are 2 red balls in bag A? [A.U. June ’07] Form oftpm : State space of the chain {X,} = (0, 1, 2), since the number of in the bag A is always 2. Let the transition probability matrix of the chain {X,,} be 0 1 2 Of Poo Por Po2 balls i P= 1! Po Pu Pr 2L P20 Par Paz. UNIT3 &a 3,88 PROBABILITY AND QUEUEING 1}, Boy Poo = 0 [the state #0, interchange of balls] Y Por = P20 = 0 (After the process of interchanging, the n, of red balls in bag cannot increase or decrease by 2) lumber A-— 0red balls (before interchange) A-— | red ball (after interchange) Let X, = 1, ie, A contains 1 red ball (and 1 white ball) B contains 1 white and 2 red balls. ow to 1 +P {X41 20/X,= 1 =Pio = 2%3 76 ie Sia i oa ; imilarly, we can find Po = 2*3 73 Ai Since P is a Stochastic matrix, PiotPutPr2 = | ; eae Pi: = 2 a 2 Similarly, Poy = 3 and lh Pry = 1- (P29 + Pa) = 3 O10 Lbs sd =p=|6.2 3 2 il oa 3 _ Now P(0)=(1, 0, 0) as there is no red ball in A in th beginning. oS ai} PX) =POp = (1 0 0) 6 2 : 2 2 033 =] (0 | 0) @ UNIT 3...» MARKOV PROCESSES AND MARKOV CHAINS 3.89 01 2) = pa gl POSER Oe olay g 2 a 3 = flap 2g 1 iy Item) tot 3)=p2p = (- Ll LT PG) = p@p Gy Dera 21 on -(42 3 ~ (1226 is) To find the probability that there are 2 red balls in bag A after 3 steps i.e., P X;. = 2). «. P {there are 2 red balls in bag A after 3 steps} 5 = P{X3=2}=p,.O=75 Let the stationary probability distribution of the chain be mT = (My ™% %) By the property of z, we have mP = mandmt+m,+m=1 01 0 dL idhiged ie, (X% 7%) 62 3) = (% ™ ™) 21 a3 1 6m = % ™ 2m Mm+5 + we UNITSPROBABILITY AND QUEUEING Tur, ORY and mimtm = | ote ye ad solving 0 10° ™ "10 =F CONCLUSION : Probability that there are 2 red balls ~ 3 Ain the long run =o ~ 0.3. O EXERCISES 0 oad S 0.2 a 1. For the Markov Chain with transition matrix P= le ‘4 (0.7, 0.3). Find p (1), P (4) and p (8). with p(0)= 2, Consider a Markov Chain with transition probability matrix 05 04 01 [3 0.4. 0.3| . Find the steady state probabilities of te 02 03 0.5. system. 1 giisz ogi 0 0.5 o 0 3. Does the transition matrix 025° 0.25 0.25 0.25 : 0 0 0 | Correspond to an absorbing Markov Chain ? oo | | 4. Consider the transition probability matrix of @ Markov | 05 05. 0 0 05s O90 0 as 0 0 05 05 0 0 05 05 L025 025 0 0. 05 ses Find the transient and recurrent states, Find the ae ite" oft} 5. it is observed that, a person when he purchases Toh ov" particular brand he will use the same brand oF j i 0 [cts the ebaint ieredt™ 0 W UNITS ......3.94 his next purcha: i n another Sesion probaly matey aga th three breed with his present and immediate next Purchase, — corr 0.7 0.2 0.1 [e: 0.4 0.3 0.3 0.3 0.4. he initial distribution of purchase of brands is (0.3, 0.2, ehat will be the distribution of the brands after two | 1) PROCESSES AND MARKOV CHAINS Rs purchases. ar Find the steady state probabilities of the Markoy Chain with 6. 0.45 0.48 0.07 transition probability matrix | 0.05 0.7. 0.25 0.01 0.50 0.49. 7. The three-state Markov Chain is given by the TPM. 221 933 1 1 eis ‘ P=) > 9 4 |. Prove that the chain is irreducible and le: zl, 2 229. all the states are aperiodic and non null persistent. Find also the steady state distribution of the chain. 8 A gambler’s luck follows a pattern. If he wins a game, the Probability of his winning the next game is 0.6. However if he loses a game, the probability of his losing the next game is 0.7. There is an even chance that the gambler wins the first Same. What is the probability that he coins (i) the second 84me, (ii) the third game and (iii) in the long run ? 9 87 [Ans. 57 399 +7] Aman tosses a fair coin until 3 heads occur in a row. Let nal & fat the mth trial, the last tail occurred at the (n —K)"" ie * X,, denotes the longest string of heads ending at the . Show that the process is Markovian. Find the Matrix and classify the states. mh trial Nsition . UNIT3PROBABILITY AND 3 LENO Ty tt 12 Orlin: 22°90 Wd 1 [Ans, P=X, |2 ° 3 0 2) 1 1 2 905 oo 0% | The chain is not irreducible. State 3 is absorbing and the other states are aperiodic. ] 10. Two boys B,, B, and two girls G,, Gp are throwing a ball from one to another. Each boy throws the ball to the other boy 1 we ee with probability 7 and to each girl with probability On the other hand, each girl throws the ball to each boy wii 1 A how probability 7 and never to the other girl. In the long rus, fo y dbl) often does each receive the ball ? [Ans. 3 »3°6 : <<
You might also like
The Subtle Art of Not Giving a F*ck: A Counterintuitive Approach to Living a Good Life
From Everand
The Subtle Art of Not Giving a F*ck: A Counterintuitive Approach to Living a Good Life
Mark Manson
Rating: 4 out of 5 stars
4/5 (6016)
Principles: Life and Work
From Everand
Principles: Life and Work
Ray Dalio
Rating: 4 out of 5 stars
4/5 (625)
The Gifts of Imperfection: Let Go of Who You Think You're Supposed to Be and Embrace Who You Are
From Everand
The Gifts of Imperfection: Let Go of Who You Think You're Supposed to Be and Embrace Who You Are
Brené Brown
Rating: 4 out of 5 stars
4/5 (1112)
Never Split the Difference: Negotiating As If Your Life Depended On It
From Everand
Never Split the Difference: Negotiating As If Your Life Depended On It
Chris Voss
Rating: 4.5 out of 5 stars
4.5/5 (908)
The Glass Castle: A Memoir
From Everand
The Glass Castle: A Memoir
Jeannette Walls
Rating: 4.5 out of 5 stars
4.5/5 (1739)
Sing, Unburied, Sing: A Novel
From Everand
Sing, Unburied, Sing: A Novel
Jesmyn Ward
Rating: 4 out of 5 stars
4/5 (1245)
Grit: The Power of Passion and Perseverance
From Everand
Grit: The Power of Passion and Perseverance
Angela Duckworth
Rating: 4 out of 5 stars
4/5 (619)
Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race
From Everand
Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race
Margot Lee Shetterly
Rating: 4 out of 5 stars
4/5 (937)
The Perks of Being a Wallflower
From Everand
The Perks of Being a Wallflower
Stephen Chbosky
Rating: 4.5 out of 5 stars
4.5/5 (2120)
Shoe Dog: A Memoir by the Creator of Nike
From Everand
Shoe Dog: A Memoir by the Creator of Nike
Phil Knight
Rating: 4.5 out of 5 stars
4.5/5 (546)
The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers
From Everand
The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers
Ben Horowitz
Rating: 4.5 out of 5 stars
4.5/5 (358)
Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future
From Everand
Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future
Ashlee Vance
Rating: 4.5 out of 5 stars
4.5/5 (478)
Bad Feminist: Essays
From Everand
Bad Feminist: Essays
Roxane Gay
Rating: 4 out of 5 stars
4/5 (1062)
The Emperor of All Maladies: A Biography of Cancer
From Everand
The Emperor of All Maladies: A Biography of Cancer
Siddhartha Mukherjee
Rating: 4.5 out of 5 stars
4.5/5 (275)
Steve Jobs
From Everand
Steve Jobs
Walter Isaacson
Rating: 4.5 out of 5 stars
4.5/5 (814)
The Outsider: A Novel
From Everand
The Outsider: A Novel
Stephen King
Rating: 4 out of 5 stars
4/5 (1953)
Angela's Ashes: A Memoir
From Everand
Angela's Ashes: A Memoir
Frank McCourt
Rating: 4.5 out of 5 stars
4.5/5 (443)
The World Is Flat 3.0: A Brief History of the Twenty-first Century
From Everand
The World Is Flat 3.0: A Brief History of the Twenty-first Century
Thomas L. Friedman
Rating: 3.5 out of 5 stars
3.5/5 (2281)
The Yellow House: A Memoir (2019 National Book Award Winner)
From Everand
The Yellow House: A Memoir (2019 National Book Award Winner)
Sarah M. Broom
Rating: 4 out of 5 stars
4/5 (99)
Yes Please
From Everand
Yes Please
Amy Poehler
Rating: 4 out of 5 stars
4/5 (1961)
Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America
From Everand
Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America
Gilbert King
Rating: 4.5 out of 5 stars
4.5/5 (273)
The Art of Racing in the Rain: A Novel
From Everand
The Art of Racing in the Rain: A Novel
Garth Stein
Rating: 4 out of 5 stars
4/5 (4264)
A Tree Grows in Brooklyn
From Everand
A Tree Grows in Brooklyn
Betty Smith
Rating: 4.5 out of 5 stars
4.5/5 (1934)
A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True Story
From Everand
A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True Story
Dave Eggers
Rating: 3.5 out of 5 stars
3.5/5 (232)
Team of Rivals: The Political Genius of Abraham Lincoln
From Everand
Team of Rivals: The Political Genius of Abraham Lincoln
Doris Kearns Goodwin
Rating: 4.5 out of 5 stars
4.5/5 (235)
Fear: Trump in the White House
From Everand
Fear: Trump in the White House
Bob Woodward
Rating: 3.5 out of 5 stars
3.5/5 (805)
On Fire: The (Burning) Case for a Green New Deal
From Everand
On Fire: The (Burning) Case for a Green New Deal
Naomi Klein
Rating: 4 out of 5 stars
4/5 (75)
Rise of ISIS: A Threat We Can't Ignore
From Everand
Rise of ISIS: A Threat We Can't Ignore
Jay Sekulow
Rating: 3.5 out of 5 stars
3.5/5 (139)
Manhattan Beach: A Novel
From Everand
Manhattan Beach: A Novel
Jennifer Egan
Rating: 3.5 out of 5 stars
3.5/5 (883)
John Adams
From Everand
John Adams
David McCullough
Rating: 4.5 out of 5 stars
4.5/5 (2520)
The Unwinding: An Inner History of the New America
From Everand
The Unwinding: An Inner History of the New America
George Packer
Rating: 4 out of 5 stars
4/5 (45)
The Constant Gardener: A Novel
From Everand
The Constant Gardener: A Novel
John le Carré
Rating: 3.5 out of 5 stars
3.5/5 (109)
PPS
Document
2 pages
PPS
Hrishikesh Vasantham
No ratings yet
MATH 1st - 2nd SEM
Document
2 pages
MATH 1st - 2nd SEM
Hrishikesh Vasantham
No ratings yet
Daa Notes Unit 4
Document
14 pages
Daa Notes Unit 4
Hrishikesh Vasantham
No ratings yet
CHEM18CYB101J 1 Sem
Document
2 pages
CHEM18CYB101J 1 Sem
Hrishikesh Vasantham
No ratings yet
Little Women
From Everand
Little Women
Louisa May Alcott
Rating: 4 out of 5 stars
4/5 (105)