lecturemania success ma ximtzer Z 3, frfCB. O . 5 - - - - * J. AlpGirayOzen! 053354991 08 | [email protected] | www.lecturemanta.com lecturemania success maxmzer , /) &A/J S.S. * fe>, £, 2, 33 . ., //»C, - ^ J // 1- AlpGirayOzen| 05335499108 j [email protected] | www.lecturemania.com iecturemania success maximizer /^5 -• ty-/«»2f- ***• Me, « 7, /,*, 3 , 4 J Xr-1 7 ***Z — v &" ti- A- ./>-• • X / s - o / ' e . - '**«* v /« -/J AlpGiraydzenl 053354991 08 | alp@lecturemaniaxom | www.lecturemaniaxom 1 lecturernania success maximizer SS= ffj "A* - - * PS*,, =2; Xj. * 2, Xi*t,j -7= — SS-f0,lf 2, , _ AlpGirayOzen | 05335499108 | [email protected] | www.lecturemania.com lecturemania success maximizer 16.2-1. Assume that the probability of rain tomorrow is 0£ if it is raining today, and assume that the probability of its being clear (no rain) tomorrow is 0.9 if it is clear today. Also assume that these probabilities do not change if information is also provided about the weather before today. (a) Explain why the stated assumptions imply that the Markovian property holds for the evolution of the weather. (b) Formulate the evolution of the weather as a Markov chain by defining its states and giving its (one-step) transition matrix. It. 2 - /-• u sj 16.2-2. Consider the second version of the stock market model presented as an example in Sec. 16.2. Whether the stock goes up tomorrow depends upon whether it increased today and yesterday. If the stock increased today and yesterday, it will increase tomorrow with probability a i. If the stock increased today and decreased yesterday, it will increase tomorrow with probability « 2 . If the stock decreased today and increased yesterday, it will increase tomorrow with probability «3. Finally, if the stock decreased today and yesterday, it will increase tomorrow with probability «4. (a) Construct the (one-step) transition matrix of the Markov chain. (b) Explain why the states used for this Markov chain cause the mathematical definition of the Markovian property to hold even though what happens in the future (tomorrow) depends upon what happened in the past (yesterday) as well as the present (today). y+s-ks-dc^ v«v- iX/ cxx ^J 23 C. . 053354991 08 |alp^tectur^maniaxom| www.tectu^mania.com iecturemania success maximizer 3 ™ ^ <&**^ ^ //I *S? / ?/&- &/&QS-A4QL- A 6 ^ *6 f X± AlpGirayOzenj 05335499108 | [email protected] | www.lecturemania.com 3 lecturemania success maxmzer - / =2 '- J.7- • " ^r~ <9/tf/-o 0*631 OtZttj . rtfrt X ^///<^ £>,36f —^ ^20/3 6 f & 0,3 * f £,3£f 0,1 ?£* 0/J6f 3 0,3& & o <2sJ#/ ,. ^ - e AlpGirayOzen | 0533 5499108 | alp@lecturemaniaxom | www.lecturemaniaxom iecturemania success maximizer O M At *9~sty * / ^ >»-*#1 -TT^i? ^--»> / ^——?/ SS; , . - _ p AlpGirayOzenj 05335499108 | [email protected] | www.Iecturemania.com lecturemania success maximizer to =**/>„+/>„/>„ - °4 • 0,3-f a a JL (3) & <? t /- AlpGirayOzeni 05335499108 | [email protected] j www.lecturemania.com <£> lecturemania success maximizer '> /*) f X 16.3-3. A particle moves on a circle through points that have been marked 0, 1, 2, 3, 4 (in a clockwise order). The particle starts at point 0. At each step it has probability QS of moving one point clockwise (0 follows 4) and 0£ of moving one point counterclockwise. Let X,, (n 5: 0) denote its location on the circle after step n. [Xn] is a Markov chain, (a) Construct the (one-step) transition matrix. ./ ^ x? 1 '&• -* *" -y / /- •" /0C.C/ -rr&SL, y CCt* _ ^ ?"/O/ ,/y ^-/ J / ^ ? "» 3 0 & 0,2- & ]0<? O 0,2& & AlpGirayOzenl 05335499108 |alp^tectur^maniaxom| www.Jecture^aniaxor *?^ lecturemania success maxmzer <*>? is, Alp Giray Ozen | 053354991 08 | [email protected] | www.iecturemania.com lecturemania success maximizer tfC #&& &(& <@*—xS . 2f 3 J 7* J AIPG,rayOzen| 05335499108 3-2 - -^- lecturemania success rnaximizer a> / 16.4-3. Given the following (one-step) transition matrix of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. State 0 1 P= 2 3 4 0 4 3 4 1 3 J 3 4 1 4 i 0 0 ,0 0 2 3 0 0 0 0 1 0 3 0 34 0 41 4 CD */*'<*£.£- 0" 0 0 1 4 3 4- 16.4-4. Determine the period of each of the states in the Markov chain that has the following (one-step) transition matrix. State 0 1 2 3 4 5 0 O 0 1 0 0 1 2 3 4 5 00 | 0 f 0 1 0 0 0 0 0 0 0 0 | 0 0I 0 1 0 0 0 0 f2) 16-6- 0-2 -6-2. - O M *^ *f AlpGirayOzenj 05335499108 | [email protected] | www.lecturemania. lecturemania success maximizer U T fa) C 16.5-4. The leading brewery on the West Coast (labeled /O has hired an OR analyst to analyze its market position. It is particularly concerned about its major competitor (labeled 5). The analyst believes that brand switching can be modeled as a Markov chain using three states, with states A and B representing customers drinking beer produced from the aforementioned breweries and state C representing all other brands. Data are taken monthly, and the analyst has constructed the following (one-step) transition matrix from past data. B 0.7 0.2 0.1 0.2 0.1 0.75 0.05 0.1 0.8 7% + 6, What arc the steady-state market shares for the two major breweries? ^ .- * 7T . .Q &J & */£ ^L etna ' AlpGiray Ozen | 0533 54991 08 | [email protected] | www.lecturemania.co , %**#* lecturemania success maximizer =&, 2-5-6 /s A^^t &*+ ^ */£&</t*»s&&r^&af v^^S <3^T< 05335499,08 lecturemania success maximizer -t , 16.5-7." Consider the inventory example introduced in Sec. 16.1, but with the following change in the ordering policy. If the numher of cameras on hand at the end of each week is 0 or 1, two additional cameras will be ordered. Otherwise, no ordering will take place. Assume that the storage costs are the same as given in the second subsection of Sec. 16.5. — ^ 1 lo • -? " ^~/ & / /) * * +• ^ —. Q \ ^ " C (a) Find the steady-state probabilities of the state of this Markov chdn. ^ / - .. -, &S-f & {j ••* * £. ? X" & w ff (b) Find the long-run expected average storage cost per week. a_ 2. 3 Alp Giray Ozen | 0533 549 9108 | [email protected] | www.lecturemania, lecturemania success maximizer 1 u. 2? ' -3 *6s' AlpGirayOzen] 05335499108 | [email protected] | www.lecturemanta.com
© Copyright 2026 Paperzz