(a) For a Markov chain on the state space {1, 2, …} give the definition of a stationary distribution. (b) Suppose π denotes the stationary distribution associated with a transition matrix P. Show that πP2 = π. (c) Calculate the stationary distribution for the Markov chain {Xn} on state space {1, 2, 3} with the following transition matrix P = (1/4 1/2 1/4 1/4 1/4 1/2 0 1/2 1/2). (d) Suppose X0 = 1, and I ran this Markov chain for a very long period of time. Approximately what proportion of time would you expect this chain to be in state 3? How about if X0 = 2?

(a) For a Markov chain on the state space {1, 2, …} give the definition of a stationary distribution. (b) Suppose π denotes the stationary distribution associated with a transition matrix P. Show that πP2 = π. (c) Calculate the stationary distribution for the Markov chain {Xn} on state space {1, 2, 3} with the following transition matrix P = (1/4 1/2 1/4 1/4 1/4 1/2 0 1/2 1/2). (d) Suppose X0 = 1, and I ran this Markov chain for a very long period of time. Approximately what proportion of time would you expect this chain to be in state 3? How about if X0 = 2?

Image text
  1. (a) For a Markov chain on the state space { 1 , 2 , } give the definition of a stationary distribution. (b) Suppose π denotes the stationary distribution associated with a transition matrix P . Show that π P 2 = π . (c) Calculate the stationary distribution for the Markov chain { X n } on state space { 1 , 2 , 3 } with the following transition matrix
P = ( 1 / 4 1 / 2 1 / 4 1 / 4 1 / 4 1 / 2 0 1 / 2 1 / 2 ) .
(d) Suppose X 0 = 1 , and I ran this Markov chain for a very long period of time. Approximately what proportion of time would you expect this chain to be in state 3? How about if X 0 = 2 ?

Detailed Answer