Problems in Probability

Border Border
Fall 2011 Strategic Practice 11: Section 3 (Markov Chains) - Question 1
Let be a Markov chain. Show that is also a Markov chain, and explain why this makes sense intuitively.
Solution: Consult iTunes course for full detailed solutions. By the definition of a Markov chain, we know that X[2n+1], X[2n+2], . . . ("the future" if we define the "present" to be time 2n) is conditionally independent of "the past". To find the transition probabilities of the Y-chain, let Q be the transition matrix of the X-chain (labeling the states as 1, 2, . . . , M). Then Q^2 gives the "two-step" transition probabilities of the X-chain. The one-step transitions of the Y-chain correspond to two-step transitions of the X-chain.
"Mathematics is the logic of certainty, but statistics is the logic of uncertainty."
Copyright © 2011 Stat 110 Harvard. Website layout by former Stat110'er.