next up previous contents
Next: Markov Chain Monte Carlo Up: Sampling from the Posterior Previous: Monte Carlo Method

Markov Chains

definition335

definition337

For simplicity, theory and details are given for a discrete state space, S.

definition342

A stationary Markov chain is sometimes referred to as homogeneous in time, since, by definition, the probability of moving between two states remains constant in time.

definition345

Note that this definition is independent of n (stationarity), that the entries in the Matrix are tex2html_wrap_inline2349 (probabilities) and that tex2html_wrap_inline2351 , since the chain must move to some state, j. This is sometimes called a transition matrix, and the associated probabilities transition probabilities. It is also worth noting that the matrix tex2html_wrap_inline2355 is the matrix of probabilities tex2html_wrap_inline2357 .

definition353

That is, there is a non-zero probability of going to state j from state k in n steps, for some n.

definition363

definition368

definition371

definition375

The stationary distribution is also referred to as the invariant distribution or equilibrium distribution of a Markov chain.

theorem378

Recall the discussion above regarding stratified and importance sampling. If it were possible to construct a Markov chain that would visit each category the `correct' number of times, then this method could be used to sample from the distribution of interest. In practice, what `correct' means here, is that the equilibrium distribution of the Markov chain is the same as the distribution of interest. In a sense this is the reverse of the theory above, since the distribution of interest is known, and the Markov chain needs to be constructed.

It is possible to do this, under certain conditions, and there are a number of ways of doing it. Of primary interest will be the approach of Metropolis-Hastings.



Cathal Walsh
Sat Jan 22 17:09:53 GMT 2000