Posts

Showing posts from 2013

MCMC Sampling - Part I

Markov Chain Monte Carlo (MCMC) methods are popularly used in Bayesian inference in order to obtain samples from complex distributions. Let us examine two of the popularly used MCMC methods today - Metropolis Algorithm Gibbs Sampling Before we discuss the two methods, let us start by examining Markov chains. A first order Markov process refers to a process such that the transition in the state space is only dependent on the current state. In simple words it can be said as "memoryless". P(Xt+1 | X0, X1, X2, Xt) = Pr(Xt+1|Xt) A Markov chain is irreducible if it is possible to get from any state to any other state. In other words, the states communicate with each other and the transition probability p_i,j > 0 for any state i, j. A chain is aperiodic if there are no cycles. A state i is transient as opposed to recurrent if there is a non-zero probability that upon starting from state i we will never return to it. A chain is ergodic if it is aperiodic and recurrent