AIMA Artificial Intelligence a modern approch

AIMA-exercises is an open-source community of students, instructors and developers. Anyone can add an exercise, suggest answers to existing questions, or simply help us improve the platform. We accept contributions on this github repository.

Exercise 14.23 [MH-exercise]

The Metropolis–Hastings algorithm is a member of the MCMC family; as such, it is designed to generate samples $\textbf{x}$ (eventually) according to target probabilities $\pi(\textbf{x})$. (Typically we are interested in sampling from $\pi(\textbf{x})P(\textbf{x}\textbf{e})$.) Like simulated annealing, Metropolis–Hastings operates in two stages. First, it samples a new state $\textbf{x’}$ from a proposal distribution $q(\textbf{x’}\textbf{x})$, given the current state $\textbf{x}$. Then, it probabilistically accepts or rejects $\textbf{x’}$ according to the acceptance probability If the proposal is rejected, the state remains at $\textbf{x}$.

  1. Consider an ordinary Gibbs sampling step for a specific variable $X_i$. Show that this step, considered as a proposal, is guaranteed to be accepted by Metropolis–Hastings. (Hence, Gibbs sampling is a special case of Metropolis–Hastings.)

  2. Show that the two-step process above, viewed as a transition probability distribution, is in detailed balance with $\pi$.

View Answer