Skip to main content

All Questions

Filter by
Sorted by
Tagged with
0 votes
0 answers
26 views

Metropolis sampling with stochastic estimation of component of probability density

Consider the probability distribution \begin{align} p(x) = \frac{1}{Z} x^2 e^{-x^2 / 2} \end{align} where $x \in \mathbb{R}$ and $Z$ is a normalization factor so that $\int_{-\infty}^{\infty} dx \, p(...
David Albandea's user avatar
0 votes
0 answers
88 views

Multiple Importance Sampling and Metropolis-Hastings on extended state space

Let $(E,\mathcal E,\lambda),(E',\mathcal E',\lambda')$ be measure spaces $k\in\mathbb N$ $p,q_1,\ldots,q_k:E\to(0,\infty)$ be probability densities on $(E,\mathcal E,\lambda)$ $w_1,\ldots,w_k:E\to[0,...
0xbadf00d's user avatar
  • 213
3 votes
0 answers
45 views

Is there a reason why we should run the Metorpolis-Hastings algorithm with a target density approximating the density we're actually after?

Let $(E,\mathcal E,\lambda)$ be a measure space, $p:E\to[0,\infty)$ be $\mathcal E$-measurable with $$c:=\int p\:{\rm d}\lambda$$ and $$\mu:=\underbrace{\frac1cp}_{=:\:\tilde p}\lambda$$ denote the ...
0xbadf00d's user avatar
  • 213
51 votes
1 answer
25k views

What is the difference between Metropolis-Hastings, Gibbs, Importance, and Rejection sampling?

I have been trying to learn MCMC methods and have come across Metropolis-Hastings, Gibbs, Importance, and Rejection sampling. While some of these differences are obvious, i.e., how Gibbs is a special ...
user1398057's user avatar
  • 2,425
4 votes
1 answer
2k views

Independence-Metropolis-Hastings Algorithm

IMHA is an importance-sampling version of MCMC, where the proposal is drawn from a fixed distribution g. Usually, g is chosen to be an approximation to f. The acceptance probability becomes $r(x,y)=\...
user84756's user avatar
  • 537