I'm trying to compute the marginal likelihood for a statistical model by Monte Carlo methods:

$$f(x) = int f(xmidtheta) pi(theta), dtheta$$

The likelihood is well behaved – smooth, log-concave – but high-dimensional. I've tried importance sampling, but the results are wonky and depend highly on the proposal I'm using. I briefly considered doing Hamiltonian Monte Carlo to compute posterior samples assuming a uniform prior over $theta$ and taking the harmonic mean, until I saw this. Lesson learned, the harmonic mean can have infinite variance. Is there an alternative MCMC estimator that is nearly as simple, but has a well-behaved variance?

**Contents**hide

#### Best Answer

How about annealed importance sampling? It has *much* lower variance than regular importance sampling. I've seen it called the "gold standard", and it's not much harder to implement than "normal" importance sampling. It's slower in the sense that you have to make a bunch of MCMC moves for each sample, but each sample tends to be very high-quality so you don't need as many of them before your estimates settle down.

The other major alternative is sequential importance sampling. My sense is that it's also fairly straightforward to implement, but it requires some familiarity with sequential Monte Carlo (AKA particle filtering), which I lack.

Good luck!

**Edited to add**: It looks like the Radford Neal blog post you linked to also recommends Annealed Importance Sampling. Let us know if it works well for you.

### Similar Posts:

- Solved – When is MCMC useful
- Solved – When is MCMC useful
- Solved – Is Markov chain based sampling the “best” for Monte Carlo sampling? Are there alternative schemes available
- Solved – Monte Carlo maximum likelihood vs Bayesian inference
- Solved – Variance for hit-and-miss Monte Carlo method and importance sampling