Multiple-try Metropolis

Multiple-try Metropolis (MTM) is a sampling method that is a modified form of the Metropolis–Hastings method, first presented by Liu, Liang, and Wong in 2000. It is designed to help the sampling trajectory converge faster, by increasing both the step size and the acceptance rate.

Background

Problems with Metropolis–Hastings

In Markov chain Monte Carlo, the Metropolis–Hastings algorithm (MH) can be used to sample from a probability distribution which is difficult to sample from directly. However, the MH algorithm requires the user to supply a proposal distribution, which can be relatively arbitrary. In many cases, one uses a Gaussian distribution centered on the current point in the probability space, of the form . This proposal distribution is convenient to sample from and may be the best choice if one has little knowledge about the target distribution, . If desired, one can use the more general multivariate normal distribution, , where is the covariance matrix which the user believes is similar to the target distribution.

Although this method must converge to the stationary distribution in the limit of infinite sample size, in practice the progress can be exceedingly slow. If is too large, almost all steps under the MH algorithm will be rejected. On the other hand, if is too small, almost all steps will be accepted, and the Markov chain will be similar to a random walk through the probability space. In the simpler case of , we see that steps only takes us a distance of . In this event, the Markov Chain will not fully explore the probability space in any reasonable amount of time. Thus the MH algorithm requires reasonable tuning of the scale parameter ( or ).

Problems with high dimensionality

Even if the scale parameter is well-tuned, as the dimensionality of the problem increases, progress can still remain exceedingly slow. To see this, again consider . In one dimension, this corresponds to a Gaussian distribution with mean 0 and variance 1. For one dimension, this distribution has a mean step of zero, however the mean squared step size is given by

As the number of dimensions increases, the expected step size becomes larger and larger. In dimensions, the probability of moving a radial distance is related to the Chi distribution, and is given by

This distribution is peaked at which is for large . This means that the step size will increase as the roughly the square root of the number of dimensions. For the MH algorithm, large steps will almost always land in regions of low probability, and therefore be rejected.

If we now add the scale parameter back in, we find that to retain a reasonable acceptance rate, we must make the transformation . In this situation, the acceptance rate can now be made reasonable, but the exploration of the probability space becomes increasingly slow. To see this, consider a slice along any one dimension of the problem. By making the scale transformation above, the expected step size is any one dimension is not but instead is . As this step size is much smaller than the "true" scale of the probability distribution (assuming that is somehow known a priori, which is the best possible case), the algorithm executes a random walk along every parameter.

The multiple-try Metropolis algorithm

Suppose is an arbitrary proposal function. We require that only if . Additionally, is the likelihood function.

Define where is a non-negative symmetric function in and that can be chosen by the user.

Now suppose the current state is . The MTM algorithm is as follows:

1) Draw k independent trial proposals from . Compute the weights for each of these.

2) Select from the with probability proportional to the weights.

3) Now produce a reference set by drawing from the distribution . Set (the current point).

4) Accept with probability

It can be shown that this method satisfies the detailed balance property and therefore produces a reversible Markov chain with as the stationary distribution.

If is symmetric (as is the case for the multivariate normal distribution), then one can choose which gives .

Several further theoretical studies, variants and extensions can be found in literature.[1][2][3][4][5][6] A review of MTM schemes and related techniques is given in [7].

Disadvantages

Multiple-try Metropolis needs to compute the energy of other states at every step. If the slow part of the process is calculating the energy, then this method can be slower. If the slow part of the process is finding neighbors of a given point, or generating random numbers, then again this method can be slower. It can be argued that this method only appears faster because it puts much more computation into a "single step" than Metropolis-Hastings does.

gollark: That's definitely true.
gollark: If you compare large supercomputers to my phone I think you might be about right.
gollark: Better *how*?
gollark: Oh, flash storage, that is a huge one.
gollark: ... which we *have had*, modern computers are better than 30-year-old ones.

See also

References

  1. Bédard, Mylène; Douc, Randal; Moulines, Eric (2012-03-01). "Scaling analysis of multiple-try MCMC methods". Stochastic Processes and Their Applications. 122 (3): 758–786. doi:10.1016/j.spa.2011.11.004.
  2. Martino, Luca; Read, Jesse (2013-07-11). "On the flexibility of the design of multiple try Metropolis schemes". Computational Statistics. 28 (6): 2797–2823. arXiv:1201.0646. doi:10.1007/s00180-013-0429-2. ISSN 0943-4062.
  3. Craiu, Radu V.; Lemieux, Christiane (2007-01-30). "Acceleration of the Multiple-Try Metropolis algorithm using antithetic and stratified sampling". Statistics and Computing. 17 (2): 109–120. doi:10.1007/s11222-006-9009-4. ISSN 0960-3174.
  4. Martino, Luca; Del Olmo, Victor Pascual; Read, Jesse (2012-07-01). "A multi-point Metropolis scheme with generic weight functions". Statistics & Probability Letters. 82 (7): 1445–1453. arXiv:1112.4048. doi:10.1016/j.spl.2012.04.008.
  5. Pandolfi, Silvia; Bartolucci, Francesco; Friel, Nial (2014-04-01). "A generalized multiple-try version of the Reversible Jump algorithm". Computational Statistics & Data Analysis. 72: 298–314. arXiv:1006.0621. doi:10.1016/j.csda.2013.10.007. hdl:10197/8372.
  6. Casarin, Roberto; Craiu, Radu; Leisen, Fabrizio (2011-12-07). "Interacting multiple try algorithms with different proposal distributions". Statistics and Computing. 23 (2): 185–200. arXiv:1011.1170. doi:10.1007/s11222-011-9301-9. ISSN 0960-3174.
  7. Martino, Luca (2018). "A review of multiple try MCMC algorithms for signal processing". Digital Signal Processing. 75: 134–152. arXiv:1801.09065. doi:10.1016/j.dsp.2018.01.004.
  • [1]Liu, J. S., Liang, F. and Wong, W. H. (2000). The multiple-try method and local optimization in Metropolis sampling, Journal of the American Statistical Association, 95(449): 121–134 JSTOR
  1. Casarin, Roberto; Craiu, Radu; Leisen, Fabrizio (2011-12-07). "Interacting multiple try algorithms with different proposal distributions". Statistics and Computing. 23 (2): 185–200. arXiv:1011.1170. doi:10.1007/s11222-011-9301-9. ISSN 0960-3174.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.