A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection

Silvia Pandolfi, Francesco Bartolucci, Nial Friel
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:581-588, 2010.

Abstract

We propose a generalization of the Multiple-try Metropolis (MTM) algorithm of Liu et al. (2000), which is based on drawing several proposals at each step and randomly choosing one of them on the basis of weights that may be arbitrary chosen. In particular, for Bayesian estimation we also introduce a method based on weights depending on a quadratic approximation of the posterior distribution. The resulting algorithm cannot be reformulated as an MTM algorithm and leads to a comparable gain of efficiency with a lower computational effort. We also outline the extension of the proposed strategy, and then of the MTM strategy, to Bayesian model selection, casting it in a Reversible Jump framework. The approach is illustrated by real examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-pandolfi10a, title = {A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection}, author = {Pandolfi, Silvia and Bartolucci, Francesco and Friel, Nial}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {581--588}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/pandolfi10a/pandolfi10a.pdf}, url = {https://proceedings.mlr.press/v9/pandolfi10a.html}, abstract = {We propose a generalization of the Multiple-try Metropolis (MTM) algorithm of Liu et al. (2000), which is based on drawing several proposals at each step and randomly choosing one of them on the basis of weights that may be arbitrary chosen. In particular, for Bayesian estimation we also introduce a method based on weights depending on a quadratic approximation of the posterior distribution. The resulting algorithm cannot be reformulated as an MTM algorithm and leads to a comparable gain of efficiency with a lower computational effort. We also outline the extension of the proposed strategy, and then of the MTM strategy, to Bayesian model selection, casting it in a Reversible Jump framework. The approach is illustrated by real examples.} }
Endnote
%0 Conference Paper %T A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection %A Silvia Pandolfi %A Francesco Bartolucci %A Nial Friel %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-pandolfi10a %I PMLR %P 581--588 %U https://proceedings.mlr.press/v9/pandolfi10a.html %V 9 %X We propose a generalization of the Multiple-try Metropolis (MTM) algorithm of Liu et al. (2000), which is based on drawing several proposals at each step and randomly choosing one of them on the basis of weights that may be arbitrary chosen. In particular, for Bayesian estimation we also introduce a method based on weights depending on a quadratic approximation of the posterior distribution. The resulting algorithm cannot be reformulated as an MTM algorithm and leads to a comparable gain of efficiency with a lower computational effort. We also outline the extension of the proposed strategy, and then of the MTM strategy, to Bayesian model selection, casting it in a Reversible Jump framework. The approach is illustrated by real examples.
RIS
TY - CPAPER TI - A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection AU - Silvia Pandolfi AU - Francesco Bartolucci AU - Nial Friel BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-pandolfi10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 581 EP - 588 L1 - http://proceedings.mlr.press/v9/pandolfi10a/pandolfi10a.pdf UR - https://proceedings.mlr.press/v9/pandolfi10a.html AB - We propose a generalization of the Multiple-try Metropolis (MTM) algorithm of Liu et al. (2000), which is based on drawing several proposals at each step and randomly choosing one of them on the basis of weights that may be arbitrary chosen. In particular, for Bayesian estimation we also introduce a method based on weights depending on a quadratic approximation of the posterior distribution. The resulting algorithm cannot be reformulated as an MTM algorithm and leads to a comparable gain of efficiency with a lower computational effort. We also outline the extension of the proposed strategy, and then of the MTM strategy, to Bayesian model selection, casting it in a Reversible Jump framework. The approach is illustrated by real examples. ER -
APA
Pandolfi, S., Bartolucci, F. & Friel, N.. (2010). A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:581-588 Available from https://proceedings.mlr.press/v9/pandolfi10a.html.

Related Material