Exponential Integration for Hamiltonian Monte Carlo

Wei-Lun Chao, Justin Solomon, Dominik Michels, Fei Sha
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1142-1151, 2015.

Abstract

We investigate numerical integration of ordinary differential equations (ODEs) for Hamiltonian Monte Carlo (HMC). High-quality integration is crucial for designing efficient and effective proposals for HMC. While the standard method is leapfrog (Stormer-Verlet) integration, we propose the use of an exponential integrator, which is robust to stiff ODEs with highly-oscillatory components. This oscillation is difficult to reproduce using leapfrog integration, even with carefully selected integration parameters and preconditioning. Concretely, we use a Gaussian distribution approximation to segregate stiff components of the ODE. We integrate this term analytically for stability and account for deviation from the approximation using variation of constants. We consider various ways to derive Gaussian approximations and conduct extensive empirical studies applying the proposed “exponential HMC” to several benchmarked learning problems. We compare to state-of-the-art methods for improving leapfrog HMC and demonstrate the advantages of our method in generating many effective samples with high acceptance rates in short running times.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-chao15, title = {Exponential Integration for Hamiltonian Monte Carlo}, author = {Chao, Wei-Lun and Solomon, Justin and Michels, Dominik and Sha, Fei}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1142--1151}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/chao15.pdf}, url = {https://proceedings.mlr.press/v37/chao15.html}, abstract = {We investigate numerical integration of ordinary differential equations (ODEs) for Hamiltonian Monte Carlo (HMC). High-quality integration is crucial for designing efficient and effective proposals for HMC. While the standard method is leapfrog (Stormer-Verlet) integration, we propose the use of an exponential integrator, which is robust to stiff ODEs with highly-oscillatory components. This oscillation is difficult to reproduce using leapfrog integration, even with carefully selected integration parameters and preconditioning. Concretely, we use a Gaussian distribution approximation to segregate stiff components of the ODE. We integrate this term analytically for stability and account for deviation from the approximation using variation of constants. We consider various ways to derive Gaussian approximations and conduct extensive empirical studies applying the proposed “exponential HMC” to several benchmarked learning problems. We compare to state-of-the-art methods for improving leapfrog HMC and demonstrate the advantages of our method in generating many effective samples with high acceptance rates in short running times.} }
Endnote
%0 Conference Paper %T Exponential Integration for Hamiltonian Monte Carlo %A Wei-Lun Chao %A Justin Solomon %A Dominik Michels %A Fei Sha %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-chao15 %I PMLR %P 1142--1151 %U https://proceedings.mlr.press/v37/chao15.html %V 37 %X We investigate numerical integration of ordinary differential equations (ODEs) for Hamiltonian Monte Carlo (HMC). High-quality integration is crucial for designing efficient and effective proposals for HMC. While the standard method is leapfrog (Stormer-Verlet) integration, we propose the use of an exponential integrator, which is robust to stiff ODEs with highly-oscillatory components. This oscillation is difficult to reproduce using leapfrog integration, even with carefully selected integration parameters and preconditioning. Concretely, we use a Gaussian distribution approximation to segregate stiff components of the ODE. We integrate this term analytically for stability and account for deviation from the approximation using variation of constants. We consider various ways to derive Gaussian approximations and conduct extensive empirical studies applying the proposed “exponential HMC” to several benchmarked learning problems. We compare to state-of-the-art methods for improving leapfrog HMC and demonstrate the advantages of our method in generating many effective samples with high acceptance rates in short running times.
RIS
TY - CPAPER TI - Exponential Integration for Hamiltonian Monte Carlo AU - Wei-Lun Chao AU - Justin Solomon AU - Dominik Michels AU - Fei Sha BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-chao15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1142 EP - 1151 L1 - http://proceedings.mlr.press/v37/chao15.pdf UR - https://proceedings.mlr.press/v37/chao15.html AB - We investigate numerical integration of ordinary differential equations (ODEs) for Hamiltonian Monte Carlo (HMC). High-quality integration is crucial for designing efficient and effective proposals for HMC. While the standard method is leapfrog (Stormer-Verlet) integration, we propose the use of an exponential integrator, which is robust to stiff ODEs with highly-oscillatory components. This oscillation is difficult to reproduce using leapfrog integration, even with carefully selected integration parameters and preconditioning. Concretely, we use a Gaussian distribution approximation to segregate stiff components of the ODE. We integrate this term analytically for stability and account for deviation from the approximation using variation of constants. We consider various ways to derive Gaussian approximations and conduct extensive empirical studies applying the proposed “exponential HMC” to several benchmarked learning problems. We compare to state-of-the-art methods for improving leapfrog HMC and demonstrate the advantages of our method in generating many effective samples with high acceptance rates in short running times. ER -
APA
Chao, W., Solomon, J., Michels, D. & Sha, F.. (2015). Exponential Integration for Hamiltonian Monte Carlo. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1142-1151 Available from https://proceedings.mlr.press/v37/chao15.html.

Related Material