Tight Variational Bounds via Random Projections and I-Projections

Lun-Kai Hsu, Tudor Achim, Stefano Ermon
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1087-1095, 2016.

Abstract

Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-hsu16, title = {Tight Variational Bounds via Random Projections and I-Projections}, author = {Hsu, Lun-Kai and Achim, Tudor and Ermon, Stefano}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1087--1095}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/hsu16.pdf}, url = {https://proceedings.mlr.press/v51/hsu16.html}, abstract = {Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.} }
Endnote
%0 Conference Paper %T Tight Variational Bounds via Random Projections and I-Projections %A Lun-Kai Hsu %A Tudor Achim %A Stefano Ermon %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-hsu16 %I PMLR %P 1087--1095 %U https://proceedings.mlr.press/v51/hsu16.html %V 51 %X Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.
RIS
TY - CPAPER TI - Tight Variational Bounds via Random Projections and I-Projections AU - Lun-Kai Hsu AU - Tudor Achim AU - Stefano Ermon BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-hsu16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1087 EP - 1095 L1 - http://proceedings.mlr.press/v51/hsu16.pdf UR - https://proceedings.mlr.press/v51/hsu16.html AB - Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data. ER -
APA
Hsu, L., Achim, T. & Ermon, S.. (2016). Tight Variational Bounds via Random Projections and I-Projections. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1087-1095 Available from https://proceedings.mlr.press/v51/hsu16.html.

Related Material