Polya-gamma augmentations for factor models

Arto Klami
Proceedings of the Sixth Asian Conference on Machine Learning, PMLR 39:112-128, 2015.

Abstract

Bayesian inference for latent factor models, such as principal component and canonical correlation analysis, is easy for Gaussian likelihoods. In particular, full conjugacy makes both Gibbs samplers and mean-field variational approximations straightforward. For other likelihood potentials one needs to either resort to more complex sampling schemes or to specifying dedicated forms of variational lower bounds. Recently, however, it was shown that for specific likelihoods related to the logistic function it is possible to augment the joint density with auxiliary variables following a Polya-Gamma distribution, leading to closed-form updates for binary and over-dispersed count models. In this paper we describe how Gibbs sampling and mean-field variational approximation for various latent factor models can be implemented for these cases, presenting easy-to-implement and efficient inference schemas.

Cite this Paper


BibTeX
@InProceedings{pmlr-v39-klami14, title = {{P}olya-gamma augmentations for factor models}, author = {Klami, Arto}, booktitle = {Proceedings of the Sixth Asian Conference on Machine Learning}, pages = {112--128}, year = {2015}, editor = {Phung, Dinh and Li, Hang}, volume = {39}, series = {Proceedings of Machine Learning Research}, address = {Nha Trang City, Vietnam}, month = {26--28 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v39/klami14.pdf}, url = {https://proceedings.mlr.press/v39/klami14.html}, abstract = {Bayesian inference for latent factor models, such as principal component and canonical correlation analysis, is easy for Gaussian likelihoods. In particular, full conjugacy makes both Gibbs samplers and mean-field variational approximations straightforward. For other likelihood potentials one needs to either resort to more complex sampling schemes or to specifying dedicated forms of variational lower bounds. Recently, however, it was shown that for specific likelihoods related to the logistic function it is possible to augment the joint density with auxiliary variables following a Polya-Gamma distribution, leading to closed-form updates for binary and over-dispersed count models. In this paper we describe how Gibbs sampling and mean-field variational approximation for various latent factor models can be implemented for these cases, presenting easy-to-implement and efficient inference schemas.} }
Endnote
%0 Conference Paper %T Polya-gamma augmentations for factor models %A Arto Klami %B Proceedings of the Sixth Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Dinh Phung %E Hang Li %F pmlr-v39-klami14 %I PMLR %P 112--128 %U https://proceedings.mlr.press/v39/klami14.html %V 39 %X Bayesian inference for latent factor models, such as principal component and canonical correlation analysis, is easy for Gaussian likelihoods. In particular, full conjugacy makes both Gibbs samplers and mean-field variational approximations straightforward. For other likelihood potentials one needs to either resort to more complex sampling schemes or to specifying dedicated forms of variational lower bounds. Recently, however, it was shown that for specific likelihoods related to the logistic function it is possible to augment the joint density with auxiliary variables following a Polya-Gamma distribution, leading to closed-form updates for binary and over-dispersed count models. In this paper we describe how Gibbs sampling and mean-field variational approximation for various latent factor models can be implemented for these cases, presenting easy-to-implement and efficient inference schemas.
RIS
TY - CPAPER TI - Polya-gamma augmentations for factor models AU - Arto Klami BT - Proceedings of the Sixth Asian Conference on Machine Learning DA - 2015/02/16 ED - Dinh Phung ED - Hang Li ID - pmlr-v39-klami14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 39 SP - 112 EP - 128 L1 - http://proceedings.mlr.press/v39/klami14.pdf UR - https://proceedings.mlr.press/v39/klami14.html AB - Bayesian inference for latent factor models, such as principal component and canonical correlation analysis, is easy for Gaussian likelihoods. In particular, full conjugacy makes both Gibbs samplers and mean-field variational approximations straightforward. For other likelihood potentials one needs to either resort to more complex sampling schemes or to specifying dedicated forms of variational lower bounds. Recently, however, it was shown that for specific likelihoods related to the logistic function it is possible to augment the joint density with auxiliary variables following a Polya-Gamma distribution, leading to closed-form updates for binary and over-dispersed count models. In this paper we describe how Gibbs sampling and mean-field variational approximation for various latent factor models can be implemented for these cases, presenting easy-to-implement and efficient inference schemas. ER -
APA
Klami, A.. (2015). Polya-gamma augmentations for factor models. Proceedings of the Sixth Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 39:112-128 Available from https://proceedings.mlr.press/v39/klami14.html.

Related Material