Factorized Asymptotic Bayesian Inference for Mixture Modeling

Ryohei Fujimaki, Satoshi Morinaga
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:400-408, 2012.

Abstract

This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorized asymptotic Bayesian inference (FAB), which maximizes an asymptotically-consistent lower bound of FIC. FIC and FAB have several desirable properties: 1) asymptotic consistency with the marginal log-likelihood, 2) automatic component selection on the basis of an intrinsic shrinkage mechanism, and 3) parameter identifiability in mixture modeling. Experimental results show that FAB outperforms state-of-the-art VB methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-fujimaki12, title = {Factorized Asymptotic Bayesian Inference for Mixture Modeling}, author = {Fujimaki, Ryohei and Morinaga, Satoshi}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {400--408}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/fujimaki12/fujimaki12.pdf}, url = {https://proceedings.mlr.press/v22/fujimaki12.html}, abstract = {This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorized asymptotic Bayesian inference (FAB), which maximizes an asymptotically-consistent lower bound of FIC. FIC and FAB have several desirable properties: 1) asymptotic consistency with the marginal log-likelihood, 2) automatic component selection on the basis of an intrinsic shrinkage mechanism, and 3) parameter identifiability in mixture modeling. Experimental results show that FAB outperforms state-of-the-art VB methods.} }
Endnote
%0 Conference Paper %T Factorized Asymptotic Bayesian Inference for Mixture Modeling %A Ryohei Fujimaki %A Satoshi Morinaga %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-fujimaki12 %I PMLR %P 400--408 %U https://proceedings.mlr.press/v22/fujimaki12.html %V 22 %X This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorized asymptotic Bayesian inference (FAB), which maximizes an asymptotically-consistent lower bound of FIC. FIC and FAB have several desirable properties: 1) asymptotic consistency with the marginal log-likelihood, 2) automatic component selection on the basis of an intrinsic shrinkage mechanism, and 3) parameter identifiability in mixture modeling. Experimental results show that FAB outperforms state-of-the-art VB methods.
RIS
TY - CPAPER TI - Factorized Asymptotic Bayesian Inference for Mixture Modeling AU - Ryohei Fujimaki AU - Satoshi Morinaga BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-fujimaki12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 400 EP - 408 L1 - http://proceedings.mlr.press/v22/fujimaki12/fujimaki12.pdf UR - https://proceedings.mlr.press/v22/fujimaki12.html AB - This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorized asymptotic Bayesian inference (FAB), which maximizes an asymptotically-consistent lower bound of FIC. FIC and FAB have several desirable properties: 1) asymptotic consistency with the marginal log-likelihood, 2) automatic component selection on the basis of an intrinsic shrinkage mechanism, and 3) parameter identifiability in mixture modeling. Experimental results show that FAB outperforms state-of-the-art VB methods. ER -
APA
Fujimaki, R. & Morinaga, S.. (2012). Factorized Asymptotic Bayesian Inference for Mixture Modeling. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:400-408 Available from https://proceedings.mlr.press/v22/fujimaki12.html.

Related Material