Factorial Mixture of Gaussians and the Marginal Independence Model

Ricardo Silva, Zoubin Ghahramani
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:520-527, 2009.

Abstract

Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-silva09b, title = {Factorial Mixture of Gaussians and the Marginal Independence Model}, author = {Silva, Ricardo and Ghahramani, Zoubin}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {520--527}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/silva09b/silva09b.pdf}, url = {https://proceedings.mlr.press/v5/silva09b.html}, abstract = {Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.} }
Endnote
%0 Conference Paper %T Factorial Mixture of Gaussians and the Marginal Independence Model %A Ricardo Silva %A Zoubin Ghahramani %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-silva09b %I PMLR %P 520--527 %U https://proceedings.mlr.press/v5/silva09b.html %V 5 %X Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.
RIS
TY - CPAPER TI - Factorial Mixture of Gaussians and the Marginal Independence Model AU - Ricardo Silva AU - Zoubin Ghahramani BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-silva09b PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 520 EP - 527 L1 - http://proceedings.mlr.press/v5/silva09b/silva09b.pdf UR - https://proceedings.mlr.press/v5/silva09b.html AB - Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues. ER -
APA
Silva, R. & Ghahramani, Z.. (2009). Factorial Mixture of Gaussians and the Marginal Independence Model. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:520-527 Available from https://proceedings.mlr.press/v5/silva09b.html.

Related Material