Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables

Yaniv Tenzer, Gal Elidan
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:222-230, 2016.

Abstract

A formidable challenge in uncertainty modeling in general, and when learning Bayesian networks in particular, is the discovery of unknown hidden variables. Few works that tackle this task are typically limited to discrete or Gaussian domains, or to tree structures. We propose a novel general purpose approach for discovering hidden variables in flexible non-Gaussian domains using the powerful class of Gaussian copula networks. Briefly, we define the concept of a hypothetically optimal predictor of variable and show it can be used to discover useful hidden variables in the expressive framework of copula networks. Our approach leads to performance and compactness advantages over competitors in a variety of domains.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-tenzer16, title = {Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables}, author = {Tenzer, Yaniv and Elidan, Gal}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {222--230}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/tenzer16.pdf}, url = {https://proceedings.mlr.press/v51/tenzer16.html}, abstract = {A formidable challenge in uncertainty modeling in general, and when learning Bayesian networks in particular, is the discovery of unknown hidden variables. Few works that tackle this task are typically limited to discrete or Gaussian domains, or to tree structures. We propose a novel general purpose approach for discovering hidden variables in flexible non-Gaussian domains using the powerful class of Gaussian copula networks. Briefly, we define the concept of a hypothetically optimal predictor of variable and show it can be used to discover useful hidden variables in the expressive framework of copula networks. Our approach leads to performance and compactness advantages over competitors in a variety of domains.} }
Endnote
%0 Conference Paper %T Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables %A Yaniv Tenzer %A Gal Elidan %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-tenzer16 %I PMLR %P 222--230 %U https://proceedings.mlr.press/v51/tenzer16.html %V 51 %X A formidable challenge in uncertainty modeling in general, and when learning Bayesian networks in particular, is the discovery of unknown hidden variables. Few works that tackle this task are typically limited to discrete or Gaussian domains, or to tree structures. We propose a novel general purpose approach for discovering hidden variables in flexible non-Gaussian domains using the powerful class of Gaussian copula networks. Briefly, we define the concept of a hypothetically optimal predictor of variable and show it can be used to discover useful hidden variables in the expressive framework of copula networks. Our approach leads to performance and compactness advantages over competitors in a variety of domains.
RIS
TY - CPAPER TI - Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables AU - Yaniv Tenzer AU - Gal Elidan BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-tenzer16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 222 EP - 230 L1 - http://proceedings.mlr.press/v51/tenzer16.pdf UR - https://proceedings.mlr.press/v51/tenzer16.html AB - A formidable challenge in uncertainty modeling in general, and when learning Bayesian networks in particular, is the discovery of unknown hidden variables. Few works that tackle this task are typically limited to discrete or Gaussian domains, or to tree structures. We propose a novel general purpose approach for discovering hidden variables in flexible non-Gaussian domains using the powerful class of Gaussian copula networks. Briefly, we define the concept of a hypothetically optimal predictor of variable and show it can be used to discover useful hidden variables in the expressive framework of copula networks. Our approach leads to performance and compactness advantages over competitors in a variety of domains. ER -
APA
Tenzer, Y. & Elidan, G.. (2016). Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:222-230 Available from https://proceedings.mlr.press/v51/tenzer16.html.

Related Material