Learning Sum-Product Networks with Direct and Indirect Variable Interactions

Amirmohammad Rooshenas, Daniel Lowd
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):710-718, 2014.

Abstract

Sum-product networks (SPNs) are a deep probabilistic representation that allows for efficient, exact inference. SPNs generalize many other tractable models, including thin junction trees, latent tree models, and many types of mixtures. Previous work on learning SPN structure has mainly focused on using top-down or bottom-up clustering to find mixtures, which capture variable interactions indirectly through implicit latent variables. In contrast, most work on learning graphical models, thin junction trees, and arithmetic circuits has focused on finding direct interactions among variables. In this paper, we present ID-SPN, a new algorithm for learning SPN structure that unifies the two approaches. In experiments on 20 benchmark datasets, we find that the combination of direct and indirect interactions leads to significantly better accuracy than several state-of-the-art algorithms for learning SPNs and other tractable models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-rooshenas14, title = {Learning Sum-Product Networks with Direct and Indirect Variable Interactions}, author = {Rooshenas, Amirmohammad and Lowd, Daniel}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {710--718}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/rooshenas14.pdf}, url = {https://proceedings.mlr.press/v32/rooshenas14.html}, abstract = {Sum-product networks (SPNs) are a deep probabilistic representation that allows for efficient, exact inference. SPNs generalize many other tractable models, including thin junction trees, latent tree models, and many types of mixtures. Previous work on learning SPN structure has mainly focused on using top-down or bottom-up clustering to find mixtures, which capture variable interactions indirectly through implicit latent variables. In contrast, most work on learning graphical models, thin junction trees, and arithmetic circuits has focused on finding direct interactions among variables. In this paper, we present ID-SPN, a new algorithm for learning SPN structure that unifies the two approaches. In experiments on 20 benchmark datasets, we find that the combination of direct and indirect interactions leads to significantly better accuracy than several state-of-the-art algorithms for learning SPNs and other tractable models.} }
Endnote
%0 Conference Paper %T Learning Sum-Product Networks with Direct and Indirect Variable Interactions %A Amirmohammad Rooshenas %A Daniel Lowd %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-rooshenas14 %I PMLR %P 710--718 %U https://proceedings.mlr.press/v32/rooshenas14.html %V 32 %N 1 %X Sum-product networks (SPNs) are a deep probabilistic representation that allows for efficient, exact inference. SPNs generalize many other tractable models, including thin junction trees, latent tree models, and many types of mixtures. Previous work on learning SPN structure has mainly focused on using top-down or bottom-up clustering to find mixtures, which capture variable interactions indirectly through implicit latent variables. In contrast, most work on learning graphical models, thin junction trees, and arithmetic circuits has focused on finding direct interactions among variables. In this paper, we present ID-SPN, a new algorithm for learning SPN structure that unifies the two approaches. In experiments on 20 benchmark datasets, we find that the combination of direct and indirect interactions leads to significantly better accuracy than several state-of-the-art algorithms for learning SPNs and other tractable models.
RIS
TY - CPAPER TI - Learning Sum-Product Networks with Direct and Indirect Variable Interactions AU - Amirmohammad Rooshenas AU - Daniel Lowd BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-rooshenas14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 710 EP - 718 L1 - http://proceedings.mlr.press/v32/rooshenas14.pdf UR - https://proceedings.mlr.press/v32/rooshenas14.html AB - Sum-product networks (SPNs) are a deep probabilistic representation that allows for efficient, exact inference. SPNs generalize many other tractable models, including thin junction trees, latent tree models, and many types of mixtures. Previous work on learning SPN structure has mainly focused on using top-down or bottom-up clustering to find mixtures, which capture variable interactions indirectly through implicit latent variables. In contrast, most work on learning graphical models, thin junction trees, and arithmetic circuits has focused on finding direct interactions among variables. In this paper, we present ID-SPN, a new algorithm for learning SPN structure that unifies the two approaches. In experiments on 20 benchmark datasets, we find that the combination of direct and indirect interactions leads to significantly better accuracy than several state-of-the-art algorithms for learning SPNs and other tractable models. ER -
APA
Rooshenas, A. & Lowd, D.. (2014). Learning Sum-Product Networks with Direct and Indirect Variable Interactions. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):710-718 Available from https://proceedings.mlr.press/v32/rooshenas14.html.

Related Material