Gaussian Process Vine Copulas for Multivariate Dependence

David Lopez-Paz, Jose Miguel Hernández-Lobato, Ghahramani Zoubin
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):10-18, 2013.

Abstract

Copulas allow to learn marginal distributions separately from the multivariate dependence structure (copula) that links them together into a density function. Vine factorizations ease the learning of high-dimensional copulas by constructing a hierarchy of conditional bivariate copulas. However, to simplify inference, it is common to assume that each of these conditional bivariate copulas is independent from its conditioning variables. In this paper, we relax this assumption by discovering the latent functions that specify the shape of a conditional copula given its conditioning variables. We learn these functions by following a Bayesian approach based on sparse Gaussian processes with expectation propagation for scalable, approximate inference. Experiments on real-world datasets show that, when modeling all conditional dependencies, we obtain better estimates of the underlying copula of the data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-lopez-paz13, title = {Gaussian Process Vine Copulas for Multivariate Dependence}, author = {Lopez-Paz, David and Hernández-Lobato, Jose Miguel and Zoubin, Ghahramani}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {10--18}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/lopez-paz13.pdf}, url = {https://proceedings.mlr.press/v28/lopez-paz13.html}, abstract = {Copulas allow to learn marginal distributions separately from the multivariate dependence structure (copula) that links them together into a density function. Vine factorizations ease the learning of high-dimensional copulas by constructing a hierarchy of conditional bivariate copulas. However, to simplify inference, it is common to assume that each of these conditional bivariate copulas is independent from its conditioning variables. In this paper, we relax this assumption by discovering the latent functions that specify the shape of a conditional copula given its conditioning variables. We learn these functions by following a Bayesian approach based on sparse Gaussian processes with expectation propagation for scalable, approximate inference. Experiments on real-world datasets show that, when modeling all conditional dependencies, we obtain better estimates of the underlying copula of the data.} }
Endnote
%0 Conference Paper %T Gaussian Process Vine Copulas for Multivariate Dependence %A David Lopez-Paz %A Jose Miguel Hernández-Lobato %A Ghahramani Zoubin %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-lopez-paz13 %I PMLR %P 10--18 %U https://proceedings.mlr.press/v28/lopez-paz13.html %V 28 %N 2 %X Copulas allow to learn marginal distributions separately from the multivariate dependence structure (copula) that links them together into a density function. Vine factorizations ease the learning of high-dimensional copulas by constructing a hierarchy of conditional bivariate copulas. However, to simplify inference, it is common to assume that each of these conditional bivariate copulas is independent from its conditioning variables. In this paper, we relax this assumption by discovering the latent functions that specify the shape of a conditional copula given its conditioning variables. We learn these functions by following a Bayesian approach based on sparse Gaussian processes with expectation propagation for scalable, approximate inference. Experiments on real-world datasets show that, when modeling all conditional dependencies, we obtain better estimates of the underlying copula of the data.
RIS
TY - CPAPER TI - Gaussian Process Vine Copulas for Multivariate Dependence AU - David Lopez-Paz AU - Jose Miguel Hernández-Lobato AU - Ghahramani Zoubin BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-lopez-paz13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 2 SP - 10 EP - 18 L1 - http://proceedings.mlr.press/v28/lopez-paz13.pdf UR - https://proceedings.mlr.press/v28/lopez-paz13.html AB - Copulas allow to learn marginal distributions separately from the multivariate dependence structure (copula) that links them together into a density function. Vine factorizations ease the learning of high-dimensional copulas by constructing a hierarchy of conditional bivariate copulas. However, to simplify inference, it is common to assume that each of these conditional bivariate copulas is independent from its conditioning variables. In this paper, we relax this assumption by discovering the latent functions that specify the shape of a conditional copula given its conditioning variables. We learn these functions by following a Bayesian approach based on sparse Gaussian processes with expectation propagation for scalable, approximate inference. Experiments on real-world datasets show that, when modeling all conditional dependencies, we obtain better estimates of the underlying copula of the data. ER -
APA
Lopez-Paz, D., Hernández-Lobato, J.M. & Zoubin, G.. (2013). Gaussian Process Vine Copulas for Multivariate Dependence. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(2):10-18 Available from https://proceedings.mlr.press/v28/lopez-paz13.html.

Related Material