Scalable Gaussian Process Classification via Expectation Propagation

Daniel Hernandez-Lobato, Jose Miguel Hernandez-Lobato
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:168-176, 2016.

Abstract

Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-hernandez-lobato16, title = {Scalable Gaussian Process Classification via Expectation Propagation}, author = {Hernandez-Lobato, Daniel and Hernandez-Lobato, Jose Miguel}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {168--176}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/hernandez-lobato16.pdf}, url = {https://proceedings.mlr.press/v51/hernandez-lobato16.html}, abstract = {Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.} }
Endnote
%0 Conference Paper %T Scalable Gaussian Process Classification via Expectation Propagation %A Daniel Hernandez-Lobato %A Jose Miguel Hernandez-Lobato %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-hernandez-lobato16 %I PMLR %P 168--176 %U https://proceedings.mlr.press/v51/hernandez-lobato16.html %V 51 %X Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.
RIS
TY - CPAPER TI - Scalable Gaussian Process Classification via Expectation Propagation AU - Daniel Hernandez-Lobato AU - Jose Miguel Hernandez-Lobato BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-hernandez-lobato16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 168 EP - 176 L1 - http://proceedings.mlr.press/v51/hernandez-lobato16.pdf UR - https://proceedings.mlr.press/v51/hernandez-lobato16.html AB - Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach. ER -
APA
Hernandez-Lobato, D. & Hernandez-Lobato, J.M.. (2016). Scalable Gaussian Process Classification via Expectation Propagation. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:168-176 Available from https://proceedings.mlr.press/v51/hernandez-lobato16.html.

Related Material