Fast Near-GRID Gaussian Process Regression

Yuancheng Luo, Ramani Duraiswami
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:424-432, 2013.

Abstract

\emphGaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N^3) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a \emphtensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N. We extend these exact fast algorithms to sparse GPR and remark on a connection to \emphGaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v31-luo13b, title = {Fast Near-GRID Gaussian Process Regression}, author = {Luo, Yuancheng and Duraiswami, Ramani}, booktitle = {Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics}, pages = {424--432}, year = {2013}, editor = {Carvalho, Carlos M. and Ravikumar, Pradeep}, volume = {31}, series = {Proceedings of Machine Learning Research}, address = {Scottsdale, Arizona, USA}, month = {29 Apr--01 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v31/luo13b.pdf}, url = {https://proceedings.mlr.press/v31/luo13b.html}, abstract = {\emphGaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N^3) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a \emphtensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N. We extend these exact fast algorithms to sparse GPR and remark on a connection to \emphGaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions.} }
Endnote
%0 Conference Paper %T Fast Near-GRID Gaussian Process Regression %A Yuancheng Luo %A Ramani Duraiswami %B Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2013 %E Carlos M. Carvalho %E Pradeep Ravikumar %F pmlr-v31-luo13b %I PMLR %P 424--432 %U https://proceedings.mlr.press/v31/luo13b.html %V 31 %X \emphGaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N^3) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a \emphtensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N. We extend these exact fast algorithms to sparse GPR and remark on a connection to \emphGaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions.
RIS
TY - CPAPER TI - Fast Near-GRID Gaussian Process Regression AU - Yuancheng Luo AU - Ramani Duraiswami BT - Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics DA - 2013/04/29 ED - Carlos M. Carvalho ED - Pradeep Ravikumar ID - pmlr-v31-luo13b PB - PMLR DP - Proceedings of Machine Learning Research VL - 31 SP - 424 EP - 432 L1 - http://proceedings.mlr.press/v31/luo13b.pdf UR - https://proceedings.mlr.press/v31/luo13b.html AB - \emphGaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N^3) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a \emphtensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N. We extend these exact fast algorithms to sparse GPR and remark on a connection to \emphGaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions. ER -
APA
Luo, Y. & Duraiswami, R.. (2013). Fast Near-GRID Gaussian Process Regression. Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 31:424-432 Available from https://proceedings.mlr.press/v31/luo13b.html.

Related Material