Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)

Andrew Wilson, Hannes Nickisch
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1775-1784, 2015.

Abstract

We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-wilson15, title = {Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)}, author = {Wilson, Andrew and Nickisch, Hannes}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1775--1784}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/wilson15.pdf}, url = {https://proceedings.mlr.press/v37/wilson15.html}, abstract = {We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.} }
Endnote
%0 Conference Paper %T Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP) %A Andrew Wilson %A Hannes Nickisch %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-wilson15 %I PMLR %P 1775--1784 %U https://proceedings.mlr.press/v37/wilson15.html %V 37 %X We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.
RIS
TY - CPAPER TI - Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP) AU - Andrew Wilson AU - Hannes Nickisch BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-wilson15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1775 EP - 1784 L1 - http://proceedings.mlr.press/v37/wilson15.pdf UR - https://proceedings.mlr.press/v37/wilson15.html AB - We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling. ER -
APA
Wilson, A. & Nickisch, H.. (2015). Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP). Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1775-1784 Available from https://proceedings.mlr.press/v37/wilson15.html.

Related Material