A la Carte – Learning Fast Kernels

Zichao Yang, Andrew Wilson, Alex Smola, Le Song
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:1098-1106, 2015.

Abstract

Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, general purpose, and lightly parametrized kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(m log d) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-yang15b, title = {{A la Carte -- Learning Fast Kernels}}, author = {Yang, Zichao and Wilson, Andrew and Smola, Alex and Song, Le}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {1098--1106}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/yang15b.pdf}, url = {https://proceedings.mlr.press/v38/yang15b.html}, abstract = {Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, general purpose, and lightly parametrized kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(m log d) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.} }
Endnote
%0 Conference Paper %T A la Carte – Learning Fast Kernels %A Zichao Yang %A Andrew Wilson %A Alex Smola %A Le Song %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-yang15b %I PMLR %P 1098--1106 %U https://proceedings.mlr.press/v38/yang15b.html %V 38 %X Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, general purpose, and lightly parametrized kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(m log d) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption.
RIS
TY - CPAPER TI - A la Carte – Learning Fast Kernels AU - Zichao Yang AU - Andrew Wilson AU - Alex Smola AU - Le Song BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-yang15b PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 1098 EP - 1106 L1 - http://proceedings.mlr.press/v38/yang15b.pdf UR - https://proceedings.mlr.press/v38/yang15b.html AB - Kernel methods have great promise for learning rich statistical representations of large modern datasets. However, compared to neural networks, kernel methods have been perceived as lacking in scalability and flexibility. We introduce a family of fast, flexible, general purpose, and lightly parametrized kernel learning methods, derived from Fastfood basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(m log d) time and O(m) memory, for m basis functions and d input dimensions. We show that the proposed methods can learn a wide class of kernels, outperforming the alternatives in accuracy, speed, and memory consumption. ER -
APA
Yang, Z., Wilson, A., Smola, A. & Song, L.. (2015). A la Carte – Learning Fast Kernels. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:1098-1106 Available from https://proceedings.mlr.press/v38/yang15b.html.

Related Material