On the Impact of Kernel Approximation on Learning Accuracy

Corinna Cortes, Mehryar Mohri, Ameet Talwalkar
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:113-120, 2010.

Abstract

Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystrom low-rank approximation in this context and report the results of experiments evaluating the quality of the Nystrom low-rank kernel approximation when used with ridge regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-cortes10a, title = {On the Impact of Kernel Approximation on Learning Accuracy}, author = {Cortes, Corinna and Mohri, Mehryar and Talwalkar, Ameet}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {113--120}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/cortes10a/cortes10a.pdf}, url = {https://proceedings.mlr.press/v9/cortes10a.html}, abstract = {Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystrom low-rank approximation in this context and report the results of experiments evaluating the quality of the Nystrom low-rank kernel approximation when used with ridge regression.} }
Endnote
%0 Conference Paper %T On the Impact of Kernel Approximation on Learning Accuracy %A Corinna Cortes %A Mehryar Mohri %A Ameet Talwalkar %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-cortes10a %I PMLR %P 113--120 %U https://proceedings.mlr.press/v9/cortes10a.html %V 9 %X Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystrom low-rank approximation in this context and report the results of experiments evaluating the quality of the Nystrom low-rank kernel approximation when used with ridge regression.
RIS
TY - CPAPER TI - On the Impact of Kernel Approximation on Learning Accuracy AU - Corinna Cortes AU - Mehryar Mohri AU - Ameet Talwalkar BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-cortes10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 113 EP - 120 L1 - http://proceedings.mlr.press/v9/cortes10a/cortes10a.pdf UR - https://proceedings.mlr.press/v9/cortes10a.html AB - Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystrom low-rank approximation in this context and report the results of experiments evaluating the quality of the Nystrom low-rank kernel approximation when used with ridge regression. ER -
APA
Cortes, C., Mohri, M. & Talwalkar, A.. (2010). On the Impact of Kernel Approximation on Learning Accuracy. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:113-120 Available from https://proceedings.mlr.press/v9/cortes10a.html.

Related Material