Efficient Approximation of Cross-Validation for Kernel Methods using Bouligand Influence Function

Yong Liu, Shali Jiang, Shizhong Liao
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):324-332, 2014.

Abstract

Model selection is one of the key issues both in recent research and application of kernel methods. Cross-validation is a commonly employed and widely accepted model selection criterion. However, it requires multiple times of training the algorithm under consideration, which is computationally intensive. In this paper, we present a novel strategy for approximating the cross-validation based on the Bouligand influence function (BIF), which only requires the solution of the algorithm once. The BIF measures the impact of an infinitesimal small amount of contamination of the original distribution. We first establish the link between the concept of BIF and the concept of cross-validation. The BIF is related to the first order term of a Taylor expansion. Then, we calculate the BIF and higher order BIFs, and apply these theoretical results to approximate the cross-validation error in practice. Experimental results demonstrate that our approximate cross-validation criterion is sound and efficient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-liua14, title = {Efficient Approximation of Cross-Validation for Kernel Methods using Bouligand Influence Function}, author = {Liu, Yong and Jiang, Shali and Liao, Shizhong}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {324--332}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/liua14.pdf}, url = {https://proceedings.mlr.press/v32/liua14.html}, abstract = {Model selection is one of the key issues both in recent research and application of kernel methods. Cross-validation is a commonly employed and widely accepted model selection criterion. However, it requires multiple times of training the algorithm under consideration, which is computationally intensive. In this paper, we present a novel strategy for approximating the cross-validation based on the Bouligand influence function (BIF), which only requires the solution of the algorithm once. The BIF measures the impact of an infinitesimal small amount of contamination of the original distribution. We first establish the link between the concept of BIF and the concept of cross-validation. The BIF is related to the first order term of a Taylor expansion. Then, we calculate the BIF and higher order BIFs, and apply these theoretical results to approximate the cross-validation error in practice. Experimental results demonstrate that our approximate cross-validation criterion is sound and efficient.} }
Endnote
%0 Conference Paper %T Efficient Approximation of Cross-Validation for Kernel Methods using Bouligand Influence Function %A Yong Liu %A Shali Jiang %A Shizhong Liao %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-liua14 %I PMLR %P 324--332 %U https://proceedings.mlr.press/v32/liua14.html %V 32 %N 1 %X Model selection is one of the key issues both in recent research and application of kernel methods. Cross-validation is a commonly employed and widely accepted model selection criterion. However, it requires multiple times of training the algorithm under consideration, which is computationally intensive. In this paper, we present a novel strategy for approximating the cross-validation based on the Bouligand influence function (BIF), which only requires the solution of the algorithm once. The BIF measures the impact of an infinitesimal small amount of contamination of the original distribution. We first establish the link between the concept of BIF and the concept of cross-validation. The BIF is related to the first order term of a Taylor expansion. Then, we calculate the BIF and higher order BIFs, and apply these theoretical results to approximate the cross-validation error in practice. Experimental results demonstrate that our approximate cross-validation criterion is sound and efficient.
RIS
TY - CPAPER TI - Efficient Approximation of Cross-Validation for Kernel Methods using Bouligand Influence Function AU - Yong Liu AU - Shali Jiang AU - Shizhong Liao BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-liua14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 324 EP - 332 L1 - http://proceedings.mlr.press/v32/liua14.pdf UR - https://proceedings.mlr.press/v32/liua14.html AB - Model selection is one of the key issues both in recent research and application of kernel methods. Cross-validation is a commonly employed and widely accepted model selection criterion. However, it requires multiple times of training the algorithm under consideration, which is computationally intensive. In this paper, we present a novel strategy for approximating the cross-validation based on the Bouligand influence function (BIF), which only requires the solution of the algorithm once. The BIF measures the impact of an infinitesimal small amount of contamination of the original distribution. We first establish the link between the concept of BIF and the concept of cross-validation. The BIF is related to the first order term of a Taylor expansion. Then, we calculate the BIF and higher order BIFs, and apply these theoretical results to approximate the cross-validation error in practice. Experimental results demonstrate that our approximate cross-validation criterion is sound and efficient. ER -
APA
Liu, Y., Jiang, S. & Liao, S.. (2014). Efficient Approximation of Cross-Validation for Kernel Methods using Bouligand Influence Function. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):324-332 Available from https://proceedings.mlr.press/v32/liua14.html.

Related Material