NYTRO: When Subsampling Meets Early Stopping

Raffaello Camoriano, Tomás Angles, Alessandro Rudi, Lorenzo Rosasco
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1403-1411, 2016.

Abstract

Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-camoriano16, title = {NYTRO: When Subsampling Meets Early Stopping}, author = {Camoriano, Raffaello and Angles, Tomás and Rudi, Alessandro and Rosasco, Lorenzo}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1403--1411}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/camoriano16.pdf}, url = {https://proceedings.mlr.press/v51/camoriano16.html}, abstract = {Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis.} }
Endnote
%0 Conference Paper %T NYTRO: When Subsampling Meets Early Stopping %A Raffaello Camoriano %A Tomás Angles %A Alessandro Rudi %A Lorenzo Rosasco %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-camoriano16 %I PMLR %P 1403--1411 %U https://proceedings.mlr.press/v51/camoriano16.html %V 51 %X Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis.
RIS
TY - CPAPER TI - NYTRO: When Subsampling Meets Early Stopping AU - Raffaello Camoriano AU - Tomás Angles AU - Alessandro Rudi AU - Lorenzo Rosasco BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-camoriano16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1403 EP - 1411 L1 - http://proceedings.mlr.press/v51/camoriano16.pdf UR - https://proceedings.mlr.press/v51/camoriano16.html AB - Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis. ER -
APA
Camoriano, R., Angles, T., Rudi, A. & Rosasco, L.. (2016). NYTRO: When Subsampling Meets Early Stopping. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1403-1411 Available from https://proceedings.mlr.press/v51/camoriano16.html.

Related Material