Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

Shai Shalev-Shwartz, Tong Zhang
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):64-72, 2014.

Abstract

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-shalev-shwartz14, title = {Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization}, author = {Shalev-Shwartz, Shai and Zhang, Tong}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {64--72}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/shalev-shwartz14.pdf}, url = {https://proceedings.mlr.press/v32/shalev-shwartz14.html}, abstract = {We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.} }
Endnote
%0 Conference Paper %T Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization %A Shai Shalev-Shwartz %A Tong Zhang %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-shalev-shwartz14 %I PMLR %P 64--72 %U https://proceedings.mlr.press/v32/shalev-shwartz14.html %V 32 %N 1 %X We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
RIS
TY - CPAPER TI - Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization AU - Shai Shalev-Shwartz AU - Tong Zhang BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-shalev-shwartz14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 64 EP - 72 L1 - http://proceedings.mlr.press/v32/shalev-shwartz14.pdf UR - https://proceedings.mlr.press/v32/shalev-shwartz14.html AB - We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings. ER -
APA
Shalev-Shwartz, S. & Zhang, T.. (2014). Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):64-72 Available from https://proceedings.mlr.press/v32/shalev-shwartz14.html.

Related Material