Fast Stochastic Alternating Direction Method of Multipliers

Wenliang Zhong, James Kwok
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):46-54, 2014.

Abstract

We propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, it improves the convergence rate on convex problems from \mO(1/\sqrtT) to \mO(1/T), where T is the number of iterations. This matches the convergence rate of the batch ADMM algorithm, but without the need to visit all the samples in each iteration. Experiments on the graph-guided fused lasso demonstrate that the new algorithm is significantly faster than state-of-the-art stochastic and batch ADMM algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-zhong14, title = {Fast Stochastic Alternating Direction Method of Multipliers}, author = {Zhong, Wenliang and Kwok, James}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {46--54}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/zhong14.pdf}, url = {https://proceedings.mlr.press/v32/zhong14.html}, abstract = {We propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, it improves the convergence rate on convex problems from \mO(1/\sqrtT) to \mO(1/T), where T is the number of iterations. This matches the convergence rate of the batch ADMM algorithm, but without the need to visit all the samples in each iteration. Experiments on the graph-guided fused lasso demonstrate that the new algorithm is significantly faster than state-of-the-art stochastic and batch ADMM algorithms.} }
Endnote
%0 Conference Paper %T Fast Stochastic Alternating Direction Method of Multipliers %A Wenliang Zhong %A James Kwok %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-zhong14 %I PMLR %P 46--54 %U https://proceedings.mlr.press/v32/zhong14.html %V 32 %N 1 %X We propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, it improves the convergence rate on convex problems from \mO(1/\sqrtT) to \mO(1/T), where T is the number of iterations. This matches the convergence rate of the batch ADMM algorithm, but without the need to visit all the samples in each iteration. Experiments on the graph-guided fused lasso demonstrate that the new algorithm is significantly faster than state-of-the-art stochastic and batch ADMM algorithms.
RIS
TY - CPAPER TI - Fast Stochastic Alternating Direction Method of Multipliers AU - Wenliang Zhong AU - James Kwok BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-zhong14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 46 EP - 54 L1 - http://proceedings.mlr.press/v32/zhong14.pdf UR - https://proceedings.mlr.press/v32/zhong14.html AB - We propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, it improves the convergence rate on convex problems from \mO(1/\sqrtT) to \mO(1/T), where T is the number of iterations. This matches the convergence rate of the batch ADMM algorithm, but without the need to visit all the samples in each iteration. Experiments on the graph-guided fused lasso demonstrate that the new algorithm is significantly faster than state-of-the-art stochastic and batch ADMM algorithms. ER -
APA
Zhong, W. & Kwok, J.. (2014). Fast Stochastic Alternating Direction Method of Multipliers. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):46-54 Available from https://proceedings.mlr.press/v32/zhong14.html.

Related Material