Stochastic Alternating Direction Method of Multipliers

Hua Ouyang, Niao He, Long Tran, Alexander Gray
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):80-88, 2013.

Abstract

The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies to a more general class of convex and nonsmooth objective functions, beyond the smooth and separable least squares loss used in lasso. We also demonstrate the rates of convergence for our algorithm under various structural assumptions of the stochastic function: O(1/\sqrtt) for convex functions and O(\log t/t) for strongly convex functions. Compared to previous literature, we establish the convergence rate of ADMM for convex problems in terms of both the objective value and the feasibility violation. A novel application named Graph-Guided SVM is proposed to demonstrate the usefulness of our algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-ouyang13, title = {Stochastic Alternating Direction Method of Multipliers}, author = {Ouyang, Hua and He, Niao and Tran, Long and Gray, Alexander}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {80--88}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/ouyang13.pdf}, url = {https://proceedings.mlr.press/v28/ouyang13.html}, abstract = {The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies to a more general class of convex and nonsmooth objective functions, beyond the smooth and separable least squares loss used in lasso. We also demonstrate the rates of convergence for our algorithm under various structural assumptions of the stochastic function: O(1/\sqrtt) for convex functions and O(\log t/t) for strongly convex functions. Compared to previous literature, we establish the convergence rate of ADMM for convex problems in terms of both the objective value and the feasibility violation. A novel application named Graph-Guided SVM is proposed to demonstrate the usefulness of our algorithm.} }
Endnote
%0 Conference Paper %T Stochastic Alternating Direction Method of Multipliers %A Hua Ouyang %A Niao He %A Long Tran %A Alexander Gray %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-ouyang13 %I PMLR %P 80--88 %U https://proceedings.mlr.press/v28/ouyang13.html %V 28 %N 1 %X The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies to a more general class of convex and nonsmooth objective functions, beyond the smooth and separable least squares loss used in lasso. We also demonstrate the rates of convergence for our algorithm under various structural assumptions of the stochastic function: O(1/\sqrtt) for convex functions and O(\log t/t) for strongly convex functions. Compared to previous literature, we establish the convergence rate of ADMM for convex problems in terms of both the objective value and the feasibility violation. A novel application named Graph-Guided SVM is proposed to demonstrate the usefulness of our algorithm.
RIS
TY - CPAPER TI - Stochastic Alternating Direction Method of Multipliers AU - Hua Ouyang AU - Niao He AU - Long Tran AU - Alexander Gray BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-ouyang13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 80 EP - 88 L1 - http://proceedings.mlr.press/v28/ouyang13.pdf UR - https://proceedings.mlr.press/v28/ouyang13.html AB - The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies to a more general class of convex and nonsmooth objective functions, beyond the smooth and separable least squares loss used in lasso. We also demonstrate the rates of convergence for our algorithm under various structural assumptions of the stochastic function: O(1/\sqrtt) for convex functions and O(\log t/t) for strongly convex functions. Compared to previous literature, we establish the convergence rate of ADMM for convex problems in terms of both the objective value and the feasibility violation. A novel application named Graph-Guided SVM is proposed to demonstrate the usefulness of our algorithm. ER -
APA
Ouyang, H., He, N., Tran, L. & Gray, A.. (2013). Stochastic Alternating Direction Method of Multipliers. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):80-88 Available from https://proceedings.mlr.press/v28/ouyang13.html.

Related Material