Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion

Mathieu Blondel, Yotaro Kubo, Ueda Naonori
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:96-104, 2014.

Abstract

Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be difficult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passive-aggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our algorithms are scalable, easy to implement and do not require the tedious tuning of a learning rate parameter. We demonstrate the effectiveness of our algorithms on three large-scale matrix completion problems and analyze them in the regret bound model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-blondel14, title = {{Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion}}, author = {Blondel, Mathieu and Kubo, Yotaro and Naonori, Ueda}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {96--104}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/blondel14.pdf}, url = {https://proceedings.mlr.press/v33/blondel14.html}, abstract = {Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be difficult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passive-aggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our algorithms are scalable, easy to implement and do not require the tedious tuning of a learning rate parameter. We demonstrate the effectiveness of our algorithms on three large-scale matrix completion problems and analyze them in the regret bound model.} }
Endnote
%0 Conference Paper %T Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion %A Mathieu Blondel %A Yotaro Kubo %A Ueda Naonori %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-blondel14 %I PMLR %P 96--104 %U https://proceedings.mlr.press/v33/blondel14.html %V 33 %X Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be difficult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passive-aggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our algorithms are scalable, easy to implement and do not require the tedious tuning of a learning rate parameter. We demonstrate the effectiveness of our algorithms on three large-scale matrix completion problems and analyze them in the regret bound model.
RIS
TY - CPAPER TI - Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion AU - Mathieu Blondel AU - Yotaro Kubo AU - Ueda Naonori BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-blondel14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 96 EP - 104 L1 - http://proceedings.mlr.press/v33/blondel14.pdf UR - https://proceedings.mlr.press/v33/blondel14.html AB - Stochastic Gradient Descent (SGD) is a popular online algorithm for large-scale matrix factorization. However, SGD can often be difficult to use for practitioners, because its performance is very sensitive to the choice of the learning rate parameter. In this paper, we present non-negative passive-aggressive (NN-PA), a family of online algorithms for non-negative matrix factorization (NMF). Our algorithms are scalable, easy to implement and do not require the tedious tuning of a learning rate parameter. We demonstrate the effectiveness of our algorithms on three large-scale matrix completion problems and analyze them in the regret bound model. ER -
APA
Blondel, M., Kubo, Y. & Naonori, U.. (2014). Online Passive-Aggressive Algorithms for Non-Negative Matrix Factorization and Completion. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:96-104 Available from https://proceedings.mlr.press/v33/blondel14.html.

Related Material