Nuclear Norm Minimization via Active Subspace Selection

Cho-Jui Hsieh, Peder Olsen
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):575-583, 2014.

Abstract

We describe a novel approach to optimizing matrix problems involving nuclear norm regularization and apply it to the matrix completion problem. We combine methods from non-smooth and smooth optimization. At each step we use the proximal gradient to select an active subspace. We then find a smooth, convex relaxation of the smaller subspace problems and solve these using second order methods. We apply our methods to matrix completion problems including Netflix dataset, and show that they are more than 6 times faster than state-of-the-art nuclear norm solvers. Also, this is the first paper to scale nuclear norm solvers to the Yahoo-Music dataset, and the first time in the literature that the efficiency of nuclear norm solvers can be compared and even compete with non-convex solvers like Alternating Least Squares (ALS).

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-hsiehb14, title = {Nuclear Norm Minimization via Active Subspace Selection}, author = {Hsieh, Cho-Jui and Olsen, Peder}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {575--583}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/hsiehb14.pdf}, url = {https://proceedings.mlr.press/v32/hsiehb14.html}, abstract = {We describe a novel approach to optimizing matrix problems involving nuclear norm regularization and apply it to the matrix completion problem. We combine methods from non-smooth and smooth optimization. At each step we use the proximal gradient to select an active subspace. We then find a smooth, convex relaxation of the smaller subspace problems and solve these using second order methods. We apply our methods to matrix completion problems including Netflix dataset, and show that they are more than 6 times faster than state-of-the-art nuclear norm solvers. Also, this is the first paper to scale nuclear norm solvers to the Yahoo-Music dataset, and the first time in the literature that the efficiency of nuclear norm solvers can be compared and even compete with non-convex solvers like Alternating Least Squares (ALS).} }
Endnote
%0 Conference Paper %T Nuclear Norm Minimization via Active Subspace Selection %A Cho-Jui Hsieh %A Peder Olsen %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-hsiehb14 %I PMLR %P 575--583 %U https://proceedings.mlr.press/v32/hsiehb14.html %V 32 %N 1 %X We describe a novel approach to optimizing matrix problems involving nuclear norm regularization and apply it to the matrix completion problem. We combine methods from non-smooth and smooth optimization. At each step we use the proximal gradient to select an active subspace. We then find a smooth, convex relaxation of the smaller subspace problems and solve these using second order methods. We apply our methods to matrix completion problems including Netflix dataset, and show that they are more than 6 times faster than state-of-the-art nuclear norm solvers. Also, this is the first paper to scale nuclear norm solvers to the Yahoo-Music dataset, and the first time in the literature that the efficiency of nuclear norm solvers can be compared and even compete with non-convex solvers like Alternating Least Squares (ALS).
RIS
TY - CPAPER TI - Nuclear Norm Minimization via Active Subspace Selection AU - Cho-Jui Hsieh AU - Peder Olsen BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-hsiehb14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 575 EP - 583 L1 - http://proceedings.mlr.press/v32/hsiehb14.pdf UR - https://proceedings.mlr.press/v32/hsiehb14.html AB - We describe a novel approach to optimizing matrix problems involving nuclear norm regularization and apply it to the matrix completion problem. We combine methods from non-smooth and smooth optimization. At each step we use the proximal gradient to select an active subspace. We then find a smooth, convex relaxation of the smaller subspace problems and solve these using second order methods. We apply our methods to matrix completion problems including Netflix dataset, and show that they are more than 6 times faster than state-of-the-art nuclear norm solvers. Also, this is the first paper to scale nuclear norm solvers to the Yahoo-Music dataset, and the first time in the literature that the efficiency of nuclear norm solvers can be compared and even compete with non-convex solvers like Alternating Least Squares (ALS). ER -
APA
Hsieh, C. & Olsen, P.. (2014). Nuclear Norm Minimization via Active Subspace Selection. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):575-583 Available from https://proceedings.mlr.press/v32/hsiehb14.html.

Related Material