Fitting Spectral Decay with the k-Support Norm

Andrew McDonald, Massimiliano Pontil, Dimitris Stamos
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1061-1069, 2016.

Abstract

The spectral k-support norm enjoys good estimation properties in low rank matrix learning problems, empirically outperforming the trace norm. Its unit ball is the convex hull of rank k matrices with unit Frobenius norm. In this paper we generalize the norm to the spectral (k,p)-support norm, whose additional parameter p can be used to tailor the norm to the decay of the spectrum of the underlying model. We characterize the unit ball and we explicitly compute the norm. We further provide a conditional gradient method to solve regularization problems with the norm, and we derive an efficient algorithm to compute the Euclidean projection on the unit ball in the case p=∞. In numerical experiments, we show that allowing p to vary significantly improves performance over the spectral k-support norm on various matrix completion benchmarks, and better captures the spectral decay of the underlying model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-mcdonald16, title = {Fitting Spectral Decay with the $k$-Support Norm}, author = {McDonald, Andrew and Pontil, Massimiliano and Stamos, Dimitris}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1061--1069}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/mcdonald16.pdf}, url = {https://proceedings.mlr.press/v51/mcdonald16.html}, abstract = {The spectral k-support norm enjoys good estimation properties in low rank matrix learning problems, empirically outperforming the trace norm. Its unit ball is the convex hull of rank k matrices with unit Frobenius norm. In this paper we generalize the norm to the spectral (k,p)-support norm, whose additional parameter p can be used to tailor the norm to the decay of the spectrum of the underlying model. We characterize the unit ball and we explicitly compute the norm. We further provide a conditional gradient method to solve regularization problems with the norm, and we derive an efficient algorithm to compute the Euclidean projection on the unit ball in the case p=∞. In numerical experiments, we show that allowing p to vary significantly improves performance over the spectral k-support norm on various matrix completion benchmarks, and better captures the spectral decay of the underlying model.} }
Endnote
%0 Conference Paper %T Fitting Spectral Decay with the k-Support Norm %A Andrew McDonald %A Massimiliano Pontil %A Dimitris Stamos %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-mcdonald16 %I PMLR %P 1061--1069 %U https://proceedings.mlr.press/v51/mcdonald16.html %V 51 %X The spectral k-support norm enjoys good estimation properties in low rank matrix learning problems, empirically outperforming the trace norm. Its unit ball is the convex hull of rank k matrices with unit Frobenius norm. In this paper we generalize the norm to the spectral (k,p)-support norm, whose additional parameter p can be used to tailor the norm to the decay of the spectrum of the underlying model. We characterize the unit ball and we explicitly compute the norm. We further provide a conditional gradient method to solve regularization problems with the norm, and we derive an efficient algorithm to compute the Euclidean projection on the unit ball in the case p=∞. In numerical experiments, we show that allowing p to vary significantly improves performance over the spectral k-support norm on various matrix completion benchmarks, and better captures the spectral decay of the underlying model.
RIS
TY - CPAPER TI - Fitting Spectral Decay with the k-Support Norm AU - Andrew McDonald AU - Massimiliano Pontil AU - Dimitris Stamos BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-mcdonald16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1061 EP - 1069 L1 - http://proceedings.mlr.press/v51/mcdonald16.pdf UR - https://proceedings.mlr.press/v51/mcdonald16.html AB - The spectral k-support norm enjoys good estimation properties in low rank matrix learning problems, empirically outperforming the trace norm. Its unit ball is the convex hull of rank k matrices with unit Frobenius norm. In this paper we generalize the norm to the spectral (k,p)-support norm, whose additional parameter p can be used to tailor the norm to the decay of the spectrum of the underlying model. We characterize the unit ball and we explicitly compute the norm. We further provide a conditional gradient method to solve regularization problems with the norm, and we derive an efficient algorithm to compute the Euclidean projection on the unit ball in the case p=∞. In numerical experiments, we show that allowing p to vary significantly improves performance over the spectral k-support norm on various matrix completion benchmarks, and better captures the spectral decay of the underlying model. ER -
APA
McDonald, A., Pontil, M. & Stamos, D.. (2016). Fitting Spectral Decay with the k-Support Norm. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1061-1069 Available from https://proceedings.mlr.press/v51/mcdonald16.html.

Related Material