Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings

Cristóbal Guzmán
Proceedings of The 28th Conference on Learning Theory, PMLR 40:1761-1763, 2015.

Abstract

First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v40-Guzman15, title = {Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings}, author = {Guzmán, Cristóbal}, booktitle = {Proceedings of The 28th Conference on Learning Theory}, pages = {1761--1763}, year = {2015}, editor = {Grünwald, Peter and Hazan, Elad and Kale, Satyen}, volume = {40}, series = {Proceedings of Machine Learning Research}, address = {Paris, France}, month = {03--06 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v40/Guzman15.pdf}, url = {https://proceedings.mlr.press/v40/Guzman15.html}, abstract = {First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models.} }
Endnote
%0 Conference Paper %T Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings %A Cristóbal Guzmán %B Proceedings of The 28th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2015 %E Peter Grünwald %E Elad Hazan %E Satyen Kale %F pmlr-v40-Guzman15 %I PMLR %P 1761--1763 %U https://proceedings.mlr.press/v40/Guzman15.html %V 40 %X First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models.
RIS
TY - CPAPER TI - Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings AU - Cristóbal Guzmán BT - Proceedings of The 28th Conference on Learning Theory DA - 2015/06/26 ED - Peter Grünwald ED - Elad Hazan ED - Satyen Kale ID - pmlr-v40-Guzman15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 40 SP - 1761 EP - 1763 L1 - http://proceedings.mlr.press/v40/Guzman15.pdf UR - https://proceedings.mlr.press/v40/Guzman15.html AB - First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models. ER -
APA
Guzmán, C.. (2015). Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings. Proceedings of The 28th Conference on Learning Theory, in Proceedings of Machine Learning Research 40:1761-1763 Available from https://proceedings.mlr.press/v40/Guzman15.html.

Related Material