Aggregation of supports along the Lasso path

Pierre C. Bellec
29th Annual Conference on Learning Theory, PMLR 49:488-529, 2016.

Abstract

In linear regression with fixed design, we propose two procedures that aggregate a data-driven collection of supports. The collection is a subset of the 2^p possible supports and both its cardinality and its elements can depend on the data. The procedures satisfy oracle inequalities with no assumption on the design matrix. Then we use these procedures to aggregate the supports that appear on the regularization path of the Lasso in order to construct an estimator that mimics the best Lasso estimator. If the restricted eigenvalue condition on the design matrix is satisfied, then this estimator achieves optimal prediction bounds. Finally, we discuss the computational cost of these procedures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-bellec16, title = {Aggregation of supports along the Lasso path}, author = {Bellec, Pierre C.}, booktitle = {29th Annual Conference on Learning Theory}, pages = {488--529}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/bellec16.pdf}, url = {https://proceedings.mlr.press/v49/bellec16.html}, abstract = {In linear regression with fixed design, we propose two procedures that aggregate a data-driven collection of supports. The collection is a subset of the 2^p possible supports and both its cardinality and its elements can depend on the data. The procedures satisfy oracle inequalities with no assumption on the design matrix. Then we use these procedures to aggregate the supports that appear on the regularization path of the Lasso in order to construct an estimator that mimics the best Lasso estimator. If the restricted eigenvalue condition on the design matrix is satisfied, then this estimator achieves optimal prediction bounds. Finally, we discuss the computational cost of these procedures. } }
Endnote
%0 Conference Paper %T Aggregation of supports along the Lasso path %A Pierre C. Bellec %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-bellec16 %I PMLR %P 488--529 %U https://proceedings.mlr.press/v49/bellec16.html %V 49 %X In linear regression with fixed design, we propose two procedures that aggregate a data-driven collection of supports. The collection is a subset of the 2^p possible supports and both its cardinality and its elements can depend on the data. The procedures satisfy oracle inequalities with no assumption on the design matrix. Then we use these procedures to aggregate the supports that appear on the regularization path of the Lasso in order to construct an estimator that mimics the best Lasso estimator. If the restricted eigenvalue condition on the design matrix is satisfied, then this estimator achieves optimal prediction bounds. Finally, we discuss the computational cost of these procedures.
RIS
TY - CPAPER TI - Aggregation of supports along the Lasso path AU - Pierre C. Bellec BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-bellec16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 488 EP - 529 L1 - http://proceedings.mlr.press/v49/bellec16.pdf UR - https://proceedings.mlr.press/v49/bellec16.html AB - In linear regression with fixed design, we propose two procedures that aggregate a data-driven collection of supports. The collection is a subset of the 2^p possible supports and both its cardinality and its elements can depend on the data. The procedures satisfy oracle inequalities with no assumption on the design matrix. Then we use these procedures to aggregate the supports that appear on the regularization path of the Lasso in order to construct an estimator that mimics the best Lasso estimator. If the restricted eigenvalue condition on the design matrix is satisfied, then this estimator achieves optimal prediction bounds. Finally, we discuss the computational cost of these procedures. ER -
APA
Bellec, P.C.. (2016). Aggregation of supports along the Lasso path. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:488-529 Available from https://proceedings.mlr.press/v49/bellec16.html.

Related Material