On the Consistency of Feature Selection With Lasso for Non-linear Targets

Yue Zhang, Weihong Guo, Soumya Ray
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:183-191, 2016.

Abstract

An important question in feature selection is whether a selection strategy recovers the “true” set of features, given enough data. We study this question in the context of the popular Least Absolute Shrinkage and Selection Operator (Lasso) feature selection strategy. In particular, we consider the scenario when the model is misspecified so that the learned model is linear while the underlying real target is nonlinear. Surprisingly, we prove that under certain conditions, Lasso is still able to recover the correct features in this case. We also carry out numerical studies to empirically verify the theoretical results and explore the necessity of the conditions under which the proof holds.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-zhanga16, title = {On the Consistency of Feature Selection With Lasso for Non-linear Targets}, author = {Zhang, Yue and Guo, Weihong and Ray, Soumya}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {183--191}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/zhanga16.pdf}, url = {https://proceedings.mlr.press/v48/zhanga16.html}, abstract = {An important question in feature selection is whether a selection strategy recovers the “true” set of features, given enough data. We study this question in the context of the popular Least Absolute Shrinkage and Selection Operator (Lasso) feature selection strategy. In particular, we consider the scenario when the model is misspecified so that the learned model is linear while the underlying real target is nonlinear. Surprisingly, we prove that under certain conditions, Lasso is still able to recover the correct features in this case. We also carry out numerical studies to empirically verify the theoretical results and explore the necessity of the conditions under which the proof holds.} }
Endnote
%0 Conference Paper %T On the Consistency of Feature Selection With Lasso for Non-linear Targets %A Yue Zhang %A Weihong Guo %A Soumya Ray %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-zhanga16 %I PMLR %P 183--191 %U https://proceedings.mlr.press/v48/zhanga16.html %V 48 %X An important question in feature selection is whether a selection strategy recovers the “true” set of features, given enough data. We study this question in the context of the popular Least Absolute Shrinkage and Selection Operator (Lasso) feature selection strategy. In particular, we consider the scenario when the model is misspecified so that the learned model is linear while the underlying real target is nonlinear. Surprisingly, we prove that under certain conditions, Lasso is still able to recover the correct features in this case. We also carry out numerical studies to empirically verify the theoretical results and explore the necessity of the conditions under which the proof holds.
RIS
TY - CPAPER TI - On the Consistency of Feature Selection With Lasso for Non-linear Targets AU - Yue Zhang AU - Weihong Guo AU - Soumya Ray BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-zhanga16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 183 EP - 191 L1 - http://proceedings.mlr.press/v48/zhanga16.pdf UR - https://proceedings.mlr.press/v48/zhanga16.html AB - An important question in feature selection is whether a selection strategy recovers the “true” set of features, given enough data. We study this question in the context of the popular Least Absolute Shrinkage and Selection Operator (Lasso) feature selection strategy. In particular, we consider the scenario when the model is misspecified so that the learned model is linear while the underlying real target is nonlinear. Surprisingly, we prove that under certain conditions, Lasso is still able to recover the correct features in this case. We also carry out numerical studies to empirically verify the theoretical results and explore the necessity of the conditions under which the proof holds. ER -
APA
Zhang, Y., Guo, W. & Ray, S.. (2016). On the Consistency of Feature Selection With Lasso for Non-linear Targets. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:183-191 Available from https://proceedings.mlr.press/v48/zhanga16.html.

Related Material