Home Page




Editorial Board

Open Source Software

Proceedings (PMLR)

Transactions (TMLR)




Frequently Asked Questions

Contact Us

RSS Feed

On the Consistency of Feature Selection using Greedy Least Squares Regression

Tong Zhang; 10(19):555−568, 2009.


This paper studies the feature selection problem using a greedy least squares regression algorithm. We show that under a certain irrepresentable condition on the design matrix (but independent of the sparse target), the greedy algorithm can select features consistently when the sample size approaches infinity. The condition is identical to a corresponding condition for Lasso.

Moreover, under a sparse eigenvalue condition, the greedy algorithm can reliably identify features as long as each nonzero coefficient is larger than a constant times the noise level. In comparison, Lasso may require the coefficients to be larger than O(√s) times the noise level in the worst case, where s is the number of nonzero coefficients.

© JMLR 2009. (edit, beta)