Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity

Zhuoran Yang, Zhaoran Wang, Han Liu, Yonina Eldar, Tong Zhang
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2472-2481, 2016.

Abstract

We study parameter estimation for sparse nonlinear regression. More specifically, we assume the data are given by y = f( \bf x^T \bf β^* ) + ε, where f is nonlinear. To recover \bf βs, we propose an \ell_1-regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f. In spite of the nonconvexity, we prove that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. Detailed numerical results are provided to back up our theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-yangc16, title = {Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity}, author = {Yang, Zhuoran and Wang, Zhaoran and Liu, Han and Eldar, Yonina and Zhang, Tong}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2472--2481}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/yangc16.pdf}, url = {https://proceedings.mlr.press/v48/yangc16.html}, abstract = {We study parameter estimation for sparse nonlinear regression. More specifically, we assume the data are given by y = f( \bf x^T \bf β^* ) + ε, where f is nonlinear. To recover \bf βs, we propose an \ell_1-regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f. In spite of the nonconvexity, we prove that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. Detailed numerical results are provided to back up our theory.} }
Endnote
%0 Conference Paper %T Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity %A Zhuoran Yang %A Zhaoran Wang %A Han Liu %A Yonina Eldar %A Tong Zhang %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-yangc16 %I PMLR %P 2472--2481 %U https://proceedings.mlr.press/v48/yangc16.html %V 48 %X We study parameter estimation for sparse nonlinear regression. More specifically, we assume the data are given by y = f( \bf x^T \bf β^* ) + ε, where f is nonlinear. To recover \bf βs, we propose an \ell_1-regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f. In spite of the nonconvexity, we prove that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. Detailed numerical results are provided to back up our theory.
RIS
TY - CPAPER TI - Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity AU - Zhuoran Yang AU - Zhaoran Wang AU - Han Liu AU - Yonina Eldar AU - Tong Zhang BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-yangc16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2472 EP - 2481 L1 - http://proceedings.mlr.press/v48/yangc16.pdf UR - https://proceedings.mlr.press/v48/yangc16.html AB - We study parameter estimation for sparse nonlinear regression. More specifically, we assume the data are given by y = f( \bf x^T \bf β^* ) + ε, where f is nonlinear. To recover \bf βs, we propose an \ell_1-regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f. In spite of the nonconvexity, we prove that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. Detailed numerical results are provided to back up our theory. ER -
APA
Yang, Z., Wang, Z., Liu, H., Eldar, Y. & Zhang, T.. (2016). Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2472-2481 Available from https://proceedings.mlr.press/v48/yangc16.html.

Related Material