A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates

Yudong Chen, Xinyang Yi, Constantine Caramanis
Proceedings of The 27th Conference on Learning Theory, PMLR 35:560-604, 2014.

Abstract

We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first (and currently only known) tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-chen14, title = {A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates}, author = {Chen, Yudong and Yi, Xinyang and Caramanis, Constantine}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {560--604}, year = {2014}, editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/chen14.pdf}, url = {https://proceedings.mlr.press/v35/chen14.html}, abstract = {We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first (and currently only known) tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity.} }
Endnote
%0 Conference Paper %T A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates %A Yudong Chen %A Xinyang Yi %A Constantine Caramanis %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-chen14 %I PMLR %P 560--604 %U https://proceedings.mlr.press/v35/chen14.html %V 35 %X We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first (and currently only known) tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity.
RIS
TY - CPAPER TI - A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates AU - Yudong Chen AU - Xinyang Yi AU - Constantine Caramanis BT - Proceedings of The 27th Conference on Learning Theory DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-chen14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 35 SP - 560 EP - 604 L1 - http://proceedings.mlr.press/v35/chen14.pdf UR - https://proceedings.mlr.press/v35/chen14.html AB - We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first (and currently only known) tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity. ER -
APA
Chen, Y., Yi, X. & Caramanis, C.. (2014). A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates. Proceedings of The 27th Conference on Learning Theory, in Proceedings of Machine Learning Research 35:560-604 Available from https://proceedings.mlr.press/v35/chen14.html.

Related Material