Learning Heteroscedastic Models by Convex Programming under Group Sparsity

Arnak Dalalyan, Mohamed Hebiri, Katia Meziani, Joseph Salmon
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):379-387, 2013.

Abstract

Sparse estimation methods based on l1 relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks, such as time series, random fields, inverse problems, for which noise is rarely homoscedastic and the noise level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-dalalyan13, title = {Learning Heteroscedastic Models by Convex Programming under Group Sparsity}, author = {Dalalyan, Arnak and Hebiri, Mohamed and Meziani, Katia and Salmon, Joseph}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {379--387}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/dalalyan13.pdf}, url = {https://proceedings.mlr.press/v28/dalalyan13.html}, abstract = {Sparse estimation methods based on l1 relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks, such as time series, random fields, inverse problems, for which noise is rarely homoscedastic and the noise level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure.} }
Endnote
%0 Conference Paper %T Learning Heteroscedastic Models by Convex Programming under Group Sparsity %A Arnak Dalalyan %A Mohamed Hebiri %A Katia Meziani %A Joseph Salmon %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-dalalyan13 %I PMLR %P 379--387 %U https://proceedings.mlr.press/v28/dalalyan13.html %V 28 %N 3 %X Sparse estimation methods based on l1 relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks, such as time series, random fields, inverse problems, for which noise is rarely homoscedastic and the noise level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure.
RIS
TY - CPAPER TI - Learning Heteroscedastic Models by Convex Programming under Group Sparsity AU - Arnak Dalalyan AU - Mohamed Hebiri AU - Katia Meziani AU - Joseph Salmon BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-dalalyan13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 379 EP - 387 L1 - http://proceedings.mlr.press/v28/dalalyan13.pdf UR - https://proceedings.mlr.press/v28/dalalyan13.html AB - Sparse estimation methods based on l1 relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks, such as time series, random fields, inverse problems, for which noise is rarely homoscedastic and the noise level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure. ER -
APA
Dalalyan, A., Hebiri, M., Meziani, K. & Salmon, J.. (2013). Learning Heteroscedastic Models by Convex Programming under Group Sparsity. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):379-387 Available from https://proceedings.mlr.press/v28/dalalyan13.html.

Related Material