Sparsity Regret Bounds for Individual Sequences in Online Linear Regression

Sébastien Gerchinovitz
Proceedings of the 24th Annual Conference on Learning Theory, PMLR 19:377-396, 2011.

Abstract

We consider the problem of online linear regression on arbitrary deterministic sequences when the ambient dimension $d$ can be much larger than the number of time rounds $T$. We introduce the notion of sparsity regret bound, which is a deterministic online counterpart of recent risk bounds derived in the stochastic setting under a sparsity scenario. We prove such regret bounds for an online-learning algorithm called SeqSEW and based on exponential weighting and data-driven truncation. In a second part we apply a parameter-free version of this algorithm on i.i.d. data and derive risk bounds of the same flavor as in Dalalyan and Tsybakov (2008, 2011) but which solve two questions left open therein. In particular our risk bounds are adaptive (up to a logarithmic factor) to the unknown variance of the noise if the latter is Gaussian.

Cite this Paper


BibTeX
@InProceedings{pmlr-v19-gerchinovitz11a, title = {Sparsity Regret Bounds for Individual Sequences in Online Linear Regression}, author = {Gerchinovitz, Sébastien}, booktitle = {Proceedings of the 24th Annual Conference on Learning Theory}, pages = {377--396}, year = {2011}, editor = {Kakade, Sham M. and von Luxburg, Ulrike}, volume = {19}, series = {Proceedings of Machine Learning Research}, address = {Budapest, Hungary}, month = {09--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v19/gerchinovitz11a/gerchinovitz11a.pdf}, url = {https://proceedings.mlr.press/v19/gerchinovitz11a.html}, abstract = {We consider the problem of online linear regression on arbitrary deterministic sequences when the ambient dimension $d$ can be much larger than the number of time rounds $T$. We introduce the notion of sparsity regret bound, which is a deterministic online counterpart of recent risk bounds derived in the stochastic setting under a sparsity scenario. We prove such regret bounds for an online-learning algorithm called SeqSEW and based on exponential weighting and data-driven truncation. In a second part we apply a parameter-free version of this algorithm on i.i.d. data and derive risk bounds of the same flavor as in Dalalyan and Tsybakov (2008, 2011) but which solve two questions left open therein. In particular our risk bounds are adaptive (up to a logarithmic factor) to the unknown variance of the noise if the latter is Gaussian.} }
Endnote
%0 Conference Paper %T Sparsity Regret Bounds for Individual Sequences in Online Linear Regression %A Sébastien Gerchinovitz %B Proceedings of the 24th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2011 %E Sham M. Kakade %E Ulrike von Luxburg %F pmlr-v19-gerchinovitz11a %I PMLR %P 377--396 %U https://proceedings.mlr.press/v19/gerchinovitz11a.html %V 19 %X We consider the problem of online linear regression on arbitrary deterministic sequences when the ambient dimension $d$ can be much larger than the number of time rounds $T$. We introduce the notion of sparsity regret bound, which is a deterministic online counterpart of recent risk bounds derived in the stochastic setting under a sparsity scenario. We prove such regret bounds for an online-learning algorithm called SeqSEW and based on exponential weighting and data-driven truncation. In a second part we apply a parameter-free version of this algorithm on i.i.d. data and derive risk bounds of the same flavor as in Dalalyan and Tsybakov (2008, 2011) but which solve two questions left open therein. In particular our risk bounds are adaptive (up to a logarithmic factor) to the unknown variance of the noise if the latter is Gaussian.
RIS
TY - CPAPER TI - Sparsity Regret Bounds for Individual Sequences in Online Linear Regression AU - Sébastien Gerchinovitz BT - Proceedings of the 24th Annual Conference on Learning Theory DA - 2011/12/21 ED - Sham M. Kakade ED - Ulrike von Luxburg ID - pmlr-v19-gerchinovitz11a PB - PMLR DP - Proceedings of Machine Learning Research VL - 19 SP - 377 EP - 396 L1 - http://proceedings.mlr.press/v19/gerchinovitz11a/gerchinovitz11a.pdf UR - https://proceedings.mlr.press/v19/gerchinovitz11a.html AB - We consider the problem of online linear regression on arbitrary deterministic sequences when the ambient dimension $d$ can be much larger than the number of time rounds $T$. We introduce the notion of sparsity regret bound, which is a deterministic online counterpart of recent risk bounds derived in the stochastic setting under a sparsity scenario. We prove such regret bounds for an online-learning algorithm called SeqSEW and based on exponential weighting and data-driven truncation. In a second part we apply a parameter-free version of this algorithm on i.i.d. data and derive risk bounds of the same flavor as in Dalalyan and Tsybakov (2008, 2011) but which solve two questions left open therein. In particular our risk bounds are adaptive (up to a logarithmic factor) to the unknown variance of the noise if the latter is Gaussian. ER -
APA
Gerchinovitz, S.. (2011). Sparsity Regret Bounds for Individual Sequences in Online Linear Regression. Proceedings of the 24th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 19:377-396 Available from https://proceedings.mlr.press/v19/gerchinovitz11a.html.

Related Material