Open Problem: Better Bounds for Online Logistic Regression

H. Brendan McMahan, Matthew Streeter
Proceedings of the 25th Annual Conference on Learning Theory, PMLR 23:44.1-44.3, 2012.

Abstract

Known algorithms applied to online logistic regression on a feasible set of \emphL_2 diameter \emphD achieve regret bounds like \emphO(\emphe^D log \emphT) in one dimension, but we show a bound of \emphO(√\emphD + log \emphT) is possible in a binary 1-dimensional problem. Thus, we pose the following question: Is it possible to achieve a regret bound for online logistic regression that is \emphO(poly(\emphD) log(\emphT))? Even if this is not possible in general, it would be interesting to have a bound that reduces to our bound in the one-dimensional case.

Cite this Paper


BibTeX
@InProceedings{pmlr-v23-mcmahan12, title = {Open Problem: Better Bounds for Online Logistic Regression}, author = {McMahan, H. Brendan and Streeter, Matthew}, booktitle = {Proceedings of the 25th Annual Conference on Learning Theory}, pages = {44.1--44.3}, year = {2012}, editor = {Mannor, Shie and Srebro, Nathan and Williamson, Robert C.}, volume = {23}, series = {Proceedings of Machine Learning Research}, address = {Edinburgh, Scotland}, month = {25--27 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v23/mcmahan12/mcmahan12.pdf}, url = {https://proceedings.mlr.press/v23/mcmahan12.html}, abstract = {Known algorithms applied to online logistic regression on a feasible set of \emphL_2 diameter \emphD achieve regret bounds like \emphO(\emphe^D log \emphT) in one dimension, but we show a bound of \emphO(√\emphD + log \emphT) is possible in a binary 1-dimensional problem. Thus, we pose the following question: Is it possible to achieve a regret bound for online logistic regression that is \emphO(poly(\emphD) log(\emphT))? Even if this is not possible in general, it would be interesting to have a bound that reduces to our bound in the one-dimensional case.} }
Endnote
%0 Conference Paper %T Open Problem: Better Bounds for Online Logistic Regression %A H. Brendan McMahan %A Matthew Streeter %B Proceedings of the 25th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2012 %E Shie Mannor %E Nathan Srebro %E Robert C. Williamson %F pmlr-v23-mcmahan12 %I PMLR %P 44.1--44.3 %U https://proceedings.mlr.press/v23/mcmahan12.html %V 23 %X Known algorithms applied to online logistic regression on a feasible set of \emphL_2 diameter \emphD achieve regret bounds like \emphO(\emphe^D log \emphT) in one dimension, but we show a bound of \emphO(√\emphD + log \emphT) is possible in a binary 1-dimensional problem. Thus, we pose the following question: Is it possible to achieve a regret bound for online logistic regression that is \emphO(poly(\emphD) log(\emphT))? Even if this is not possible in general, it would be interesting to have a bound that reduces to our bound in the one-dimensional case.
RIS
TY - CPAPER TI - Open Problem: Better Bounds for Online Logistic Regression AU - H. Brendan McMahan AU - Matthew Streeter BT - Proceedings of the 25th Annual Conference on Learning Theory DA - 2012/06/16 ED - Shie Mannor ED - Nathan Srebro ED - Robert C. Williamson ID - pmlr-v23-mcmahan12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 23 SP - 44.1 EP - 44.3 L1 - http://proceedings.mlr.press/v23/mcmahan12/mcmahan12.pdf UR - https://proceedings.mlr.press/v23/mcmahan12.html AB - Known algorithms applied to online logistic regression on a feasible set of \emphL_2 diameter \emphD achieve regret bounds like \emphO(\emphe^D log \emphT) in one dimension, but we show a bound of \emphO(√\emphD + log \emphT) is possible in a binary 1-dimensional problem. Thus, we pose the following question: Is it possible to achieve a regret bound for online logistic regression that is \emphO(poly(\emphD) log(\emphT))? Even if this is not possible in general, it would be interesting to have a bound that reduces to our bound in the one-dimensional case. ER -
APA
McMahan, H.B. & Streeter, M.. (2012). Open Problem: Better Bounds for Online Logistic Regression. Proceedings of the 25th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 23:44.1-44.3 Available from https://proceedings.mlr.press/v23/mcmahan12.html.

Related Material