Localized Complexities for Transductive Learning

Ilya Tolstikhin, Gilles Blanchard, Marius Kloft
Proceedings of The 27th Conference on Learning Theory, PMLR 35:857-884, 2014.

Abstract

We show two novel concentration inequalities for suprema of empirical processes when sampling without replacement, which both take the variance of the functions into account. While these inequalities may potentially have broad applications in learning theory in general, we exemplify their significance by studying the transductive setting of learning theory. For which we provide the first excess risk bounds based on the localized complexity of the hypothesis class, which can yield fast rates of convergence also in the transductive learning setting. We give a preliminary analysis of the localized complexities for the prominent case of kernel classes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-tolstikhin14, title = {Localized Complexities for Transductive Learning}, author = {Tolstikhin, Ilya and Blanchard, Gilles and Kloft, Marius}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {857--884}, year = {2014}, editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/tolstikhin14.pdf}, url = {https://proceedings.mlr.press/v35/tolstikhin14.html}, abstract = {We show two novel concentration inequalities for suprema of empirical processes when sampling without replacement, which both take the variance of the functions into account. While these inequalities may potentially have broad applications in learning theory in general, we exemplify their significance by studying the transductive setting of learning theory. For which we provide the first excess risk bounds based on the localized complexity of the hypothesis class, which can yield fast rates of convergence also in the transductive learning setting. We give a preliminary analysis of the localized complexities for the prominent case of kernel classes.} }
Endnote
%0 Conference Paper %T Localized Complexities for Transductive Learning %A Ilya Tolstikhin %A Gilles Blanchard %A Marius Kloft %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-tolstikhin14 %I PMLR %P 857--884 %U https://proceedings.mlr.press/v35/tolstikhin14.html %V 35 %X We show two novel concentration inequalities for suprema of empirical processes when sampling without replacement, which both take the variance of the functions into account. While these inequalities may potentially have broad applications in learning theory in general, we exemplify their significance by studying the transductive setting of learning theory. For which we provide the first excess risk bounds based on the localized complexity of the hypothesis class, which can yield fast rates of convergence also in the transductive learning setting. We give a preliminary analysis of the localized complexities for the prominent case of kernel classes.
RIS
TY - CPAPER TI - Localized Complexities for Transductive Learning AU - Ilya Tolstikhin AU - Gilles Blanchard AU - Marius Kloft BT - Proceedings of The 27th Conference on Learning Theory DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-tolstikhin14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 35 SP - 857 EP - 884 L1 - http://proceedings.mlr.press/v35/tolstikhin14.pdf UR - https://proceedings.mlr.press/v35/tolstikhin14.html AB - We show two novel concentration inequalities for suprema of empirical processes when sampling without replacement, which both take the variance of the functions into account. While these inequalities may potentially have broad applications in learning theory in general, we exemplify their significance by studying the transductive setting of learning theory. For which we provide the first excess risk bounds based on the localized complexity of the hypothesis class, which can yield fast rates of convergence also in the transductive learning setting. We give a preliminary analysis of the localized complexities for the prominent case of kernel classes. ER -
APA
Tolstikhin, I., Blanchard, G. & Kloft, M.. (2014). Localized Complexities for Transductive Learning. Proceedings of The 27th Conference on Learning Theory, in Proceedings of Machine Learning Research 35:857-884 Available from https://proceedings.mlr.press/v35/tolstikhin14.html.

Related Material