Concentration in unbounded metric spaces and algorithmic stability

Aryeh Kontorovich
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):28-36, 2014.

Abstract

We prove an extension of McDiarmid’s inequality for metric spaces with unbounded diameter. To this end, we introduce the notion of the \em subgaussian diameter, which is a distribution-dependent refinement of the metric diameter. Our technique provides an alternative approach to that of Kutin and Niyogi’s method of weakly difference-bounded functions, and yields nontrivial, dimension-free results in some interesting cases where the former does not. As an application, we give apparently the first generalization bound in the algorithmic stability setting that holds for unbounded loss functions. This yields a novel risk bound for some regularized metric regression algorithms. We give two extensions of the basic concentration result. The first enables one to replace the independence assumption by appropriate strong mixing. The second generalizes the subgaussian technique to other Orlicz norms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-kontorovicha14, title = {Concentration in unbounded metric spaces and algorithmic stability}, author = {Kontorovich, Aryeh}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {28--36}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/kontorovicha14.pdf}, url = {https://proceedings.mlr.press/v32/kontorovicha14.html}, abstract = {We prove an extension of McDiarmid’s inequality for metric spaces with unbounded diameter. To this end, we introduce the notion of the \em subgaussian diameter, which is a distribution-dependent refinement of the metric diameter. Our technique provides an alternative approach to that of Kutin and Niyogi’s method of weakly difference-bounded functions, and yields nontrivial, dimension-free results in some interesting cases where the former does not. As an application, we give apparently the first generalization bound in the algorithmic stability setting that holds for unbounded loss functions. This yields a novel risk bound for some regularized metric regression algorithms. We give two extensions of the basic concentration result. The first enables one to replace the independence assumption by appropriate strong mixing. The second generalizes the subgaussian technique to other Orlicz norms.} }
Endnote
%0 Conference Paper %T Concentration in unbounded metric spaces and algorithmic stability %A Aryeh Kontorovich %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-kontorovicha14 %I PMLR %P 28--36 %U https://proceedings.mlr.press/v32/kontorovicha14.html %V 32 %N 2 %X We prove an extension of McDiarmid’s inequality for metric spaces with unbounded diameter. To this end, we introduce the notion of the \em subgaussian diameter, which is a distribution-dependent refinement of the metric diameter. Our technique provides an alternative approach to that of Kutin and Niyogi’s method of weakly difference-bounded functions, and yields nontrivial, dimension-free results in some interesting cases where the former does not. As an application, we give apparently the first generalization bound in the algorithmic stability setting that holds for unbounded loss functions. This yields a novel risk bound for some regularized metric regression algorithms. We give two extensions of the basic concentration result. The first enables one to replace the independence assumption by appropriate strong mixing. The second generalizes the subgaussian technique to other Orlicz norms.
RIS
TY - CPAPER TI - Concentration in unbounded metric spaces and algorithmic stability AU - Aryeh Kontorovich BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-kontorovicha14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 28 EP - 36 L1 - http://proceedings.mlr.press/v32/kontorovicha14.pdf UR - https://proceedings.mlr.press/v32/kontorovicha14.html AB - We prove an extension of McDiarmid’s inequality for metric spaces with unbounded diameter. To this end, we introduce the notion of the \em subgaussian diameter, which is a distribution-dependent refinement of the metric diameter. Our technique provides an alternative approach to that of Kutin and Niyogi’s method of weakly difference-bounded functions, and yields nontrivial, dimension-free results in some interesting cases where the former does not. As an application, we give apparently the first generalization bound in the algorithmic stability setting that holds for unbounded loss functions. This yields a novel risk bound for some regularized metric regression algorithms. We give two extensions of the basic concentration result. The first enables one to replace the independence assumption by appropriate strong mixing. The second generalizes the subgaussian technique to other Orlicz norms. ER -
APA
Kontorovich, A.. (2014). Concentration in unbounded metric spaces and algorithmic stability. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):28-36 Available from https://proceedings.mlr.press/v32/kontorovicha14.html.

Related Material