Entropy-Based Concentration Inequalities for Dependent Variables

Liva Ralaivola, Massih-Reza Amini
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:2436-2444, 2015.

Abstract

We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-ralaivola15, title = {Entropy-Based Concentration Inequalities for Dependent Variables}, author = {Ralaivola, Liva and Amini, Massih-Reza}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {2436--2444}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/ralaivola15.pdf}, url = {https://proceedings.mlr.press/v37/ralaivola15.html}, abstract = {We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.} }
Endnote
%0 Conference Paper %T Entropy-Based Concentration Inequalities for Dependent Variables %A Liva Ralaivola %A Massih-Reza Amini %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-ralaivola15 %I PMLR %P 2436--2444 %U https://proceedings.mlr.press/v37/ralaivola15.html %V 37 %X We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.
RIS
TY - CPAPER TI - Entropy-Based Concentration Inequalities for Dependent Variables AU - Liva Ralaivola AU - Massih-Reza Amini BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-ralaivola15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 2436 EP - 2444 L1 - http://proceedings.mlr.press/v37/ralaivola15.pdf UR - https://proceedings.mlr.press/v37/ralaivola15.html AB - We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking. ER -
APA
Ralaivola, L. & Amini, M.. (2015). Entropy-Based Concentration Inequalities for Dependent Variables. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:2436-2444 Available from https://proceedings.mlr.press/v37/ralaivola15.html.

Related Material