On the Estimation of $\alpha$-Divergences

Barnabas Poczos, Jeff Schneider
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:609-617, 2011.

Abstract

We propose new nonparametric, consistent Rényi-$\alpha$ and Tsallis-$\alpha$ divergence estimators for continuous distributions. Given two independent and identically distributed samples, a ‘brute force’ approach would be simply to estimate the underlying densities, and plug these densities into the corresponding formulas. However, it is not our goal to consistently estimate these possibly high dimensional densities, and our algorithm avoids estimating them. We will use simple $k$-nearest-neighbor statistics, and interestingly enough, we will still be able to prove that the proposed divergence estimators are consistent under certain conditions. We will also show how to use them for mutual information estimation, and demonstrate their efficiency by some numerical experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v15-poczos11a, title = {On the Estimation of $\alpha$-Divergences}, author = {Poczos, Barnabas and Schneider, Jeff}, booktitle = {Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics}, pages = {609--617}, year = {2011}, editor = {Gordon, Geoffrey and Dunson, David and Dudík, Miroslav}, volume = {15}, series = {Proceedings of Machine Learning Research}, address = {Fort Lauderdale, FL, USA}, month = {11--13 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v15/poczos11a/poczos11a.pdf}, url = {https://proceedings.mlr.press/v15/poczos11a.html}, abstract = {We propose new nonparametric, consistent Rényi-$\alpha$ and Tsallis-$\alpha$ divergence estimators for continuous distributions. Given two independent and identically distributed samples, a ‘brute force’ approach would be simply to estimate the underlying densities, and plug these densities into the corresponding formulas. However, it is not our goal to consistently estimate these possibly high dimensional densities, and our algorithm avoids estimating them. We will use simple $k$-nearest-neighbor statistics, and interestingly enough, we will still be able to prove that the proposed divergence estimators are consistent under certain conditions. We will also show how to use them for mutual information estimation, and demonstrate their efficiency by some numerical experiments.} }
Endnote
%0 Conference Paper %T On the Estimation of $\alpha$-Divergences %A Barnabas Poczos %A Jeff Schneider %B Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2011 %E Geoffrey Gordon %E David Dunson %E Miroslav Dudík %F pmlr-v15-poczos11a %I PMLR %P 609--617 %U https://proceedings.mlr.press/v15/poczos11a.html %V 15 %X We propose new nonparametric, consistent Rényi-$\alpha$ and Tsallis-$\alpha$ divergence estimators for continuous distributions. Given two independent and identically distributed samples, a ‘brute force’ approach would be simply to estimate the underlying densities, and plug these densities into the corresponding formulas. However, it is not our goal to consistently estimate these possibly high dimensional densities, and our algorithm avoids estimating them. We will use simple $k$-nearest-neighbor statistics, and interestingly enough, we will still be able to prove that the proposed divergence estimators are consistent under certain conditions. We will also show how to use them for mutual information estimation, and demonstrate their efficiency by some numerical experiments.
RIS
TY - CPAPER TI - On the Estimation of $\alpha$-Divergences AU - Barnabas Poczos AU - Jeff Schneider BT - Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics DA - 2011/06/14 ED - Geoffrey Gordon ED - David Dunson ED - Miroslav Dudík ID - pmlr-v15-poczos11a PB - PMLR DP - Proceedings of Machine Learning Research VL - 15 SP - 609 EP - 617 L1 - http://proceedings.mlr.press/v15/poczos11a/poczos11a.pdf UR - https://proceedings.mlr.press/v15/poczos11a.html AB - We propose new nonparametric, consistent Rényi-$\alpha$ and Tsallis-$\alpha$ divergence estimators for continuous distributions. Given two independent and identically distributed samples, a ‘brute force’ approach would be simply to estimate the underlying densities, and plug these densities into the corresponding formulas. However, it is not our goal to consistently estimate these possibly high dimensional densities, and our algorithm avoids estimating them. We will use simple $k$-nearest-neighbor statistics, and interestingly enough, we will still be able to prove that the proposed divergence estimators are consistent under certain conditions. We will also show how to use them for mutual information estimation, and demonstrate their efficiency by some numerical experiments. ER -
APA
Poczos, B. & Schneider, J.. (2011). On the Estimation of $\alpha$-Divergences. Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 15:609-617 Available from https://proceedings.mlr.press/v15/poczos11a.html.

Related Material