Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation

Taiji Suzuki, Masashi Sugiyama, Jun Sese, Takafumi Kanamori
Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008, PMLR 4:5-20, 2008.

Abstract

Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v4-suzuki08a, title = {Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation}, author = {Suzuki, Taiji and Sugiyama, Masashi and Sese, Jun and Kanamori, Takafumi}, booktitle = {Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008}, pages = {5--20}, year = {2008}, editor = {Saeys, Yvan and Liu, Huan and Inza, Iñaki and Wehenkel, Louis and Pee, Yves Van de}, volume = {4}, series = {Proceedings of Machine Learning Research}, address = {Antwerp, Belgium}, month = {15 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v4/suzuki08a/suzuki08a.pdf}, url = {https://proceedings.mlr.press/v4/suzuki08a.html}, abstract = {Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.} }
Endnote
%0 Conference Paper %T Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation %A Taiji Suzuki %A Masashi Sugiyama %A Jun Sese %A Takafumi Kanamori %B Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008 %C Proceedings of Machine Learning Research %D 2008 %E Yvan Saeys %E Huan Liu %E Iñaki Inza %E Louis Wehenkel %E Yves Van de Pee %F pmlr-v4-suzuki08a %I PMLR %P 5--20 %U https://proceedings.mlr.press/v4/suzuki08a.html %V 4 %X Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods.
RIS
TY - CPAPER TI - Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation AU - Taiji Suzuki AU - Masashi Sugiyama AU - Jun Sese AU - Takafumi Kanamori BT - Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008 DA - 2008/09/11 ED - Yvan Saeys ED - Huan Liu ED - Iñaki Inza ED - Louis Wehenkel ED - Yves Van de Pee ID - pmlr-v4-suzuki08a PB - PMLR DP - Proceedings of Machine Learning Research VL - 4 SP - 5 EP - 20 L1 - http://proceedings.mlr.press/v4/suzuki08a/suzuki08a.pdf UR - https://proceedings.mlr.press/v4/suzuki08a.html AB - Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating mutual information based on maximum likelihood estimation of a density ratio function. Our method, called Maximum Likelihood Mutual Information (MLMI), has several attractive properties, e.g., density estimation is not involved, it is a single-shot procedure, the global optimal solution can be efficiently computed, and cross-validation is available for model selection. Numerical experiments show that MLMI compares favorably with existing methods. ER -
APA
Suzuki, T., Sugiyama, M., Sese, J. & Kanamori, T.. (2008). Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation. Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery at ECML/PKDD 2008, in Proceedings of Machine Learning Research 4:5-20 Available from https://proceedings.mlr.press/v4/suzuki08a.html.

Related Material