Nonlinear Information-Theoretic Compressive Measurement Design

Liming Wang, Abolfazl Razi, Miguel Rodrigues, Robert Calderbank, Lawrence Carin
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1161-1169, 2014.

Abstract

We investigate design of general nonlinear functions for mapping high-dimensional data into a lower-dimensional (compressive) space. The nonlinear measurements are assumed contaminated by additive Gaussian noise. Depending on the application, we are either interested in recovering the high-dimensional data from the nonlinear compressive measurements, or performing classification directly based on these measurements. The latter case corresponds to classification based on nonlinearly constituted and noisy features. The nonlinear measurement functions are designed based on constrained mutual-information optimization. New analytic results are developed for the gradient of mutual information in this setting, for arbitrary input-signal statistics. We make connections to kernel-based methods, such as the support vector machine. Encouraging results are presented on multiple datasets, for both signal recovery and classification. The nonlinear approach is shown to be particularly valuable in high-noise scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-wangh14, title = {Nonlinear Information-Theoretic Compressive Measurement Design}, author = {Wang, Liming and Razi, Abolfazl and Rodrigues, Miguel and Calderbank, Robert and Carin, Lawrence}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1161--1169}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/wangh14.pdf}, url = {https://proceedings.mlr.press/v32/wangh14.html}, abstract = {We investigate design of general nonlinear functions for mapping high-dimensional data into a lower-dimensional (compressive) space. The nonlinear measurements are assumed contaminated by additive Gaussian noise. Depending on the application, we are either interested in recovering the high-dimensional data from the nonlinear compressive measurements, or performing classification directly based on these measurements. The latter case corresponds to classification based on nonlinearly constituted and noisy features. The nonlinear measurement functions are designed based on constrained mutual-information optimization. New analytic results are developed for the gradient of mutual information in this setting, for arbitrary input-signal statistics. We make connections to kernel-based methods, such as the support vector machine. Encouraging results are presented on multiple datasets, for both signal recovery and classification. The nonlinear approach is shown to be particularly valuable in high-noise scenarios.} }
Endnote
%0 Conference Paper %T Nonlinear Information-Theoretic Compressive Measurement Design %A Liming Wang %A Abolfazl Razi %A Miguel Rodrigues %A Robert Calderbank %A Lawrence Carin %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-wangh14 %I PMLR %P 1161--1169 %U https://proceedings.mlr.press/v32/wangh14.html %V 32 %N 2 %X We investigate design of general nonlinear functions for mapping high-dimensional data into a lower-dimensional (compressive) space. The nonlinear measurements are assumed contaminated by additive Gaussian noise. Depending on the application, we are either interested in recovering the high-dimensional data from the nonlinear compressive measurements, or performing classification directly based on these measurements. The latter case corresponds to classification based on nonlinearly constituted and noisy features. The nonlinear measurement functions are designed based on constrained mutual-information optimization. New analytic results are developed for the gradient of mutual information in this setting, for arbitrary input-signal statistics. We make connections to kernel-based methods, such as the support vector machine. Encouraging results are presented on multiple datasets, for both signal recovery and classification. The nonlinear approach is shown to be particularly valuable in high-noise scenarios.
RIS
TY - CPAPER TI - Nonlinear Information-Theoretic Compressive Measurement Design AU - Liming Wang AU - Abolfazl Razi AU - Miguel Rodrigues AU - Robert Calderbank AU - Lawrence Carin BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-wangh14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1161 EP - 1169 L1 - http://proceedings.mlr.press/v32/wangh14.pdf UR - https://proceedings.mlr.press/v32/wangh14.html AB - We investigate design of general nonlinear functions for mapping high-dimensional data into a lower-dimensional (compressive) space. The nonlinear measurements are assumed contaminated by additive Gaussian noise. Depending on the application, we are either interested in recovering the high-dimensional data from the nonlinear compressive measurements, or performing classification directly based on these measurements. The latter case corresponds to classification based on nonlinearly constituted and noisy features. The nonlinear measurement functions are designed based on constrained mutual-information optimization. New analytic results are developed for the gradient of mutual information in this setting, for arbitrary input-signal statistics. We make connections to kernel-based methods, such as the support vector machine. Encouraging results are presented on multiple datasets, for both signal recovery and classification. The nonlinear approach is shown to be particularly valuable in high-noise scenarios. ER -
APA
Wang, L., Razi, A., Rodrigues, M., Calderbank, R. & Carin, L.. (2014). Nonlinear Information-Theoretic Compressive Measurement Design. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1161-1169 Available from https://proceedings.mlr.press/v32/wangh14.html.

Related Material