Supervised Dimension Reduction with Topic Models

Khoat Than, Tu Bao Ho, Duy Khuong Nguyen, Ngoc Khanh Pham
Proceedings of the Asian Conference on Machine Learning, PMLR 25:395-410, 2012.

Abstract

We consider supervised dimension reduction (SDR) for problems with discrete variables. Existing methods are computationally expensive, and often do not take the local structure of data into consideration when searching for a low-dimensional space. In this paper, we propose a novel framework for SDR which is (1) general and fiexible so that it can be easily adapted to various unsupervised topic models, (2) able to inherit scalability of unsupervised topic models, and (3) can exploit well label information and local structure of data when searching for a new space. Extensive experiments with adaptations to three models demonstrate that our framework can yield scalable and qualitative methods for SDR. One of those adaptations can perform better than the state-of-the-art method for SDR while enjoying significantly faster speed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v25-than12, title = {Supervised Dimension Reduction with Topic Models}, author = {Than, Khoat and Ho, Tu Bao and Nguyen, Duy Khuong and Pham, Ngoc Khanh}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {395--410}, year = {2012}, editor = {Hoi, Steven C. H. and Buntine, Wray}, volume = {25}, series = {Proceedings of Machine Learning Research}, address = {Singapore Management University, Singapore}, month = {04--06 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v25/than12/than12.pdf}, url = {https://proceedings.mlr.press/v25/than12.html}, abstract = {We consider supervised dimension reduction (SDR) for problems with discrete variables. Existing methods are computationally expensive, and often do not take the local structure of data into consideration when searching for a low-dimensional space. In this paper, we propose a novel framework for SDR which is (1) general and fiexible so that it can be easily adapted to various unsupervised topic models, (2) able to inherit scalability of unsupervised topic models, and (3) can exploit well label information and local structure of data when searching for a new space. Extensive experiments with adaptations to three models demonstrate that our framework can yield scalable and qualitative methods for SDR. One of those adaptations can perform better than the state-of-the-art method for SDR while enjoying significantly faster speed.} }
Endnote
%0 Conference Paper %T Supervised Dimension Reduction with Topic Models %A Khoat Than %A Tu Bao Ho %A Duy Khuong Nguyen %A Ngoc Khanh Pham %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2012 %E Steven C. H. Hoi %E Wray Buntine %F pmlr-v25-than12 %I PMLR %P 395--410 %U https://proceedings.mlr.press/v25/than12.html %V 25 %X We consider supervised dimension reduction (SDR) for problems with discrete variables. Existing methods are computationally expensive, and often do not take the local structure of data into consideration when searching for a low-dimensional space. In this paper, we propose a novel framework for SDR which is (1) general and fiexible so that it can be easily adapted to various unsupervised topic models, (2) able to inherit scalability of unsupervised topic models, and (3) can exploit well label information and local structure of data when searching for a new space. Extensive experiments with adaptations to three models demonstrate that our framework can yield scalable and qualitative methods for SDR. One of those adaptations can perform better than the state-of-the-art method for SDR while enjoying significantly faster speed.
RIS
TY - CPAPER TI - Supervised Dimension Reduction with Topic Models AU - Khoat Than AU - Tu Bao Ho AU - Duy Khuong Nguyen AU - Ngoc Khanh Pham BT - Proceedings of the Asian Conference on Machine Learning DA - 2012/11/17 ED - Steven C. H. Hoi ED - Wray Buntine ID - pmlr-v25-than12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 25 SP - 395 EP - 410 L1 - http://proceedings.mlr.press/v25/than12/than12.pdf UR - https://proceedings.mlr.press/v25/than12.html AB - We consider supervised dimension reduction (SDR) for problems with discrete variables. Existing methods are computationally expensive, and often do not take the local structure of data into consideration when searching for a low-dimensional space. In this paper, we propose a novel framework for SDR which is (1) general and fiexible so that it can be easily adapted to various unsupervised topic models, (2) able to inherit scalability of unsupervised topic models, and (3) can exploit well label information and local structure of data when searching for a new space. Extensive experiments with adaptations to three models demonstrate that our framework can yield scalable and qualitative methods for SDR. One of those adaptations can perform better than the state-of-the-art method for SDR while enjoying significantly faster speed. ER -
APA
Than, K., Ho, T.B., Nguyen, D.K. & Pham, N.K.. (2012). Supervised Dimension Reduction with Topic Models. Proceedings of the Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 25:395-410 Available from https://proceedings.mlr.press/v25/than12.html.

Related Material