Discriminative Features via Generalized Eigenvectors

Nikos Karampatziakis, Paul Mineiro
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):494-502, 2014.

Abstract

Representing examples in a way that is compatible with the underlying classifier can greatly enhance the performance of a learning system. In this paper we investigate scalable techniques for inducing discriminative features by taking advantage of simple second order structure in the data. We focus on multiclass classification and show that features extracted from the generalized eigenvectors of the class conditional second moments lead to classifiers with excellent empirical performance. Moreover, these features have attractive theoretical properties, such as inducing representations that are invariant to linear transformations of the input. We evaluate classifiers built from these features on three different tasks, obtaining state of the art results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-karampatziakis14, title = {Discriminative Features via Generalized Eigenvectors}, author = {Karampatziakis, Nikos and Mineiro, Paul}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {494--502}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/karampatziakis14.pdf}, url = {https://proceedings.mlr.press/v32/karampatziakis14.html}, abstract = {Representing examples in a way that is compatible with the underlying classifier can greatly enhance the performance of a learning system. In this paper we investigate scalable techniques for inducing discriminative features by taking advantage of simple second order structure in the data. We focus on multiclass classification and show that features extracted from the generalized eigenvectors of the class conditional second moments lead to classifiers with excellent empirical performance. Moreover, these features have attractive theoretical properties, such as inducing representations that are invariant to linear transformations of the input. We evaluate classifiers built from these features on three different tasks, obtaining state of the art results.} }
Endnote
%0 Conference Paper %T Discriminative Features via Generalized Eigenvectors %A Nikos Karampatziakis %A Paul Mineiro %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-karampatziakis14 %I PMLR %P 494--502 %U https://proceedings.mlr.press/v32/karampatziakis14.html %V 32 %N 1 %X Representing examples in a way that is compatible with the underlying classifier can greatly enhance the performance of a learning system. In this paper we investigate scalable techniques for inducing discriminative features by taking advantage of simple second order structure in the data. We focus on multiclass classification and show that features extracted from the generalized eigenvectors of the class conditional second moments lead to classifiers with excellent empirical performance. Moreover, these features have attractive theoretical properties, such as inducing representations that are invariant to linear transformations of the input. We evaluate classifiers built from these features on three different tasks, obtaining state of the art results.
RIS
TY - CPAPER TI - Discriminative Features via Generalized Eigenvectors AU - Nikos Karampatziakis AU - Paul Mineiro BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-karampatziakis14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 494 EP - 502 L1 - http://proceedings.mlr.press/v32/karampatziakis14.pdf UR - https://proceedings.mlr.press/v32/karampatziakis14.html AB - Representing examples in a way that is compatible with the underlying classifier can greatly enhance the performance of a learning system. In this paper we investigate scalable techniques for inducing discriminative features by taking advantage of simple second order structure in the data. We focus on multiclass classification and show that features extracted from the generalized eigenvectors of the class conditional second moments lead to classifiers with excellent empirical performance. Moreover, these features have attractive theoretical properties, such as inducing representations that are invariant to linear transformations of the input. We evaluate classifiers built from these features on three different tasks, obtaining state of the art results. ER -
APA
Karampatziakis, N. & Mineiro, P.. (2014). Discriminative Features via Generalized Eigenvectors. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):494-502 Available from https://proceedings.mlr.press/v32/karampatziakis14.html.

Related Material