Affinity Weighted Embedding

Jason Weston, Ron Weiss, Hector Yee
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1215-1223, 2014.

Abstract

Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-weston14, title = {Affinity Weighted Embedding}, author = {Weston, Jason and Weiss, Ron and Yee, Hector}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1215--1223}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/weston14.pdf}, url = {https://proceedings.mlr.press/v32/weston14.html}, abstract = {Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.} }
Endnote
%0 Conference Paper %T Affinity Weighted Embedding %A Jason Weston %A Ron Weiss %A Hector Yee %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-weston14 %I PMLR %P 1215--1223 %U https://proceedings.mlr.press/v32/weston14.html %V 32 %N 2 %X Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets.
RIS
TY - CPAPER TI - Affinity Weighted Embedding AU - Jason Weston AU - Ron Weiss AU - Hector Yee BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-weston14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1215 EP - 1223 L1 - http://proceedings.mlr.press/v32/weston14.pdf UR - https://proceedings.mlr.press/v32/weston14.html AB - Supervised linear embedding models like Wsabie (Weston et al., 2011) and supervised semantic indexing (Bai et al., 2010) have proven successful at ranking, recommendation and annotation tasks. However, despite being scalable to large datasets they do not take full advantage of the extra data due to their linear nature, and we believe they typically underfit. We propose a new class of models which aim to provide improved performance while retaining many of the benefits of the existing class of embedding models. Our approach works by reweighting each component of the embedding of features and labels with a potentially nonlinear affinity function. We describe several variants of the family, and show its usefulness on several datasets. ER -
APA
Weston, J., Weiss, R. & Yee, H.. (2014). Affinity Weighted Embedding. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1215-1223 Available from https://proceedings.mlr.press/v32/weston14.html.

Related Material