Breaking Sticks and Ambiguities with Adaptive Skip-gram

Sergey Bartunov, Dmitry Kondrashkin, Anton Osokin, Dmitry Vetrov
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:130-138, 2016.

Abstract

The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-bartunov16, title = {Breaking Sticks and Ambiguities with Adaptive Skip-gram}, author = {Bartunov, Sergey and Kondrashkin, Dmitry and Osokin, Anton and Vetrov, Dmitry}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {130--138}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/bartunov16.pdf}, url = {https://proceedings.mlr.press/v51/bartunov16.html}, abstract = {The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.} }
Endnote
%0 Conference Paper %T Breaking Sticks and Ambiguities with Adaptive Skip-gram %A Sergey Bartunov %A Dmitry Kondrashkin %A Anton Osokin %A Dmitry Vetrov %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-bartunov16 %I PMLR %P 130--138 %U https://proceedings.mlr.press/v51/bartunov16.html %V 51 %X The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.
RIS
TY - CPAPER TI - Breaking Sticks and Ambiguities with Adaptive Skip-gram AU - Sergey Bartunov AU - Dmitry Kondrashkin AU - Anton Osokin AU - Dmitry Vetrov BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-bartunov16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 130 EP - 138 L1 - http://proceedings.mlr.press/v51/bartunov16.pdf UR - https://proceedings.mlr.press/v51/bartunov16.html AB - The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task. ER -
APA
Bartunov, S., Kondrashkin, D., Osokin, A. & Vetrov, D.. (2016). Breaking Sticks and Ambiguities with Adaptive Skip-gram. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:130-138 Available from https://proceedings.mlr.press/v51/bartunov16.html.

Related Material