Compositional Morphology for Word Representations and Language Modelling

Jan Botha, Phil Blunsom
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1899-1907, 2014.

Abstract

This paper presents a scalable method for integrating compositional morphological representations into a vector-based probabilistic language model. Our approach is evaluated in the context of log-bilinear language models, rendered suitably efficient for implementation inside a machine translation decoder by factoring the vocabulary. We perform both intrinsic and extrinsic evaluations, presenting results on a range of languages which demonstrate that our model learns morphological representations that both perform well on word similarity tasks and lead to substantial reductions in perplexity. When used for translation into morphologically rich languages with large vocabularies, our models obtain improvements of up to 1.2 BLEU points relative to a baseline system using back-off n-gram models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-botha14, title = {Compositional Morphology for Word Representations and Language Modelling}, author = {Botha, Jan and Blunsom, Phil}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1899--1907}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/botha14.pdf}, url = {https://proceedings.mlr.press/v32/botha14.html}, abstract = {This paper presents a scalable method for integrating compositional morphological representations into a vector-based probabilistic language model. Our approach is evaluated in the context of log-bilinear language models, rendered suitably efficient for implementation inside a machine translation decoder by factoring the vocabulary. We perform both intrinsic and extrinsic evaluations, presenting results on a range of languages which demonstrate that our model learns morphological representations that both perform well on word similarity tasks and lead to substantial reductions in perplexity. When used for translation into morphologically rich languages with large vocabularies, our models obtain improvements of up to 1.2 BLEU points relative to a baseline system using back-off n-gram models.} }
Endnote
%0 Conference Paper %T Compositional Morphology for Word Representations and Language Modelling %A Jan Botha %A Phil Blunsom %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-botha14 %I PMLR %P 1899--1907 %U https://proceedings.mlr.press/v32/botha14.html %V 32 %N 2 %X This paper presents a scalable method for integrating compositional morphological representations into a vector-based probabilistic language model. Our approach is evaluated in the context of log-bilinear language models, rendered suitably efficient for implementation inside a machine translation decoder by factoring the vocabulary. We perform both intrinsic and extrinsic evaluations, presenting results on a range of languages which demonstrate that our model learns morphological representations that both perform well on word similarity tasks and lead to substantial reductions in perplexity. When used for translation into morphologically rich languages with large vocabularies, our models obtain improvements of up to 1.2 BLEU points relative to a baseline system using back-off n-gram models.
RIS
TY - CPAPER TI - Compositional Morphology for Word Representations and Language Modelling AU - Jan Botha AU - Phil Blunsom BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-botha14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1899 EP - 1907 L1 - http://proceedings.mlr.press/v32/botha14.pdf UR - https://proceedings.mlr.press/v32/botha14.html AB - This paper presents a scalable method for integrating compositional morphological representations into a vector-based probabilistic language model. Our approach is evaluated in the context of log-bilinear language models, rendered suitably efficient for implementation inside a machine translation decoder by factoring the vocabulary. We perform both intrinsic and extrinsic evaluations, presenting results on a range of languages which demonstrate that our model learns morphological representations that both perform well on word similarity tasks and lead to substantial reductions in perplexity. When used for translation into morphologically rich languages with large vocabularies, our models obtain improvements of up to 1.2 BLEU points relative to a baseline system using back-off n-gram models. ER -
APA
Botha, J. & Blunsom, P.. (2014). Compositional Morphology for Word Representations and Language Modelling. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1899-1907 Available from https://proceedings.mlr.press/v32/botha14.html.

Related Material