Associative Long Short-Term Memory

Ivo Danihelka, Greg Wayne, Benigno Uria, Nal Kalchbrenner, Alex Graves
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1986-1994, 2016.

Abstract

We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-danihelka16, title = {Associative Long Short-Term Memory}, author = {Danihelka, Ivo and Wayne, Greg and Uria, Benigno and Kalchbrenner, Nal and Graves, Alex}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1986--1994}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/danihelka16.pdf}, url = {https://proceedings.mlr.press/v48/danihelka16.html}, abstract = {We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks.} }
Endnote
%0 Conference Paper %T Associative Long Short-Term Memory %A Ivo Danihelka %A Greg Wayne %A Benigno Uria %A Nal Kalchbrenner %A Alex Graves %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-danihelka16 %I PMLR %P 1986--1994 %U https://proceedings.mlr.press/v48/danihelka16.html %V 48 %X We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks.
RIS
TY - CPAPER TI - Associative Long Short-Term Memory AU - Ivo Danihelka AU - Greg Wayne AU - Benigno Uria AU - Nal Kalchbrenner AU - Alex Graves BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-danihelka16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1986 EP - 1994 L1 - http://proceedings.mlr.press/v48/danihelka16.pdf UR - https://proceedings.mlr.press/v48/danihelka16.html AB - We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks. ER -
APA
Danihelka, I., Wayne, G., Uria, B., Kalchbrenner, N. & Graves, A.. (2016). Associative Long Short-Term Memory. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1986-1994 Available from https://proceedings.mlr.press/v48/danihelka16.html.

Related Material