Iterative Learning and Denoising in Convolutional Neural Associative Memories

Amin Karbasi, Amir Hesam Salavati, Amin Shokrollahi
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):445-453, 2013.

Abstract

The task of a neural associative memory is to retrieve a set of previously memorized patterns from their noisy versions by using a network of neurons. Hence, an ideal network should be able to 1) gradually learn a set of patterns, 2) retrieve the correct pattern from noisy queries and 3) maximize the number of memorized patterns while maintaining the reliability in responding to queries. We show that by considering the inherent redundancy in the memorized patterns, one can obtain all the mentioned properties at once. This is in sharp contrast with the previous work that could only improve one or two aspects at the expense of the third. More specifically, we devise an iterative algorithm that learns the redundancy among the patterns. The resulting network has a retrieval capacity that is exponential in the size of the network. Lastly, by considering the local structures of the network, the asymptotic error correction performance can be made linear in the size of the network.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-karbasi13, title = {Iterative Learning and Denoising in Convolutional Neural Associative Memories}, author = {Karbasi, Amin and Salavati, Amir Hesam and Shokrollahi, Amin}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {445--453}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/karbasi13.pdf}, url = {https://proceedings.mlr.press/v28/karbasi13.html}, abstract = {The task of a neural associative memory is to retrieve a set of previously memorized patterns from their noisy versions by using a network of neurons. Hence, an ideal network should be able to 1) gradually learn a set of patterns, 2) retrieve the correct pattern from noisy queries and 3) maximize the number of memorized patterns while maintaining the reliability in responding to queries. We show that by considering the inherent redundancy in the memorized patterns, one can obtain all the mentioned properties at once. This is in sharp contrast with the previous work that could only improve one or two aspects at the expense of the third. More specifically, we devise an iterative algorithm that learns the redundancy among the patterns. The resulting network has a retrieval capacity that is exponential in the size of the network. Lastly, by considering the local structures of the network, the asymptotic error correction performance can be made linear in the size of the network.} }
Endnote
%0 Conference Paper %T Iterative Learning and Denoising in Convolutional Neural Associative Memories %A Amin Karbasi %A Amir Hesam Salavati %A Amin Shokrollahi %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-karbasi13 %I PMLR %P 445--453 %U https://proceedings.mlr.press/v28/karbasi13.html %V 28 %N 1 %X The task of a neural associative memory is to retrieve a set of previously memorized patterns from their noisy versions by using a network of neurons. Hence, an ideal network should be able to 1) gradually learn a set of patterns, 2) retrieve the correct pattern from noisy queries and 3) maximize the number of memorized patterns while maintaining the reliability in responding to queries. We show that by considering the inherent redundancy in the memorized patterns, one can obtain all the mentioned properties at once. This is in sharp contrast with the previous work that could only improve one or two aspects at the expense of the third. More specifically, we devise an iterative algorithm that learns the redundancy among the patterns. The resulting network has a retrieval capacity that is exponential in the size of the network. Lastly, by considering the local structures of the network, the asymptotic error correction performance can be made linear in the size of the network.
RIS
TY - CPAPER TI - Iterative Learning and Denoising in Convolutional Neural Associative Memories AU - Amin Karbasi AU - Amir Hesam Salavati AU - Amin Shokrollahi BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-karbasi13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 445 EP - 453 L1 - http://proceedings.mlr.press/v28/karbasi13.pdf UR - https://proceedings.mlr.press/v28/karbasi13.html AB - The task of a neural associative memory is to retrieve a set of previously memorized patterns from their noisy versions by using a network of neurons. Hence, an ideal network should be able to 1) gradually learn a set of patterns, 2) retrieve the correct pattern from noisy queries and 3) maximize the number of memorized patterns while maintaining the reliability in responding to queries. We show that by considering the inherent redundancy in the memorized patterns, one can obtain all the mentioned properties at once. This is in sharp contrast with the previous work that could only improve one or two aspects at the expense of the third. More specifically, we devise an iterative algorithm that learns the redundancy among the patterns. The resulting network has a retrieval capacity that is exponential in the size of the network. Lastly, by considering the local structures of the network, the asymptotic error correction performance can be made linear in the size of the network. ER -
APA
Karbasi, A., Salavati, A.H. & Shokrollahi, A.. (2013). Iterative Learning and Denoising in Convolutional Neural Associative Memories. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):445-453 Available from https://proceedings.mlr.press/v28/karbasi13.html.

Related Material