This is an old revision of the document!


Motivation

How can we relate the characteristics of associative memory with the functioning of neural networks.

References

http://arxiv.org/pdf/1606.01164v1.pdf

Dense Associative Memory for Pattern Recognition

We propose a simple duality between this dense associative memory and neural networks commonly used in deep learning. The proposed duality makes it possible to apply energy-based intuition from associative memory to analyze computational properties of neural networks with unusual activation functions – the higher rectified polynomials which until now have not been used for training neural networks

http://arxiv.org/pdf/1510.04935v2.pdf Holographic Embeddings of Knowledge Graphs

The proposed method is related to holographic models of associative memory in that it employs circular correlation to create compositional representations. By using correlation as the compositional operator, HOLE can capture rich interactions but simultaneously remains efficient to compute, easy to train, and scalable to very large datasets.

https://arxiv.org/abs/1610.06258 Using Fast Weights to Attend to the Recent Past

https://arxiv.org/abs/1701.00939v1 Dense Associative Memory is Robust to Adversarial Inputs

https://arxiv.org/pdf/1709.06493v1.pdf Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization

https://arxiv.org/pdf/1709.07116v1.pdf Variational Memory Addressing in Generative Models

https://arxiv.org/abs/1610.08613v2 Can Active Memory Replace Attention?

We propose an extended model of active memory that matches existing attention models on neural machine translation and generalizes better to longer sentences. We investigate this model and explain why previous active memory models did not succeed. Finally, we discuss when active memory brings most benefits and where attention can be a better choice.