Associative memory architectures are designed for memorization but also offer, through their retrieval method, a form of generalization to unseen inputs: stored memories can be seen as prototypes from this point of view. Focusing on Modern Hopfield Networks (MHN), we show that a large memorization capacity undermines the generalization opportunity. We offer a solution to better optimize this tradeoff. It relies on Minimum Description Length (MDL) to determine during training which memories to store, as well as how many of them.Comment: 4 pages, Associative Memory & Hopfield Networks Workshop at NeurIPS202
. We apply genetic algorithms to fully connected Hopfield associative memory networks. Previously, w...
Determining the memory capacity of two layer neural networks with $m$ hidden neurons and input dimen...
A large number of neural network models of associative memory have been proposed in the literature. ...
We present a Hopfield-like autoassociative network for memories representing examples of concepts. E...
A large number of neural network models of associative memory have been proposed in the literature. ...
The Little-Hopfield network is an auto-associative computational model of neural memory storage and ...
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analy...
The information capacity of general forms of memory is formalized. The number of bits of information...
Sequence memory is an essential attribute of natural and artificial intelligence that enables agents...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We propose a genetic algorithm for mutually connected neural networks to obtain a higher capacity of...
Hopfield neural networks are a possible basis for modelling associative memory in living organisms. ...
We give a review on the rigorous results concerning the storage capacity of the Hopfield model. We d...
This paper presents a further theoretical analysis on the asymptotic memory capacity of the generali...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
. We apply genetic algorithms to fully connected Hopfield associative memory networks. Previously, w...
Determining the memory capacity of two layer neural networks with $m$ hidden neurons and input dimen...
A large number of neural network models of associative memory have been proposed in the literature. ...
We present a Hopfield-like autoassociative network for memories representing examples of concepts. E...
A large number of neural network models of associative memory have been proposed in the literature. ...
The Little-Hopfield network is an auto-associative computational model of neural memory storage and ...
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analy...
The information capacity of general forms of memory is formalized. The number of bits of information...
Sequence memory is an essential attribute of natural and artificial intelligence that enables agents...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We propose a genetic algorithm for mutually connected neural networks to obtain a higher capacity of...
Hopfield neural networks are a possible basis for modelling associative memory in living organisms. ...
We give a review on the rigorous results concerning the storage capacity of the Hopfield model. We d...
This paper presents a further theoretical analysis on the asymptotic memory capacity of the generali...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
. We apply genetic algorithms to fully connected Hopfield associative memory networks. Previously, w...
Determining the memory capacity of two layer neural networks with $m$ hidden neurons and input dimen...
A large number of neural network models of associative memory have been proposed in the literature. ...