Recurrent neural networks have been shown to be able to store memory patterns as fixed point attractors of the dynamics of the network. The prototypical learning rule for storing memories in attractor neural networks is Hebbian learning, which can store up to 0.138N uncorrelated patterns in a recurrent network of N neurons. This is very far from the maximal capacity 2N, which can be achieved by supervised rules, e.g. by the perceptron learning rule. However, these rules are problematic for neurons in the neocortex or the hippocampus, since they rely on the computation of a supervisory error signal for each neuron of the network. We show here that the total synaptic input received by a neuron during the presentation of a sufficiently strong ...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
SCOPUS=eid=2-s2.0-80052989624 We study the storage and retrieval of phase-coded patterns as stable ...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage an...
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage an...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
The neural net computer simulations which will be presented here are based on the acceptance of a se...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
SCOPUS=eid=2-s2.0-80052989624 We study the storage and retrieval of phase-coded patterns as stable ...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage an...
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage an...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
The neural net computer simulations which will be presented here are based on the acceptance of a se...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
SCOPUS=eid=2-s2.0-80052989624 We study the storage and retrieval of phase-coded patterns as stable ...