A general mean-field theory is presented for an attractor neural network in which each elementary unit is described by one input and one output real variable, and whose synaptic strengths are determined by a covariance imprinting rule. In the case of threshold-linear units, a single equation is shown to yield the storage capacity for the retrieval of random activity patterns drawn from any given probability distribution. If this distribution produces binary patterns, the storage capacity is essentially the same as for networks of binary units. To explore the effects of storing more structured patterns, the case of a ternary distribution is studied. It is shown that the number of patterns that can be stored can be much higher than in the bin...
Most analytical results concerning the long-time behaviour of associative memory networks have been ...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
The link between the structure of a neural network and its attractor states is investigated, with a ...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Memory is a fundamental part of computational systems like the human brain. Theoretical models ident...
Real neuronal networks in the brain utilize networks of neurons with graded not binary firing rates....
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
The persistent and graded activity often observed in cortical circuits is sometimes seen as a signat...
This paper presents an Attractor Neural Network (ANN) model of Re-call and Recognition. It is shown ...
Most analytical results concerning the long-time behaviour of associative memory networks have been ...
Most analytical results concerning the long-time behaviour of associative memory networks have been ...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
The link between the structure of a neural network and its attractor states is investigated, with a ...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Memory is a fundamental part of computational systems like the human brain. Theoretical models ident...
Real neuronal networks in the brain utilize networks of neurons with graded not binary firing rates....
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying...
The persistent and graded activity often observed in cortical circuits is sometimes seen as a signat...
This paper presents an Attractor Neural Network (ANN) model of Re-call and Recognition. It is shown ...
Most analytical results concerning the long-time behaviour of associative memory networks have been ...
Most analytical results concerning the long-time behaviour of associative memory networks have been ...
A fundamental problem in neuroscience is understanding how working memory—the ability to store infor...
The link between the structure of a neural network and its attractor states is investigated, with a ...