The authors investigate the optimal storage capacity of attractor neural networks with sign-constrained weights, which are prescribed a priori. The storage capacity is calculated by considering the fractional volume of weights which can store a set of random patterns as attractors, for a given stability parameter. It is found that this volume is independent of the particular distribution of signs (gauge invariance) and that the storage capacity of such constrained networks is exactly one half that of the unconstrained network with the corresponding value of the stability parameter
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
The authors study neural network models in which the synaptic efficacies are restricted to have a pr...
International audienceThe optimal storage properties of three different neural network models are st...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The authors consider the retrieval properties of attractor neural networks whose synaptic matrices h...
Contains fulltext : 112678.pdf (preprint version ) (Open Access
By adapting an attractor neural network to an appropriate training overlap, the authors optimize its...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We display a synaptic matrix that can efficiently store, in attractor neural networks (ANN) and perc...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
Biological neural networks do not allow the synapses to choose their own sign: excitatory or inhibit...
Neurophysiological experiments show that the strength of synaptic connections can undergo substantia...
scopus:eid=2-s2.0-78751676189 We study the storage of phase-coded patterns as stable dynamical attra...
We performed a systematic study of the sizes of the basins of attraction in a Hebbian-type neural ne...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
The authors study neural network models in which the synaptic efficacies are restricted to have a pr...
International audienceThe optimal storage properties of three different neural network models are st...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The authors consider the retrieval properties of attractor neural networks whose synaptic matrices h...
Contains fulltext : 112678.pdf (preprint version ) (Open Access
By adapting an attractor neural network to an appropriate training overlap, the authors optimize its...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We display a synaptic matrix that can efficiently store, in attractor neural networks (ANN) and perc...
A general mean-field theory is presented for an attractor neural network in which each elementary un...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
Biological neural networks do not allow the synapses to choose their own sign: excitatory or inhibit...
Neurophysiological experiments show that the strength of synaptic connections can undergo substantia...
scopus:eid=2-s2.0-78751676189 We study the storage of phase-coded patterns as stable dynamical attra...
We performed a systematic study of the sizes of the basins of attraction in a Hebbian-type neural ne...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
The authors study neural network models in which the synaptic efficacies are restricted to have a pr...
International audienceThe optimal storage properties of three different neural network models are st...