International audienceThe optimal storage properties of three different neural network models are studied. For two of these models the architecture of the network is a perceptron with +or-J interactions, whereas for the third model the output can be an arbitrary function of the inputs. Analytic bounds and numerical estimates of the optimal capacities and of the minimal fraction of errors are obtained for the first two models. The third model can be solved exactly and the exact solution is compared to the bounds and to the results of numerical simulations used for the two other models
The question of the nature of the distributed memory of neural networks is considered. Since the mem...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
We consider the properties of “Potts” neural networks where each neuron can be in $Q$ different stat...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
(A) Storage capacity of the network as a function of r1 with the learning rule defined in Eq (6) (B)...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
This paper presents a probabilistic approach based on collisions to assess the storage capacity of R...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
(A) Dependence of p on C for . In the small r1 limit, the optimal potential width C* is zero (i.e., ...
The authors investigate the optimal storage capacity of attractor neural networks with sign-constrai...
The neural network is a powerful computing framework that has been exploited by biological evolution...
A long standing open problem in the theory of neural networks is the development of quantitative met...
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analy...
The question of the nature of the distributed memory of neural networks is considered. Since the mem...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
We consider the properties of “Potts” neural networks where each neuron can be in $Q$ different stat...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
(A) Storage capacity of the network as a function of r1 with the learning rule defined in Eq (6) (B)...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
This paper presents a probabilistic approach based on collisions to assess the storage capacity of R...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
(A) Dependence of p on C for . In the small r1 limit, the optimal potential width C* is zero (i.e., ...
The authors investigate the optimal storage capacity of attractor neural networks with sign-constrai...
The neural network is a powerful computing framework that has been exploited by biological evolution...
A long standing open problem in the theory of neural networks is the development of quantitative met...
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analy...
The question of the nature of the distributed memory of neural networks is considered. Since the mem...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...