Networks of threshold automata are random dynamical systems with a large number of attractors, which J. Hopfield proposed to use as associative memories. We establish the scaling laws relating the maximum number of « useful » attractors and the radius of the attraction basin to the number of automata. A by-product of our analysis is a better choice for thresholds which doubles the performances in terms of the maximum number of « useful » attractors.Les réseaux d'automates à seuil sont des systèmes dynamiques à structure aléatoire semblables aux verres de spins dont J. Hopfield a proposé l'application comme mémoires associatives. Nous établissons les lois d'échelles reliant le nombre maximum d'attracteurs utiles et la distance d'attraction, ...
First Asia-Pacific Conference on Simulated Evolution and LearningWe apply genetic algorithms to full...
Recently, Hopfield and Krotov introduced the concept of dense associative memories [DAM] (close to s...
The addition of noise to the deterministic Hopfield network, trained with one shot Hebbian learning,...
We study generalizations of the Hopfield model for associative memory which contain interactions of ...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We solve the mean field equations for a stochastic Hopfield network with tem-perature (noise) in the...
We apply evolutionary computations to Hopfield's neural network model of associative memory. We repo...
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
nuloVile review some recent rigorous results in the theory of neural networks, and in particular on ...
Abstract: Vile review some recent rigorous results in the theory of neural networks, and in particul...
We consider the multitasking associative network in the low-storage limit and we study its phase dia...
The storage capacity of a Q-state Hopfield network is determined via finite size scaling for paralle...
Three variants of the Hopfield network are examined, each of which is trained using a different iter...
We performed a systematic study of the sizes of the basins of attraction in a Hebbian-type neural ne...
First Asia-Pacific Conference on Simulated Evolution and LearningWe apply genetic algorithms to full...
Recently, Hopfield and Krotov introduced the concept of dense associative memories [DAM] (close to s...
The addition of noise to the deterministic Hopfield network, trained with one shot Hebbian learning,...
We study generalizations of the Hopfield model for associative memory which contain interactions of ...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We solve the mean field equations for a stochastic Hopfield network with tem-perature (noise) in the...
We apply evolutionary computations to Hopfield's neural network model of associative memory. We repo...
Attractor neural networks such as the Hopfield model can be used to model associative memory. An eff...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
nuloVile review some recent rigorous results in the theory of neural networks, and in particular on ...
Abstract: Vile review some recent rigorous results in the theory of neural networks, and in particul...
We consider the multitasking associative network in the low-storage limit and we study its phase dia...
The storage capacity of a Q-state Hopfield network is determined via finite size scaling for paralle...
Three variants of the Hopfield network are examined, each of which is trained using a different iter...
We performed a systematic study of the sizes of the basins of attraction in a Hebbian-type neural ne...
First Asia-Pacific Conference on Simulated Evolution and LearningWe apply genetic algorithms to full...
Recently, Hopfield and Krotov introduced the concept of dense associative memories [DAM] (close to s...
The addition of noise to the deterministic Hopfield network, trained with one shot Hebbian learning,...