We study the learning of an external signal by a neural network and the time to forget it when this network is submitted to noise. The presentation of an external stimulus to the recurrent network of binary neurons may change the state of the synapses. Multiple presentations of a unique signal leads to its learning. Then, during the forgetting time, the presentation of other signals (noise) may also modify the synaptic weights. We construct an estimator of the initial signal thanks to the synaptic currents and define by this way a probability of error. In our model, these synaptic currents evolve as Markov chains. We study the dynamics of these Markov chains and obtain a lower bound on the number of external stimuli that the network can rec...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
We study the learning of an external signal by a neural network and the time to forget it when this ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually lea...
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
We study numerically the memory that forgets, introduced in 1986 by Parisi by bounding the synaptic ...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We show that a message-passing process allows us to store in binary ‘‘material'' synapses a number o...
Attractor neural networks (ANNs) are one of the leading theoretical frameworks for the formation and...
International audienceGated working memory is defined as the capacity of holding arbitrary informati...
<p>All network are of size <i>N</i> = 1000. <b>A</b>: Recall as a function of the load for different...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
We study the learning of an external signal by a neural network and the time to forget it when this ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually lea...
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
We study numerically the memory that forgets, introduced in 1986 by Parisi by bounding the synaptic ...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We show that a message-passing process allows us to store in binary ‘‘material'' synapses a number o...
Attractor neural networks (ANNs) are one of the leading theoretical frameworks for the formation and...
International audienceGated working memory is defined as the capacity of holding arbitrary informati...
<p>All network are of size <i>N</i> = 1000. <b>A</b>: Recall as a function of the load for different...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...