We study the learning of an external signal by a neural network and the time to forget it when this network is submitted to other signals considered as noise. The presentation of an external stimulus changes the state of the synapses in a network of binary neurons. Multiple presentations of a unique signal leads to its learning. Then, the presentation of other signals also changes the synaptic weight (during the forgetting time). We study the number of external signals to which the network can be submitted until the initial signal is considered as forgotten. We construct an estimator of the initial signal thanks to the synaptic currents. In our model, these synaptic currents evolve as Markov chains. We study mathematically these Markov chai...
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
International audienceGated working memory is defined as the capacity of holding arbitrary informati...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...
We study the learning of an external signal by a neural network and the time to forget it when this ...
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually lea...
We study numerically the memory that forgets, introduced in 1986 by Parisi by bounding the synaptic ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
The question of how neural systems encode memories in one-shot without immediately disrupting previo...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We show that a message-passing process allows us to store in binary ‘‘material'' synapses a number o...
In this work we investigate from a computational perspective the efficiency of the Willshaw synaptic...
Memory models based on synapses with discrete and bounded strengths store new memories by forgetting...
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
International audienceGated working memory is defined as the capacity of holding arbitrary informati...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...
We study the learning of an external signal by a neural network and the time to forget it when this ...
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually lea...
We study numerically the memory that forgets, introduced in 1986 by Parisi by bounding the synaptic ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
The question of how neural systems encode memories in one-shot without immediately disrupting previo...
Attractor networks are an influential theory for memory storage in brain systems. This theory has re...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We show that a message-passing process allows us to store in binary ‘‘material'' synapses a number o...
In this work we investigate from a computational perspective the efficiency of the Willshaw synaptic...
Memory models based on synapses with discrete and bounded strengths store new memories by forgetting...
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
International audienceGated working memory is defined as the capacity of holding arbitrary informati...
Abstract. Encoding, storing, and recalling a temporal sequence of stim-uli in a neuronal network can...