We propose Hebb-like learning rules to store a static pattern as a dynamical attractor in a neural network with chaotic dynamics. We show that these kind of rules reduces the attractor dimension while learning, until a fixed point is reached. We choose to stop learning when the system is on a strange attractor or limit cycle. We associate this attractor with the pattern learned: the presentation of the pattern on the chaotic network leads to the reduction of the dynamics on the attractor after learning. This reproduces with a very simple network the results observed by Freeman in the olfactory bulb of the rabbit. Introduction We are concerned with the coding of patterns by dynamical attractors (limit cycles, strange attractors). Instead o...
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find pat...
Abstract. In contradiction with Hopeld-like networks, random recur-rent neural networks (RRNN), wher...
An algorithm is introduced that trains a neural network to identify chaotic dynamics from a single m...
The neural net computer simulations which will be presented here are based on the acceptance of a se...
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural ...
International audienceThe aim of the present paper is to study the effects of Hebbian learning in ra...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
In many complex systems, elementary units live in a chaotic environment and need to adapt their stra...
Poster presentation A central problem in neuroscience is to bridge local synaptic plasticity and the...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural net...
We proposes an autonomous dynamical pattern recognition and learning system. It is demonstrated that...
We evolve small continuous-time recurrent neural networks with fixed weights that perform Hebbian le...
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find pat...
Abstract. In contradiction with Hopeld-like networks, random recur-rent neural networks (RRNN), wher...
An algorithm is introduced that trains a neural network to identify chaotic dynamics from a single m...
The neural net computer simulations which will be presented here are based on the acceptance of a se...
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural ...
International audienceThe aim of the present paper is to study the effects of Hebbian learning in ra...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
In many complex systems, elementary units live in a chaotic environment and need to adapt their stra...
Poster presentation A central problem in neuroscience is to bridge local synaptic plasticity and the...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural net...
We proposes an autonomous dynamical pattern recognition and learning system. It is demonstrated that...
We evolve small continuous-time recurrent neural networks with fixed weights that perform Hebbian le...
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find pat...
Abstract. In contradiction with Hopeld-like networks, random recur-rent neural networks (RRNN), wher...
An algorithm is introduced that trains a neural network to identify chaotic dynamics from a single m...