We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a network's level of dilution and asymmetry. The network dilution measures the fraction of neuron couples that are connected, and the network asymmetry measures to what extent the underlying connectivity matrix is asymmetric. For each given neural network, we study the dynamical evolution of all the different initial conditions, thus characterizing the full dynamical landscape without imposing any learning rule. Because of the deterministic dynamics, each trajectory converges to an attractor, that can be either a fixed point or a limit cycle. These attractors form ...
Recurrent neural networks (RNN) are powerful tools to explain how attractors may emerge from noisy, ...
A neural network model in which individual memories are stored in limit cycles is studied analytical...
This paper deals with finite size recurrent neural networks which consist of general (possibly with ...
The comprehension of the mechanisms at the basis of the functioning of complexly interconnected netw...
Previous explanations of computations performed by recurrent networks have focused on symmetrically ...
Previous explanations of computations performed by recurrent networks have focused on symmetrically ...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
International audienceIt is possible to construct diluted asymmetric models of neural networks for w...
The Monte Carlo adaptation rule has been proposed to design asymmetric neural network. By adjusting ...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence...
We present exact analytical equilibrium solutions for a class of recurrent neural network models, wi...
Abstract—Deterministic behavior can be modeled conveniently in the framework of finite automata. We ...
We study the stability of the dynamics of a network of n formal neurons interacting through an asymm...
Symmetrically connected recurrent networks have recently been used as models of a host of neural com...
Recurrent neural networks (RNN) are powerful tools to explain how attractors may emerge from noisy, ...
A neural network model in which individual memories are stored in limit cycles is studied analytical...
This paper deals with finite size recurrent neural networks which consist of general (possibly with ...
The comprehension of the mechanisms at the basis of the functioning of complexly interconnected netw...
Previous explanations of computations performed by recurrent networks have focused on symmetrically ...
Previous explanations of computations performed by recurrent networks have focused on symmetrically ...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
International audienceIt is possible to construct diluted asymmetric models of neural networks for w...
The Monte Carlo adaptation rule has been proposed to design asymmetric neural network. By adjusting ...
ArticleWe present a mathematical analysis of the effects of Hebbian learning in random recurrent neu...
Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence...
We present exact analytical equilibrium solutions for a class of recurrent neural network models, wi...
Abstract—Deterministic behavior can be modeled conveniently in the framework of finite automata. We ...
We study the stability of the dynamics of a network of n formal neurons interacting through an asymm...
Symmetrically connected recurrent networks have recently been used as models of a host of neural com...
Recurrent neural networks (RNN) are powerful tools to explain how attractors may emerge from noisy, ...
A neural network model in which individual memories are stored in limit cycles is studied analytical...
This paper deals with finite size recurrent neural networks which consist of general (possibly with ...