Recurrent neural networks (RNN) are powerful tools to explain how attractors may emerge from noisy, high-dimensional dynamics. We study here how to learn the ~N^(2) pairwise interactions in a RNN with N neurons to embed L manifolds of dimension D << N. We show that the capacity, i.e. the maximal ratio L/N, decreases as |log(epsilon)|^(-D), where epsilon is the error on the position encoded by the neural activity along each manifold. Hence, RNN are flexible memory devices capable of storing a large number of manifolds at high spatial resolution. Our results rely on a combination of analytical tools from statistical mechanics and random matrix theory, extending Gardner's classical theory of learning to the case of patterns with strong spat...
We present exact analytical equilibrium solutions for a class of recurrent neural network models, wi...
We study with numerical simulation the possible limit behaviors of synchronous discrete-time determi...
: This thesis examines so-called folding neural networks as a mechanism for machine learning. Foldi...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
scopus:eid=2-s2.0-78751676189 We study the storage of phase-coded patterns as stable dynamical attra...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
Many cognitive processes involve transformations of distributed representations in neural population...
How sensory information is encoded and processed by neuronal circuits is a central question in compu...
Copyright: © 2020 Pollock, Jazayeri. This is an open access article distributed under the terms of t...
The static and dynamical properties of neural networks having many-neuron interactions are studied a...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
The comprehension of the mechanisms at the basis of the functioning of complexly interconnected netw...
Abstract. The typical fraction of the space of interactions between each pair of N Ising spins which...
One way to understand the brain is in terms of the computations it performs that allow an organism t...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
We present exact analytical equilibrium solutions for a class of recurrent neural network models, wi...
We study with numerical simulation the possible limit behaviors of synchronous discrete-time determi...
: This thesis examines so-called folding neural networks as a mechanism for machine learning. Foldi...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
scopus:eid=2-s2.0-78751676189 We study the storage of phase-coded patterns as stable dynamical attra...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
Many cognitive processes involve transformations of distributed representations in neural population...
How sensory information is encoded and processed by neuronal circuits is a central question in compu...
Copyright: © 2020 Pollock, Jazayeri. This is an open access article distributed under the terms of t...
The static and dynamical properties of neural networks having many-neuron interactions are studied a...
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent ne...
The comprehension of the mechanisms at the basis of the functioning of complexly interconnected netw...
Abstract. The typical fraction of the space of interactions between each pair of N Ising spins which...
One way to understand the brain is in terms of the computations it performs that allow an organism t...
We provide a characterization of the expressive powers of several models of deterministic and nondet...
We present exact analytical equilibrium solutions for a class of recurrent neural network models, wi...
We study with numerical simulation the possible limit behaviors of synchronous discrete-time determi...
: This thesis examines so-called folding neural networks as a mechanism for machine learning. Foldi...