A) We also trained RNNs without rank constraint. For these networks initial entries in the recurrent weight matrix J were drawn from a zero-mean Gaussian with variance . Here, we add a regularisation term to the loss, which keeps the average firing rates close to 0, to avoid a rate-coding solution (Eq 6). We set g to 0.6 and took a learning rate of 0.001, with the training setup otherwise as for the low-rank networks (Training). In order to find a basis similar to the one used for plotting the dynamics of the low-rank RNN we took the following approach. First we calculated basis vectors for the activity due to recurrent dynamics: J tanh(x(t)), by performing a Principal Component Analysis (singular vector decomposition): , where X is an mat...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
(A) Fixed points of the latent variable κ in the rank-one approximation. The lines show the dynamics...
(A) Influence of reciprocal correlations on fixed points of the latent variable κ in the rank-one ap...
A) RNNs receive transient stimuli as input, along with a reference oscillation. Networks are trained...
A) Here we analysed to what degree a model will learn a phase- versus a rate-coding solution, as a f...
A) We here detail a simple rate-coding model that performs the working memory task. Such a model con...
International audienceAbstract A large body of work has suggested that neural populations exhibit lo...
∗ Equal contribution Recurrent neural networks (RNNs) are useful tools for learning nonlinear rela-t...
Recurrent Neural Networks (RNN) are commonly used models to study neural computation. However, a com...
International audienceNeural population dynamics are often highly coordinated, allowing task-related...
Training recurrent neural networks (RNNs) is a long-standing open problem both in theoretical neuros...
An emerging paradigm proposes that neural computations can be understood at the level of dynamic sys...
Network pruning techniques are widely employed to reduce the memory requirements and increase the in...
This paper concerns the construction and training of basis function networks for the identification ...
We propose a novel low-rank initialization framework for training low-rank deep neural networks -- n...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
(A) Fixed points of the latent variable κ in the rank-one approximation. The lines show the dynamics...
(A) Influence of reciprocal correlations on fixed points of the latent variable κ in the rank-one ap...
A) RNNs receive transient stimuli as input, along with a reference oscillation. Networks are trained...
A) Here we analysed to what degree a model will learn a phase- versus a rate-coding solution, as a f...
A) We here detail a simple rate-coding model that performs the working memory task. Such a model con...
International audienceAbstract A large body of work has suggested that neural populations exhibit lo...
∗ Equal contribution Recurrent neural networks (RNNs) are useful tools for learning nonlinear rela-t...
Recurrent Neural Networks (RNN) are commonly used models to study neural computation. However, a com...
International audienceNeural population dynamics are often highly coordinated, allowing task-related...
Training recurrent neural networks (RNNs) is a long-standing open problem both in theoretical neuros...
An emerging paradigm proposes that neural computations can be understood at the level of dynamic sys...
Network pruning techniques are widely employed to reduce the memory requirements and increase the in...
This paper concerns the construction and training of basis function networks for the identification ...
We propose a novel low-rank initialization framework for training low-rank deep neural networks -- n...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
(A) Fixed points of the latent variable κ in the rank-one approximation. The lines show the dynamics...
(A) Influence of reciprocal correlations on fixed points of the latent variable κ in the rank-one ap...