In the context of sequence processing, we study the relationship between single-layer feedforward neural networks,that have simultaneous access to all items composing a sequence, and single-layer recurrent neural networks which access information one step at a time.We treat both linear and nonlinear networks, describing a constructive procedure, based on linear autoencoders for sequences, that given a feedforward neural network shows how to define a recurrent neural network that implements the same function in time. Upper bounds on the required number of hidden units for the recurrent network as a function of some features of the feedforward network are given. By separating the functional from the memory component, the proposed procedure su...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This thesis studies the introduction of a priori structure into the design of learning systems based...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
Sequence processing involves several tasks such as clustering, classification, prediction, and trans...
Recurrent neural networks can learn complex transduction problems that require maintaining and activ...
Recurrent neural networks can learn complex transduction problems that require maintaining and activ...
[[abstract]]A recurrent neural networks with context units that can handle temporal sequences is pro...
We propose a pre-training technique for recurrent neural networks based on linear autoencoder networ...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
This paper suggests the use of Fourier-type activation functions in fully recurrent neural networks....
A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the conne...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the conne...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This thesis studies the introduction of a priori structure into the design of learning systems based...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
Sequence processing involves several tasks such as clustering, classification, prediction, and trans...
Recurrent neural networks can learn complex transduction problems that require maintaining and activ...
Recurrent neural networks can learn complex transduction problems that require maintaining and activ...
[[abstract]]A recurrent neural networks with context units that can handle temporal sequences is pro...
We propose a pre-training technique for recurrent neural networks based on linear autoencoder networ...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
Learning to solve sequential tasks with recurrent models requires the ability to memorize long seque...
This paper suggests the use of Fourier-type activation functions in fully recurrent neural networks....
A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the conne...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the conne...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This thesis studies the introduction of a priori structure into the design of learning systems based...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...