Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to deal with complex data in the form of sequences of vectors. They are well known for their power to model temporal dependencies and process sequences for classification, recognition, and transduction. In this paper, we propose a nonmonotone conjugate gradient training algorithm for recurrent neural networks, which is equipped with an adaptive tuning strategy for the nonmonotone learning horizon. Simulation results show that this modification of conjugate gradient is more effective than the original CG in four applications using three different recurrent network architectures
A novel variant of a familiar recurrent network learning algorithm is described. This algorithm is c...
Bellec et al. present a mathematically founded approximation for gradient descent training of recurr...
An adaptive algorithm for function minimization based on conjugate gradients for the problem of find...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal s...
Abstract—In this paper, we evaluate the performance of descent conjugate gradient methods and we pro...
In this paper, we present a formulation of the learning problem that allows deterministic nonmonoton...
In this paper, we present nonmonotone variants of the Levenberg–Marquardt (LM) method for training r...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
The exact form of a gradient-following learning algorithm for completely recurrent networks running ...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
A novel variant of a familiar recurrent network learning algorithm is described. This algorithm is c...
Bellec et al. present a mathematically founded approximation for gradient descent training of recurr...
An adaptive algorithm for function minimization based on conjugate gradients for the problem of find...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal s...
Abstract—In this paper, we evaluate the performance of descent conjugate gradient methods and we pro...
In this paper, we present a formulation of the learning problem that allows deterministic nonmonoton...
In this paper, we present nonmonotone variants of the Levenberg–Marquardt (LM) method for training r...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
The exact form of a gradient-following learning algorithm for completely recurrent networks running ...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
A novel variant of a familiar recurrent network learning algorithm is described. This algorithm is c...
Bellec et al. present a mathematically founded approximation for gradient descent training of recurr...
An adaptive algorithm for function minimization based on conjugate gradients for the problem of find...