This research was partially supported by the Italian MURST. A new second order algorithm based on Scaled Conjugate Gradient for training recurrent and locally recurrent neural networks is proposed. The algorithm is able to extract second order information performing two times the corresponding first order method. Therefore the computational complexity is only about two times the corresponding first order method. Simulation results show a faster training with respect to the first order algorithm. This second order algorithm is particularly useful for tracking fast varying systems.
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Two concurrent implementations of the method of conjugate gradients for training Elman networks are ...
The model of multi-layered neural networks of the back-propagation type is well-known for their univ...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
We propose a new learning algorithm for locally recurrent neural networks, called truncated recursiv...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
Abstract—In this paper, we evaluate the performance of descent conjugate gradient methods and we pro...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
The chapter contains the dexcription of a set of fast (second-order) training algorithm
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
This paper proposes an improved stochastic second order learning algorithm for supervised neural net...
A recurrent neural network approach to robust approximate pole assignment for second-order systems i...
An iterative pruning method for second-order recurrent neural networks is presented. Each step consi...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Two concurrent implementations of the method of conjugate gradients for training Elman networks are ...
The model of multi-layered neural networks of the back-propagation type is well-known for their univ...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
We propose a new learning algorithm for locally recurrent neural networks, called truncated recursiv...
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to d...
Abstract—In this paper, we evaluate the performance of descent conjugate gradient methods and we pro...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
The chapter contains the dexcription of a set of fast (second-order) training algorithm
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
This paper proposes an improved stochastic second order learning algorithm for supervised neural net...
A recurrent neural network approach to robust approximate pole assignment for second-order systems i...
An iterative pruning method for second-order recurrent neural networks is presented. Each step consi...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Two concurrent implementations of the method of conjugate gradients for training Elman networks are ...
The model of multi-layered neural networks of the back-propagation type is well-known for their univ...