We propose a new learning algorithm for locally recurrent neural networks, called truncated recursive backpropagation which can be easily implemented on-line with good performance. Moreover it generalises the algorithm proposed by Waibel et al. (1989) for TDNN, and includes the Back and Tsoi (1991) algorithm as well as BPS and standard on-line backpropagation as particular cases. The proposed algorithm has a memory and computational complexity that can be adjusted by a careful choice of two parameters h and h' and so it is more flexible than a previous algorithm proposed by us. Although for the sake of brevity we present the new algorithm only for IIR-MLP networks, it can be applied also to any locally recurrent neural network. Some compute...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
Error backpropagation in feedforward neural network models is a popular learning algorithm that has ...
Error backpropagation in feedforward neural network models is a pop-ular learning algorithm that has...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
Neural networks with internal temporal dynamics can be applied to non-linear DSP problems. The class...
This paper is focused on the learning algorithms for dynamic multilayer perceptron neural networks w...
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the...
Abstract—This paper introduces a general framework for de-scribing dynamic neural networks—the layer...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the...
A general method for deriving backpropagation algorithms for networks with recurrent and higher orde...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
v2 sur arxivWe introduce the "NoBackTrack" algorithm to train the parameters of dynamical systems su...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
Error backpropagation in feedforward neural network models is a popular learning algorithm that has ...
Error backpropagation in feedforward neural network models is a pop-ular learning algorithm that has...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
Neural networks with internal temporal dynamics can be applied to non-linear DSP problems. The class...
This paper is focused on the learning algorithms for dynamic multilayer perceptron neural networks w...
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the...
Abstract—This paper introduces a general framework for de-scribing dynamic neural networks—the layer...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the...
A general method for deriving backpropagation algorithms for networks with recurrent and higher orde...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
v2 sur arxivWe introduce the "NoBackTrack" algorithm to train the parameters of dynamical systems su...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
We survey learning algorithms for recurrent neural networks with hidden units, and put the various t...
Error backpropagation in feedforward neural network models is a popular learning algorithm that has ...
Error backpropagation in feedforward neural network models is a pop-ular learning algorithm that has...