Back propagation is a steepest descent type algorithm that normally has slow learning rate and the search for the global minimum often becomes trapped at poor local minima. This paper proposes an algorithm called modified recursive prediction error (MRPE) algorithm for training multilayered perceptron networks. MRPE is a modified version of recursive prediction error (RPE) algorithm. RPE and MRPE are based on Gaussian-Newton type algorithm that generally provides better performance than a steepest type algorithm such as back propagation. The current study investigates the performance of MRPE algorithm to train MLP networks and compares its performance to the famous back propagation algorithm. Three data sets were used for the comparison. It...
Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-based methods, such as...
A fundamental limitation of the use of error back propagation (BP) in the training of a layered feed...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The multilayer perceptron is one of the most commonly used types of feedforward neural networks and ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
Summary form only given, as follows. A novel learning algorithm for multilayered neural networks is ...
Recurrent neural networks have the potential to perform significantly better than the commonly used ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
Abstract—Back propagation is one of the well known training algorithms for multilayer perceptron. Ho...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
A learning based error back propagation algorithm, a proposed non-differential digital back propagat...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-based methods, such as...
A fundamental limitation of the use of error back propagation (BP) in the training of a layered feed...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The multilayer perceptron is one of the most commonly used types of feedforward neural networks and ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
Summary form only given, as follows. A novel learning algorithm for multilayered neural networks is ...
Recurrent neural networks have the potential to perform significantly better than the commonly used ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
Abstract—Back propagation is one of the well known training algorithms for multilayer perceptron. Ho...
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasi...
A learning based error back propagation algorithm, a proposed non-differential digital back propagat...
In this chapter, we describe the basic concepts behind the functioning of recurrent neural networks ...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-based methods, such as...
A fundamental limitation of the use of error back propagation (BP) in the training of a layered feed...
This paper concerns dynamic neural networks for signal processing: architectural issues are consider...