The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by introducing a gain variation term in the activation function, (2) Calculation of the gradient descent of error with respect to the weights and gains values and (3) the determination of a new search direction by using information calculated in step (2). The performance of the proposed method is demonstrated by comparing accuracy and computation time wi...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The back-propagation algorithm calculates the weight changes of an artificial neural network, and a ...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented ...
This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this me...
The important feature of this work is the combination of minimizing a function with desirable proper...
Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively se...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
Multi-layered neural networks of the back-propagation type are well known for their universal approx...
The back propagation algorithm has been successfully applied to wide range of practical problems. Si...
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate i...
Abstract-The Back-propagation (BP) training algorithm is a renowned representative of all iterative ...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The back-propagation algorithm calculates the weight changes of an artificial neural network, and a ...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented ...
This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this me...
The important feature of this work is the combination of minimizing a function with desirable proper...
Training of artificial neural networks (ANN) is normally a time consuming task due to iteratively se...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
Multi-layered neural networks of the back-propagation type are well known for their universal approx...
The back propagation algorithm has been successfully applied to wide range of practical problems. Si...
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate i...
Abstract-The Back-propagation (BP) training algorithm is a renowned representative of all iterative ...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The back-propagation algorithm calculates the weight changes of an artificial neural network, and a ...
In this paper we explore different strategies to guide backpropagation algorithm used for training a...