This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very difficult to train by traditional Back Propagation (BP) methods. For MLPs trapped in local minima, compensating methods can correct the wrong outputs one by one using constructing techniques until all outputs are right, so that the MLPs can skip from the local minima to the global minima. A hidden neuron is added as compensation for a binary input three-layer perceptron trapped in a local minimum; and one or two hidden neurons are added as compensation for a real input three-layer perceptron. For a perceptron of more than three layers, the second hidden layer from behind will be temporarily treated as the input layer during compensation, hence the ...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This p...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Abstract—The response of a multilayered perceptron (MLP) network on points which are far away from t...
Several neural network architectures have been developed over the past several years. One of the mos...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
A new training algorithm is presented as a fast alternative to the backpropagation method. The new a...
Abstract Recently back propagation neural network BPNN has been applied successfully in many areas w...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This p...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Abstract—The response of a multilayered perceptron (MLP) network on points which are far away from t...
Several neural network architectures have been developed over the past several years. One of the mos...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
A new training algorithm is presented as a fast alternative to the backpropagation method. The new a...
Abstract Recently back propagation neural network BPNN has been applied successfully in many areas w...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
Abstract. Typically the response of a multilayered perceptron (MLP) network on points which are far ...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...