A fast iterative algorithm is proposed for the construction and the learning of a neural net achieving a classification task, with an input layer, one intermediate layer, and an output layer The network is able to learn an arbitrary training set. The algorithm does not depend on a special learning scheme (e.g., the couplings can be determined by modified Hebbian prescriptions or by more complex learning procedures). During the process the intermediate units are constructed systematically by collecting the patterns into smaller subsets. For simplicity, we consider only the case of one output neuron, but actually this restriction is not necessary
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
The problem of saturation in neural network classification problems is discussed. The listprop algor...
A fast iterative algorithm is proposed for the construction and the learning of a neural net achievi...
The perceptron is essentially an adaptive linear combiner with the output quantized to ...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
Neural networks as a general mechanism for learning and adaptation became increasingly popular in re...
A general method for building and training multilayer perceptrons composed of linear threshold units...
A general method for building and training multilayer perceptrons composed of linear threshold units...
One connectionist approach to the classification problem, which has gained popularity in recent year...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
The neural network model (NN) comprised of relatively simple computing elements, operating in parall...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This p...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
The problem of saturation in neural network classification problems is discussed. The listprop algor...
A fast iterative algorithm is proposed for the construction and the learning of a neural net achievi...
The perceptron is essentially an adaptive linear combiner with the output quantized to ...
The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limit...
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
Neural networks as a general mechanism for learning and adaptation became increasingly popular in re...
A general method for building and training multilayer perceptrons composed of linear threshold units...
A general method for building and training multilayer perceptrons composed of linear threshold units...
One connectionist approach to the classification problem, which has gained popularity in recent year...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
The neural network model (NN) comprised of relatively simple computing elements, operating in parall...
We study learning and generalisation ability of a specific two-layer feed-forward neural network and...
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This p...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
A adaptive back-propagation algorithm for multilayered feedforward perceptrons was discussed. It was...
The problem of saturation in neural network classification problems is discussed. The listprop algor...