A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that suppresses redundant features while using a minimal number of hidden units. This is achieved by propagating sideways to newly added hidden units the task of separating successive groups of unclassified points. Computational results how improvement o 26.53% and 19.76? in tenfold cross-validation test correctness over a parsimonious perceptron on two publicly available datasets
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
We present a neural network architecture and a training algorithm designed to enable very rapid trai...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
The back propagation algorithm calculates the weight changes of artificial neural networks, and a co...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Summary form only given, as follows. A novel learning algorithm for multilayered neural networks is ...
A new method of pruning away hidden neurons in neural networks is presented in this paper. The hidde...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
We present a neural network architecture and a training algorithm designed to enable very rapid trai...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
The back propagation algorithm calculates the weight changes of artificial neural networks, and a co...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
A multilayer perceptron is a feed forward artificial neural network model that maps sets of input da...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Summary form only given, as follows. A novel learning algorithm for multilayered neural networks is ...
A new method of pruning away hidden neurons in neural networks is presented in this paper. The hidde...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
We present a neural network architecture and a training algorithm designed to enable very rapid trai...