We present a novel training algorithm for a feed forward neural network with a single hidden layer of nodes (i.e., two layers of connection weights). Our algorithm is capable of training networks for hard problems, such as the classic two-spirals problem. The weights in the first layer are determined using a quasirandom number generator. These weights are frozen---they are never modified during the training process. The second layer of weights is trained as a simple linear discriminator using methods such as the pseudo-inverse, with possible iterations. We also study the problem of reducing the hidden layer: pruning low-weight nodes and a genetic algorithm search for good subsets
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract: In this paper we present a simple modification of some cascade-correlation type constructi...
This study high lights on the subject of weight initialization in back-propagation feed-forward netw...
We present a novel training algorithm for a feed forward neural network with a single hidden layer o...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
In this paper, we propose a hybrid learning algorithm for the single hidden layer feedforward neural...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Abstract: To reduce random access memory (RAM) requirements and to increase speed of recognition alg...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
In recent years, multi-layer feedforward neural networks have been popularly used for pattern classi...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract: In this paper we present a simple modification of some cascade-correlation type constructi...
This study high lights on the subject of weight initialization in back-propagation feed-forward netw...
We present a novel training algorithm for a feed forward neural network with a single hidden layer o...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
In this paper, we propose a hybrid learning algorithm for the single hidden layer feedforward neural...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Abstract: To reduce random access memory (RAM) requirements and to increase speed of recognition alg...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
In recent years, multi-layer feedforward neural networks have been popularly used for pattern classi...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract: In this paper we present a simple modification of some cascade-correlation type constructi...
This study high lights on the subject of weight initialization in back-propagation feed-forward netw...