We present a novel training algorithm for a feed forward neural network with a single hidden layer of nodes (i.e., two layers of connection weights). Our algorithm is capable of training networks for hard problems, such as the classic two-spirals problem. The weights in the first layer are determined using a quasirandom number generator. These weights are frozen---they are never modified during the training process. The second layer of weights is trained as a simple linear discriminator using methods such as the pseudo-inverse, with possible iterations. We also study the problem of reducing the hidden layer: pruning low-weight nodes and a genetic algorithm search for good subsets
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
We present a novel training algorithm for a feed forward neural network with a single hidden layer o...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
In this paper, we propose a hybrid learning algorithm for the single hidden layer feedforward neural...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
This paper presents a practical algorithm for training neural networks with fuzzy number weights, in...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
We present a neural network architecture and a training algorithm designed to enable very rapid trai...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...
We present a novel training algorithm for a feed forward neural network with a single hidden layer o...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
In this paper, we propose a hybrid learning algorithm for the single hidden layer feedforward neural...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
This paper presents a practical algorithm for training neural networks with fuzzy number weights, in...
Abstract: We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold...
We present a neural network architecture and a training algorithm designed to enable very rapid trai...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully ...