An algorithm for the training of a special multilayered feed-forward neural network is presented. The strategy is very similar to the well-known tiling algorithm, yet the resulting architecture is completely different. Neurons are added in one layer only. The output of the network is given by the product of its k many hidden neurons, which is for ±1 units the result of the parity-operation. The capacity αc of a network trained according to the algorithm is estimated for the storage of randomly defined classifications. The asymptotic dependence is found to be αc ~ k ln k for k→∞. This is in agreement with recent analytic results for the algorithm-independent storage capacity of a parity-machine.
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
Abstract:- Highly nonlinear data sets are important in the field of artificial neural networks. It i...
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
[[abstract]]This paper addresses saturation phenomena at hidden nodes during the learning phase of n...
For realistic neural network applications the storage and recognition of gray-tone patterns, i.e., p...
. A perceptron is trained by a random bit sequence. In comparison to the corresponding classificatio...
Parallelizing neural networks is an active area of research. Current approaches surround the paralle...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
Abstract:- Highly nonlinear data sets are important in the field of artificial neural networks. It i...
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
[[abstract]]This paper addresses saturation phenomena at hidden nodes during the learning phase of n...
For realistic neural network applications the storage and recognition of gray-tone patterns, i.e., p...
. A perceptron is trained by a random bit sequence. In comparison to the corresponding classificatio...
Parallelizing neural networks is an active area of research. Current approaches surround the paralle...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...