[[abstract]]This paper addresses saturation phenomena at hidden nodes during the learning phase of neural networks. The hidden-node saturation tends to cause a "plateau," a region of very little or no change in a graphic representation of the error learning curve. We investigate the saturation phenomena in multilayer perceptrons (MLP) with the well-known neuralnetwork benchmark problem: the parity problem, describing how to augment their learning capacity to solve it perfectly by avoiding the hidden-node saturation in conjunction with a dynamic programminglike recursive gradient formula. To make the parity problem especially challenging, we first show that the seven-bit parity problem can in principle be solved using only four hid...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
AbstractWe deal with the problem of efficient learning of feedforward neural networks. First, we con...
In artificial neural networks, learning from data is a computationally demanding task in which a lar...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural ...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
We study the effect of regularization in an on-line gradient-descent learning scenario for a general...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Several neural network architectures have been developed over the past several years. One of the mos...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
AbstractWe deal with the problem of efficient learning of feedforward neural networks. First, we con...
In artificial neural networks, learning from data is a computationally demanding task in which a lar...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
Starting with two hidden units, we train a simple single hidden layer feed-forward neural network to...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
Abstract—We develop, in this brief, a new constructive learning algorithm for feedforward neural net...
The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural ...
A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that...
We study the effect of regularization in an on-line gradient-descent learning scenario for a general...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Several neural network architectures have been developed over the past several years. One of the mos...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
AbstractWe deal with the problem of efficient learning of feedforward neural networks. First, we con...
In artificial neural networks, learning from data is a computationally demanding task in which a lar...