Abstract We present an emcl analysis of ieaming a rule by on-line gradient descent in a two-layered neural network with adjustable hidden-to-output weights (backpropagation of error). Results are compared with the training of networks having the same architecmre but fixed weights in the second layer. The ability of neural networks to learn arule from examples [I] has been studied successfully in a statistical mechanics context, see e.g. [2-4] for recent reviews. So far most of the analysis has been restricted to very simple networks such as the single layer perceptron [l] or networks with one layer of hidden units and a fixed hidden-to-output relation, e.g. the so-called committee machine 131. In the following we extend the recent investiga...
The influence of biases on the learning dynamics of a two-layer neural network, a normalized soft-co...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
I'' Abstract 'U The multi-layer perceptron is a type of feed forward neural network f...
We present an exact analysis of learning a rule by on-line gradient descent in a two-layered neural ...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
An adaptive back-propagation algorithm parameterized by an inverse temperature 1/T is studied and co...
We present an analytic solution to the problem of on-line gradient-descent learning for two-layer ne...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
This report contains some remarks about the backpropagation method for neural net learning. We conce...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Introduction Backpropagation and contrastive Hebbian learning (CHL) are two supervised learning alg...
What follows extends some of our results of [1] on learning from ex-amples in layered feed-forward n...
The influence of biases on the learning dynamics of a two-layer neural network, a normalized soft-co...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
I'' Abstract 'U The multi-layer perceptron is a type of feed forward neural network f...
We present an exact analysis of learning a rule by on-line gradient descent in a two-layered neural ...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
An adaptive back-propagation algorithm parameterized by an inverse temperature 1/T is studied and co...
We present an analytic solution to the problem of on-line gradient-descent learning for two-layer ne...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
This report contains some remarks about the backpropagation method for neural net learning. We conce...
We study on-line gradient-descent learning in multilayer networks analytically and numerically. The ...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Introduction Backpropagation and contrastive Hebbian learning (CHL) are two supervised learning alg...
What follows extends some of our results of [1] on learning from ex-amples in layered feed-forward n...
The influence of biases on the learning dynamics of a two-layer neural network, a normalized soft-co...
This paper presents some simple techniques to improve the backpropagation algorithm. Since learning ...
I'' Abstract 'U The multi-layer perceptron is a type of feed forward neural network f...