This study highlights on the subject of weight initialization in multi-layer feed-forward networks. Training data is analyzed and the notion of criti- cal point is introduced for determining the initial weights for the input to hidden layer synaptic con- nections. The proposed method has been applied to artificial data. The experimental results show that the proposed method takes almost 1/2 of the train- ing time required for standard back propagation
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
A general method for building and training multilayer perceptrons composed of linear threshold units...
In this paper we describe several different training algorithms for feed forward neural networks(FFN...
This study high lights on the subject of weight initialization in back-propagation feed-forward netw...
A method has been proposed for weight initialization in back-propagation feed-forward networks. Trai...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
The training of multilayer perceptron is generally a difficult task. Excessive training times and la...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A general method for building and training multilayer perceptrons composed of linear threshold units...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
A new methodology for neural learning is presented, whereby only a single iteration is required to t...
We present an analytic solution to the problem of on-line gradient-descent learning for two-layer ne...
The main problem for Supervised Multi-layer Neural Network (SMNN) model such as Back propagation net...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
A general method for building and training multilayer perceptrons composed of linear threshold units...
In this paper we describe several different training algorithms for feed forward neural networks(FFN...
This study high lights on the subject of weight initialization in back-propagation feed-forward netw...
A method has been proposed for weight initialization in back-propagation feed-forward networks. Trai...
The back propagation algorithm caused a tremendous breakthrough in the application of multilayer per...
The training of multilayer perceptron is generally a difficult task. Excessive training times and la...
Rumelhart, Hinton and Williams [Rumelhart et al. 86] describe a learning procedure for layered netwo...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
A general method for building and training multilayer perceptrons composed of linear threshold units...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
A new methodology for neural learning is presented, whereby only a single iteration is required to t...
We present an analytic solution to the problem of on-line gradient-descent learning for two-layer ne...
The main problem for Supervised Multi-layer Neural Network (SMNN) model such as Back propagation net...
Abstract- We propose a novel learning algorithm to train networks with multi-layer linear-threshold ...
A general method for building and training multilayer perceptrons composed of linear threshold units...
In this paper we describe several different training algorithms for feed forward neural networks(FFN...