We present a global algorithm for training multilayer neural networks in this Letter. The algorithm is focused on controlling the local fields of neurons induced by the input of samples by random adaptations of the synaptic weights. Unlike the backpropagation algorithm, the networks may have discrete-state weights, and may apply either differentiable or nondifferentiable neural transfer functions. A two-layer network is trained as an example to separate a linearly inseparable set of samples into two categories, and its powerful generalization capacity is emphasized. The extension to more general cases is straightforward
金沢大学大学院自然科学研究科知能情報・数理A training data selection method is proposed for multilayer neural networks (ML...
A training data selection method for multi-class data is proposed. This method can be used for multi...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
The new modifications of multilayered neurak networks training algorithms in a generalized training ...
A method for calculating the globally optimal learning rate in on-line gradient-descent training of ...
We present a framework for calculating globally optimal parameters, within a given time frame, for o...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
An anti-Hebbian local learning algorithm for two-layer optical neural networks is introduced. With t...
The use of multilayer perceptrons (MLP) with threshold functions (binary step function activations) ...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
金沢大学大学院自然科学研究科知能情報・数理A training data selection method is proposed for multilayer neural networks (ML...
A training data selection method for multi-class data is proposed. This method can be used for multi...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
The new modifications of multilayered neurak networks training algorithms in a generalized training ...
A method for calculating the globally optimal learning rate in on-line gradient-descent training of ...
We present a framework for calculating globally optimal parameters, within a given time frame, for o...
Artificial neural networks have, in recent years, been very successfully applied in a wide range of ...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
In this paper, the authors propose a new training algorithm which does not only rely upon the traini...
Backpropagation (BP) is one of the most widely used algorithms for training feed-forward neural netw...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
An anti-Hebbian local learning algorithm for two-layer optical neural networks is introduced. With t...
The use of multilayer perceptrons (MLP) with threshold functions (binary step function activations) ...
This study highlights on the subject of weight initialization in multi-layer feed-forward networks....
In this paper, we propose a genetic algorithm for the training and construction of a multilayer perc...
金沢大学大学院自然科学研究科知能情報・数理A training data selection method is proposed for multilayer neural networks (ML...
A training data selection method for multi-class data is proposed. This method can be used for multi...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...