This report presents P scg , a new global optimization method for training multilayered perceptrons. Instead of local minima, global minima of the error function are found. This new method is hybrid in the sense that it combines three very different optimization techniques: Random Line Search, Scaled Conjugate Gradient and a 1-dimensional minimization algorithm named P . The best points of each component are retained by the hybrid method: simplicity of Random Line Search, efficiency of Scaled Conjugate Gradient, efficiency and convergence toward a global minimum for P . P scg is empirically shown to perform better or much better than three other global random optimization methods and a global deterministic optimization method....
This paper presents some numerical experiments related to a new global "pseudo-backpropagation" algo...
We present deterministic nonmonotone learning strategies for multilayer perceptrons (MLPs), i.e., de...
The back-propagation algorithm is mainly used for mul-tilayer perceptrons. This algorithm is, howeve...
: In this paper we consider a possible improvment of conjugate gradient methods commonly used for tr...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOAB...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural ...
Several neural network architectures have been developed over the past several years. One of the mos...
In this paper we propose a Monte Carlo-based learning algorithm for multi-layer perceptron (MLP) whi...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
Training a multilayer perceptron (MLP) with algorithms employing global search strategies has been a...
The problem of finding the global minimum of multidimensional functions is often applied to a wide r...
This paper presents some numerical experiments related to a new global "pseudo-backpropagation" algo...
We present deterministic nonmonotone learning strategies for multilayer perceptrons (MLPs), i.e., de...
The back-propagation algorithm is mainly used for mul-tilayer perceptrons. This algorithm is, howeve...
: In this paper we consider a possible improvment of conjugate gradient methods commonly used for tr...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Multilayer perceptrons (MLPs) (1) are the most common artificial neural networks employed in a large...
The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOAB...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
This thesis addresses the issue of applying a "globally" convergent optimization scheme to the train...
The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural ...
Several neural network architectures have been developed over the past several years. One of the mos...
In this paper we propose a Monte Carlo-based learning algorithm for multi-layer perceptron (MLP) whi...
This paper presents two compensation methods for multilayer perceptrons (MLPs) which are very diffic...
Training a multilayer perceptron (MLP) with algorithms employing global search strategies has been a...
The problem of finding the global minimum of multidimensional functions is often applied to a wide r...
This paper presents some numerical experiments related to a new global "pseudo-backpropagation" algo...
We present deterministic nonmonotone learning strategies for multilayer perceptrons (MLPs), i.e., de...
The back-propagation algorithm is mainly used for mul-tilayer perceptrons. This algorithm is, howeve...