Neural Network Learning algorithms based on Conjugate Gradient Techniques and Quasi Newton Techniques such as Broyden, DFP, BFGS, and SSVM algorithms require exact or inexact line searches in order to satisfy their convergence criteria. Line searches are very costly and slow down the learning process. This paper will present new Neural Network learning algorithms based on Hoshino's weak line search technique and Davidon's Optimally Conditioned line search free technique. Also, a practical method of using these optimization algorithms is presented such that they will avoid getting trapped in local minima for the most part. The global minimization problem is a serious one when quadratically convergent techniques such as Quasi Newton...
This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this me...
[[abstract]]This paper proposes a zero-order method of nonlinear optimization using back-propagation...
Work in machine learning has grown tremendously in the past years, but has had little to no impact o...
In the modern digital economy, optimal decision support systems, as well as machine learning systems...
In this paper the problem of neural network training is formulated as the unconstrained minimization...
Constrained optimization problems entail the minimization or maximization of a linear or quadratic o...
Recent years have seen the proposal of several different gradient-based optimization methods for tra...
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate i...
Neural networks (NNs) have seen a surge in popularity due to their unprecedented practical success i...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
In this dissertation the problem of the training of feedforward artificial neural networks and its a...
The chapter presents a novel neural learning methodology by using different combination strategies f...
The important feature of this work is the combination of minimizing a function with desirable proper...
Abstract. This paper presents a novel Quasi-Newton method for the minimization of the error function...
This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this me...
[[abstract]]This paper proposes a zero-order method of nonlinear optimization using back-propagation...
Work in machine learning has grown tremendously in the past years, but has had little to no impact o...
In the modern digital economy, optimal decision support systems, as well as machine learning systems...
In this paper the problem of neural network training is formulated as the unconstrained minimization...
Constrained optimization problems entail the minimization or maximization of a linear or quadratic o...
Recent years have seen the proposal of several different gradient-based optimization methods for tra...
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate i...
Neural networks (NNs) have seen a surge in popularity due to their unprecedented practical success i...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
In this dissertation the problem of the training of feedforward artificial neural networks and its a...
The chapter presents a novel neural learning methodology by using different combination strategies f...
The important feature of this work is the combination of minimizing a function with desirable proper...
Abstract. This paper presents a novel Quasi-Newton method for the minimization of the error function...
This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this me...
[[abstract]]This paper proposes a zero-order method of nonlinear optimization using back-propagation...
Work in machine learning has grown tremendously in the past years, but has had little to no impact o...