It might be inadequate for the line search technique for Newton's method to use only one floating point number. A column vector of the same size as the gradient might be better than a mere float number to accelerate each of the gradient elements with different rates. Moreover, a square matrix of the same order as the Hessian matrix might be helpful to correct the Hessian matrix. Chiang applied something between a column vector and a square matrix, namely a diagonal matrix, to accelerate the gradient and further proposed a faster gradient variant called quadratic gradient. In this paper, we present a new way to build a new version of the quadratic gradient. This new quadratic gradient doesn't satisfy the convergence conditions of the fixed H...
This paper develops a modified quasi-Newton method for structured unconstrained optimization with pa...
Computing the gradient of a function provides fundamental information about its behavior. This infor...
Finding roots of equations is at the heart of most computational science. A well-known and widely us...
Batch gradient descent, ~w(t) = -7JdE/dw(t) , conver~es to a minimum of quadratic form with a time ...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
AbstractAn algorithm was recently presented that minimizes a nonlinear function in several variables...
AbstractA bound on the possible deterioration in the condition number of the inverse Hessian approxi...
It is well known that the minimization of a smooth function f (x) is equivalent to minimizing its gr...
The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leadi...
We consider a variant of inexact Newton Method, called Newton-MR, in which the least-squares sub-pro...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Batch gradient descent, \Deltaw(t) = \GammajdE=dw(t), converges to a minimum of quadratic form with ...
There are several benefits of taking the Hessian of the objective function into account when designi...
AbstractIn this paper, we propose some improvements on a new gradient-type method for solving large-...
This paper develops a modified quasi-Newton method for structured unconstrained optimization with pa...
Computing the gradient of a function provides fundamental information about its behavior. This infor...
Finding roots of equations is at the heart of most computational science. A well-known and widely us...
Batch gradient descent, ~w(t) = -7JdE/dw(t) , conver~es to a minimum of quadratic form with a time ...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
AbstractAn algorithm was recently presented that minimizes a nonlinear function in several variables...
AbstractA bound on the possible deterioration in the condition number of the inverse Hessian approxi...
It is well known that the minimization of a smooth function f (x) is equivalent to minimizing its gr...
The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leadi...
We consider a variant of inexact Newton Method, called Newton-MR, in which the least-squares sub-pro...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Batch gradient descent, \Deltaw(t) = \GammajdE=dw(t), converges to a minimum of quadratic form with ...
There are several benefits of taking the Hessian of the objective function into account when designi...
AbstractIn this paper, we propose some improvements on a new gradient-type method for solving large-...
This paper develops a modified quasi-Newton method for structured unconstrained optimization with pa...
Computing the gradient of a function provides fundamental information about its behavior. This infor...
Finding roots of equations is at the heart of most computational science. A well-known and widely us...