A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses Armijo gradient method iterations until it reaches a region where the Newton method is more efficient, and then switches over to a secant form of operation. It is concluded that an efficient method for unconstrained minimization has been developed, and that any convergent minimization method can be substituted for the Armijo gradient method
The secant method is a very effective numerical procedure used for solving nonlinear equations of th...
A theory of inexact Newton methods with secant preconditioners for solving large nonlinear systems o...
AbstractA new algorithm for unconstrained optimization is presented which is based on a modified one...
In this work some interesting relations between results on basic optimization and algorithms for non...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
AbstractThis paper presents a family of improved secant algorithms via two preconditional curvilinea...
Finding the unconstrained minimizer of a function of more than one variable is an important problem ...
In this paper we present a new algorithm of steepest descent type. A new technique for steplength co...
AbstractFollowing the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradie...
AbstractSome modifications of the secant method for solving nonlinear equations are revisited and th...
This paper includes a twofold result for the Nonlinear Conjugate Gradient (NCG) method, in large sca...
The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems ...
AbstractIn this paper the development, convergence theory and numerical testing of a class of gradie...
A new accelerated gradient method for finding the minimum of a function f(x) whose variables are unc...
This paper presents a diagonal-secant modification of the successive element correction method, a fi...
The secant method is a very effective numerical procedure used for solving nonlinear equations of th...
A theory of inexact Newton methods with secant preconditioners for solving large nonlinear systems o...
AbstractA new algorithm for unconstrained optimization is presented which is based on a modified one...
In this work some interesting relations between results on basic optimization and algorithms for non...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
AbstractThis paper presents a family of improved secant algorithms via two preconditional curvilinea...
Finding the unconstrained minimizer of a function of more than one variable is an important problem ...
In this paper we present a new algorithm of steepest descent type. A new technique for steplength co...
AbstractFollowing the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradie...
AbstractSome modifications of the secant method for solving nonlinear equations are revisited and th...
This paper includes a twofold result for the Nonlinear Conjugate Gradient (NCG) method, in large sca...
The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems ...
AbstractIn this paper the development, convergence theory and numerical testing of a class of gradie...
A new accelerated gradient method for finding the minimum of a function f(x) whose variables are unc...
This paper presents a diagonal-secant modification of the successive element correction method, a fi...
The secant method is a very effective numerical procedure used for solving nonlinear equations of th...
A theory of inexact Newton methods with secant preconditioners for solving large nonlinear systems o...
AbstractA new algorithm for unconstrained optimization is presented which is based on a modified one...