New algorithms for solving unconstrained optimization problems are presented based on the idea of combining two types of descent directions: the direction of anti-gradient and either the Newton or quasi-Newton directions. The use of latter directions allows one to improve the convergence rate. Global and superlinear convergence properties of these algorithms are established. Numerical experiments using some unconstrained test problems are reported. Also, the proposed algorithms are compared with some existing similar methods using results of experiments. This comparison demonstrates the efficiency of the proposed combined methods
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several conse...
In this thesis we propose new iteratively constructed preconditioners, to be paired with Conjugate G...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
In this thesis, we are mainly concerned with finding the numerical solution of nonlinear unconstrain...
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/77127/1/AIAA-1998-4751-286.pd
Conjugate gradient methods are widely used for unconstrained opti-mization, especially large scale p...
In this paper we present a new search direction known as the CG-BFGS method, which uses the search d...
Abstract In this paper, based on a new quasi-Newton equation and the conjugacy condition, we propose...
In this paper, we define an unconstrained optimization algorithm employing only first-order derivati...
In this paper, a new conjugate gradient method is proposed for large-scale unconstrained o...
Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are pr...
We propose in this paper novel global descent methods for unconstrained global optimization problems...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several conse...
In this thesis we propose new iteratively constructed preconditioners, to be paired with Conjugate G...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
In this thesis, we are mainly concerned with finding the numerical solution of nonlinear unconstrain...
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/77127/1/AIAA-1998-4751-286.pd
Conjugate gradient methods are widely used for unconstrained opti-mization, especially large scale p...
In this paper we present a new search direction known as the CG-BFGS method, which uses the search d...
Abstract In this paper, based on a new quasi-Newton equation and the conjugacy condition, we propose...
In this paper, we define an unconstrained optimization algorithm employing only first-order derivati...
In this paper, a new conjugate gradient method is proposed for large-scale unconstrained o...
Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are pr...
We propose in this paper novel global descent methods for unconstrained global optimization problems...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several conse...
In this thesis we propose new iteratively constructed preconditioners, to be paired with Conjugate G...