AbstractConjugate gradient methods are conjugate direction or gradient deflection methods which lie somewhere between the method of steepest descent and Newton's method. Their principal advantage is that they do not require the storage of any matrices as in Newton's method, or as in quasi-Newton methods, and they are designed to converge faster than the method of steepest descent. Unlike quasi-Newton or variable-metric methods, these are fixed-metric methods in which the search direction at each iteration is based on an approximation to the inverse Hessian constructed by updating a fixed, symmetric, positive definite matrix, typically the identity matrix. The resulting approximation is usually not symmetric, although some variants force sym...
The main subject of the research in this thesis is the study of conjugate gradient methods for optim...
AbstractIn this paper we develop a new class of conjugate gradient methods for unconstrained optimiz...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...
AbstractConjugate gradient methods are conjugate direction or gradient deflection methods which lie ...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimiza...
A new conjugate gradient method (the LS method) that takes into account the effect of inexact line s...
Abstract In this paper, based on a new quasi-Newton equation and the conjugacy condition, we propose...
AbstractQuasi-Newton method is a well-known effective method for solving optimization problems. Sinc...
The conjugate gradient method provides a very powerful tool for solving unconstrained optimization p...
In this paper we present a new line search method known as the HBFGS method, which uses the search d...
A new hybrid quasi-Newton search direction ( ) is proposed. It uses the update formula of Broyden–F...
This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (N...
This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (N...
In unconstrained optimization algorithms, we employ the memoryless quasi Newton procedure to constru...
AbstractIn this paper, a new gradient-related algorithm for solving large-scale unconstrained optimi...
The main subject of the research in this thesis is the study of conjugate gradient methods for optim...
AbstractIn this paper we develop a new class of conjugate gradient methods for unconstrained optimiz...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...
AbstractConjugate gradient methods are conjugate direction or gradient deflection methods which lie ...
Conjugate gradient methods are effective in solving linear equations and solving non-linear optimiza...
A new conjugate gradient method (the LS method) that takes into account the effect of inexact line s...
Abstract In this paper, based on a new quasi-Newton equation and the conjugacy condition, we propose...
AbstractQuasi-Newton method is a well-known effective method for solving optimization problems. Sinc...
The conjugate gradient method provides a very powerful tool for solving unconstrained optimization p...
In this paper we present a new line search method known as the HBFGS method, which uses the search d...
A new hybrid quasi-Newton search direction ( ) is proposed. It uses the update formula of Broyden–F...
This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (N...
This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (N...
In unconstrained optimization algorithms, we employ the memoryless quasi Newton procedure to constru...
AbstractIn this paper, a new gradient-related algorithm for solving large-scale unconstrained optimi...
The main subject of the research in this thesis is the study of conjugate gradient methods for optim...
AbstractIn this paper we develop a new class of conjugate gradient methods for unconstrained optimiz...
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization proble...