In this paper, on the basis of the DFP method a class of non-quasi-Newton methods is presented. Under some condition the global convergence property of these methods with Goldstein line search on uniformly convex objective function is proved
The thesis concerns mainly in finding the numerical solution of non-linear unconstrained problems. ...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
In inexact Newton methods for solving nonlinear systems of equations, an approximation to the step s...
Abstract In this paper, non-monotone line search procedure is studied, which is combined with the no...
AbstractQuasi-Newton method is a well-known effective method for solving optimization problems. Sinc...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
AbstractIn this paper, we present a multi-step memory gradient method with Goldstein line search for...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
AbstractIn this paper, we discuss the convergence of the DFP algorithm with revised search direction...
In this paper, we define an unconstrained optimization algorithm employing only first-order derivati...
Among the quasi-Newton algorithms, the BFGS method is often discussed by related scholars. However, ...
The success of Newton’s method for smooth optimization, when Hessians are available, motivated the i...
New algorithms for solving unconstrained optimization problems are presented based on the idea of co...
We study the convergence properties of a class of low memory methods for solving large-scale unconst...
We give a framework for the globalization of a nonsmooth Newton method introduced by B. Kummer. We s...
The thesis concerns mainly in finding the numerical solution of non-linear unconstrained problems. ...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
In inexact Newton methods for solving nonlinear systems of equations, an approximation to the step s...
Abstract In this paper, non-monotone line search procedure is studied, which is combined with the no...
AbstractQuasi-Newton method is a well-known effective method for solving optimization problems. Sinc...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
AbstractIn this paper, we present a multi-step memory gradient method with Goldstein line search for...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
AbstractIn this paper, we discuss the convergence of the DFP algorithm with revised search direction...
In this paper, we define an unconstrained optimization algorithm employing only first-order derivati...
Among the quasi-Newton algorithms, the BFGS method is often discussed by related scholars. However, ...
The success of Newton’s method for smooth optimization, when Hessians are available, motivated the i...
New algorithms for solving unconstrained optimization problems are presented based on the idea of co...
We study the convergence properties of a class of low memory methods for solving large-scale unconst...
We give a framework for the globalization of a nonsmooth Newton method introduced by B. Kummer. We s...
The thesis concerns mainly in finding the numerical solution of non-linear unconstrained problems. ...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
In inexact Newton methods for solving nonlinear systems of equations, an approximation to the step s...