AbstractIn this paper, we discuss the convergence of the DFP algorithm with revised search direction. Under some inexact line searches, we prove that the algorithm is globally convergent for continuously differentiable functions and the rate of convergence of the algorithm is one-step superlinear and n-step second order for uniformly convex objective functions.From the proof of this paper, we obtain the superlinear and n-step second-order convergence of the DFP algorithm for uniformly convex objective functions
AbstractIn this paper, we propose a modification of the BFGS method for unconstrained optimization. ...
Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivat...
In this paper, we study greedy variants of quasi-Newton methods. They are based on the updating foru...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
In this paper, on the basis of the DFP method a class of non-quasi-Newton methods is presented. Unde...
AbstractWe investigate convergence property of the restricted Broyden class of variable metric metho...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
Abstract. Techniques for obtaining safely positive definite Hessian approximations with self-scaling...
AbstractIn this paper, a new nonmonotone MBFGS algorithm for unconstrained optimization will be prop...
The aim of this paper is to deepen the convergence analysis of the scaled gradient projection (SGP) ...
AbstractA lucid description of the variable metric (DEP) method due to Davidon (1959), Flectcher and...
Standard global convergence proofs are examined to determine why some algorithms perform better than...
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several conse...
In this thesis we develop a unified theory for establishing the local and q-superlinear convergence ...
AbstractIn this paper, we propose a modification of the BFGS method for unconstrained optimization. ...
Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivat...
In this paper, we study greedy variants of quasi-Newton methods. They are based on the updating foru...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
In this paper, on the basis of the DFP method a class of non-quasi-Newton methods is presented. Unde...
AbstractWe investigate convergence property of the restricted Broyden class of variable metric metho...
We begin by developing a line search method for unconstrained optimization which can be regarded as ...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
Abstract. Techniques for obtaining safely positive definite Hessian approximations with self-scaling...
AbstractIn this paper, a new nonmonotone MBFGS algorithm for unconstrained optimization will be prop...
The aim of this paper is to deepen the convergence analysis of the scaled gradient projection (SGP) ...
AbstractA lucid description of the variable metric (DEP) method due to Davidon (1959), Flectcher and...
Standard global convergence proofs are examined to determine why some algorithms perform better than...
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several conse...
In this thesis we develop a unified theory for establishing the local and q-superlinear convergence ...
AbstractIn this paper, we propose a modification of the BFGS method for unconstrained optimization. ...
Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivat...
In this paper, we study greedy variants of quasi-Newton methods. They are based on the updating foru...