We study the convergence properties of a class of low memory methods for solving large-scale unconstrained problems. This class of methods belongs to that of quasi-Newton family, except for which the approximation to Hessian, at each step, is updated by means of a diagonal matrix. Using appropriate scaling, we show that the methods can be implemented so as to be globally and \(R\) -linearly convergent with standard inexact line searches. Preliminary numerical results suggest that the methods are good alternative to other low memory methods such as the CG and spectral gradient methods
The main focus of this paper is to derive new diagonal updating scheme via the direct weak secant eq...
One of the well-known methods in solving large scale unconstrained optimization is limited memory qu...
Subspace quasi-Newton (SQN) method has been widely used in large scale unconstrained optimization pr...
In this work, we present a new class of diagonal quasi-Newton methods for solving large-scale uncons...
We study the numerical performance of a limited memory quasi-Newton method for large scale optimizat...
In this paper a new class of quasi-Newton methods, namedLQN, is introduced in order to solve unconst...
One of the widely used methods for solving a nonlinear system of equations is the quasi-Newton metho...
Recently, subspace quasi-Newton (SQN) method has been widely used in solving large scale unconstrain...
Abstract. The use of the L-BFGS method is very efficient for the resolution of large scale optimizat...
Abstract: "We propose a quasi-Newton algorithm for solving large optimization problems with nonlinea...
In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrai...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
An optimization algorithm for minimizing a smooth function over a convex set is de-scribed. Each ite...
This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method fo...
The main focus of this paper is to derive new diagonal updating scheme via the direct weak secant eq...
One of the well-known methods in solving large scale unconstrained optimization is limited memory qu...
Subspace quasi-Newton (SQN) method has been widely used in large scale unconstrained optimization pr...
In this work, we present a new class of diagonal quasi-Newton methods for solving large-scale uncons...
We study the numerical performance of a limited memory quasi-Newton method for large scale optimizat...
In this paper a new class of quasi-Newton methods, namedLQN, is introduced in order to solve unconst...
One of the widely used methods for solving a nonlinear system of equations is the quasi-Newton metho...
Recently, subspace quasi-Newton (SQN) method has been widely used in solving large scale unconstrain...
Abstract. The use of the L-BFGS method is very efficient for the resolution of large scale optimizat...
Abstract: "We propose a quasi-Newton algorithm for solving large optimization problems with nonlinea...
In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrai...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
An optimization algorithm for minimizing a smooth function over a convex set is de-scribed. Each ite...
This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method fo...
The main focus of this paper is to derive new diagonal updating scheme via the direct weak secant eq...
One of the well-known methods in solving large scale unconstrained optimization is limited memory qu...
Subspace quasi-Newton (SQN) method has been widely used in large scale unconstrained optimization pr...