The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems stores a few past gradients to compute multiple stepsizes at once. We review this method and propose new variants. For strictly convex quadratic objective functions, we study the numerical behavior of different techniques to compute new stepsizes. In particular, we introduce a method to improve the use of harmonic Ritz values. We also show the existence of a secant condition associated with LMSD, where the approximating Hessian is projected onto a low-dimensional space. In the general nonlinear case, we propose two new alternatives to Fletcher's method: first, the addition of symmetry constraints to the secant condition valid for the quadrati...
International audienceThis paper studies Newton-type methods for minimization of partly smooth conve...
AbstractA very simple gradient only algorithm for unconstrained minimization is proposed that, in te...
This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functi...
Gradient Projection (GP) methods are a very popular tool to address box-constrained quadratic proble...
In this paper, we propose an interior-point method for linearly constrained optimization problems (p...
It is well known that the minimization of a smooth function f (x) is equivalent to minimizing its gr...
The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leadi...
The steplength selection is a crucial issue for the effectiveness of the stochastic gradient methods...
A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses...
AbstractIn this paper the development, convergence theory and numerical testing of a class of gradie...
The possibilities inherent in steepest descent methods have been considerably amplified by the intro...
International audienceIn this paper, we propose an interior-point method for linearly constrained-an...
AbstractThis paper presents a family of improved secant algorithms via two preconditional curvilinea...
Gradient projection methods represent effective tools for solving large-scale constrained optimizat...
International audienceThis paper studies Newton-type methods for minimization of partly smooth conve...
AbstractA very simple gradient only algorithm for unconstrained minimization is proposed that, in te...
This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functi...
Gradient Projection (GP) methods are a very popular tool to address box-constrained quadratic proble...
In this paper, we propose an interior-point method for linearly constrained optimization problems (p...
It is well known that the minimization of a smooth function f (x) is equivalent to minimizing its gr...
The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leadi...
The steplength selection is a crucial issue for the effectiveness of the stochastic gradient methods...
A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses...
AbstractIn this paper the development, convergence theory and numerical testing of a class of gradie...
The possibilities inherent in steepest descent methods have been considerably amplified by the intro...
International audienceIn this paper, we propose an interior-point method for linearly constrained-an...
AbstractThis paper presents a family of improved secant algorithms via two preconditional curvilinea...
Gradient projection methods represent effective tools for solving large-scale constrained optimizat...
International audienceThis paper studies Newton-type methods for minimization of partly smooth conve...
AbstractA very simple gradient only algorithm for unconstrained minimization is proposed that, in te...
This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functi...