The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of the availability of a few additional 'long' vectors of storage to achieve a significant improvement in performance, both for quadratic and non-quadratic objective functions. It makes use of certain Ritz values related to the Lanczos process (Lanczos in J Res Nat Bur Stand 45:255-282, 1950). Some underlying theory is provided, and numerical evidence is s...
This paper studies recent modications of the limited memory BFGS (L-BFGS) method for solving large s...
AbstractThis paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solvi...
The steplength selection is a crucial issue for the effectiveness of the stochastic gradient methods...
summary:Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstraine...
We study the numerical performance of a limited memory quasi-Newton method for large scale optimizat...
Steepest descent method is a simple gradient method for optimization. This method has a slow converg...
In this paper we present two new numerical methods for unconstrained large-scale optimization. These...
The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems ...
Abstract. In this paper we present two new numerical methods for unconstrained large-scale optimizat...
A preconditioned steepest descent (SD) method for solving very large (with dimensions up to 106 ) un...
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb...
International audienceWe present some extensions to the limited memory steepest descent method based...
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb...
Gradient Projection (GP) methods are a very popular tool to address box-constrained quadratic proble...
AbstractA new gradient algorithm (LFOPC) for unconstrained minimization, requiring no line searches ...
This paper studies recent modications of the limited memory BFGS (L-BFGS) method for solving large s...
AbstractThis paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solvi...
The steplength selection is a crucial issue for the effectiveness of the stochastic gradient methods...
summary:Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstraine...
We study the numerical performance of a limited memory quasi-Newton method for large scale optimizat...
Steepest descent method is a simple gradient method for optimization. This method has a slow converg...
In this paper we present two new numerical methods for unconstrained large-scale optimization. These...
The limited memory steepest descent method (Fletcher, 2012) for unconstrained optimization problems ...
Abstract. In this paper we present two new numerical methods for unconstrained large-scale optimizat...
A preconditioned steepest descent (SD) method for solving very large (with dimensions up to 106 ) un...
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb...
International audienceWe present some extensions to the limited memory steepest descent method based...
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb...
Gradient Projection (GP) methods are a very popular tool to address box-constrained quadratic proble...
AbstractA new gradient algorithm (LFOPC) for unconstrained minimization, requiring no line searches ...
This paper studies recent modications of the limited memory BFGS (L-BFGS) method for solving large s...
AbstractThis paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solvi...
The steplength selection is a crucial issue for the effectiveness of the stochastic gradient methods...