We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex optimization from the point of view of worst-case evaluation complexity, improving and generalizing the results of [15, 19]. To this aim, we consider a new general class of inexact second-order algorithms for unconstrained optimization that includes regularization and trust-region variations of Newton’s method as well as of their linesearch variants. For each method in this class and arbitrary accuracy threshold ϵ ∈ (0, 1), we exhibit a smooth objective function with bounded range, whose gradient is globally Lipschitz continuous and whose Hessian is α−Hölder continuous (for given α ∈ [0, 1]), for which the method in question takes at least [ϵ−(...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
We provide sharp worst-case evaluation complexity bounds for nonconvex minimization problems with ge...
An adaptive regularization algorithm is proposed that uses Taylor models of the objective of order p...
FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIM...
An adaptive regularization algorithm is proposed that uses Taylor models of the objective of order p...
The worst-case behaviour of a general class of regularization algorithms is considered in the case w...
The adaptive cubic regularization algorithms described in Cartis, Gould and Toint [Adaptive cubic re...
AbstractThis paper examines worst-case evaluation bounds for finding weak minimizers in unconstraine...
The worst-case behaviour of a general class of regularization algorithms is considered in the case w...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
Regularized minimization problems with nonconvex, nonsmooth, perhaps non-Lipschitz penalty functions...
An adaptive regularization algorithm using inexact function and derivatives evaluations is proposed ...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
We provide sharp worst-case evaluation complexity bounds for nonconvex minimization problems with ge...
An adaptive regularization algorithm is proposed that uses Taylor models of the objective of order p...
FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOCNPQ - CONSELHO NACIONAL DE DESENVOLVIM...
An adaptive regularization algorithm is proposed that uses Taylor models of the objective of order p...
The worst-case behaviour of a general class of regularization algorithms is considered in the case w...
The adaptive cubic regularization algorithms described in Cartis, Gould and Toint [Adaptive cubic re...
AbstractThis paper examines worst-case evaluation bounds for finding weak minimizers in unconstraine...
The worst-case behaviour of a general class of regularization algorithms is considered in the case w...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
Regularized minimization problems with nonconvex, nonsmooth, perhaps non-Lipschitz penalty functions...
An adaptive regularization algorithm using inexact function and derivatives evaluations is proposed ...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
31 p. ; mars 2022An Adagrad-inspired class of algorithms for smooth unconstrained optimization is pr...