In this paper, we study the regularized second-order methods for unconstrained minimization of a twice-differentiable (convex or nonconvex) objective function. For the current function, these methods automatically achieve the best possible global complexity estimates among different Hölder classes containing the Hessian of the objective. We show that such methods for functional residual and for the norm of the gradient must be different. For development of the latter methods, we introduced two new line-search acceptance criteria, which can be seen as generalizations of the Armijo condition
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations...
Abstract We present a branch and bound algorithm for the global optimization of a twice differentiab...
In this paper, we suggest new universal second-order methods for unconstrained minimization of twice...
In this paper, we study accelerated regularized Newton methods for minimizing objectives formed as a...
In this paper, we study accelerated Regularized Newton Methods for minimizing objectives formed as a...
This paper studies convergence properties of regularized Newton methods for minimizing a convex func...
Abstract. This paper studies convergence properties of regularized Newton methods for minimizing a c...
Regularized minimization problems with nonconvex, nonsmooth, perhaps non-Lipschitz penalty functions...
We consider variants of trust-region and adaptive cubic regularization methods for non-convex optimi...
In this paper, we study the iteration complexity of cubic regularization of Newton method for solvin...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
A Newton-like method for unconstrained minimization is introduced in the present work. While the com...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
We present a branch and bound algorithm for the global optimization of a twice differentiable noncon...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations...
Abstract We present a branch and bound algorithm for the global optimization of a twice differentiab...
In this paper, we suggest new universal second-order methods for unconstrained minimization of twice...
In this paper, we study accelerated regularized Newton methods for minimizing objectives formed as a...
In this paper, we study accelerated Regularized Newton Methods for minimizing objectives formed as a...
This paper studies convergence properties of regularized Newton methods for minimizing a convex func...
Abstract. This paper studies convergence properties of regularized Newton methods for minimizing a c...
Regularized minimization problems with nonconvex, nonsmooth, perhaps non-Lipschitz penalty functions...
We consider variants of trust-region and adaptive cubic regularization methods for non-convex optimi...
In this paper, we study the iteration complexity of cubic regularization of Newton method for solvin...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
A Newton-like method for unconstrained minimization is introduced in the present work. While the com...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
We present a branch and bound algorithm for the global optimization of a twice differentiable noncon...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations...
Abstract We present a branch and bound algorithm for the global optimization of a twice differentiab...