An example is presented where Newton’s method for unconstrained minimization is applied to find an ǫ-approximate first-order critical point of a smooth function and takes a multiple of ǫ−2 iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that the objective function has a globally Lipschitz-continuous Hessian, while a previous example published by the same authors only ensured this critical property along the path of iterates, which is impossible to verify a priori.
This paper deals with two kinds of the one-dimensional global optimization problem over a closed fin...
We develop general approximate Newton methods for solving Lipschitz continuous equations by replacin...
Abstract. We use Newton’s method to solve systems of equations with constant rank derivatives. Motiv...
An example of slow convergence for Newton’s method on a function with globally Lipschitz continuous ...
The problem of globalizing the Newton method when the actual Hessian matrix is not used at every ite...
In this paper, we propose a continuous Newton-type method in the form of an ordinary differential eq...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
In this note we discuss the convergence of Newton`s method for minimization. We present examples in ...
Abstract. Inexact Newton methods for finding a zero of F 1 1 are variations of Newton’s method in wh...
In this paper, we study the regularized second-order methods for unconstrained minimization of a twi...
The aim of this paper is to present a new semi-local convergence analysis for Newton’s method ...
This paper presents some globally convergent descent methods for solving systems of nonlinear equati...
We give a framework for the globalization of a nonsmooth Newton method introduced by B. Kummer. We s...
AbstractUnder weak Lipschitz condition, local convergence properties of inexact Newton methods and N...
An algorithm based on a combination of the polyhedral and quadratic approximation is given for findi...
This paper deals with two kinds of the one-dimensional global optimization problem over a closed fin...
We develop general approximate Newton methods for solving Lipschitz continuous equations by replacin...
Abstract. We use Newton’s method to solve systems of equations with constant rank derivatives. Motiv...
An example of slow convergence for Newton’s method on a function with globally Lipschitz continuous ...
The problem of globalizing the Newton method when the actual Hessian matrix is not used at every ite...
In this paper, we propose a continuous Newton-type method in the form of an ordinary differential eq...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
In this note we discuss the convergence of Newton`s method for minimization. We present examples in ...
Abstract. Inexact Newton methods for finding a zero of F 1 1 are variations of Newton’s method in wh...
In this paper, we study the regularized second-order methods for unconstrained minimization of a twi...
The aim of this paper is to present a new semi-local convergence analysis for Newton’s method ...
This paper presents some globally convergent descent methods for solving systems of nonlinear equati...
We give a framework for the globalization of a nonsmooth Newton method introduced by B. Kummer. We s...
AbstractUnder weak Lipschitz condition, local convergence properties of inexact Newton methods and N...
An algorithm based on a combination of the polyhedral and quadratic approximation is given for findi...
This paper deals with two kinds of the one-dimensional global optimization problem over a closed fin...
We develop general approximate Newton methods for solving Lipschitz continuous equations by replacin...
Abstract. We use Newton’s method to solve systems of equations with constant rank derivatives. Motiv...