We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle point of a convex-concave unconstrained min-max optimization problem. Compared to their first-order counterparts, investigations of second-order methods for min-max optimization are relatively limited, as obtaining global rates of convergence with second-order information is much more involved. In this paper, we highlight how second-order information can be used to speed up the dynamics of dual extrapolation methods despite inexactness. Specifically, we show that the proposed algorithms generate iterates that remain within a bounded set and the averaged iterates converge to an ϵ-saddle point within O(ϵ −2/3) iterations in terms of a gap func...
We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigi...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
The performance of branch-and-bound algorithms for deterministic global optimization is strongly dep...
We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle...
We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle...
In this paper, we present new second-order methods with converge rate O(k^{-4}), where k is the iter...
International audienceIn this paper, we present new second-order methods with convergence rate O (k ...
In the recent years, we can see that the interest for new optimization methods keeps growing. The mo...
Compared to minimization, the min-max optimization in machine learning applications is considerably ...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
In this paper, we present new second-order algorithms for composite convex optimization, called Cont...
International audienceWe present a new family of min-max optimization algorithms that automatically ...
We study the smooth minimax optimization problem $\min_{\bf x}\max_{\bf y} f({\bf x},{\bf y})$, wher...
In this paper, we present a new framework of bi-level unconstrained minimization for development of ...
Motivated by recent work of Renegar, we present new computational methods and associated computation...
We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigi...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
The performance of branch-and-bound algorithms for deterministic global optimization is strongly dep...
We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle...
We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle...
In this paper, we present new second-order methods with converge rate O(k^{-4}), where k is the iter...
International audienceIn this paper, we present new second-order methods with convergence rate O (k ...
In the recent years, we can see that the interest for new optimization methods keeps growing. The mo...
Compared to minimization, the min-max optimization in machine learning applications is considerably ...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
In this paper, we present new second-order algorithms for composite convex optimization, called Cont...
International audienceWe present a new family of min-max optimization algorithms that automatically ...
We study the smooth minimax optimization problem $\min_{\bf x}\max_{\bf y} f({\bf x},{\bf y})$, wher...
In this paper, we present a new framework of bi-level unconstrained minimization for development of ...
Motivated by recent work of Renegar, we present new computational methods and associated computation...
We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigi...
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex op...
The performance of branch-and-bound algorithms for deterministic global optimization is strongly dep...