International audienceClassical First Order methods for large-scale convex-concave saddle point problems and variational inequalities with monotone operators are proximal algorithms. They require minimizing of a sum of a linear form and a strongly convex (proximal) function at each iteration of the method. To make such an algorithm practical, the problem domain X should be proximal-friendly (admits a strongly convex function with easy to minimize linear perturbation). As a byproduct, X admits a computationally cheap Linear Minimization Oracle (LMO) capable to minimize over X linear forms. There are, however, important situations where a cheap LMO indeed is available, but X is not proximal-friendly. This motivates the search for algorithms b...
Abstract. We consider convex optimization and variational inequality problems with a given separable...
A class of convexification and concavification methods are proposed for solving some classes of non-...
First-order methods for composite minimization minx∈X f (x) + h(x) f and h are convex, f is smooth, ...
International audienceThe standard algorithms for solving large-scale convex–concave saddle point pr...
International audienceThe standard algorithms for solving large-scale convex–concave saddle point pr...
International audienceThe majority of first-order methods for large-scale convex–concave saddle poin...
Following the works of R.T. Rockafellar, to search for a zero of a maximal monotone operator, and of...
International audienceIn the paper, we develop a composite version of Mirror Prox algorithm for solv...
The proximal point algorithm has known these last years many developments connected with the expansi...
This paper demonstrates a customized application of the classical proximal point algorithm (PPA) to ...
We give a simple and natural method for computing approximately optimal solutions for minimizing a c...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
For those acquainted with CVX (aka disciplined convex programming) of M. Grant and S. Boyd, the moti...
This paper describes two optimal subgradient algorithms for solving structured large-scale convex co...
We seek to solve convex optimization problems in composite form: minimize x∈Rn f(x): = g(x) + h(x), ...
Abstract. We consider convex optimization and variational inequality problems with a given separable...
A class of convexification and concavification methods are proposed for solving some classes of non-...
First-order methods for composite minimization minx∈X f (x) + h(x) f and h are convex, f is smooth, ...
International audienceThe standard algorithms for solving large-scale convex–concave saddle point pr...
International audienceThe standard algorithms for solving large-scale convex–concave saddle point pr...
International audienceThe majority of first-order methods for large-scale convex–concave saddle poin...
Following the works of R.T. Rockafellar, to search for a zero of a maximal monotone operator, and of...
International audienceIn the paper, we develop a composite version of Mirror Prox algorithm for solv...
The proximal point algorithm has known these last years many developments connected with the expansi...
This paper demonstrates a customized application of the classical proximal point algorithm (PPA) to ...
We give a simple and natural method for computing approximately optimal solutions for minimizing a c...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
For those acquainted with CVX (aka disciplined convex programming) of M. Grant and S. Boyd, the moti...
This paper describes two optimal subgradient algorithms for solving structured large-scale convex co...
We seek to solve convex optimization problems in composite form: minimize x∈Rn f(x): = g(x) + h(x), ...
Abstract. We consider convex optimization and variational inequality problems with a given separable...
A class of convexification and concavification methods are proposed for solving some classes of non-...
First-order methods for composite minimization minx∈X f (x) + h(x) f and h are convex, f is smooth, ...