We introduce a model-based excessive gap technique to analyze first-order primal-dual methods for constrained convex minimization. As a result, we construct first-order primal-dual methods with optimal convergence rates on the primal objec-tive residual and the primal feasibility gap of their iterates separately. Through a dual smoothing and prox-center selection strategy, our framework subsumes the augmented Lagrangian, alternating direction, and dual fast-gradient methods as special cases, where our rates apply.
We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangia...
Many statistical learning problems can be posed as minimization of a sum of two convex functions, on...
We propose a new self-adaptive and double-loop smoothing algorithm to solve composite, nonsmooth, an...
We introduce a model-based excessive gap technique to analyze first-order primal-dual methods for co...
We introduce a model-based excessive gap technique to analyze first-order primal-dual methods for co...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
International audience<p>We propose a new first-order primal-dual optimization framework for a conve...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
(will be inserted by the editor) An optimal first-order primal-dual gap reduction framework for cons...
Nonlinearly constrained optimization problems may be solved by minimizing a sequence of simpler subp...
We introduce a novel primal-dual flow for affine constrained convex optimization problems. As a modi...
We introduce a novel primal-dual flow for affine constrained convex optimization problems. As a modi...
We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangia...
Many statistical learning problems can be posed as minimization of a sum of two convex functions, on...
We propose a new self-adaptive and double-loop smoothing algorithm to solve composite, nonsmooth, an...
We introduce a model-based excessive gap technique to analyze first-order primal-dual methods for co...
We introduce a model-based excessive gap technique to analyze first-order primal-dual methods for co...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
International audience<p>We propose a new first-order primal-dual optimization framework for a conve...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
(will be inserted by the editor) An optimal first-order primal-dual gap reduction framework for cons...
Nonlinearly constrained optimization problems may be solved by minimizing a sequence of simpler subp...
We introduce a novel primal-dual flow for affine constrained convex optimization problems. As a modi...
We introduce a novel primal-dual flow for affine constrained convex optimization problems. As a modi...
We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangia...
Many statistical learning problems can be posed as minimization of a sum of two convex functions, on...
We propose a new self-adaptive and double-loop smoothing algorithm to solve composite, nonsmooth, an...