We study the linear convergence of the primal-dual hybrid gradient method. After a review of current analyses, we show that they do not explain properly the behavior of the algorithm, even on the most simple problems. We thus introduce the quadratic error bound of the smoothed gap, a new regularity assumption that holds for a wide class of optimization problems. Equipped with this tool, we manage to prove tighter convergence rates. Then, we show that averaging and restarting the primal-dual hybrid gradient allows us to leverage better the regularity constant. Numerical experiments on linear and quadratic programs, ridge regression and image denoising illustrate the findings of the paper
We propose a simple yet efficient algorithm for total variation (TV) minimizations with applications...
Recently, Zhang, Tapia and Dennis produced a superlinear and quadratic convergence theory for the du...
© 2017, Springer Science+Business Media New York. The primalâdual hybrid gradient method (PDHG) ori...
© 2014 Society for Industrial and Applied Mathematics. The primal-dual hybrid gradient algorithm (P...
In this paper, we analyze the recently proposed stochastic primal-dual hybrid gradient (SPDHG) algor...
In this paper we establish the convergence of a general primal-dual method for nonsmooth convex opti...
International audienceBy analyzing accelerated proximal gradient methods under a local quadratic gro...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
In this paper, we analyze the recently proposed stochastic primal-dual hybrid gradient (SPDHG) algor...
The alternating direction method of multipliers (ADMM) is an important tool for solving complex opti...
In this work we propose a new primal-dual algorithm with adaptive step-sizes. The stochastic primal-...
We derive bounds for the objective errors and gradient residuals when finding approximations to the ...
The primal-dual gap is a natural upper bound for the energy error and, for uniformly convex minimiza...
Abstract. This paper deals with the analysis of a recent reformulation of the primal-dual hybrid gra...
The study of first-order optimization is sensitive to the assumptions made on the objective function...
We propose a simple yet efficient algorithm for total variation (TV) minimizations with applications...
Recently, Zhang, Tapia and Dennis produced a superlinear and quadratic convergence theory for the du...
© 2017, Springer Science+Business Media New York. The primalâdual hybrid gradient method (PDHG) ori...
© 2014 Society for Industrial and Applied Mathematics. The primal-dual hybrid gradient algorithm (P...
In this paper, we analyze the recently proposed stochastic primal-dual hybrid gradient (SPDHG) algor...
In this paper we establish the convergence of a general primal-dual method for nonsmooth convex opti...
International audienceBy analyzing accelerated proximal gradient methods under a local quadratic gro...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
In this paper, we analyze the recently proposed stochastic primal-dual hybrid gradient (SPDHG) algor...
The alternating direction method of multipliers (ADMM) is an important tool for solving complex opti...
In this work we propose a new primal-dual algorithm with adaptive step-sizes. The stochastic primal-...
We derive bounds for the objective errors and gradient residuals when finding approximations to the ...
The primal-dual gap is a natural upper bound for the energy error and, for uniformly convex minimiza...
Abstract. This paper deals with the analysis of a recent reformulation of the primal-dual hybrid gra...
The study of first-order optimization is sensitive to the assumptions made on the objective function...
We propose a simple yet efficient algorithm for total variation (TV) minimizations with applications...
Recently, Zhang, Tapia and Dennis produced a superlinear and quadratic convergence theory for the du...
© 2017, Springer Science+Business Media New York. The primalâdual hybrid gradient method (PDHG) ori...