In this paper, we propose a restart scheme for FISTA (Fast Iterative Shrinking-Threshold Algorithm). This method which is a generalization of Nesterov's accelerated gradient algorithm is widely used in the field of large convex optimization problems and it provides fast convergence results under a strong convexity assumption. These convergence rates can be extended for weaker hypotheses such as the \L{}ojasiewicz property but it requires prior knowledge on the function of interest. In particular, most of the schemes providing a fast convergence for non-strongly convex functions satisfying a quadratic growth condition involve the growth parameter which is generally not known. Recent works show that restarting FISTA could ensure a fast conver...
Fast iterative soft threshold algorithm (FISTA) is one of the algorithms for the reconstruction part...
Motivated by recent work of Renegar, we present new computational methods and associated computation...
The Łojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problem...
We consider a combined restarting and adaptive backtracking strategy for the popular Fast IterativeS...
In this work, we are interested in the famous FISTA algorithm. We show that FISTA is an automatic ge...
Convex-composite optimization, which minimizes an objective function represented by the sum of a dif...
International audienceBy analyzing accelerated proximal gradient methods under a local quadratic gro...
Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of ...
International audienceIn this note, we consider a special instance of the scaled, inexact and adapti...
We propose a scaled adaptive version of the Fast Iterative Soft-Thresholding Algorithm, named S-FIST...
International audienceWe consider a variable metric and inexact version of the FISTA-type algorithm ...
International audienceWe present and analyse a backtracking strategy for a general Fast Iterative Sh...
International audienceFISTA is a popular convex optimisation algorithm which is known to converge at...
International audienceFISTA is a classical optimization algorithm to minimize convex functions. The ...
Gradient descent is slow to converge for ill-conditioned problems and non-convex problems. An import...
Fast iterative soft threshold algorithm (FISTA) is one of the algorithms for the reconstruction part...
Motivated by recent work of Renegar, we present new computational methods and associated computation...
The Łojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problem...
We consider a combined restarting and adaptive backtracking strategy for the popular Fast IterativeS...
In this work, we are interested in the famous FISTA algorithm. We show that FISTA is an automatic ge...
Convex-composite optimization, which minimizes an objective function represented by the sum of a dif...
International audienceBy analyzing accelerated proximal gradient methods under a local quadratic gro...
Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of ...
International audienceIn this note, we consider a special instance of the scaled, inexact and adapti...
We propose a scaled adaptive version of the Fast Iterative Soft-Thresholding Algorithm, named S-FIST...
International audienceWe consider a variable metric and inexact version of the FISTA-type algorithm ...
International audienceWe present and analyse a backtracking strategy for a general Fast Iterative Sh...
International audienceFISTA is a popular convex optimisation algorithm which is known to converge at...
International audienceFISTA is a classical optimization algorithm to minimize convex functions. The ...
Gradient descent is slow to converge for ill-conditioned problems and non-convex problems. An import...
Fast iterative soft threshold algorithm (FISTA) is one of the algorithms for the reconstruction part...
Motivated by recent work of Renegar, we present new computational methods and associated computation...
The Łojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problem...