A number of recent works have emphasized the prominent role played by the Kurdyka-Lojasiewicz inequality for proving the convergence of iterative algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of a non necessarily convex differentiable function and a non necessarily differentiable or convex function. The latter function is expressed as a separable sum of functions of blocks of variables. Such an op-timization problem can be addressed with the Forward-Backward algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize-Minimize princi-ple. We propose to combine the latter ...
Abstract. Nonconvex optimization problems arise in many areas of computational science and engineeri...
International audienceOptimization methods play a central role in the solution of a wide array of pr...
© 2017, Springer Science+Business Media New York. The forward–backward splitting method (FBS) for mi...
International audienceA number of recent works have emphasized the prominent role played by the Kurd...
International audienceWe consider the minimization of a function $G$ defined on $R^N$, which is the ...
We consider the minimization of a function G defined on RN, which is the sum of a (non necessarily c...
Forward-backward methods are a very useful tool for the minimization of a functional given by the su...
Forward-backward methods are valid tools to solve a variety of optimization problems where the objec...
This paper deals with a general framework for inexact forward-backward algorithms aimed at minimizin...
One of the most popular approaches for the minimization of a convex functional given by the sum of a...
In this paper we propose an alternating block version of a variable metric linesearch proximal gradi...
Abstract. We propose a forward-backward proximal-type algorithm with inertial/memory effects for min...
In view of the minimization of a function which is the sum of a differentiable function $f$ and a c...
International audienceMany inverse problems require to minimize a criterion being the sum of a non n...
Abstract. Nonconvex optimization problems arise in many areas of computational science and engineeri...
International audienceOptimization methods play a central role in the solution of a wide array of pr...
© 2017, Springer Science+Business Media New York. The forward–backward splitting method (FBS) for mi...
International audienceA number of recent works have emphasized the prominent role played by the Kurd...
International audienceWe consider the minimization of a function $G$ defined on $R^N$, which is the ...
We consider the minimization of a function G defined on RN, which is the sum of a (non necessarily c...
Forward-backward methods are a very useful tool for the minimization of a functional given by the su...
Forward-backward methods are valid tools to solve a variety of optimization problems where the objec...
This paper deals with a general framework for inexact forward-backward algorithms aimed at minimizin...
One of the most popular approaches for the minimization of a convex functional given by the sum of a...
In this paper we propose an alternating block version of a variable metric linesearch proximal gradi...
Abstract. We propose a forward-backward proximal-type algorithm with inertial/memory effects for min...
In view of the minimization of a function which is the sum of a differentiable function $f$ and a c...
International audienceMany inverse problems require to minimize a criterion being the sum of a non n...
Abstract. Nonconvex optimization problems arise in many areas of computational science and engineeri...
International audienceOptimization methods play a central role in the solution of a wide array of pr...
© 2017, Springer Science+Business Media New York. The forward–backward splitting method (FBS) for mi...