In this paper, we study the Kurdyka–Łojasiewicz (KL) exponent, an important quantity for analyzing the convergence rate of first-order methods. Specifically, we develop various calculus rules to deduce the KL exponent of new (possibly nonconvex and nonsmooth) functions formed from functions with known KL exponents. In addition, we show that the well-studied Luo–Tseng error bound together with a mild assumption on the separation of stationary values implies that the KL exponent is 12. The Luo–Tseng error bound is known to hold for a large class of concrete structured optimization problems, and thus we deduce the KL exponent of a large class of functions whose exponents were previously unknown. Building upon this and the calculus rules, we ar...
International audienceIn view of the minimization of a nonsmooth nonconvex function f, we prove an a...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
The Łojasiewicz exponent of the gradient of a convergent power series h(X,Y) with complex coefficien...
This paper shows that error bounds can be used as effective tools for deriving complexity results fo...
International audienceWe investigate convergence of subgradient-oriented descent methods in non-smoo...
Cette thèse traite des méthodes de descente d’ordre un pour les problèmes de minimisation. Elle comp...
International audienceDifference-of-Convex programming and related algorithms, which constitute the ...
Bachelor thesis pursue the Łojasiewicz inequality. The Łojasiewicz inequality is proved here for gen...
We study the convergence of general abstract descent methods applied to a lower semicontinuous nonco...
The classical Lojasiewicz inequality and its extensions for partial differential equation problems (...
The scaled gradient projection (SGP) method is a first-order optimization method applicable to the c...
We study the convergence rate of gradient-based local search methods for solving low-rank matrix rec...
International audienceThe classical Lojasiewicz inequality and its extensions for partial differenti...
We present global convergence rates for a line-search method which is based on random first-order mo...
The paper addresses parametric inequality systems described by polynomial functions in finite dimens...
International audienceIn view of the minimization of a nonsmooth nonconvex function f, we prove an a...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
The Łojasiewicz exponent of the gradient of a convergent power series h(X,Y) with complex coefficien...
This paper shows that error bounds can be used as effective tools for deriving complexity results fo...
International audienceWe investigate convergence of subgradient-oriented descent methods in non-smoo...
Cette thèse traite des méthodes de descente d’ordre un pour les problèmes de minimisation. Elle comp...
International audienceDifference-of-Convex programming and related algorithms, which constitute the ...
Bachelor thesis pursue the Łojasiewicz inequality. The Łojasiewicz inequality is proved here for gen...
We study the convergence of general abstract descent methods applied to a lower semicontinuous nonco...
The classical Lojasiewicz inequality and its extensions for partial differential equation problems (...
The scaled gradient projection (SGP) method is a first-order optimization method applicable to the c...
We study the convergence rate of gradient-based local search methods for solving low-rank matrix rec...
International audienceThe classical Lojasiewicz inequality and its extensions for partial differenti...
We present global convergence rates for a line-search method which is based on random first-order mo...
The paper addresses parametric inequality systems described by polynomial functions in finite dimens...
International audienceIn view of the minimization of a nonsmooth nonconvex function f, we prove an a...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
The Łojasiewicz exponent of the gradient of a convergent power series h(X,Y) with complex coefficien...