In this article, we investigate an accelerated first-order method, namely, the method of similar triangles, which is optimal in the class of convex (strongly convex) problems with a Lipschitz gradient. The paper considers a model of additive noise in a gradient and a Euclidean prox- structure for not necessarily bounded sets. Convergence estimates are obtained in the case of strong convexity and its absence, and a stopping criterion is proposed for not strongly convex problems
Accepted to COLT2020International audienceIn smooth strongly convex optimization, or in the presence...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of ...
We study the gradient method under the assumption that an additively inexact gradient is available f...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
We consider optimization methods for convex minimization problems under inexact information on the o...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper, we propose the gradient descent type methods to solve convex optimization problems in...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper we propose the gradient descent type methods to solve convex optimization problems in ...
We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex fu...
We consider the problem of minimizing a smooth convex objective function subject to the set of minim...
In smooth strongly convex optimization, or in the presence of H\"olderian error bounds, knowledge of...
Accepted to COLT2020International audienceIn smooth strongly convex optimization, or in the presence...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of ...
We study the gradient method under the assumption that an additively inexact gradient is available f...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
We consider optimization methods for convex minimization problems under inexact information on the o...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper, we propose the gradient descent type methods to solve convex optimization problems in...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper we propose the gradient descent type methods to solve convex optimization problems in ...
We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex fu...
We consider the problem of minimizing a smooth convex objective function subject to the set of minim...
In smooth strongly convex optimization, or in the presence of H\"olderian error bounds, knowledge of...
Accepted to COLT2020International audienceIn smooth strongly convex optimization, or in the presence...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of ...