We analyze the global and local behavior of gradient-like flows under stochastic errors towards the aim of solving convex optimization problems with noisy gradient input. We first study the unconstrained differentiable convex case, using a stochastic differential equation where the drift term is minus the gradient of the objective function and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz continuity of the gradient, our first main result shows almost sure convergence of the objective and the trajectory process towards a minimizer of the objective function. We also provide a comprehensive complexity analysis by establishing several new pointwise and ergodic convergence rates in expectation for th...
94 pages, 4 figuresThis paper proposes a thorough theoretical analysis of Stochastic Gradient Descen...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
International audienceIn view of solving convex optimization problems with noisy gradient input, we ...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In this article, a family of SDEs are derived as a tool to understand the behavior of numerical opti...
International audienceIn this paper, we examine a class of non-convex stochastic optimization proble...
In this paper, we are interested in the development of efficient first-order methods for convex opti...
In this thesis we want to give a theoretical and practical introduction to stochastic gradient desce...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
We propose a stochastic gradient framework for solving stochastic composite convex optimization prob...
Consider the problem of minimizing functions that are Lipschitz and strongly convex, but not necessa...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
Driven by the need to solve increasingly complex optimization problems in signal processing and mach...
94 pages, 4 figuresThis paper proposes a thorough theoretical analysis of Stochastic Gradient Descen...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
International audienceIn view of solving convex optimization problems with noisy gradient input, we ...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In this article, a family of SDEs are derived as a tool to understand the behavior of numerical opti...
International audienceIn this paper, we examine a class of non-convex stochastic optimization proble...
In this paper, we are interested in the development of efficient first-order methods for convex opti...
In this thesis we want to give a theoretical and practical introduction to stochastic gradient desce...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
We propose a stochastic gradient framework for solving stochastic composite convex optimization prob...
Consider the problem of minimizing functions that are Lipschitz and strongly convex, but not necessa...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
Driven by the need to solve increasingly complex optimization problems in signal processing and mach...
94 pages, 4 figuresThis paper proposes a thorough theoretical analysis of Stochastic Gradient Descen...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...