We consider in this work a system of two stochastic differential equations named the perturbed compositional gradient flow. By introducing a separation of fast and slow scales of the two equations, we show that the limit of the slow motion is given by an averaged ordinary differential equation. We then demonstrate that the deviation of the slow motion from the averaged equation, after proper rescaling, converges to a stochastic process with Gaussian inputs. This indicates that the slow motion can be approximated in the weak sense by a standard perturbed gradient flow or the continuous-time stochastic gradient descent algorithm that solves the optimization problem for a composition of two functions. As an application, the perturbed compositi...
International audienceIn view of solving convex optimization problems with noisy gradient input, we ...
We show that accelerated gradient descent, averaged gradient descent and the heavy-ball method for q...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
We consider in this work small random perturbations (of multiplicative noise type) of the gradient f...
International audienceThe problem of minimizing the sum, or composition, of two objective functions ...
International audienceThe problem of minimizing the sum, or composition, of two objective functions ...
In this work we use the stochastic flow decomposition technique to get components that represent the...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
The problem of minimizing the sum, or composition, of two objective functions is a frequent sight in...
The problem of minimizing the sum, or composition, of two objective functions is a frequent sight in...
Classical stochastic gradient methods are well suited for minimizing expected-value ob-jective funct...
94 pages, 4 figuresThis paper proposes a thorough theoretical analysis of Stochastic Gradient Descen...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
International audienceIn view of solving convex optimization problems with noisy gradient input, we ...
We show that accelerated gradient descent, averaged gradient descent and the heavy-ball method for q...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
We consider in this work small random perturbations (of multiplicative noise type) of the gradient f...
International audienceThe problem of minimizing the sum, or composition, of two objective functions ...
International audienceThe problem of minimizing the sum, or composition, of two objective functions ...
In this work we use the stochastic flow decomposition technique to get components that represent the...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
The problem of minimizing the sum, or composition, of two objective functions is a frequent sight in...
The problem of minimizing the sum, or composition, of two objective functions is a frequent sight in...
Classical stochastic gradient methods are well suited for minimizing expected-value ob-jective funct...
94 pages, 4 figuresThis paper proposes a thorough theoretical analysis of Stochastic Gradient Descen...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
International audienceIn view of solving convex optimization problems with noisy gradient input, we ...
We show that accelerated gradient descent, averaged gradient descent and the heavy-ball method for q...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...