International audienceIn this paper, a general stochastic optimization procedure is studied, unifying several variants of the stochastic gradient descent such as, among others, the stochastic heavy ball method, the Stochastic Nesterov Accelerated Gradient algorithm (S-NAG), and the widely used Adam algorithm. The algorithm is seen as a noisy Euler discretization of a nonautonomous ordinary differential equation, recently introduced by Belotto da Silva and Gazeau, which is analyzed in depth. Assuming that the objective function is non-convex and differentiable, the stability and the almost sure convergence of the iterates to the set of critical points are established. A noteworthy special case is the convergence proof of SNAG in a nonconvex ...
International audienceThis paper deals with a natural stochastic optimization procedure derived from...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective ...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
The subject of this thesis is the analysis of several stochastic algorithms in a nonconvex setting. ...
Adam is a popular variant of the stochastic gradient descent for finding a local minimizer of a func...
The subject of this thesis is the analysis of several stochastic algorithms in a nonconvex setting. ...
This paper deals with a natural stochastic optimization procedure derived from the so-called Heavy-b...
This paper deals with a natural stochastic optimization procedure derived from the so-called Heavy-b...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
Le sujet de cette thèse est l'analyse de divers algorithmes stochastiques visant à résoudre un probl...
International audienceThis paper deals with a natural stochastic optimization procedure derived from...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective ...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
International audienceIn this paper, a general stochastic optimization procedure is studied, unifyin...
The subject of this thesis is the analysis of several stochastic algorithms in a nonconvex setting. ...
Adam is a popular variant of the stochastic gradient descent for finding a local minimizer of a func...
The subject of this thesis is the analysis of several stochastic algorithms in a nonconvex setting. ...
This paper deals with a natural stochastic optimization procedure derived from the so-called Heavy-b...
This paper deals with a natural stochastic optimization procedure derived from the so-called Heavy-b...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
Le sujet de cette thèse est l'analyse de divers algorithmes stochastiques visant à résoudre un probl...
International audienceThis paper deals with a natural stochastic optimization procedure derived from...
We study stochastic gradient descent (SGD) and the stochastic heavy ball method (SHB, otherwise know...
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective ...