International audienceThe continuous-time model of Nesterov’s momentum provides a thought-provoking perspective for understanding the nature of the acceleration phenomenon in convex optimization. One of the main ideas in this line of research comes from the field of classical mechanics and proposes to link Nesterov’s trajectory to the solution of a set of Euler-Lagrange equations relative to the so-called Bregman Lagrangian. In the last years, this approach led to the discovery of many new (stochastic) accelerated algorithms and provided a solid theoretical foundation for the design of structure-preserving accelerated methods. In this work, we revisit this idea and provide an in-depth analysis of the action relative to the Bregman Lagrangia...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
Motivated by variational models in continuum mechanics, we introduce a novel algorithm for performin...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
Accelerated gradient methods play a central role in optimization, achieving optimal rates in many se...
Many of the new developments in machine learning are connected with gradient-based optimization meth...
The problem of learning from data is prevalent in the modern scientific age, and optimization provid...
Many of the new developments in machine learning are connected with gradient-based optimization meth...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
In this article a family of second order ODEs associated to inertial gradient descend is studied. Th...
First-order methods play a central role in large-scale convex optimization. Even though many variati...
We introduce the "continuized" Nesterov acceleration, a close variant of Nesterov acceleration whose...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
Recent research on accelerated gradient methods of use in optimization has demonstrated that these m...
First-order methods play a central role in large-scale convex optimization. Despite their various fo...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
Motivated by variational models in continuum mechanics, we introduce a novel algorithm for performin...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
Accelerated gradient methods play a central role in optimization, achieving optimal rates in many se...
Many of the new developments in machine learning are connected with gradient-based optimization meth...
The problem of learning from data is prevalent in the modern scientific age, and optimization provid...
Many of the new developments in machine learning are connected with gradient-based optimization meth...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
In this article a family of second order ODEs associated to inertial gradient descend is studied. Th...
First-order methods play a central role in large-scale convex optimization. Even though many variati...
We introduce the "continuized" Nesterov acceleration, a close variant of Nesterov acceleration whose...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
Recent research on accelerated gradient methods of use in optimization has demonstrated that these m...
First-order methods play a central role in large-scale convex optimization. Despite their various fo...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
Motivated by variational models in continuum mechanics, we introduce a novel algorithm for performin...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...