First-order methods are often analyzed via their continuous-time models, where their worst-case convergence properties are usually approached via Lyapunov functions. In this work, we provide a systematic and principled approach to find and verify Lyapunov functions for classes of ordinary and stochastic differential equations. More precisely, we extend the performance estimation framework, originally proposed by Drori and Teboulle [10], to continuous-time models. We retrieve convergence results comparable to those of discrete methods using fewer assumptions and convexity inequalities, and provide new results for stochastic accelerated gradient flows
Recently, there has been a surge of interest in incorporating tools from dynamical systems and contr...
In this article a family of second order ODEs associated to inertial gradient descend is studied. Th...
Prediction and filtering of continuous-time stochastic processes require a solver of a continuous-t...
First-order methods are often analyzed via their continuous-time models, where their worst-case conv...
We propose new continuous-time formulations for first-order stochastic optimization algorithms such ...
Optimization is among the richest modeling languages in science. In statistics and machine learning,...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
Motivated by the fact that the gradient-based optimization algorithms can be studied from the perspe...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
[en]Convergence of a stochastic process is an intrinsic property quite relevant for its successful p...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
Recently, there has been a surge of interest in incorporating tools from dynamical systems and contr...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
Recently, there has been a surge of interest in incorporating tools from dynamical systems and contr...
In this article a family of second order ODEs associated to inertial gradient descend is studied. Th...
Prediction and filtering of continuous-time stochastic processes require a solver of a continuous-t...
First-order methods are often analyzed via their continuous-time models, where their worst-case conv...
We propose new continuous-time formulations for first-order stochastic optimization algorithms such ...
Optimization is among the richest modeling languages in science. In statistics and machine learning,...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
Motivated by the fact that the gradient-based optimization algorithms can be studied from the perspe...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
[en]Convergence of a stochastic process is an intrinsic property quite relevant for its successful p...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
We analyze the global and local behavior of gradient-like flows under stochastic errors towards the ...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
Recently, there has been a surge of interest in incorporating tools from dynamical systems and contr...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
Recently, there has been a surge of interest in incorporating tools from dynamical systems and contr...
In this article a family of second order ODEs associated to inertial gradient descend is studied. Th...
Prediction and filtering of continuous-time stochastic processes require a solver of a continuous-t...