In the history of first-order algorithms, Nesterov's accelerated gradient descent (NAG) is one of the milestones. However, the cause of the acceleration has been a mystery for a long time. It has not been revealed with the existence of gradient correction until the high-resolution differential equation framework proposed in [Shi et al., 2021]. In this paper, we continue to investigate the acceleration phenomenon. First, we provide a significantly simplified proof based on precise observation and a tighter inequality for $L$-smooth functions. Then, a new implicit-velocity high-resolution differential equation framework, as well as the corresponding implicit-velocity version of phase-space representation and Lyapunov function, is proposed to ...
Nesterov's accelerated gradient (AG) is a popular technique to optimize objective functions comprisi...
Can we accelerate convergence of gradient descent without changing the algorithm -- just by carefull...
The proximal point method (PPM) is a fundamental method in optimization that is often used as a buil...
For first-order smooth optimization, the research on the acceleration phenomenon has a long-time his...
Nesterov's accelerated gradient algorithm is derived from first principles. The first principles are...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
Policy gradient methods have recently been shown to enjoy global convergence at a $\Theta(1/t)$ rate...
First-order methods play a central role in large-scale convex optimization. Despite their various fo...
We provide a novel accelerated first-order method that achieves the asymptotically optimal convergen...
We provide a novel accelerated first-order method that achieves the asymptotically optimal convergen...
International audienceThe forward-backward algorithm is a powerful tool for solving optimization pro...
Nesterov's Accelerated Gradient (NAG) for optimization has better performance than its continuous ti...
main paper (9 pages) + appendix (21 pages)International audienceWe introduce a generic scheme for ac...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
International audienceWe show that accelerated gradient descent, averaged gradient descent and the h...
Nesterov's accelerated gradient (AG) is a popular technique to optimize objective functions comprisi...
Can we accelerate convergence of gradient descent without changing the algorithm -- just by carefull...
The proximal point method (PPM) is a fundamental method in optimization that is often used as a buil...
For first-order smooth optimization, the research on the acceleration phenomenon has a long-time his...
Nesterov's accelerated gradient algorithm is derived from first principles. The first principles are...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
Policy gradient methods have recently been shown to enjoy global convergence at a $\Theta(1/t)$ rate...
First-order methods play a central role in large-scale convex optimization. Despite their various fo...
We provide a novel accelerated first-order method that achieves the asymptotically optimal convergen...
We provide a novel accelerated first-order method that achieves the asymptotically optimal convergen...
International audienceThe forward-backward algorithm is a powerful tool for solving optimization pro...
Nesterov's Accelerated Gradient (NAG) for optimization has better performance than its continuous ti...
main paper (9 pages) + appendix (21 pages)International audienceWe introduce a generic scheme for ac...
(Based on a joint work with Z. Chbani and H. Riahi) In a Hilbert space setting $\mathcal{H}$, given...
International audienceWe show that accelerated gradient descent, averaged gradient descent and the h...
Nesterov's accelerated gradient (AG) is a popular technique to optimize objective functions comprisi...
Can we accelerate convergence of gradient descent without changing the algorithm -- just by carefull...
The proximal point method (PPM) is a fundamental method in optimization that is often used as a buil...