We develop a theoretical foundation for the application of Nesterov’s accelerated gradient descent method (AGD) to the approximation of solutions of a wide class of partial differential equations (PDEs). This is achieved by proving the existence of an invariant set and exponential convergence rates when its preconditioned version (PAGD) is applied to minimize locally Lipschitz smooth, strongly convex objective functionals. We introduce a second-order ordinary differential equation (ODE) with a preconditioner built-in and show that PAGD is an explicit time-discretization of this ODE, which requires a natural time step restriction for energy stability. At the continuous time level, we show an exponential convergence of the ODE solution to its...
We propose two numerical methods for accelerating the convergence of the standard fixed point method...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
© 2018 Curran Associates Inc..All rights reserved. We study gradient-based optimization methods obta...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
We propose two numerical methods for accelerating the convergence of the standard fixed point method...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
© 2018 Curran Associates Inc..All rights reserved. We study gradient-based optimization methods obta...
We derive a second-order ordinary differential equation (ODE), which is the limit of Nesterov’s acce...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
We investigate convex differentiable optimization and explore the temporal discretization of damped ...
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
International audienceWe revisit the Ravine method of Gelfand and Tsetlin from a dynamical system pe...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
We propose two numerical methods for accelerating the convergence of the standard fixed point method...
International audienceIn this paper, we study the behavior of solutions of the ODE associated to Nes...
© 2018 Curran Associates Inc..All rights reserved. We study gradient-based optimization methods obta...