We consider the problem of minimizing the sum of two convex functions. One of those functions has Lipschitz-continuous gradients, and can be accessed via stochastic oracles, whereas the other is "simple". We provide a Bregman-type algorithm with accelerated convergence in function values to a ball containing the minimum. The radius of this ball depends on problem-dependent constants, including the variance of the stochastic oracle. We further show that this algorithmic setup naturally leads to a variant of Frank-Wolfe achieving acceleration under parallelization. More precisely, when minimizing a smooth convex function on a bounded domain, we show that one can achieve an $\epsilon$ primal-dual gap (in expectation) in $\tilde{O}(1/ \sqrt{\ep...
Abstract — First, we introduce a splitting algorithm to minimize a sum of three convex functions. Th...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
International audienceWe discuss the possibility to accelerate solving extremely large-scale well st...
We consider the problem of minimizing the sum of two convex functions. One of those functions has Li...
We consider the problem of minimizing the sum of two convex functions. One of those functions has Li...
A broad class of convex optimization problems can be formulated as a semidefinite program (SDP), min...
We propose a randomized block-coordinate variant of the classic Frank-Wolfe algorithm for convex opt...
We propose a stochastic gradient framework for solving stochastic composite convex optimization prob...
International audienceIn this paper, we introduce various mechanisms to obtain accelerated first-ord...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
International audienceA new stochastic primal-dual algorithm for solving a composite optimization pr...
We propose a stochastic conditional gradient method (CGM) for minimizing convex finite-sum objective...
Regularized risk minimization often involves non-smooth optimization, either because of the loss fun...
Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problem...
Abstract—Consider the problem of minimizing the expected value of a (possibly nonconvex) cost functi...
Abstract — First, we introduce a splitting algorithm to minimize a sum of three convex functions. Th...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
International audienceWe discuss the possibility to accelerate solving extremely large-scale well st...
We consider the problem of minimizing the sum of two convex functions. One of those functions has Li...
We consider the problem of minimizing the sum of two convex functions. One of those functions has Li...
A broad class of convex optimization problems can be formulated as a semidefinite program (SDP), min...
We propose a randomized block-coordinate variant of the classic Frank-Wolfe algorithm for convex opt...
We propose a stochastic gradient framework for solving stochastic composite convex optimization prob...
International audienceIn this paper, we introduce various mechanisms to obtain accelerated first-ord...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
International audienceA new stochastic primal-dual algorithm for solving a composite optimization pr...
We propose a stochastic conditional gradient method (CGM) for minimizing convex finite-sum objective...
Regularized risk minimization often involves non-smooth optimization, either because of the loss fun...
Mini-batch algorithms have been proposed as a way to speed-up stochastic convex optimization problem...
Abstract—Consider the problem of minimizing the expected value of a (possibly nonconvex) cost functi...
Abstract — First, we introduce a splitting algorithm to minimize a sum of three convex functions. Th...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
International audienceWe discuss the possibility to accelerate solving extremely large-scale well st...