6 pagesWe give a simple proof that the Frank-Wolfe algorithm obtains a stationary point at a rate of $O(1/\sqrt{t})$ on non-convex objectives with a Lipschitz continuous gradient. Our analysis is affine invariant and is the first, to the best of our knowledge, giving a similar rate to what was already proven for projected gradient methods (though on slightly different measures of stationarity)
The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization p...
The Frank-Wolfe (FW) method, which implements efficient linear oracles that minimize linear approxim...
Aiming at convex optimization under structural constraints, this work introduces and analyzes a vari...
We study the linear convergence of variants of the Frank-Wolfe algorithms for some classes of strong...
The Frank-Wolfe algorithm is a popular method for minimizing a smooth convex function f over a compa...
The Frank-Wolfe (FW) optimization algorithm has lately re-gained popularity thanks in particular to ...
We revisit the Frank-Wolfe (FW) optimization under strongly convex constraint sets. We provide a fas...
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained ...
International audienceConditional Gradients (aka Frank-Wolfe algorithms) form a classical set of met...
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
Cover title.Includes bibliographical references (leaves 20-21).Research supported by the U.S. Army R...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
We perform the first tight convergence analysis of the gradient method with varying step sizes when ...
The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization p...
The Frank-Wolfe (FW) method, which implements efficient linear oracles that minimize linear approxim...
Aiming at convex optimization under structural constraints, this work introduces and analyzes a vari...
We study the linear convergence of variants of the Frank-Wolfe algorithms for some classes of strong...
The Frank-Wolfe algorithm is a popular method for minimizing a smooth convex function f over a compa...
The Frank-Wolfe (FW) optimization algorithm has lately re-gained popularity thanks in particular to ...
We revisit the Frank-Wolfe (FW) optimization under strongly convex constraint sets. We provide a fas...
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained ...
International audienceConditional Gradients (aka Frank-Wolfe algorithms) form a classical set of met...
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
Cover title.Includes bibliographical references (leaves 20-21).Research supported by the U.S. Army R...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
We perform the first tight convergence analysis of the gradient method with varying step sizes when ...
The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization p...
The Frank-Wolfe (FW) method, which implements efficient linear oracles that minimize linear approxim...
Aiming at convex optimization under structural constraints, this work introduces and analyzes a vari...