Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy $\gamma_t = 2/(t+2)$, obtaining a $\mathcal{O}(1/t)$ convergence rate for this class of functions in terms of primal gap and Frank-Wolfe gap, where $t$ is the iteration count. This avoids the use of second-order information or the need to estimate local smoothness parameters of previous work. We also show improved convergence rates for various common cases, e.g., when the feasible region under consideration is uniformly convex or polyhedral
We introduce the notion of self-concordant smoothing for minimizing the sum of two convex functions:...
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over...
Many problems in statistical learning, imaging, and computer vision involve the optimization of a no...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
Projection-free optimization via different variants of the Frank–Wolfe method has become one of the ...
We present and analyze a new away-step Frank-Wolfe method for the convex optimization problem ${\min...
The self-concordant-like property of a smooth convex func- tion is a new analytical structure that g...
Many problems in statistical learning, imaging, and computer vision involve the optimization of a no...
International audienceThe Frank-Wolfe (FW) optimization algorithm has lately re-gained popularity th...
The Frank-Wolfe algorithm is a popular method for minimizing a smooth convex function $f$ over a com...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
The Frank-Wolfe method (a.k.a. conditional gra-dient algorithm) for smooth optimization has re-gaine...
6 figuresThe quadratic convergence region of the exact Newton method around the minimum of a self-co...
We introduce the notion of self-concordant smoothing for minimizing the sum of two convex functions:...
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over...
Many problems in statistical learning, imaging, and computer vision involve the optimization of a no...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
Projection-free optimization via different variants of the Frank–Wolfe method has become one of the ...
We present and analyze a new away-step Frank-Wolfe method for the convex optimization problem ${\min...
The self-concordant-like property of a smooth convex func- tion is a new analytical structure that g...
Many problems in statistical learning, imaging, and computer vision involve the optimization of a no...
International audienceThe Frank-Wolfe (FW) optimization algorithm has lately re-gained popularity th...
The Frank-Wolfe algorithm is a popular method for minimizing a smooth convex function $f$ over a com...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
The Frank-Wolfe method (a.k.a. conditional gra-dient algorithm) for smooth optimization has re-gaine...
6 figuresThe quadratic convergence region of the exact Newton method around the minimum of a self-co...
We introduce the notion of self-concordant smoothing for minimizing the sum of two convex functions:...
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over...
Many problems in statistical learning, imaging, and computer vision involve the optimization of a no...