In this thesis, we focus on Frank-Wolfe (a.k.a. Conditional Gradient) algorithms, a family of iterative algorithms for convex optimization, that work under the assumption that projections onto the feasible region are prohibitive, but linear optimization problems can be efficiently solved over the feasible region. We present several algorithms that either locally or globally improve upon existing convergence guarantees. In Chapters 2-4 we focus on the case where the objective function is smooth and strongly convex and the feasible region is a polytope, and in Chapter 5 we focus on the case where the function is generalized self-concordant and the feasible region is a compact convex set.Ph.D
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
In the first chapter of this thesis, we analyze the global convergence rate of a proximal quasi-Newt...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
The purpose of this survey is to serve both as a gentle introduction and a coherent overview of stat...
The move from hand-designed to learned optimizers in machine learning has been quite successful for ...
The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization p...
In Chapter 2, we present the Frank-Wolfe algorithm (FW) and all necessary background material. We ex...
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained ...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
As a projection-free algorithm, Frank-Wolfe (FW) method, also known as conditional gradient, has rec...
Machine learning has become one of the most exciting research areas in the world, with various appli...
Many fundamental machine learning tasks can be formulated as min-max optimization. This motivates us...
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...
In the first chapter of this thesis, we analyze the global convergence rate of a proximal quasi-Newt...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
The purpose of this survey is to serve both as a gentle introduction and a coherent overview of stat...
The move from hand-designed to learned optimizers in machine learning has been quite successful for ...
The Frank-Wolfe algorithms, a.k.a. conditional gradient algorithms, solve constrained optimization p...
In Chapter 2, we present the Frank-Wolfe algorithm (FW) and all necessary background material. We ex...
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained ...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
This thesis aims at developing efficient algorithms for solving complex and constrained convex optim...
As a projection-free algorithm, Frank-Wolfe (FW) method, also known as conditional gradient, has rec...
Machine learning has become one of the most exciting research areas in the world, with various appli...
Many fundamental machine learning tasks can be formulated as min-max optimization. This motivates us...
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Grad...