Introduction. Recent work has shown many connections between conditional gradient and other first-order optimization methods, such as herding [3] and subgradient descent [2]. By considering a type of proximal conditional method, which we call boosted mirror descent (BMD), we are able to unify all of these algorithms into a single framework, which can be interpreted as taking successive arg-mins of a sequence of surrogate functions. Using a standard online learning analysis based o
13 pagesIn this paper, we provide an overview of first-order and second-order variants of the gradie...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
International audienceIn this paper, we study optimization methods consisting of iteratively minimiz...
First-order methods are gaining substantial interest in the past two decades because of their superi...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The interplay between optimization and machine learning is one of the most important developments in...
Gradient boosting is a prediction method that iteratively combines weak learners to produce a comple...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Thesis: Ph. D. in Mathematics and Operations Research, Massachusetts Institute of Technology, Depart...
International audienceGiven a convex optimization problem and its dual, there are many possible firs...
We present a new method for regularized convex optimization and analyze it under both online and sto...
This paper explores a new framework for reinforcement learning based on online convex optimization, ...
305 pagesThis thesis concerns the foundations of first-order optimization theory. In recent years, t...
First-order methods play a central role in large-scale convex optimization. Even though many variati...
13 pagesIn this paper, we provide an overview of first-order and second-order variants of the gradie...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
International audienceIn this paper, we study optimization methods consisting of iteratively minimiz...
First-order methods are gaining substantial interest in the past two decades because of their superi...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The interplay between optimization and machine learning is one of the most important developments in...
Gradient boosting is a prediction method that iteratively combines weak learners to produce a comple...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Thesis: Ph. D. in Mathematics and Operations Research, Massachusetts Institute of Technology, Depart...
International audienceGiven a convex optimization problem and its dual, there are many possible firs...
We present a new method for regularized convex optimization and analyze it under both online and sto...
This paper explores a new framework for reinforcement learning based on online convex optimization, ...
305 pagesThis thesis concerns the foundations of first-order optimization theory. In recent years, t...
First-order methods play a central role in large-scale convex optimization. Even though many variati...
13 pagesIn this paper, we provide an overview of first-order and second-order variants of the gradie...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...