International audienceMotivated by applications to machine learning and imaging science, we study a class of online and stochastic optimization problems with loss functions that are not Lipschitz continuous; in particular, the loss functions encountered by the optimizer could exhibit gradient singularities or be singular themselves. Drawing on tools and techniques from Riemannian geometry, we examine a Riemann-Lipschitz (RL) continuity condition which is tailored to the singularity landscape of the problem's loss functions. In this way, we are able to tackle cases beyond the Lipschitz framework provided by a global norm, and we derive optimal regret bounds and last iterate convergence results through the use of regularized learning methods ...
Stochastic mirror descent (SMD) algorithms have recently garnered a great deal of attention in optim...
This dissertation presents several contributions at the interface of methods for convex optimization...
2013-2014 > Academic research: refereed > Publication in refereed journalVersion of RecordPublishe
International audienceMotivated by applications to machine learning and imaging science, we study a ...
International audienceWe propose a new family of adaptive first-order methods for a class of convex ...
Several important problems in learning theory and data science involve high-dimensional optimization...
Plusieurs problèmes importants issus de l'apprentissage statistique et de la science des données imp...
We consider the problem of maximizing a non-concave Lipschitz multivariate function f over a compact...
International audienceStochastic approximation techniques have been used in various contexts in data...
Online convex optimization (OCO) is a powerful algorithmic framework that has extensive applications...
We consider the problem of maximizing a non-concave Lipschitz multivariate function over a compact d...
Motivated by applications in machine learning and operations research, we study regret minimization ...
We present a simple unified analysis of adaptive Mirror Descent (MD) and Follow- the-Regularized-Lea...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2...
International audienceThis paper proposes a learning framework and a set of algorithms for nonsmooth...
Stochastic mirror descent (SMD) algorithms have recently garnered a great deal of attention in optim...
This dissertation presents several contributions at the interface of methods for convex optimization...
2013-2014 > Academic research: refereed > Publication in refereed journalVersion of RecordPublishe
International audienceMotivated by applications to machine learning and imaging science, we study a ...
International audienceWe propose a new family of adaptive first-order methods for a class of convex ...
Several important problems in learning theory and data science involve high-dimensional optimization...
Plusieurs problèmes importants issus de l'apprentissage statistique et de la science des données imp...
We consider the problem of maximizing a non-concave Lipschitz multivariate function f over a compact...
International audienceStochastic approximation techniques have been used in various contexts in data...
Online convex optimization (OCO) is a powerful algorithmic framework that has extensive applications...
We consider the problem of maximizing a non-concave Lipschitz multivariate function over a compact d...
Motivated by applications in machine learning and operations research, we study regret minimization ...
We present a simple unified analysis of adaptive Mirror Descent (MD) and Follow- the-Regularized-Lea...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2...
International audienceThis paper proposes a learning framework and a set of algorithms for nonsmooth...
Stochastic mirror descent (SMD) algorithms have recently garnered a great deal of attention in optim...
This dissertation presents several contributions at the interface of methods for convex optimization...
2013-2014 > Academic research: refereed > Publication in refereed journalVersion of RecordPublishe