Optimization problems arise naturally in machine learning for supervised problems. A typical example is the empirical risk minimization (ERM) formulation, which aims to find the best a posteriori estimator minimizing the regularized risk on a given dataset. The current challenge is to design efficient optimization algorithms that are able to handle large amounts of data in high-dimensional feature spaces. Classical optimization methods such as the gradient descent algorithm and its accelerated variants are computationally expensive under this setting, because they require to pass through the entire dataset at each evaluation of the gradient. This was the motivation for the recent development of incremental algorithms. By loading a single da...
RÉSUMÉ: L'objectif principal de ce travail est de proposer des méthodes d'optimisation du premier et...
main paper (9 pages) + appendix (21 pages)International audienceWe introduce a generic scheme for ac...
International audienceWe propose an inexact variable-metric proximal point algorithm to accelerate g...
Optimization problems arise naturally in machine learning for supervised problems. A typical example...
Les problèmes d’optimisation apparaissent naturellement pendant l’entraine-ment de modèles d’apprent...
http://jmlr.org/papers/volume18/17-748/17-748.pdfInternational audienceWe introduce a generic scheme...
This thesis manuscript is devoted to the optimization of composite convex functions in a determinist...
A goal of this thesis is to explore several topics in optimization for high-dimensional stochastic p...
In the digitization age, data becomes cheap and easy to obtain. That results in many new optimizatio...
One common way of describing the tasks addressable by machine learning is to break them down into th...
Optimization Under Uncertainty is a fundamental axis of research in many companies nowadays, due to ...
We introduce a generic scheme to solve nonconvex optimization problems using gradient-based algorith...
Le traitement massif et automatique des données requiert le développement de techniques de filtratio...
In many different fields such as optimization, the performance of a method is often characterized by...
RÉSUMÉ: L'objectif principal de ce travail est de proposer des méthodes d'optimisation du premier et...
main paper (9 pages) + appendix (21 pages)International audienceWe introduce a generic scheme for ac...
International audienceWe propose an inexact variable-metric proximal point algorithm to accelerate g...
Optimization problems arise naturally in machine learning for supervised problems. A typical example...
Les problèmes d’optimisation apparaissent naturellement pendant l’entraine-ment de modèles d’apprent...
http://jmlr.org/papers/volume18/17-748/17-748.pdfInternational audienceWe introduce a generic scheme...
This thesis manuscript is devoted to the optimization of composite convex functions in a determinist...
A goal of this thesis is to explore several topics in optimization for high-dimensional stochastic p...
In the digitization age, data becomes cheap and easy to obtain. That results in many new optimizatio...
One common way of describing the tasks addressable by machine learning is to break them down into th...
Optimization Under Uncertainty is a fundamental axis of research in many companies nowadays, due to ...
We introduce a generic scheme to solve nonconvex optimization problems using gradient-based algorith...
Le traitement massif et automatique des données requiert le développement de techniques de filtratio...
In many different fields such as optimization, the performance of a method is often characterized by...
RÉSUMÉ: L'objectif principal de ce travail est de proposer des méthodes d'optimisation du premier et...
main paper (9 pages) + appendix (21 pages)International audienceWe introduce a generic scheme for ac...
International audienceWe propose an inexact variable-metric proximal point algorithm to accelerate g...