This thesis aims at developing efficient optimization algorithms for solving large-scale machine learning problems. To cope with the increasing scale and complexity of such models, we focus on first-order and stochastic methods in which updates are carried out using only (noisy) information about function values and (sub)gradients. The broad question we ask is: given such algorithmic oracles, how can we best design algorithms that both have strong theoretical guarantees and are practically useful. To this end, we touch upon a wide range of problems and investigate various methods (including newly proposed and successful existing ones) for solving them. We pay significant attention to developing and analyzing methods that are easy to run at ...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Machine learning has been a topic in academia and industry for decades. Performance of machine lear...
Stochastic and data-distributed optimization algorithms have received lots of attention from the mac...
This thesis aims at developing efficient optimization algorithms for solving large-scale machine lea...
Recently, Stochastic Gradient Descent (SGD) and its variants have become the dominant methods in the...
Technological developments in devices and storages have made large volumes of data collections more ...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
Recent years have witnessed huge advances in machine learning (ML) and its applications, especially ...
Stochastic gradient descent is the method of choice for solving large-scale optimization problems in...
This thesis aims at developing efficient algorithms for solving some fundamental engineering problem...
International audienceIn this paper, we introduce various mechanisms to obtain accelerated first-ord...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
In this thesis we study several machine learning problems that are all linked with the minimization ...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Machine learning has been a topic in academia and industry for decades. Performance of machine lear...
Stochastic and data-distributed optimization algorithms have received lots of attention from the mac...
This thesis aims at developing efficient optimization algorithms for solving large-scale machine lea...
Recently, Stochastic Gradient Descent (SGD) and its variants have become the dominant methods in the...
Technological developments in devices and storages have made large volumes of data collections more ...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
Recent years have witnessed huge advances in machine learning (ML) and its applications, especially ...
Stochastic gradient descent is the method of choice for solving large-scale optimization problems in...
This thesis aims at developing efficient algorithms for solving some fundamental engineering problem...
International audienceIn this paper, we introduce various mechanisms to obtain accelerated first-ord...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
In this thesis we study several machine learning problems that are all linked with the minimization ...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
Machine learning has been a topic in academia and industry for decades. Performance of machine lear...
Stochastic and data-distributed optimization algorithms have received lots of attention from the mac...