Traditional machine learning models can be formulated as the expected risk minimization (ERM) problem: minw∈Rd Eξ [l(w; ξ)], where w ∈ Rd denotes the model parameter, ξ represents training samples, l(·) is the loss function. Numerous optimization algorithms, such as stochastic gradient descent (SGD), have been developed to solve the ERM problem. However, a wide range of emerging machine learning models are beyond this class of optimization problems, such as model-agnostic meta-learning (Finn, Abbeel, and Levine 2017). Of particular interest of my research is the stochastic nested optimization (SNO) problem, whose objective function has a nested structure. Specifically, I have been focusing on two instances of this kind of problem: stochasti...
Thesis (Ph.D.)--University of Washington, 2019Tremendous advances in large scale machine learning an...
In recent years, the rapid development of new generation information technology has resulted in an u...
International audienceWe consider stochastic optimization problems defined over reproducing kernel H...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
Stochastic and data-distributed optimization algorithms have received lots of attention from the mac...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
This dissertation deals with developing optimization algorithms which can be distributed over a netw...
The area of machine learning has made considerable progress over the past decade, enabled by the wid...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We study optimization algorithms for the finite sum problems frequently arising in machine learning...
Recent years have witnessed huge advances in machine learning (ML) and its applications, especially ...
Abstract—We consider the problem of distributed stochastic optimization, where each of several machi...
University of Minnesota Ph.D. dissertation. April 2020. Major: Computer Science. Advisor: Arindam Ba...
Classical optimization techniques have found widespread use in machine learning. Convex optimization...
The first part of this dissertation considers distributed learning problems over networked agents. T...
Thesis (Ph.D.)--University of Washington, 2019Tremendous advances in large scale machine learning an...
In recent years, the rapid development of new generation information technology has resulted in an u...
International audienceWe consider stochastic optimization problems defined over reproducing kernel H...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
Stochastic and data-distributed optimization algorithms have received lots of attention from the mac...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
This dissertation deals with developing optimization algorithms which can be distributed over a netw...
The area of machine learning has made considerable progress over the past decade, enabled by the wid...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We study optimization algorithms for the finite sum problems frequently arising in machine learning...
Recent years have witnessed huge advances in machine learning (ML) and its applications, especially ...
Abstract—We consider the problem of distributed stochastic optimization, where each of several machi...
University of Minnesota Ph.D. dissertation. April 2020. Major: Computer Science. Advisor: Arindam Ba...
Classical optimization techniques have found widespread use in machine learning. Convex optimization...
The first part of this dissertation considers distributed learning problems over networked agents. T...
Thesis (Ph.D.)--University of Washington, 2019Tremendous advances in large scale machine learning an...
In recent years, the rapid development of new generation information technology has resulted in an u...
International audienceWe consider stochastic optimization problems defined over reproducing kernel H...