The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso, sparse logistic regression, and elastic net regularization, and show how earlier work can be derived as a special case. We provide convergence guarantees for the class of convex regularized loss minimization objectives, leveraging a novel approach in handling non-strongly-convex reg...
We analyze two communication-efficient algorithms for distributed optimization in statistical set-ti...
We present a globally convergent method for regularized risk minimization prob-lems. Our method appl...
Classical optimization techniques have found widespread use in machine learning. Convex optimization...
The scale of modern datasets necessitates the development of efficient distributed optimization meth...
The scale of modern datasets necessitates the development of efficient distributed and parallel opti...
New computing systems have emerged in response to the increasing size and complexity of modern datas...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
We propose a new distributed algorithm for em-pirical risk minimization in machine learning. The alg...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We propose a novel incomplete cooperative algorithm for distributed constraint optimization problems...
International audienceIn distributed optimization for large-scale learning, a major performance limi...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
In Part I of this paper, we proposed and analyzed a novel algorithmic framework for the minimization...
A wide variety of machine learning problems can be described as minimizing a regularized risk functi...
Thesis (Ph.D.)--University of Washington, 2016-08Design and analysis of tractable methods for estima...
We analyze two communication-efficient algorithms for distributed optimization in statistical set-ti...
We present a globally convergent method for regularized risk minimization prob-lems. Our method appl...
Classical optimization techniques have found widespread use in machine learning. Convex optimization...
The scale of modern datasets necessitates the development of efficient distributed optimization meth...
The scale of modern datasets necessitates the development of efficient distributed and parallel opti...
New computing systems have emerged in response to the increasing size and complexity of modern datas...
Modern machine learning systems pose several new statistical, scalability, privacy and ethical chall...
We propose a new distributed algorithm for em-pirical risk minimization in machine learning. The alg...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We propose a novel incomplete cooperative algorithm for distributed constraint optimization problems...
International audienceIn distributed optimization for large-scale learning, a major performance limi...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
In Part I of this paper, we proposed and analyzed a novel algorithmic framework for the minimization...
A wide variety of machine learning problems can be described as minimizing a regularized risk functi...
Thesis (Ph.D.)--University of Washington, 2016-08Design and analysis of tractable methods for estima...
We analyze two communication-efficient algorithms for distributed optimization in statistical set-ti...
We present a globally convergent method for regularized risk minimization prob-lems. Our method appl...
Classical optimization techniques have found widespread use in machine learning. Convex optimization...