Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm. We obtain theoretical guarantees on the optimization error and communication cost that do not depend on the total number of combining elements. We further show that the communication cost of dFW is optimal by deriving a lower-bound on the communication cost required to construct an -approximate solution. We validate our theoretical analysis with em...
Learning theory of distributed algorithms has recently attracted enormous attention in the machine l...
pre-printA recent paper [1] proposes a general model for distributed learning that bounds the commun...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its as...
International audienceWe propose distributed algorithms for high-dimensional sparse optimization. In...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
<p>This thesis is concerned with the design of distributed algorithms for solving optimization probl...
International audienceIn distributed optimization for large-scale learning, a major performance limi...
In distributed optimization and machine learning, multiple nodes coordinate to solve large problems....
Recently decentralized optimization attracts much attention in machine learning because it is more c...
This dissertation deals with developing optimization algorithms which can be distributed over a netw...
This work presents and studies a distributed algorithm for solving optimization problems over networ...
This paper addresses the problem of distributed training of a machine learning model over the nodes ...
We study distributed inference, learning and optimization in scenarios which involve networked entit...
We investigate an existing distributed algorithm for learning sparse signals or data over networks. ...
Learning theory of distributed algorithms has recently attracted enormous attention in the machine l...
pre-printA recent paper [1] proposes a general model for distributed learning that bounds the commun...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its as...
International audienceWe propose distributed algorithms for high-dimensional sparse optimization. In...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
<p>This thesis is concerned with the design of distributed algorithms for solving optimization probl...
International audienceIn distributed optimization for large-scale learning, a major performance limi...
In distributed optimization and machine learning, multiple nodes coordinate to solve large problems....
Recently decentralized optimization attracts much attention in machine learning because it is more c...
This dissertation deals with developing optimization algorithms which can be distributed over a netw...
This work presents and studies a distributed algorithm for solving optimization problems over networ...
This paper addresses the problem of distributed training of a machine learning model over the nodes ...
We study distributed inference, learning and optimization in scenarios which involve networked entit...
We investigate an existing distributed algorithm for learning sparse signals or data over networks. ...
Learning theory of distributed algorithms has recently attracted enormous attention in the machine l...
pre-printA recent paper [1] proposes a general model for distributed learning that bounds the commun...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...