The size of modern datasets has spurred interest in distributed statistical estimation. We consider a scenario in which randomly drawn data is spread across a set of machines, and the task is to provide an estimate for the location parameter from which the data was drawn. We provide a one-shot protocol for computing this estimate which generalizes results from Braverman et al. [2], which provides a protocol under the assumption that the distribution is Gaussian, as well as from Duchi et al. [4], which assumes that the distribution is supported on the compact set [−1,1]. Like that of Braverman et al., our protocol is optimal in the case that the distribution is Gaussian
article number 62International audienceThe statistical analysis of massive and complex data sets wil...
This paper studies hypothesis testing and parameter estimation in the context of the divide and conq...
One of the most promising areas in which probabilistic graphical models have shown an incipient acti...
This paper constructs bounds on the minimax risk under loss functions when statistical estimation is...
We explore the connection between dimensionality and communication cost in distributed learning prob...
In this thesis, we study distributed statistical learning, in which multiple terminals, connected by...
A common approach to statistical learning with big-data is to randomly split it among m machines and...
The classical framework on distributed inference considers a set of nodes taking measurements and a ...
We consider the problem ofdistributed mean estimation (DME), in which n machines are each given a lo...
Distributed statistical learning problems arise commonly when dealing with large datasets. In this s...
We live in the era of big data, nowadays, many companies face data of massive size that, in most cas...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We consider the problem of estimating the arithmetic average of a finite collection of real vectors ...
Information-theoretic lower bounds on the estimation error are derived for problems of distributed c...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
article number 62International audienceThe statistical analysis of massive and complex data sets wil...
This paper studies hypothesis testing and parameter estimation in the context of the divide and conq...
One of the most promising areas in which probabilistic graphical models have shown an incipient acti...
This paper constructs bounds on the minimax risk under loss functions when statistical estimation is...
We explore the connection between dimensionality and communication cost in distributed learning prob...
In this thesis, we study distributed statistical learning, in which multiple terminals, connected by...
A common approach to statistical learning with big-data is to randomly split it among m machines and...
The classical framework on distributed inference considers a set of nodes taking measurements and a ...
We consider the problem ofdistributed mean estimation (DME), in which n machines are each given a lo...
Distributed statistical learning problems arise commonly when dealing with large datasets. In this s...
We live in the era of big data, nowadays, many companies face data of massive size that, in most cas...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We consider the problem of estimating the arithmetic average of a finite collection of real vectors ...
Information-theoretic lower bounds on the estimation error are derived for problems of distributed c...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
article number 62International audienceThe statistical analysis of massive and complex data sets wil...
This paper studies hypothesis testing and parameter estimation in the context of the divide and conq...
One of the most promising areas in which probabilistic graphical models have shown an incipient acti...