We consider the problem of private distributed computation. Our main interest in this problem stems from machine learning applications. A master node (referred to as master) possesses a tremendous amount of {em confidential} data (e.g., personal, genomic or medical data) and wants to perform intensive computations on it. The master divides these computations into smaller computational tasks and distribute them to {em untrusted} workers that perform these tasks in parallel. The workers return their results to the master, who processes them to obtain its original task. In large scale systems, the appearance of slow and unresponsive workers, called stragglers, is inevitable. Stragglers incur large delays on the computation if they are not acco...
Coded computation techniques provide robustness against straggling workers in distributed computing....
In this paper, we address the problem of privacy-preserving distributed learning and the evaluation ...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We consider the setting of a master server who possesses confidential data (genomic, medical data, e...
We consider a scenario involving computations over a massive dataset stored distributedly across mul...
With the unprecedented rate of the amount of data generated daily, it has become difficult and ineff...
When gradient descent (GD) is scaled to many parallel workers for large-scale machine learning appli...
We propose a privacy-preserving federated learning (FL) scheme that is resilient against straggling ...
In many practical settings, a user needs to perform computations---for example, using machine learni...
Existing approaches to distributed matrix computations involve allocating coded combinations of subm...
Abstract. Secure computation consists of protocols for secure arith-metic: secret values are added a...
We consider the problem of secure and private multiparty computation (MPC), in which the goal is to ...
© 2017 Kim Sasha RamchenA fundamental problem in large distributed systems is how to enable parties ...
Over the past decade, distributed representation learning has emerged as a popular alternative to co...
In the problem of private “swarm” computing, n agents wish to securely and distributively perform a ...
Coded computation techniques provide robustness against straggling workers in distributed computing....
In this paper, we address the problem of privacy-preserving distributed learning and the evaluation ...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...
We consider the setting of a master server who possesses confidential data (genomic, medical data, e...
We consider a scenario involving computations over a massive dataset stored distributedly across mul...
With the unprecedented rate of the amount of data generated daily, it has become difficult and ineff...
When gradient descent (GD) is scaled to many parallel workers for large-scale machine learning appli...
We propose a privacy-preserving federated learning (FL) scheme that is resilient against straggling ...
In many practical settings, a user needs to perform computations---for example, using machine learni...
Existing approaches to distributed matrix computations involve allocating coded combinations of subm...
Abstract. Secure computation consists of protocols for secure arith-metic: secret values are added a...
We consider the problem of secure and private multiparty computation (MPC), in which the goal is to ...
© 2017 Kim Sasha RamchenA fundamental problem in large distributed systems is how to enable parties ...
Over the past decade, distributed representation learning has emerged as a popular alternative to co...
In the problem of private “swarm” computing, n agents wish to securely and distributively perform a ...
Coded computation techniques provide robustness against straggling workers in distributed computing....
In this paper, we address the problem of privacy-preserving distributed learning and the evaluation ...
Distributed machine learning bridges the traditional fields of distributed systems and machine learn...