We propose a generic distributed learning framework for robust statistical learn-ing on big contaminated data. The Distributed Robust Learning (DRL) framework can reduce the computational cost of traditional robust learning methods by sev-eral orders. We provide a sharp analysis on the robustness of DRL, showing that DRL not only preserves the robustness of base robust learning methods, but also tolerates breakdowns of a constant fraction of computing nodes. Moreover, DRL can enhance the breakdown point of existing robust learning methods to be even larger than 50%, under favorable conditions. This enhanced robustness is in sharp contrast with the naive divide and fusion method where the breakdown point may be reduced by several orders. We ...
Federated learning (FL) emerges as a popular distributed learning schema that learns a model from a ...
Distributed learning provides an attractive framework for scaling the learning task by sharing the c...
Distributed learning paradigms, such as federated or decentralized learning, allow a collection of a...
Whether it occurs in artificial or biological substrates, {\it learning} is a {distributed} phenomen...
Distributed learning paradigms, such as federated and decentralized learning, allow for the coordina...
We propose a distributionally robust learning (DRL) method for unsupervised domain adaptation (UDA) ...
Robustness of a model plays a vital role in large scale machine learning. Classical estimators in ro...
The performance decay experienced by deep neural networks (DNNs) when confronted with distributional...
In this paper, we propose a class of robust stochastic subgradient methods for distributed learning ...
For many data-intensive real-world applications, such as recognizing objects from images, detecting ...
Modern machine learning methods often require more data for training than a single expert can provid...
This dissertation develops a comprehensive statistical learning framework that is robust to (distrib...
Whether it occurs in artificial or biological substrates, {\it learning} is a {distributed} phenomen...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
Supervised machine learning techniques developed in the Probably Approximately Correct, Maximum A Po...
Federated learning (FL) emerges as a popular distributed learning schema that learns a model from a ...
Distributed learning provides an attractive framework for scaling the learning task by sharing the c...
Distributed learning paradigms, such as federated or decentralized learning, allow a collection of a...
Whether it occurs in artificial or biological substrates, {\it learning} is a {distributed} phenomen...
Distributed learning paradigms, such as federated and decentralized learning, allow for the coordina...
We propose a distributionally robust learning (DRL) method for unsupervised domain adaptation (UDA) ...
Robustness of a model plays a vital role in large scale machine learning. Classical estimators in ro...
The performance decay experienced by deep neural networks (DNNs) when confronted with distributional...
In this paper, we propose a class of robust stochastic subgradient methods for distributed learning ...
For many data-intensive real-world applications, such as recognizing objects from images, detecting ...
Modern machine learning methods often require more data for training than a single expert can provid...
This dissertation develops a comprehensive statistical learning framework that is robust to (distrib...
Whether it occurs in artificial or biological substrates, {\it learning} is a {distributed} phenomen...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
Supervised machine learning techniques developed in the Probably Approximately Correct, Maximum A Po...
Federated learning (FL) emerges as a popular distributed learning schema that learns a model from a ...
Distributed learning provides an attractive framework for scaling the learning task by sharing the c...
Distributed learning paradigms, such as federated or decentralized learning, allow a collection of a...