This work focuses on decentralized stochastic optimization in the presence of Byzantine attacks. During the optimization process, an unknown number of malfunctioning or malicious nodes, which we term as Byzantine workers, disobey the algorithmic protocol and send wrong messages to their neighbors. Even though various Byzantine-resilient algorithms have been developed for distributed stochastic optimization, we show that there are still two major challenges during the designation of robust aggregation rules suitable for decentralized stochastic optimization: disagreement and non-doubly stochastic mixing matrix. This paper provides a comprehensive analysis disclosing the negative effects of these two issues, and gives guidelines of designing ...
Byzantine resilience emerged as a prominent topic within the distributed machine learning community....
The problem of distributed optimization requires a group of networked agents to compute a parameter ...
This paper proposes a novel approach to resilient distributed optimization with quadratic costs in a...
In this paper, we propose a class of robust stochastic subgradient methods for distributed learning ...
This paper studies the problem of distributed stochastic optimization in an adversarial setting wher...
Asynchronous distributed machine learning solutions have proven very effective so far, but always as...
A very common optimization technique in Machine Learning is Stochastic Gradient Descent (SGD). SGD c...
Byzantine-resilient Stochastic Gradient Descent (SGD) aims at shielding model training from Byzantin...
We report on \emph{Krum}, the first \emph{provably} Byzantine-tolerant aggregation rule for distribu...
In this paper, we propose an iterative scheme for distributed Byzantineresilient estimation of a gra...
We present AGGREGATHOR, a framework that implements state-of-the-art robust (Byzantine-resilient) di...
Algorithms are everywhere. The recipe for the frangipane cake is an algorithm. If all the listed ing...
This paper investigates several voting consensus protocols with low computational complexity in nois...
For many data-intensive real-world applications, such as recognizing objects from images, detecting ...
This paper considers the problem of Byzantine fault-tolerance in distributed multi-agent optimizatio...
Byzantine resilience emerged as a prominent topic within the distributed machine learning community....
The problem of distributed optimization requires a group of networked agents to compute a parameter ...
This paper proposes a novel approach to resilient distributed optimization with quadratic costs in a...
In this paper, we propose a class of robust stochastic subgradient methods for distributed learning ...
This paper studies the problem of distributed stochastic optimization in an adversarial setting wher...
Asynchronous distributed machine learning solutions have proven very effective so far, but always as...
A very common optimization technique in Machine Learning is Stochastic Gradient Descent (SGD). SGD c...
Byzantine-resilient Stochastic Gradient Descent (SGD) aims at shielding model training from Byzantin...
We report on \emph{Krum}, the first \emph{provably} Byzantine-tolerant aggregation rule for distribu...
In this paper, we propose an iterative scheme for distributed Byzantineresilient estimation of a gra...
We present AGGREGATHOR, a framework that implements state-of-the-art robust (Byzantine-resilient) di...
Algorithms are everywhere. The recipe for the frangipane cake is an algorithm. If all the listed ing...
This paper investigates several voting consensus protocols with low computational complexity in nois...
For many data-intensive real-world applications, such as recognizing objects from images, detecting ...
This paper considers the problem of Byzantine fault-tolerance in distributed multi-agent optimizatio...
Byzantine resilience emerged as a prominent topic within the distributed machine learning community....
The problem of distributed optimization requires a group of networked agents to compute a parameter ...
This paper proposes a novel approach to resilient distributed optimization with quadratic costs in a...