This paper addresses the problem of combining Byzantine resilience with privacy in machine learning (ML). Specifically, we study if a distributed implementation of the renowned Stochastic Gradient Descent (SGD) learning algorithm is feasible with both differential privacy (DP) and (alpha, f)-Byzantine resilience. To the best of our knowledge, this is the first work to tackle this problem from a theoretical point of view. A key finding of our analyses is that the classical approaches to these two (seemingly) orthogonal issues are incompatible. More precisely, we show that a direct composition of these techniques makes the guarantees of the resulting SGD algorithm depend unfavourably upon the number of parameters of the ML model, making the t...
Differentially private stochastic gradient descent (DP-SGD) has been widely adopted in deep learning...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Asynchronous distributed machine learning solutions have proven very effective so far, but always as...
Privacy and Byzantine resilience (BR) are two crucial requirements of modern-day distributed machine...
This paper aims at jointly addressing two seemly conflicting issues in federated learning: different...
Abstract—Differential privacy is a recent framework for com-putation on sensitive data, which has sh...
A central issue in machine learning is how to train models on sensitive user data. Industry has wide...
In this paper, we apply machine learning to distributed private data owned by multiple data owners, ...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Because learning sometimes involves sensitive data, machine learning algorithms have been extended t...
We analyse the privacy leakage of noisy stochastic gradient descent by modeling Rényi divergence dyn...
Bayesian deep learning is recently regarded as an intrinsic way to characterize the weight uncertain...
Privacy is a key concern in many distributed systems that are rich in personal data such as networks...
Decentralized machine learning has been playing an essential role in improving training efficiency. ...
Nowadays, owners and developers of deep learning models must consider stringent privacy-preservation...
Differentially private stochastic gradient descent (DP-SGD) has been widely adopted in deep learning...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Asynchronous distributed machine learning solutions have proven very effective so far, but always as...
Privacy and Byzantine resilience (BR) are two crucial requirements of modern-day distributed machine...
This paper aims at jointly addressing two seemly conflicting issues in federated learning: different...
Abstract—Differential privacy is a recent framework for com-putation on sensitive data, which has sh...
A central issue in machine learning is how to train models on sensitive user data. Industry has wide...
In this paper, we apply machine learning to distributed private data owned by multiple data owners, ...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Because learning sometimes involves sensitive data, machine learning algorithms have been extended t...
We analyse the privacy leakage of noisy stochastic gradient descent by modeling Rényi divergence dyn...
Bayesian deep learning is recently regarded as an intrinsic way to characterize the weight uncertain...
Privacy is a key concern in many distributed systems that are rich in personal data such as networks...
Decentralized machine learning has been playing an essential role in improving training efficiency. ...
Nowadays, owners and developers of deep learning models must consider stringent privacy-preservation...
Differentially private stochastic gradient descent (DP-SGD) has been widely adopted in deep learning...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Asynchronous distributed machine learning solutions have proven very effective so far, but always as...