Because learning sometimes involves sensitive data, machine learning algorithms have been extended to offer differential privacy for training data. In practice, this has been mostly an afterthought, with privacy-preserving models obtained by re-running training with a different optimizer, but using the model architectures that already performed well in a non-privacy-preserving setting. This approach leads to less than ideal privacy/utility tradeoffs, as we show here. To improve these tradeoffs, prior work introduces variants of differential privacy that weaken the privacy guarantee proved to increase model utility. We show this is not necessary and instead propose that utility be improved by choosing activation functions designed explicitly...
The Gradient Boosting Decision Tree (GBDT) is a popular machine learning model for various tasks in ...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Abstract—Differential privacy is a recent framework for com-putation on sensitive data, which has sh...
Training large neural networks with meaningful/usable differential privacy security guarantees is a ...
Nowadays, owners and developers of deep learning models must consider stringent privacy-preservation...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Differentially Private Stochastic Gradient Descent (DP-SGD) is a key method for applying privacy in ...
In this paper, we introduce a data augmentation-based defense strategy for preventing the reconstruc...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
In this paper, we focus on developing a novel mechanism to preserve differential privacy in deep neu...
Privacy in AI remains a topic that draws attention from researchers and the general public in recent...
Training even moderately-sized generative models with differentially-private stochastic gradient des...
Brinkrolf J, Berger K, Hammer B. Differential private relevance learning. In: Verleysen M, ed. Proce...
This paper addresses the problem of combining Byzantine resilience with privacy in machine learning ...
Deep learning models have revolutionized AI tasks by producing accurate predictions. These models′ s...
The Gradient Boosting Decision Tree (GBDT) is a popular machine learning model for various tasks in ...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Abstract—Differential privacy is a recent framework for com-putation on sensitive data, which has sh...
Training large neural networks with meaningful/usable differential privacy security guarantees is a ...
Nowadays, owners and developers of deep learning models must consider stringent privacy-preservation...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Differentially Private Stochastic Gradient Descent (DP-SGD) is a key method for applying privacy in ...
In this paper, we introduce a data augmentation-based defense strategy for preventing the reconstruc...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
In this paper, we focus on developing a novel mechanism to preserve differential privacy in deep neu...
Privacy in AI remains a topic that draws attention from researchers and the general public in recent...
Training even moderately-sized generative models with differentially-private stochastic gradient des...
Brinkrolf J, Berger K, Hammer B. Differential private relevance learning. In: Verleysen M, ed. Proce...
This paper addresses the problem of combining Byzantine resilience with privacy in machine learning ...
Deep learning models have revolutionized AI tasks by producing accurate predictions. These models′ s...
The Gradient Boosting Decision Tree (GBDT) is a popular machine learning model for various tasks in ...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Abstract—Differential privacy is a recent framework for com-putation on sensitive data, which has sh...