Attacks that aim to identify the training data of neural networks represent a severe threat to the privacy of individuals in the training dataset. A possible protection is offered by anonymization of the training data or training function with differential privacy. However, data scientists can choose between local and central differential privacy, and need to select meaningful privacy parameters . A comparison of local and central differential privacy based on the privacy parameters furthermore potentially leads data scientists to incorrect conclusions, since the privacy parameters are reflecting different types of mechanisms. Instead, we empirically compare the relative privacy-accuracy trade-off of one central and two local differential ...
Federated learning (FL) was originally regarded as a framework for collaborative learning among clie...
We study the privacy implications of training recurrent neural networks (RNNs) with sensitive traini...
Neural network pruning has been an essential technique to reduce the computation and memory requirem...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Does a neural network's privacy have to be at odds with its accuracy? In this work, we study the eff...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
Deep ensemble learning has been shown to improve accuracy by training multiple neural networks and a...
International audienceThis position paper deals with privacy for deep neural networks, more precisel...
Differential Privacy (DP) is the de facto standard for reasoning about the privacy guarantees of a t...
Hierarchical text classification consists of classifying text documents into a hierarchy of classes ...
Machine learning models are commonly trained on sensitive and personal data such as pictures, medica...
Artificial Intelligence has been widely applied today, and the subsequent privacy leakage problems h...
We study the privacy risks that are associated with training a neural network's weights with self-su...
Over the last decade there have been great strides made in developing techniques to compute function...
Large capacity machine learning (ML) models are prone to membership inference attacks (MIAs), which ...
Federated learning (FL) was originally regarded as a framework for collaborative learning among clie...
We study the privacy implications of training recurrent neural networks (RNNs) with sensitive traini...
Neural network pruning has been an essential technique to reduce the computation and memory requirem...
Deep Learning (DL) has become increasingly popular in recent years. While DL models can achieve high...
Does a neural network's privacy have to be at odds with its accuracy? In this work, we study the eff...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
Deep ensemble learning has been shown to improve accuracy by training multiple neural networks and a...
International audienceThis position paper deals with privacy for deep neural networks, more precisel...
Differential Privacy (DP) is the de facto standard for reasoning about the privacy guarantees of a t...
Hierarchical text classification consists of classifying text documents into a hierarchy of classes ...
Machine learning models are commonly trained on sensitive and personal data such as pictures, medica...
Artificial Intelligence has been widely applied today, and the subsequent privacy leakage problems h...
We study the privacy risks that are associated with training a neural network's weights with self-su...
Over the last decade there have been great strides made in developing techniques to compute function...
Large capacity machine learning (ML) models are prone to membership inference attacks (MIAs), which ...
Federated learning (FL) was originally regarded as a framework for collaborative learning among clie...
We study the privacy implications of training recurrent neural networks (RNNs) with sensitive traini...
Neural network pruning has been an essential technique to reduce the computation and memory requirem...