A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the Differential Privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide Differentially Private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that, for the dataset used, this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolk...
The availability of large amounts of informative data is crucial for successful machine learning. Ho...
International audienceThis work addresses the problem of learning from large collections of data wit...
Domains involving sensitive human data, such as health care, human mobility, and online activity, ar...
A major challenge for machine learning is increasing the availability of data while respecting the p...
A major challenge for machine learning is increasing the availability of data while respecting the p...
A continuing challenge for machine learning is providing methods to perform computation on data whil...
In this paper, we present a notion of differential privacy (DP) for data that comes from different c...
Differentially Private Stochastic Gradient Descent (DP-SGD) is a key method for applying privacy in ...
The framework of differential privacy protects an individual's privacy while publishing query respon...
Differentially private stochastic gradient descent (DP-SGD) is the workhorse algorithm for recent ad...
Differential privacy has seen remarkable success as a rigorous and practical formalization of data p...
While modern machine learning models rely on increasingly large training datasets, data is often lim...
Differential privacy has seen remarkable success as a rigorous and practical for- malization of data...
Training large neural networks with meaningful/usable differential privacy security guarantees is a ...
Differential privacy (DP) has become a rigorous central concept in privacy protection for the past d...
The availability of large amounts of informative data is crucial for successful machine learning. Ho...
International audienceThis work addresses the problem of learning from large collections of data wit...
Domains involving sensitive human data, such as health care, human mobility, and online activity, ar...
A major challenge for machine learning is increasing the availability of data while respecting the p...
A major challenge for machine learning is increasing the availability of data while respecting the p...
A continuing challenge for machine learning is providing methods to perform computation on data whil...
In this paper, we present a notion of differential privacy (DP) for data that comes from different c...
Differentially Private Stochastic Gradient Descent (DP-SGD) is a key method for applying privacy in ...
The framework of differential privacy protects an individual's privacy while publishing query respon...
Differentially private stochastic gradient descent (DP-SGD) is the workhorse algorithm for recent ad...
Differential privacy has seen remarkable success as a rigorous and practical formalization of data p...
While modern machine learning models rely on increasingly large training datasets, data is often lim...
Differential privacy has seen remarkable success as a rigorous and practical for- malization of data...
Training large neural networks with meaningful/usable differential privacy security guarantees is a ...
Differential privacy (DP) has become a rigorous central concept in privacy protection for the past d...
The availability of large amounts of informative data is crucial for successful machine learning. Ho...
International audienceThis work addresses the problem of learning from large collections of data wit...
Domains involving sensitive human data, such as health care, human mobility, and online activity, ar...