Machine learning (ML) models can leak information about users, and differential privacy (DP) provides a rigorous way to bound that leakage under a given budget. This DP budget can be regarded as a new type of compute resource in workloads of multiple ML models training on user data. Once it is used, the DP budget is forever consumed. Therefore, it is crucial to allocate it most efficiently to train as many models as possible. This paper presents the scheduler for privacy that optimizes for efficiency. We formulate privacy scheduling as a new type of multidimensional knapsack problem, called privacy knapsack, which maximizes DP budget efficiency. We show that privacy knapsack is NP-hard, hence practical algorithms are necessarily approximate...
Machine learning applications in fields where data is sensitive, such as healthcare and banking, fac...
Machine learning (ML) has become one of the most powerful classes of tools for artificial intelligen...
Differentially Private methods for training Deep Neural Networks (DNNs) have progressed recently, in...
We consider training machine learning models using data located on multiple private and geographical...
The past decade has witnessed the fast growth and tremendous success of machine learning. However, r...
Differential privacy (DP) is a key tool in privacy-preserving data analysis. Yet it remains challeng...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
The availability of large amounts of informative data is crucial for successful machine learning. Ho...
We address the problem of learning a machine learning model from training data that originates at mu...
Machine learning has assumed an increasingly important role in Artificial Intelligence in recent yea...
Previous work on user-level differential privacy (DP) [Ghazi et al. NeurIPS 2021, Bun et al. STOC 20...
International audienceIn this position paper, we discuss the problem of specifying privacy requireme...
Applying machine learning (ML) to sensitive domains requires privacy protection of the underlying tr...
International audienceIn this position paper, we discuss the problem of specifying privacy requireme...
Machine learning applications in fields where data is sensitive, such as healthcare and banking, fac...
Machine learning applications in fields where data is sensitive, such as healthcare and banking, fac...
Machine learning (ML) has become one of the most powerful classes of tools for artificial intelligen...
Differentially Private methods for training Deep Neural Networks (DNNs) have progressed recently, in...
We consider training machine learning models using data located on multiple private and geographical...
The past decade has witnessed the fast growth and tremendous success of machine learning. However, r...
Differential privacy (DP) is a key tool in privacy-preserving data analysis. Yet it remains challeng...
Data holders are increasingly seeking to protect their user’s privacy, whilst still maximizing their...
The availability of large amounts of informative data is crucial for successful machine learning. Ho...
We address the problem of learning a machine learning model from training data that originates at mu...
Machine learning has assumed an increasingly important role in Artificial Intelligence in recent yea...
Previous work on user-level differential privacy (DP) [Ghazi et al. NeurIPS 2021, Bun et al. STOC 20...
International audienceIn this position paper, we discuss the problem of specifying privacy requireme...
Applying machine learning (ML) to sensitive domains requires privacy protection of the underlying tr...
International audienceIn this position paper, we discuss the problem of specifying privacy requireme...
Machine learning applications in fields where data is sensitive, such as healthcare and banking, fac...
Machine learning applications in fields where data is sensitive, such as healthcare and banking, fac...
Machine learning (ML) has become one of the most powerful classes of tools for artificial intelligen...
Differentially Private methods for training Deep Neural Networks (DNNs) have progressed recently, in...