In this paper we deal with the problem of improving the recent milestone results on the estimation of the generalization capability of a randomized learning algorithm based on Differential Privacy (DP). In particular, we derive new DP based multiplicative Chernoff and Bennett type generalization bounds, which improve over the current state-of-the-art Hoeffding type bound. Then, we prove that a randomized algorithm based on the data generating dependent prior and data dependent posterior Boltzmann distributions of Catoni (2007) [10] is Differentially Private and shows better generalization properties than the Gibbs classifier associated to the same distributions. With this aim, we also exploit a simple example. Finally, we discuss the advant...
In recent years, privacy enhancing technologies have gained tremendous momentum and they are expecte...
239 pagesIn modern settings of data analysis, we may be running our algorithms on datasets that are ...
Motivated by the increasing concern about privacy in nowadays data-intensive online learning systems...
In this paper we deal with the problem of improving the recent milestone results on the estimation o...
In the context of assessing the generalization abilities of a randomized model or learning algorithm...
We address the problem of randomized learning and generalization of fair and private classifiers. Fr...
This work studies the problem of privacy-preserving classification – namely, learning a classifier f...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Differential privacy is the now de facto industry standard for ensuring privacy while publicly relea...
Propose-Test-Release (PTR) is a differential privacy framework that works with local sensitivity of ...
We prove new upper and lower bounds on the sample complexity of (ε, δ) differentially private algori...
Using results from PAC-Bayesian bounds in learning theory, we formulate differentially-private learn...
Producing statistics that respect the privacy of the samples while still maintaining their accuracy ...
In this thesis, we study when algorithmic tasks can be performed on sensitive data while protecting ...
This dissertation studies the trade-off between differential privacy and statistical accuracy in par...
In recent years, privacy enhancing technologies have gained tremendous momentum and they are expecte...
239 pagesIn modern settings of data analysis, we may be running our algorithms on datasets that are ...
Motivated by the increasing concern about privacy in nowadays data-intensive online learning systems...
In this paper we deal with the problem of improving the recent milestone results on the estimation o...
In the context of assessing the generalization abilities of a randomized model or learning algorithm...
We address the problem of randomized learning and generalization of fair and private classifiers. Fr...
This work studies the problem of privacy-preserving classification – namely, learning a classifier f...
Prior work on differential privacy analysis of randomized SGD algorithms relies on composition theor...
Differential privacy is the now de facto industry standard for ensuring privacy while publicly relea...
Propose-Test-Release (PTR) is a differential privacy framework that works with local sensitivity of ...
We prove new upper and lower bounds on the sample complexity of (ε, δ) differentially private algori...
Using results from PAC-Bayesian bounds in learning theory, we formulate differentially-private learn...
Producing statistics that respect the privacy of the samples while still maintaining their accuracy ...
In this thesis, we study when algorithmic tasks can be performed on sensitive data while protecting ...
This dissertation studies the trade-off between differential privacy and statistical accuracy in par...
In recent years, privacy enhancing technologies have gained tremendous momentum and they are expecte...
239 pagesIn modern settings of data analysis, we may be running our algorithms on datasets that are ...
Motivated by the increasing concern about privacy in nowadays data-intensive online learning systems...