© 2017 by the author(s). We develop differentially private hypothesis testing methods for the small sample regime. Given a sample V from a categorical distribution p over some domain ∑, an explicitly described distribution q over ∑, some privacy parameter e, accuracy parameter ϵ, and requirements βI and βII for the type I and type II errors of our test, the goal is to distinguish between p = q and dTV(p, q) ≥ α. We provide theoretical bounds for the sample size \V\ so that our method both satisfies (e, 0)-differential privacy, and guarantees βi and βu type I and type II errors. We show that differential privacy may come for free in some regimes of parameters, and we always beat the sample complexity resulting from running the χ2-test with n...
Differential privacy is the now de facto industry standard for ensuring privacy while publicly relea...
Differential privacy (DP) uses a probabilistic framework to measure the level of privacy protection ...
Algorithms such as Differentially Private SGD enable training machine learning models with formal pr...
© 2019 Neural information processing systems foundation. All rights reserved. Statistical tests are ...
239 pagesIn modern settings of data analysis, we may be running our algorithms on datasets that are ...
International audienceThe challenge of producing accurate statistics while respecting the privacy of...
Hypothesis testing is one of the most common types of data analysis and forms the backbone of scient...
Recent years have witnessed growing concerns about the privacy of sensitive data. In response to the...
Data analysis is inherently adaptive, where previous results may influence which tests are carried o...
While running any experiment, we often have to consider the statistical power to ensure an effective...
A statistical hypothesis test determines whether a hypothesis should be rejected based on samples fr...
We provide improved differentially private algorithms for identity testing of high-dimensional distr...
Code and data for the published article. We develop differentially private methods for estimating v...
In this thesis, we study when algorithmic tasks can be performed on sensitive data while protecting ...
This work studies the problem of privacy-preserving classification – namely, learning a classifier f...
Differential privacy is the now de facto industry standard for ensuring privacy while publicly relea...
Differential privacy (DP) uses a probabilistic framework to measure the level of privacy protection ...
Algorithms such as Differentially Private SGD enable training machine learning models with formal pr...
© 2019 Neural information processing systems foundation. All rights reserved. Statistical tests are ...
239 pagesIn modern settings of data analysis, we may be running our algorithms on datasets that are ...
International audienceThe challenge of producing accurate statistics while respecting the privacy of...
Hypothesis testing is one of the most common types of data analysis and forms the backbone of scient...
Recent years have witnessed growing concerns about the privacy of sensitive data. In response to the...
Data analysis is inherently adaptive, where previous results may influence which tests are carried o...
While running any experiment, we often have to consider the statistical power to ensure an effective...
A statistical hypothesis test determines whether a hypothesis should be rejected based on samples fr...
We provide improved differentially private algorithms for identity testing of high-dimensional distr...
Code and data for the published article. We develop differentially private methods for estimating v...
In this thesis, we study when algorithmic tasks can be performed on sensitive data while protecting ...
This work studies the problem of privacy-preserving classification – namely, learning a classifier f...
Differential privacy is the now de facto industry standard for ensuring privacy while publicly relea...
Differential privacy (DP) uses a probabilistic framework to measure the level of privacy protection ...
Algorithms such as Differentially Private SGD enable training machine learning models with formal pr...