Recent works have investigated deep learning models trained by optimising PAC-Bayes bounds, with priors that are learnt on subsets of the data. This combination has been shown to lead not only to accurate classifiers, but also to remarkably tight risk certificates, bearing promise towards self-certified learning (i.e. use all the data to learn a predictor and certify its quality). In this work, we empirically investigate the role of the prior. We experiment on 6 datasets with different strategies and amounts of data to learn data-dependent PAC-Bayes priors, and we compare them in terms of their effect on test performance of the learnt predictors and tightness of their risk certificate. We ask what is the optimal amount of data which should ...
24 pages, including 2 pages of references and 10 pages of appendixIn machine learning, it is common ...
In recent years, Neural Networks (NN) have become a popular data-analytic tool in Statistics, Compu...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
This paper presents an empirical study regarding training probabilistic neural networks using traini...
International audienceA learning method is self-certified if it uses all available data to simultane...
In this paper we further develop the idea that the PAC-Bayes prior can be defined based on the data-...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
What is the best way to exploit extra data -- be it unlabeled data from the same task, or labeled da...
PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper...
The Bayesian posterior minimizes the "inferential risk" which itself bounds the "predictive risk". T...
L'ambition du présent mémoire est la présentation d'un ensemble de principes appelés la théorie PAC-...
This paper introduces a new neural network based prior for real valued functions on $\mathbb R^d$ wh...
Recent studies have empirically investigated different methods to train stochastic neural networks o...
24 pages, including 2 pages of references and 10 pages of appendixIn machine learning, it is common ...
In recent years, Neural Networks (NN) have become a popular data-analytic tool in Statistics, Compu...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
This paper presents an empirical study regarding training probabilistic neural networks using traini...
International audienceA learning method is self-certified if it uses all available data to simultane...
In this paper we further develop the idea that the PAC-Bayes prior can be defined based on the data-...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
What is the best way to exploit extra data -- be it unlabeled data from the same task, or labeled da...
PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper...
The Bayesian posterior minimizes the "inferential risk" which itself bounds the "predictive risk". T...
L'ambition du présent mémoire est la présentation d'un ensemble de principes appelés la théorie PAC-...
This paper introduces a new neural network based prior for real valued functions on $\mathbb R^d$ wh...
Recent studies have empirically investigated different methods to train stochastic neural networks o...
24 pages, including 2 pages of references and 10 pages of appendixIn machine learning, it is common ...
In recent years, Neural Networks (NN) have become a popular data-analytic tool in Statistics, Compu...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...