International audienceThe connection between Bayesian neural networks and Gaussian processes gained a lot of attention in the last few years, with the flagship result that hidden units converge to a Gaussian process limit when the layers width tends to infinity. Underpinning this result is the fact that hidden units become independent in the infinite-width limit. Our aim is to shed some light on hidden units dependence properties in practical finite-width Bayesian neural networks. In addition to theoretical results, we assess empirically the depth and width impacts on hidden units dependence properties
This thesis aims to study recent theoretical work in machine learning research that seeks to better ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
Understanding capabilities and limitations of different network architectures is of fundamental impo...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
It took until the last decade to finally see a machine match human performance on essentially any ta...
10 pages, 5 figures, ICML'19 conferenceInternational audienceWe investigate deep Bayesian neural net...
Comparing Bayesian neural networks (BNNs) with different widths is challenging because, as the width...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
Les réseaux neuronaux (RN) sont des outils efficaces qui atteignent des performances de pointe dans ...
The need to avoid confident predictions on unfamiliar data has sparked interest in out-of-distributi...
This thesis aims to study recent theoretical work in machine learning research that seeks to better ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
Understanding capabilities and limitations of different network architectures is of fundamental impo...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
It took until the last decade to finally see a machine match human performance on essentially any ta...
10 pages, 5 figures, ICML'19 conferenceInternational audienceWe investigate deep Bayesian neural net...
Comparing Bayesian neural networks (BNNs) with different widths is challenging because, as the width...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
Les réseaux neuronaux (RN) sont des outils efficaces qui atteignent des performances de pointe dans ...
The need to avoid confident predictions on unfamiliar data has sparked interest in out-of-distributi...
This thesis aims to study recent theoretical work in machine learning research that seeks to better ...
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their accuracy ...
Understanding capabilities and limitations of different network architectures is of fundamental impo...