International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonlinearities, shedding light on novel distribution properties at the level of the neural network units. The main thrust of the paper is to establish that the prior distribution induced on the units before and after activation becomes increasingly heavier-tailed with depth. We show that first layer units are Gaussian, second layer units are sub-Exponential, and we introduce sub-Weibull distributions to characterize the deeper layers units. This result provides new theoretical insight on deep Bayesian neural networks, underpinning their practical potential. The workshop paper is based on the original paper Vladimirova et al. ...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
The paper deals with learning probability distributions of observed data by artificial neural networ...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
10 pages, 5 figures, ICML'19 conferenceInternational audienceWe investigate deep Bayesian neural net...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. Ho...
Bayesian inference is known to provide a general framework for incorporating prior knowledge or spec...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
The paper deals with learning probability distributions of observed data by artificial neural networ...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
10 pages, 5 figures, ICML'19 conferenceInternational audienceWe investigate deep Bayesian neural net...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. Ho...
Bayesian inference is known to provide a general framework for incorporating prior knowledge or spec...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
International audienceThe connection between Bayesian neural networks and Gaussian processes gained ...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
The paper deals with learning probability distributions of observed data by artificial neural networ...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...