Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights instead of a single set, having significant advantages in terms of, e.g., interpretability, multi-task learning, and calibration. Because of the intractability of the resulting optimization problem, most BNNs are either sampled through Monte Carlo methods, or trained by minimizing a suitable Evidence Lower BOund (ELBO) on a variational approximation. In this paper, we propose an optimized version of the latter, wherein we replace the Kullback–Leibler divergence in the ELBO term with a Maximum Mean Discrepancy (MMD) estimator, inspired by recent work in variational inference. After motivating our proposal based on the properties of the MMD term,...
Bayesian Neural Networks provide a tool to estimate the uncertainty of a neural network by consideri...
Bayesian neural networks (BNNs) offer a promising probabilistic take on neural networks, allowing un...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
The ability to output accurate predictive uncertainty estimates is vital to a reliable classifier. S...
A classic inferential statistical problem is the goodness-of-fit (GOF) test. Such a test can be chal...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Máster Universitario en en Investigación e Innovación en Inteligencia Computacional y Sistemas Inter...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
An artificial neural network (ANN) is a powerful machine learning method that is used in many modern...
Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Network...
We provide a rigorous analysis of training by variational inference (VI) of Bayesian neural networks...
Bayesian Neural Networks provide a tool to estimate the uncertainty of a neural network by consideri...
Bayesian neural networks (BNNs) offer a promising probabilistic take on neural networks, allowing un...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
The ability to output accurate predictive uncertainty estimates is vital to a reliable classifier. S...
A classic inferential statistical problem is the goodness-of-fit (GOF) test. Such a test can be chal...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Máster Universitario en en Investigación e Innovación en Inteligencia Computacional y Sistemas Inter...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
An artificial neural network (ANN) is a powerful machine learning method that is used in many modern...
Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Network...
We provide a rigorous analysis of training by variational inference (VI) of Bayesian neural networks...
Bayesian Neural Networks provide a tool to estimate the uncertainty of a neural network by consideri...
Bayesian neural networks (BNNs) offer a promising probabilistic take on neural networks, allowing un...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...