Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Networks (BNNs). However, the KL divergence has limitations such as unboundedness and asymmetry. We examine the Jensen-Shannon (JS) divergence that is more general, bounded, and symmetric. We formulate a novel loss function for BNNs based on the geometric JS divergence and show that the conventional KL divergence-based loss function is its special case. We evaluate the divergence part of the proposed loss function in a closed form for a Gaussian prior. For any other general prior, Monte Carlo approximations can be used. We provide algorithms for implementing both of these cases. We demonstrate that the proposed loss function offers an additional par...
To obtain uncertainty estimates with real-world Bayesian deep learning models, practical inference a...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
The ability to output accurate predictive uncertainty estimates is vital to a reliable classifier. S...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
Statistical divergences quantify the difference between probability distributions finding multiple u...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
Dropout, a stochastic regularisation technique for training of neural networks, has recently been re...
Kullback–Leibler divergence KL(p, q) is the standard measure of error when we have a true probabili...
Dropout, a stochastic regularisation technique for training of neural networks, has recently been re...
Recent work has attempted to directly approximate the `function-space' or predictive posterior distr...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
Neural networks have shown great predictive power when dealing with various unstructured data such a...
This paper introduces a new neural network based prior for real valued functions on $\mathbb R^d$ wh...
To obtain uncertainty estimates with real-world Bayesian deep learning models, practical inference a...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
The ability to output accurate predictive uncertainty estimates is vital to a reliable classifier. S...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
Statistical divergences quantify the difference between probability distributions finding multiple u...
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights in...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
Dropout, a stochastic regularisation technique for training of neural networks, has recently been re...
Kullback–Leibler divergence KL(p, q) is the standard measure of error when we have a true probabili...
Dropout, a stochastic regularisation technique for training of neural networks, has recently been re...
Recent work has attempted to directly approximate the `function-space' or predictive posterior distr...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
Neural networks have shown great predictive power when dealing with various unstructured data such a...
This paper introduces a new neural network based prior for real valued functions on $\mathbb R^d$ wh...
To obtain uncertainty estimates with real-world Bayesian deep learning models, practical inference a...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
The ability to output accurate predictive uncertainty estimates is vital to a reliable classifier. S...