Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal with uncertainty when learning from finite data. Among approaches to realize probabilistic inference in deep neural networks, variational Bayes (VB) is theoretically grounded, generally applicable, and computationally efficient. With wide recognition of potential advantages, why is it that variational Bayes has seen very limited practical use for BNNs in real applications? We argue that variational inference in neural networks is fragile: successful implementations require careful initialization and tuning of prior variances, as well as controlling the variance of Monte Carlo gradient estimates. We provide two innovations that aim to turn VB int...
We propose a new variational family for Bayesian neural networks. We decompose the variational poste...
The last decade witnessed a growing interest in Bayesian learning. Yet, the technicality of the topi...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Bayesian...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
Due to copyright restrictions, the access to the full text of this article is only available via sub...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
We propose a new variational family for Bayesian neural networks. We decompose the variational poste...
We propose a new variational family for Bayesian neural networks. We decompose the variational poste...
The last decade witnessed a growing interest in Bayesian learning. Yet, the technicality of the topi...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Bayesian...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
Due to copyright restrictions, the access to the full text of this article is only available via sub...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
We propose a new variational family for Bayesian neural networks. We decompose the variational poste...
We propose a new variational family for Bayesian neural networks. We decompose the variational poste...
The last decade witnessed a growing interest in Bayesian learning. Yet, the technicality of the topi...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...