Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors and approximate posterior distributions over neural network weights. Specifying meaningful weight priors is a challenging problem, particularly for scaling variational inference to deeper architectures involving high dimensional weight space. We propose MOdel Priors with Empirical Bayes using DNN (MOPED) method to choose informed weight priors in Bayesian neural networks. We formulate a two-stage hierarchical modeling, first find the maximum likelihood estimates of weights with DNN, and then set the weight priors using empirical Bayes approach to infer the posterior with variational inference. We empirically evaluate the proposed approach on r...
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. Ho...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Bayesian...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
Bayesian inference is known to provide a general framework for incorporating prior knowledge or spec...
Due to copyright restrictions, the access to the full text of this article is only available via sub...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
We consider the problem of Bayesian parameter estimation for deep neural networks, which is importan...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Uncertainty estimates are crucial in many deep learning problems, e.g. for active learning or safety...
Modern deep learning methods constitute incredibly powerful tools to tackle a myriad of challenging ...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. Ho...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...
Stochastic variational inference for Bayesian deep neural network (DNN) requires specifying priors a...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Bayesian...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
Bayesian inference is known to provide a general framework for incorporating prior knowledge or spec...
Due to copyright restrictions, the access to the full text of this article is only available via sub...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
We consider the problem of Bayesian parameter estimation for deep neural networks, which is importan...
In this work, I will focus on ways in which we can build machine learning models that appropriately ...
Uncertainty estimates are crucial in many deep learning problems, e.g. for active learning or safety...
Modern deep learning methods constitute incredibly powerful tools to tackle a myriad of challenging ...
The Bayesian treatment of neural networks dictates that a prior distribution is specified over their...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. Ho...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...
International audienceBayesian Neural Networks (BNNs) have been long considered an ideal, yet unscal...