The Laplace approximation yields a tractable marginal likelihood for Bayesian neural networks. This enables empirical Bayes methods for optimizing BNN hyperparameters directly on training data, even if validation data is unavailable. Sequential decision-making problems are typical instances of settings that rely on training data alone. Hence, this thesis asks whether hyperparameter optimization using the Laplace marginal likelihood benefits Bayesian neural networks on sequential decision-making problems. The answer is mixed: while online model selection improves decision-making performance for large sample sizes, maximum marginal likelihood models in the small-data regime behave like constant predictors. Therefore, this thesis further inves...
Advances in machine learning have had, and continue to have, a profound effect on scientific researc...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Deep neural networks have recently become astonishingly successful at many machine learning problems...
How do we compare between hypotheses that are entirely consistent with observations? The marginal li...
Approximate marginal Bayesian computation and inference are developed for neural network models. The...
AbstractWe describe two specific examples of neural-Bayesian approaches for complex modeling tasks: ...
Since Bayesian learning for neural networks was introduced by MacKay it was applied to real world pr...
The goal of this thesis was to implement a practical tool for optimizing hy- perparameters of neural...
We give a short review on Bayesian techniques for neural networks and demonstrate the advantages of ...
Summary The application of the Bayesian learning paradigm to neural networks results in a flexi-ble ...
Includes supplementary materials for the online appendix.We propose the approximate Laplace approxim...
National audience<p>One common problem in building deep learning architectures is the choice of the ...
Advances in machine learning have had, and continue to have, a profound effect on scientific researc...
Abstract. We discuss Bayesian methods for model averaging and model selection among Bayesian-network...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Advances in machine learning have had, and continue to have, a profound effect on scientific researc...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Deep neural networks have recently become astonishingly successful at many machine learning problems...
How do we compare between hypotheses that are entirely consistent with observations? The marginal li...
Approximate marginal Bayesian computation and inference are developed for neural network models. The...
AbstractWe describe two specific examples of neural-Bayesian approaches for complex modeling tasks: ...
Since Bayesian learning for neural networks was introduced by MacKay it was applied to real world pr...
The goal of this thesis was to implement a practical tool for optimizing hy- perparameters of neural...
We give a short review on Bayesian techniques for neural networks and demonstrate the advantages of ...
Summary The application of the Bayesian learning paradigm to neural networks results in a flexi-ble ...
Includes supplementary materials for the online appendix.We propose the approximate Laplace approxim...
National audience<p>One common problem in building deep learning architectures is the choice of the ...
Advances in machine learning have had, and continue to have, a profound effect on scientific researc...
Abstract. We discuss Bayesian methods for model averaging and model selection among Bayesian-network...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Advances in machine learning have had, and continue to have, a profound effect on scientific researc...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Deep neural networks have recently become astonishingly successful at many machine learning problems...