Training probability-density estimating neural networks with the expectation-maximization (EM) algorithm aims to maximize the likelihood of the training set and therefore leads to overfitting for sparse data. In this article, a regularization method for mixture models with generalized linear kernel centers is proposed, which adopts the Bayesian evidence approach and optimizes the hyperparameters of the prior by type II maximum likelihood. This includes a marginalization over the parameters, which is done by Laplace approximation and requires the derivation of the Hessian of the log-likelihood function. The incorporation of this approach into the standard training scheme leads to a modified form of the EM algorithm, which includes a regulari...
The Laplace approximation yields a tractable marginal likelihood for Bayesian neural networks. This ...
This paper reviews the Bayesian approach to learning in neural networks, then introduces a new adapt...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Summary The application of the Bayesian learning paradigm to neural networks results in a flexi-ble ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
We compare two regularization methods which can be used to im-prove the generalization capabilities ...
Ensemble learning by variational free energy minimization is a tool introduced to neural networks by...
This paper addresses the estimation of parameters of a Bayesian network from incomplete data. The ta...
The Bayesian framework offers a flexible tool for regularization in the high dimensional setting. In...
Since Bayesian learning for neural networks was introduced by MacKay it was applied to real world pr...
Bayesian treatments of learning in neural networks are typically based either on a local Gaussian ap...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
The Laplace approximation yields a tractable marginal likelihood for Bayesian neural networks. This ...
This paper reviews the Bayesian approach to learning in neural networks, then introduces a new adapt...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Summary The application of the Bayesian learning paradigm to neural networks results in a flexi-ble ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
Mixture Density Networks are a principled method to model conditional probability density functions ...
We compare two regularization methods which can be used to im-prove the generalization capabilities ...
Ensemble learning by variational free energy minimization is a tool introduced to neural networks by...
This paper addresses the estimation of parameters of a Bayesian network from incomplete data. The ta...
The Bayesian framework offers a flexible tool for regularization in the high dimensional setting. In...
Since Bayesian learning for neural networks was introduced by MacKay it was applied to real world pr...
Bayesian treatments of learning in neural networks are typically based either on a local Gaussian ap...
Deep Learning-based models are becoming more and more relevant for an increasing number of applicati...
The Laplace approximation yields a tractable marginal likelihood for Bayesian neural networks. This ...
This paper reviews the Bayesian approach to learning in neural networks, then introduces a new adapt...
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. ...