A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture distributions and its performance is compared with the expectation-maximisation (EM) algorithm. The BSOM is found to yield as good results as the well-known EM algorithm but with much fewer iterations and, more importantly it can be used as an on-line training method. The neighbourhood function and distance measures of the traditional SOM [3] are replaced by the neuron's on-line estimated posterior probabilities, which can be interpreted as a Bayesian inference of the neuron's opportunity to share in the winning response and so to adapt to the input pattern. Such posteriors starting from uniform priors are gradually sharpened when more...
In this paper, we address the problem of learning discrete Bayesian networks from noisy data. A grap...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...
A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture ...
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. I...
We build up the mathematical connection between the "Expectation-Maximization" (EM) algori...
The learning of variational inference can be widely seen as first estimating the class assignment va...
This thesis mainly propose variational inference for Bayesian mixture models and their applications ...
An extended self-organising learning scheme is proposed, namely the Bayesian self-organising map (BS...
A completely unsupervised mixture distribution network, namely the self-organising mixture network, ...
Abstract—In this paper, we propose a new expectation-maximization (EM) algorithm, named GMM-EM, to b...
: We consider the approach to unsupervised learning whereby a normal mixture model is fitted to the ...
This paper uses Gaussian mixture model instead of linear Gaussian model to fit the distribution of e...
We propose an Gaussian Mixture Model (GMM) learning algorithm, based on our previous work of GMM exp...
The self-organizing mixture network (SOMN) is a learning algorithm for mixture densities, derived fr...
In this paper, we address the problem of learning discrete Bayesian networks from noisy data. A grap...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...
A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture ...
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. I...
We build up the mathematical connection between the "Expectation-Maximization" (EM) algori...
The learning of variational inference can be widely seen as first estimating the class assignment va...
This thesis mainly propose variational inference for Bayesian mixture models and their applications ...
An extended self-organising learning scheme is proposed, namely the Bayesian self-organising map (BS...
A completely unsupervised mixture distribution network, namely the self-organising mixture network, ...
Abstract—In this paper, we propose a new expectation-maximization (EM) algorithm, named GMM-EM, to b...
: We consider the approach to unsupervised learning whereby a normal mixture model is fitted to the ...
This paper uses Gaussian mixture model instead of linear Gaussian model to fit the distribution of e...
We propose an Gaussian Mixture Model (GMM) learning algorithm, based on our previous work of GMM exp...
The self-organizing mixture network (SOMN) is a learning algorithm for mixture densities, derived fr...
In this paper, we address the problem of learning discrete Bayesian networks from noisy data. A grap...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...