A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture distributions and its performance is compared with the expectation-maximisation (EM) algorithm. The BSOM is found to yield as good results as the well-known EM algorithm but with much fewer iterations and, more importantly it can be used as an on-line training method. The neighbourhood function and distance measures of the traditional SOM [3] are replaced by the neuron's on-line estimated posterior probabilities, which can be interpreted as a Bayesian inference of the neuron's opportunity to share in the winning response and so to adapt to the input pattern. Such posteriors starting from uniform priors are gradually sharpened when m...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...
A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture...
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. I...
We build up the mathematical connection between the "Expectation-Maximization" (EM) algori...
The learning of variational inference can be widely seen as first estimating the class assignment va...
An extended self-organising learning scheme is proposed, namely the Bayesian self-organising map (BS...
This thesis mainly propose variational inference for Bayesian mixture models and their applications ...
This paper presents a scheme for unsupervised classification with Gaussian mixture models by means o...
In this paper, we address the problem of learning discrete Bayesian networks from noisy data. A grap...
This paper uses Gaussian mixture model instead of linear Gaussian model to fit the distribution of e...
A completely unsupervised mixture distribution network, namely the self-organising mixture network, ...
Abstract—In this paper, we propose a new expectation-maximization (EM) algorithm, named GMM-EM, to b...
This paper 1 proposes a technique for simplifying a given Gaussian mixture model, i.e. reformulating...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...
A Bayesian SOM (BSOM) [8], is proposed and applied to the unsupervised learning of Gaussian mixture...
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. I...
We build up the mathematical connection between the "Expectation-Maximization" (EM) algori...
The learning of variational inference can be widely seen as first estimating the class assignment va...
An extended self-organising learning scheme is proposed, namely the Bayesian self-organising map (BS...
This thesis mainly propose variational inference for Bayesian mixture models and their applications ...
This paper presents a scheme for unsupervised classification with Gaussian mixture models by means o...
In this paper, we address the problem of learning discrete Bayesian networks from noisy data. A grap...
This paper uses Gaussian mixture model instead of linear Gaussian model to fit the distribution of e...
A completely unsupervised mixture distribution network, namely the self-organising mixture network, ...
Abstract—In this paper, we propose a new expectation-maximization (EM) algorithm, named GMM-EM, to b...
This paper 1 proposes a technique for simplifying a given Gaussian mixture model, i.e. reformulating...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
We compare EM, SEM, and MCMC algorithms to estimate the parameters of the Gaussian mixture model. We...
www-public.int-evry.fr/˜pieczyn The idea behind the Pairwise Mixture Model (PMM) we pro-pose in this...