A method for implicit variable selection in mixture-of-experts frameworks is proposed. We introduce a prior structure where information is taken from a set of independent covariates. Robust class membership predictors are identified using a normal gamma prior. The resulting model setup is used in a finite mixture of Bernoulli distributions to find homogenous clusters of women in Mozambique based on their information sources on HIV. Fully Bayesian inference is carried out via the implementation of a Gibbs sampler
The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer appr...
The mixture of normals model has been extensively applied to density estimation problems. This pape...
We present a Bayesian framework for inferring the parameters of a mixture of experts model based on ...
<p>Clustering methods are designed to separate heterogeneous data into groups of similar objects suc...
We present an hierarchical Bayes approach to modeling parameter heterogeneity in generalized linear ...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
Latent variable models are used extensively in unsupervised learning within the Bayesian paradigm, t...
Researchers are usually interested in examining the impact of covariates when separating heterogeneo...
We describe a non-parametric Bayesian model using genotype data to classify individuals among popula...
The use of a finite mixture of normal distributions in model-based clustering allows to capture non...
When working with model-based classifications, finite mixture models are utilized to describe the di...
Abstract. Bayesian approaches to density estimation and clustering using mixture distributions allow...
In the framework of Bayesian model-based clustering based on a finite mixture of Gaussian distributi...
<p>In this thesis, we develop some Bayesian mixture density estimation for univariate and multivaria...
. The mixtures of experts (ME) model offers a modular structure suitable for a divideand -conquer ap...
The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer appr...
The mixture of normals model has been extensively applied to density estimation problems. This pape...
We present a Bayesian framework for inferring the parameters of a mixture of experts model based on ...
<p>Clustering methods are designed to separate heterogeneous data into groups of similar objects suc...
We present an hierarchical Bayes approach to modeling parameter heterogeneity in generalized linear ...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
Latent variable models are used extensively in unsupervised learning within the Bayesian paradigm, t...
Researchers are usually interested in examining the impact of covariates when separating heterogeneo...
We describe a non-parametric Bayesian model using genotype data to classify individuals among popula...
The use of a finite mixture of normal distributions in model-based clustering allows to capture non...
When working with model-based classifications, finite mixture models are utilized to describe the di...
Abstract. Bayesian approaches to density estimation and clustering using mixture distributions allow...
In the framework of Bayesian model-based clustering based on a finite mixture of Gaussian distributi...
<p>In this thesis, we develop some Bayesian mixture density estimation for univariate and multivaria...
. The mixtures of experts (ME) model offers a modular structure suitable for a divideand -conquer ap...
The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer appr...
The mixture of normals model has been extensively applied to density estimation problems. This pape...
We present a Bayesian framework for inferring the parameters of a mixture of experts model based on ...