International audienceThis paper deals with probabilistic models, that take the form of mixtures of Student distributions. Student distributions are known to be more statistically robust than Gaussian distributions, with regard to outliers (i.e. data that cannot be reasonnably explained by any component in the mixture and that do not justifiy an extra component. Our contribution is as follows : we show how several mixtures of Student distributions may be agregated into a single mixture, without resorting to sampling. The trick is that, as is well known, a Student distribution may be expressed as an infinite mixture of Gaussians, where the variances follow a Gamma distribution
This thesis presents new methods for mixture model learning based on information geometry. We focus ...
Dirichlet process mixture of Gaussians (DPMG) has been used in the literature for clustering and den...
This chapter is dedicated to model-based supervised and unsuper-vised classification. Probability di...
International audienceThis paper addresses merging of Gaussian mixture models, which answers growing...
Abstract. Bayesian approaches to density estimation and clustering using mixture distributions allow...
In this dissertation, we extend several relatively new developments in statistical model selection a...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
International audienceAggregating statistical representations of classes is an important task for cu...
Mixture models form one of the most fundamental classes of generative models for clustered data...
This thesis deals with the distributed statistical estimation, with its motivation from, and appli- ...
We discuss recent results giving algorithms for learning mixtures of unstructured distributions
Abstract. Gaussian mixture models are a widespread tool for mod-eling various and complex probabilit...
A method for implicit variable selection in mixture-of-experts frameworks is proposed. We introduce...
International audienceThis paper proposes a solution to the problem of aggre- gating versatile proba...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
This thesis presents new methods for mixture model learning based on information geometry. We focus ...
Dirichlet process mixture of Gaussians (DPMG) has been used in the literature for clustering and den...
This chapter is dedicated to model-based supervised and unsuper-vised classification. Probability di...
International audienceThis paper addresses merging of Gaussian mixture models, which answers growing...
Abstract. Bayesian approaches to density estimation and clustering using mixture distributions allow...
In this dissertation, we extend several relatively new developments in statistical model selection a...
A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algor...
International audienceAggregating statistical representations of classes is an important task for cu...
Mixture models form one of the most fundamental classes of generative models for clustered data...
This thesis deals with the distributed statistical estimation, with its motivation from, and appli- ...
We discuss recent results giving algorithms for learning mixtures of unstructured distributions
Abstract. Gaussian mixture models are a widespread tool for mod-eling various and complex probabilit...
A method for implicit variable selection in mixture-of-experts frameworks is proposed. We introduce...
International audienceThis paper proposes a solution to the problem of aggre- gating versatile proba...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
This thesis presents new methods for mixture model learning based on information geometry. We focus ...
Dirichlet process mixture of Gaussians (DPMG) has been used in the literature for clustering and den...
This chapter is dedicated to model-based supervised and unsuper-vised classification. Probability di...