Blind separation of signals through the info-max algorithm may be viewed as maximum likelihood learning in a latent variable model. In this paper we present an alternative approach to maximum likelihood learning in these models, namely Bayesian inference. It has already been shown how Bayesian inference can be applied to determine latent dimensionality in principal component analysis models (Bishop, 1999a). In this paper we derive a similar approach for removing unecessary source dimensions in an independent component analysis model. We present results on a toy data-set and on some artificially mixed images.
International audienceA central issue in dimension reduction is choosing a sensible number of dimens...
46 Bayesian learning of latent variable models 2.1 Bayesian modeling and variational learning Unsupe...
International audienceSolving a Source separation problem using a maximum likelihood approach offers...
42 pages, 16 figures, 1 tableWe consider the problem of reducing the dimensions of parameters and da...
International audienceWe present a Bayesian model selection approach to estimate the intrinsic dimen...
Supervised dimensionality reduction has shown great advantages in finding predictive subspaces. Prev...
AbstractWe show that different theories recently proposed for independent component analysis (ICA) l...
International audienceDimensionality reduction can be efficiently achieved by generative latent vari...
<p>The problem of learning a latent model for sparse or low-dimensional representation of high-dimen...
In an exploratory approach to data analysis, it is often useful to consider the observations as gene...
International audienceWe present a Bayesian model selection approach to estimate the intrinsic dimen...
International audienceA central issue in dimension reduction is choosing a sensible number of dimens...
This thesis proposes Bayesian parametric and nonparametric models for signal representation. The fir...
With modern high-dimensional data, complex statistical models are necessary, requiring computational...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
International audienceA central issue in dimension reduction is choosing a sensible number of dimens...
46 Bayesian learning of latent variable models 2.1 Bayesian modeling and variational learning Unsupe...
International audienceSolving a Source separation problem using a maximum likelihood approach offers...
42 pages, 16 figures, 1 tableWe consider the problem of reducing the dimensions of parameters and da...
International audienceWe present a Bayesian model selection approach to estimate the intrinsic dimen...
Supervised dimensionality reduction has shown great advantages in finding predictive subspaces. Prev...
AbstractWe show that different theories recently proposed for independent component analysis (ICA) l...
International audienceDimensionality reduction can be efficiently achieved by generative latent vari...
<p>The problem of learning a latent model for sparse or low-dimensional representation of high-dimen...
In an exploratory approach to data analysis, it is often useful to consider the observations as gene...
International audienceWe present a Bayesian model selection approach to estimate the intrinsic dimen...
International audienceA central issue in dimension reduction is choosing a sensible number of dimens...
This thesis proposes Bayesian parametric and nonparametric models for signal representation. The fir...
With modern high-dimensional data, complex statistical models are necessary, requiring computational...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
International audienceA central issue in dimension reduction is choosing a sensible number of dimens...
46 Bayesian learning of latent variable models 2.1 Bayesian modeling and variational learning Unsupe...
International audienceSolving a Source separation problem using a maximum likelihood approach offers...