The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dimensionality reduction that has been widely applied. However, the current approach for training GP-LVMs is based on maximum likelihood, where the latent projection variables are maximised over rather than integrated out. In this paper we present a Bayesian method for training GP-LVMs by introducing a non-standard variational inference framework that allows to approximately integrate out the latent variables and subsequently train a GP-LVM by maximising an analytic lower bound on the exact marginal likelihood. We apply this method for learning a GP-LVM from i.i.d. observations and for learning non-linear dynamical systems where the observations ...
The analysis of time series data is important in fields as disparate as the social sciences, biology...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Real engineering and scientific applications often involve one or more qualitative inputs. Standard ...
A Bayesian inference framework for supervised Gaussian process latent variable models is introduced....
Uncertainty propagation across components of complex probabilistic models is vital for improving reg...
Latent Gaussian models (LGMs) are perhaps the most commonly used class of models in statistical appl...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian Process Latent Variable Model (GPLVM) is a flexible framework to handle uncertain inputs in...
The analysis of time series data is important in fields as disparate as the social sciences, biology...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Real engineering and scientific applications often involve one or more qualitative inputs. Standard ...
A Bayesian inference framework for supervised Gaussian process latent variable models is introduced....
Uncertainty propagation across components of complex probabilistic models is vital for improving reg...
Latent Gaussian models (LGMs) are perhaps the most commonly used class of models in statistical appl...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian Process Latent Variable Model (GPLVM) is a flexible framework to handle uncertain inputs in...
The analysis of time series data is important in fields as disparate as the social sciences, biology...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
The results in this thesis are based on applications of the expectation propagation algorithm to app...