Gaussian processes (GPs) are ubiquitously used in science and engineering as metamodels. Standard GPs, however, can only handle numerical or quantitative variables. In this paper, we introduce latent map Gaussian processes (LMGPs) that inherit the attractive properties of GPs but are also applicable to mixed data that have both quantitative and qualitative inputs. The core idea behind LMGPs is to learn a low-dimensional manifold where all qualitative inputs are represented by some quantitative features. To learn this manifold, we first assign a unique prior vector representation to each combination of qualitative inputs. We then use a linear map to project these priors on a manifold that characterizes the posterior representations. As the p...
Purely data-driven approaches for machine learning present difficulties when data are scarce relativ...
© 2016 IEEE. Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions ...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian processes (GPs) are ubiquitously used in science and engineering as metamodels. Standard GP...
Density modeling is notoriously difficult for high dimensional data. One approach to the problem is ...
Abstract. Density modeling is notoriously difficult for high dimensional data. One approach to the p...
Density modeling is notoriously difficult for high dimensional data. One approach to the problem is ...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
Triggered by a market relevant application that involves making joint predictions of pedestrian and ...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kern...
In nonlinear latent variable models or dy-namic models, if we consider the latent vari-ables as conf...
Latent variable models represent the probability density of data in a space of several dimensions in...
Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the struct...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
Purely data-driven approaches for machine learning present difficulties when data are scarce relativ...
© 2016 IEEE. Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions ...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian processes (GPs) are ubiquitously used in science and engineering as metamodels. Standard GP...
Density modeling is notoriously difficult for high dimensional data. One approach to the problem is ...
Abstract. Density modeling is notoriously difficult for high dimensional data. One approach to the p...
Density modeling is notoriously difficult for high dimensional data. One approach to the problem is ...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
Triggered by a market relevant application that involves making joint predictions of pedestrian and ...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kern...
In nonlinear latent variable models or dy-namic models, if we consider the latent vari-ables as conf...
Latent variable models represent the probability density of data in a space of several dimensions in...
Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the struct...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
Purely data-driven approaches for machine learning present difficulties when data are scarce relativ...
© 2016 IEEE. Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions ...
We introduce a variational inference framework for training the Gaussian process latent variable mod...