We introduce a novel variational method that allows to approximately integrate out kernel hyperparameters, such as length-scales, in Gaussian process regression. This approach consists of a novel variant of the variational framework that has been recently developed for the Gaussian process latent variable model which ad-ditionally makes use of a standardised representation of the Gaussian process. We consider this technique for learning Mahalanobis distance metrics in a Gaussian process regression setting and provide experimental evaluations and comparisons with existing methods by considering datasets with high-dimensional inputs.
In multi-output regression applications the correlations between the response variables may vary wit...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
We introduce a novel variational method that allows to approximately integrate out kernel hyperparam...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Abstract. The contributions of this work are threefold. First, various metric learning techniques ar...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Training the Gaussian Process regression model on training centers only, which makes is applicable o...
International audienceMonge-Kantorovich distances, otherwise known as Wasserstein distances, have re...
In multi-output regression applications the correlations between the response variables may vary wit...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
We introduce a novel variational method that allows to approximately integrate out kernel hyperparam...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Abstract. The contributions of this work are threefold. First, various metric learning techniques ar...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Training the Gaussian Process regression model on training centers only, which makes is applicable o...
International audienceMonge-Kantorovich distances, otherwise known as Wasserstein distances, have re...
In multi-output regression applications the correlations between the response variables may vary wit...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...