Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust to overfitting, and provide well-calibrated predictive uncertainty. Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models has proved challenging. Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice. We present a doubly stochastic variational inference algorithm that does not force independence between layers. With our method of inference we demonstrate that a DGP model can be used effectively on data ranging in size from hundreds to a billion points. We provide strong empirical evidence that our i...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Inter-domain Gaussian processes (GPs) allow for high flexibility and low computational cost when per...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
Deep Gaussian Processes (DGPs) are multi-layer, flexible extensions of Gaussian Processes but their ...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Transformed Gaussian Processes (TGPs) are stochastic processes specified by transforming samples fro...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non...
Many modern machine learning methods, including deep neural networks, utilize a discrete sequence of...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Inter-domain Gaussian processes (GPs) allow for high flexibility and low computational cost when per...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
Deep Gaussian Processes (DGPs) are multi-layer, flexible extensions of Gaussian Processes but their ...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Transformed Gaussian Processes (TGPs) are stochastic processes specified by transforming samples fro...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non...
Many modern machine learning methods, including deep neural networks, utilize a discrete sequence of...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Inter-domain Gaussian processes (GPs) allow for high flexibility and low computational cost when per...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...