Deep Gaussian Processes (DGPs) are multi-layer, flexible extensions of Gaussian Processes but their training remains challenging. Sparse approximations simplify the training but often require optimization over a large number of inducing inputs and their locations across layers. In this paper, we simplify the training by setting the locations to a fixed subset of data and sampling the inducing inputs from a variational distribution. This reduces the trainable parameters and computation cost without significant performance degradations, as demonstrated by our empirical results on regression problems. Our modifications simplify and stabilize DGP training while making it amenable to sampling schemes for setting the inducing inputs. © 2021 37th...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...