This paper proposes the minimization of α-divergences for approximate inference in the context of deep Gaussian processes (DGPs). The proposed method can be considered as a generalization of variational inference (VI) and expectation propagation (EP), two previously used methods for approximate inference in DGPs. Both VI and EP are based on the minimization of the Kullback-Leibler divergence. The proposed method is based on a scalable version of power expectation propagation, a method that introduces an extra parameter α that specifies the targeted α-divergence to be optimized. In particular, such a method can recover the VI solution when α → 0 and the EP solution when α → 1. An exhaustive experimental evaluation shows that the mini...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
International audienceMulti-fidelity approaches improve the inference of a high-fidelity model which...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief network bas...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Transformed Gaussian Processes (TGPs) are stochastic processes specified by transforming samples fro...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non...
Deep Gaussian Processes (DGPs) are multi-layer, flexible extensions of Gaussian Processes but their ...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
This paper introduces the $\textit{variational Rényi bound}$ (VR) that extends traditional variation...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
International audienceMulti-fidelity approaches improve the inference of a high-fidelity model which...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief network bas...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Transformed Gaussian Processes (TGPs) are stochastic processes specified by transforming samples fro...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non...
Deep Gaussian Processes (DGPs) are multi-layer, flexible extensions of Gaussian Processes but their ...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
This paper introduces the $\textit{variational Rényi bound}$ (VR) that extends traditional variation...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
International audienceMulti-fidelity approaches improve the inference of a high-fidelity model which...