Neural processes (NPs) constitute a family of variational approximate models for stochastic processes with promising properties in computational efficiency and uncertainty quantification. These processes use neural networks with latent variable inputs to induce a predictive distribution. However, the expressiveness of vanilla NPs is limited as they only use a global latent variable, while target-specific local variation may be crucial sometimes. To address this challenge, we investigate NPs systematically and present a new variant of NP model that we call Doubly Stochastic Variational Neural Process (DSVNP). This model combines the global latent variable and local latent variables for prediction. We evaluate this model in several experiment...
We present a novel extension of multi-output Gaussian processes for handling heterogeneous outputs. ...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
We introduce the implicit processes (IPs), a stochastic process that places implicitly defined multi...
We view perceptual tasks such as vision and speech recognition as inference problems where the goal ...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Variational methods have been previously explored as a tractable approximation to Bayesian inference...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Abstract Neural Process (NP) fully combines the advantages of neural network and Gaussian Process (G...
Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that ma...
Science and engineering fields use computer simulation extensively. These simulations are often run ...
We present a novel extension of multi-output Gaussian processes for handling heterogeneous outputs. ...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
We introduce the implicit processes (IPs), a stochastic process that places implicitly defined multi...
We view perceptual tasks such as vision and speech recognition as inference problems where the goal ...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Bayesian neural networks (BNNs) hold great promise as a flexible and principled solution to deal wit...
Implicit processes (IPs) are a generalization of Gaussian processes (GPs). IPs may lack a closed-for...
Variational methods have been previously explored as a tractable approximation to Bayesian inference...
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, b...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Abstract Neural Process (NP) fully combines the advantages of neural network and Gaussian Process (G...
Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that ma...
Science and engineering fields use computer simulation extensively. These simulations are often run ...
We present a novel extension of multi-output Gaussian processes for handling heterogeneous outputs. ...
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. I...
Recent advances in statistical inference have significantly expanded the toolbox of probabilistic mo...