We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Process family that models translation equivariance in the data. Translation equivariance is an important inductive bias for many learning problems including time series modelling, spatial data, and images. The model embeds data sets into an infinite-dimensional function space as opposed to a finite-dimensional vector space. To formalize this notion, we extend the theory of neural representations of sets to include functional representations, and demonstrate that any translation-equivariant embedding can be represented using a convolutional deep set. We evaluate ConvCNPs in several settings, demonstrating that they achieve state-of-the-art perfor...
This repository contains the datasets used in our paper "Generalization capabilities of translationa...
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of f...
NeurIPS 2019International audienceDespite the phenomenal success of deep neural networks in a broad ...
We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Proc...
Stationary stochastic processes (SPs) are a key component of many probabilistic models, such as thos...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that ma...
Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning models which leverage t...
Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attractive family of meta-learning m...
In this thesis we have looked into the complexity of neural networks. Especially convolutional neura...
Convolutional Neural Networks (CNNs) have shown to yield very strong results in several Computer Vis...
The purpose of this short and simple note is to clarify a common misconception about convolutional n...
The aggressive resurgence of convolutional neural network (CNN) models for prediction has led to new...
A natural progression in machine learning research is to automate and learn from data increasingly m...
Equivariances provide useful inductive biases in neural network modeling, with the translation equiv...
This repository contains the datasets used in our paper "Generalization capabilities of translationa...
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of f...
NeurIPS 2019International audienceDespite the phenomenal success of deep neural networks in a broad ...
We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Proc...
Stationary stochastic processes (SPs) are a key component of many probabilistic models, such as thos...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that ma...
Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning models which leverage t...
Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attractive family of meta-learning m...
In this thesis we have looked into the complexity of neural networks. Especially convolutional neura...
Convolutional Neural Networks (CNNs) have shown to yield very strong results in several Computer Vis...
The purpose of this short and simple note is to clarify a common misconception about convolutional n...
The aggressive resurgence of convolutional neural network (CNN) models for prediction has led to new...
A natural progression in machine learning research is to automate and learn from data increasingly m...
Equivariances provide useful inductive biases in neural network modeling, with the translation equiv...
This repository contains the datasets used in our paper "Generalization capabilities of translationa...
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of f...
NeurIPS 2019International audienceDespite the phenomenal success of deep neural networks in a broad ...