Probabilistic meta-learning methods recently have achieved impressive success in few-shot image classification. However, they introduce a huge number of random variables for neural network weights and thus severe computational and inferential challenges. In this paper, we propose a novel probabilistic meta-learning method called amortized Bayesian prototype meta-learning. In contrast to previous methods, we introduce only a small number of random variables for latent class prototypes rather than a huge number for network weights; we learn to learn the posterior distributions of these latent prototypes in an amortized inference way with no need for an extra amortization network, such that we can easily approximate their posteriors conditiona...
Meta-learning has been shown to be an effective strategy for few-shot learning. The key idea is to l...
We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability...
International audienceWe propose a novel amortized variational inference scheme for an empirical Bay...
One of the fundamental problems in machine learning is training high-quality neural network models u...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. A variety of...
Added experiments with different network architectures and input image resolutionsInternational audi...
Few-shot learning aims to scale visual recognition to open-ended growth of new classes with limited ...
Few-shot learning focuses on learning a new visual concept with very limited labelled examples. A su...
The prototypical network is a prototype classifier based on meta-learning and is widely used for few...
This paper introduces a new framework for data efficient and versatile learning. Specifically: 1) We...
In recent years, there has been rapid progress in computing performance and communication techniques...
Neural networks are known to suffer from catastrophic forgetting when trained on sequential datasets...
Modern deep learning requires large-scale extensively labelled datasets for training. Few-shot learn...
Object classes that surround us have a natural tendency to emerge at varying levels of abstraction. ...
In few-shot classification, we are interested in learning algorithms that train a classifier from on...
Meta-learning has been shown to be an effective strategy for few-shot learning. The key idea is to l...
We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability...
International audienceWe propose a novel amortized variational inference scheme for an empirical Bay...
One of the fundamental problems in machine learning is training high-quality neural network models u...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. A variety of...
Added experiments with different network architectures and input image resolutionsInternational audi...
Few-shot learning aims to scale visual recognition to open-ended growth of new classes with limited ...
Few-shot learning focuses on learning a new visual concept with very limited labelled examples. A su...
The prototypical network is a prototype classifier based on meta-learning and is widely used for few...
This paper introduces a new framework for data efficient and versatile learning. Specifically: 1) We...
In recent years, there has been rapid progress in computing performance and communication techniques...
Neural networks are known to suffer from catastrophic forgetting when trained on sequential datasets...
Modern deep learning requires large-scale extensively labelled datasets for training. Few-shot learn...
Object classes that surround us have a natural tendency to emerge at varying levels of abstraction. ...
In few-shot classification, we are interested in learning algorithms that train a classifier from on...
Meta-learning has been shown to be an effective strategy for few-shot learning. The key idea is to l...
We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability...
International audienceWe propose a novel amortized variational inference scheme for an empirical Bay...