The prototypical network is a prototype classifier based on meta-learning and is widely used for few-shot learning because it classifies unseen examples by constructing class-specific prototypes without adjusting hyper-parameters during meta-testing. Interestingly, recent research has attracted a lot of attention, showing that training a new linear classifier, which does not use a meta-learning algorithm, performs comparably with the prototypical network. However, the training of a new linear classifier requires the retraining of the classifier every time a new class appears. In this paper, we analyze how a prototype classifier works equally well without training a new linear classifier or meta-learning. We experimentally find that directly...
In this paper, we move towards combining large parametric models with non-parametric prototypical ne...
Gradient-based meta-learning techniques aim to distill useful prior knowledge from a set of training...
Few-shot learning for neural networks (NNs) is an important problem that aims to train NNs with a fe...
Few-shot classification aims to adapt to new tasks with limited labeled examples. To fully use the a...
Few-Shot Learning (FSL) is a challenging task, i.e., how to recognize novel classes with few example...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. A variety of...
Prototypical network for Few shot learning tries to learn an embedding function in the encoder that ...
Probabilistic meta-learning methods recently have achieved impressive success in few-shot image clas...
We introduce ProtoPool, an interpretable image classification model with a pool of prototypes shared...
Model-agnostic meta-learning (MAML) is arguably one of the most popular meta-learning algorithms now...
In this paper, we consider the framework of multi-task representation (MTR) learning where the goal ...
A two-stage training paradigm consisting of sequential pre-training and meta-training stages has bee...
The development of complex, powerful classifiers and their constant improvement have contributed muc...
One of the fundamental problems in machine learning is training high-quality neural network models u...
Incorporating large-scale pre-trained models with the prototypical neural networks is a de-facto par...
In this paper, we move towards combining large parametric models with non-parametric prototypical ne...
Gradient-based meta-learning techniques aim to distill useful prior knowledge from a set of training...
Few-shot learning for neural networks (NNs) is an important problem that aims to train NNs with a fe...
Few-shot classification aims to adapt to new tasks with limited labeled examples. To fully use the a...
Few-Shot Learning (FSL) is a challenging task, i.e., how to recognize novel classes with few example...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. A variety of...
Prototypical network for Few shot learning tries to learn an embedding function in the encoder that ...
Probabilistic meta-learning methods recently have achieved impressive success in few-shot image clas...
We introduce ProtoPool, an interpretable image classification model with a pool of prototypes shared...
Model-agnostic meta-learning (MAML) is arguably one of the most popular meta-learning algorithms now...
In this paper, we consider the framework of multi-task representation (MTR) learning where the goal ...
A two-stage training paradigm consisting of sequential pre-training and meta-training stages has bee...
The development of complex, powerful classifiers and their constant improvement have contributed muc...
One of the fundamental problems in machine learning is training high-quality neural network models u...
Incorporating large-scale pre-trained models with the prototypical neural networks is a de-facto par...
In this paper, we move towards combining large parametric models with non-parametric prototypical ne...
Gradient-based meta-learning techniques aim to distill useful prior knowledge from a set of training...
Few-shot learning for neural networks (NNs) is an important problem that aims to train NNs with a fe...