Reducing the amount of labels required to train convolutional neural networks without performance degradation is key to effectively reduce human annotation efforts. We propose Reliable Label Bootstrapping (ReLaB), an unsupervised preprossessing algorithm which improves the performance of semi-supervised algorithms in extremely low supervision settings. Given a dataset with few labeled samples, we first learn meaningful self-supervised, latent features for the data. Second, a label propagation algorithm propagates the known labels on the unsupervised features, effectively labeling the full dataset in an automatic fashion. Third, we select a subset of correctly labeled (reliable) samples using a label noise detection algorithm. Finally, we tr...
In this paper we revisit the idea of pseudo-labeling in the context of semi-supervised learning wher...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Training with fewer annotations is a key issue for applying deep models to various practical domains...
Noisy labels are an unavoidable consequence of labeling processes and detecting them is an important...
Noisy labels are an unavoidable consequence of labeling processes and detecting them is an important...
Supervised learning, the standard paradigm in machine learning, only works well if a sufficiently la...
Supervised learning of deep neural networks heavily relies on large-scale datasets annotated by high...
Supervised learning of deep neural networks heavily relies on large-scale datasets annotated by high...
Given an unlabeled dataset and an annotation budget, we study how to selectively label a fixed numbe...
Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the sam...
While semi-supervised learning (SSL) algorithms provide an efficient way to make use of both labelle...
In this paper we revisit the idea of pseudo-labeling in the context of semi-supervised learning wher...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Training with fewer annotations is a key issue for applying deep models to various practical domains...
Noisy labels are an unavoidable consequence of labeling processes and detecting them is an important...
Noisy labels are an unavoidable consequence of labeling processes and detecting them is an important...
Supervised learning, the standard paradigm in machine learning, only works well if a sufficiently la...
Supervised learning of deep neural networks heavily relies on large-scale datasets annotated by high...
Supervised learning of deep neural networks heavily relies on large-scale datasets annotated by high...
Given an unlabeled dataset and an annotation budget, we study how to selectively label a fixed numbe...
Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the sam...
While semi-supervised learning (SSL) algorithms provide an efficient way to make use of both labelle...
In this paper we revisit the idea of pseudo-labeling in the context of semi-supervised learning wher...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...
Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active res...