Training deep neural networks (DNNs) with limited supervision has been a popular research topic as it can significantly alleviate the annotation burden. Self-training has been successfully applied in semi-supervised learning tasks, but one drawback of self-training is that it is vulnerable to the label noise from incorrect pseudo labels. Inspired by the fact that samples with similar labels tend to share similar representations, we develop a neighborhood-based sample selection approach to tackle the issue of noisy pseudo labels. We further stabilize self-training via aggregating the predictions from different rounds during sample selection. Experiments on eight tasks show that our proposed method outperforms the strongest self-training base...
Abstract—Practical machine learning and data mining prob-lems often face shortage of labeled trainin...
Recent advances in deep learning have relied on large, labelled datasets to train high-capacity mode...
Deep neural networks (DNNs) require large amounts of labeled data for model training. However, label...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
Training deep neural networks (DNNs) with noisy labels often leads to poorly generalized models as D...
Deep neural networks achieve remarkable performances on a wide range of tasks with the aid of large-...
Learning with noisy labels is one of the most practical but challenging tasks in deep learning. One ...
Due to the memorization effect in Deep Neural Networks (DNNs), training with noisy labels usually re...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
We propose self-adaptive training -- a unified training algorithm that dynamically calibrates and en...
Noisy Labels are commonly present in data sets automatically collected from the internet, mislabeled...
We propose the simple and efficient method of semi-supervised learning for deep neural networks. Bas...
Deep Neural Networks (DNNs) have been shown to be susceptible to memorization or overfitting in the ...
Abstract—Practical machine learning and data mining prob-lems often face shortage of labeled trainin...
Recent advances in deep learning have relied on large, labelled datasets to train high-capacity mode...
Deep neural networks (DNNs) require large amounts of labeled data for model training. However, label...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
Training deep neural networks (DNNs) with noisy labels often leads to poorly generalized models as D...
Deep neural networks achieve remarkable performances on a wide range of tasks with the aid of large-...
Learning with noisy labels is one of the most practical but challenging tasks in deep learning. One ...
Due to the memorization effect in Deep Neural Networks (DNNs), training with noisy labels usually re...
Reducing the amount of labels required to train convolutional neural networks without performance de...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
We propose self-adaptive training -- a unified training algorithm that dynamically calibrates and en...
Noisy Labels are commonly present in data sets automatically collected from the internet, mislabeled...
We propose the simple and efficient method of semi-supervised learning for deep neural networks. Bas...
Deep Neural Networks (DNNs) have been shown to be susceptible to memorization or overfitting in the ...
Abstract—Practical machine learning and data mining prob-lems often face shortage of labeled trainin...
Recent advances in deep learning have relied on large, labelled datasets to train high-capacity mode...
Deep neural networks (DNNs) require large amounts of labeled data for model training. However, label...