Label noise is ubiquitous in various machine learning scenarios such as self-labeling with model predictions and erroneous data annotation. Many existing approaches are based on heuristics such as sample losses, which might not be flexible enough to achieve optimal solutions. Meta learning based methods address this issue by learning a data selection function, but can be hard to optimize. In light of these pros and cons, we propose Selection-Enhanced Noisy label Training (SENT) that does not rely on meta learning while having the flexibility of being data-driven. SENT transfers the noise distribution to a clean set and trains a model to distinguish noisy labels from clean ones using model-based features. Empirically, on a wide range of task...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Obtaining a sufficient number of accurate labels to form a training set for learning a classifier ca...
Despite the large progress in supervised learning with neural networks, there are significant challe...
Leveraging weak or noisy supervision for building effective machine learning models has long been an...
Learning with noisy labels is a vital topic for practical deep learning as models should be robust t...
Noisy Labels are commonly present in data sets automatically collected from the internet, mislabeled...
Designing robust loss functions is popular in learning with noisy labels while existing designs did ...
Recent studies on learning with noisy labels have shown remarkable performance by exploiting a small...
We investigate the problem of learning with noisy labels in real-world annotation scenarios, where n...
We investigate the problem of learning with noisy labels in real-world annotation scenarios, where n...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the sam...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Obtaining a sufficient number of accurate labels to form a training set for learning a classifier ca...
Despite the large progress in supervised learning with neural networks, there are significant challe...
Leveraging weak or noisy supervision for building effective machine learning models has long been an...
Learning with noisy labels is a vital topic for practical deep learning as models should be robust t...
Noisy Labels are commonly present in data sets automatically collected from the internet, mislabeled...
Designing robust loss functions is popular in learning with noisy labels while existing designs did ...
Recent studies on learning with noisy labels have shown remarkable performance by exploiting a small...
We investigate the problem of learning with noisy labels in real-world annotation scenarios, where n...
We investigate the problem of learning with noisy labels in real-world annotation scenarios, where n...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
In this paper, we address the problem of effectively self-training neural networks in a lowresource ...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the sam...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Most studies on learning from noisy labels rely on unrealistic models of i.i.d. label noise, such as...
Obtaining a sufficient number of accurate labels to form a training set for learning a classifier ca...