Recent researches reveal that deep neural networks are sensitive to label noises hence leading to poor generalization performance in some tasks. Although different robust loss functions have been proposed to remedy this issue, they suffer from an underfitting problem, thus are not sufficient to learn accurate models. On the other hand, the commonly used Cross Entropy (CE) loss, which shows high performance in standard supervised learning (with clean supervision), is non-robust to label noise. In this paper, we propose a general framework to learn robust deep neural networks with complementary loss functions. In our framework, CE and robust loss play complementary roles in a joint learning objective as per their learning sufficiency and robu...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
Regularization of (deep) learning models can be realized at the model, loss, or data level. As a tec...
In the process of machine learning, models are essentially defined by a group of parameters in multi...
Robust learning in presence of label noise is an important problem of current interest. Training dat...
Deep neural networks are able to memorize noisy labels easily with a softmax cross-entropy (CE) loss...
In many applications of classifier learning, training data suffers from label noise. Deep networks a...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Designing robust loss functions is popular in learning with noisy labels while existing designs did ...
Labelling of data for supervised learning canbe costly and time-consuming and the riskof incorporati...
Labelling of data for supervised learning canbe costly and time-consuming and the riskof incorporati...
Recent advances in Artificial Intelligence (AI) have been built on large scale datasets. These advan...
Learning with noisy labels is one of the most practical but challenging tasks in deep learning. One ...
We consider a number of enhancements to the standard neural network training paradigm. First, we sho...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
Regularization of (deep) learning models can be realized at the model, loss, or data level. As a tec...
In the process of machine learning, models are essentially defined by a group of parameters in multi...
Robust learning in presence of label noise is an important problem of current interest. Training dat...
Deep neural networks are able to memorize noisy labels easily with a softmax cross-entropy (CE) loss...
In many applications of classifier learning, training data suffers from label noise. Deep networks a...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Deep neural networks trained with standard cross-entropy loss memorize noisy labels, which degrades ...
Designing robust loss functions is popular in learning with noisy labels while existing designs did ...
Labelling of data for supervised learning canbe costly and time-consuming and the riskof incorporati...
Labelling of data for supervised learning canbe costly and time-consuming and the riskof incorporati...
Recent advances in Artificial Intelligence (AI) have been built on large scale datasets. These advan...
Learning with noisy labels is one of the most practical but challenging tasks in deep learning. One ...
We consider a number of enhancements to the standard neural network training paradigm. First, we sho...
Over the past decades, deep neural networks have achieved unprecedented success in image classificat...
Regularization of (deep) learning models can be realized at the model, loss, or data level. As a tec...
In the process of machine learning, models are essentially defined by a group of parameters in multi...