Convex potential minimisation is the de facto approach to binary classification. However, Long and Servedio [2008] proved that under symmetric label noise (SLN), minimisation of any convex potential over a linear function class can result in classification performance equivalent to random guessing. This ostensibly shows that convex losses are not SLN-robust. In this paper, we propose a convex, classification-calibrated loss and prove that it is SLN-robust. The loss avoids the Long and Servedio [2008] result by virtue of being negatively unbounded. The loss is a modification of the hinge loss, where one does not clamp at zero; hence, we call it the unhinged loss. We show that the optimal unhinged solution is equivalent to that of a strongly ...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
The paper brings together methods from two disciplines: machine learning theory and robust statistic...
International audienceIn many real-world classification problems, the labels of training examples ar...
Convex potential minimisation is the de facto approach to binary classification. However, Long and S...
In many applications, the training data, from which one needs to learn a classifier, is corrupted wi...
We discuss binary classification from only pos-itive and unlabeled data (PU classification), which i...
In this paper, we theoretically study the problem of binary classification in the presence of random...
In this paper, we theoretically study the problem of binary classification in the presence of random...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
Extreme Classification (XC) refers to supervised learning where each training/test instance is label...
In many applications of classifier learning, training data suffers from label noise. Deep networks a...
Learning a classifier from positive and unlabeled data is an important class of classification probl...
Loss function plays an important role in data classification. Manyloss functions have been proposed ...
In many real-world classification problems, the labels of training examples are randomly corrupted. ...
In this letter, we investigate the impact of choosing different loss functions from the viewpoint of...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
The paper brings together methods from two disciplines: machine learning theory and robust statistic...
International audienceIn many real-world classification problems, the labels of training examples ar...
Convex potential minimisation is the de facto approach to binary classification. However, Long and S...
In many applications, the training data, from which one needs to learn a classifier, is corrupted wi...
We discuss binary classification from only pos-itive and unlabeled data (PU classification), which i...
In this paper, we theoretically study the problem of binary classification in the presence of random...
In this paper, we theoretically study the problem of binary classification in the presence of random...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
Extreme Classification (XC) refers to supervised learning where each training/test instance is label...
In many applications of classifier learning, training data suffers from label noise. Deep networks a...
Learning a classifier from positive and unlabeled data is an important class of classification probl...
Loss function plays an important role in data classification. Manyloss functions have been proposed ...
In many real-world classification problems, the labels of training examples are randomly corrupted. ...
In this letter, we investigate the impact of choosing different loss functions from the viewpoint of...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
The paper brings together methods from two disciplines: machine learning theory and robust statistic...
International audienceIn many real-world classification problems, the labels of training examples ar...