Machine Learning grew exponentially in the last decade and it is for sure a central topic in every scientific subject. The basic idea of ML is ``learn through examples''. To do so, one has to minimize the regularized empirical risk, which involves the choice of an error measure, called loss function and of a regularization term (introduced to prevent overfitting). It seems quite clear that choosing the right loss for the right problem can significantly make the difference. In this work, we present a large class of loss functions, named Fenchel-Young losses and introduced by Blondel et al. The name comes from a well-known inequality in convex analysis, named Fenchel-Young inequality, which binds a function with its conjugate. These loss func...
Margin maximizing properties play an important role in the analysis of classification models, such ...
Loss functions are central to machine learning because they are the means by which the quality of a ...
A central problem in statistical learning is to design prediction algorithms that not only perform w...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
In many classification procedures, the classification function is obtained (or trained) by minimizi...
© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and the...
Loss functions play a key role in machine learning optimization problems. Even with their widespread...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
Learning invariant representations is a critical first step in a number of machine learning tasks. A...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
In this letter, we investigate the impact of choosing different loss functions from the viewpoint of...
We consider the problem of supervised learning with convex loss functions and propose a new form of ...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
International audienceBuilding upon recent advances in entropy-regularized optimal transport, and up...
The combination of using loss functions that are both Bayes consistent and margin enforcing has lead...
Margin maximizing properties play an important role in the analysis of classification models, such ...
Loss functions are central to machine learning because they are the means by which the quality of a ...
A central problem in statistical learning is to design prediction algorithms that not only perform w...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
In many classification procedures, the classification function is obtained (or trained) by minimizi...
© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and the...
Loss functions play a key role in machine learning optimization problems. Even with their widespread...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
Learning invariant representations is a critical first step in a number of machine learning tasks. A...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
In this letter, we investigate the impact of choosing different loss functions from the viewpoint of...
We consider the problem of supervised learning with convex loss functions and propose a new form of ...
Problems of data classification can be studied in the framework of regularization theory as ill-pose...
International audienceBuilding upon recent advances in entropy-regularized optimal transport, and up...
The combination of using loss functions that are both Bayes consistent and margin enforcing has lead...
Margin maximizing properties play an important role in the analysis of classification models, such ...
Loss functions are central to machine learning because they are the means by which the quality of a ...
A central problem in statistical learning is to design prediction algorithms that not only perform w...