Cross entropy loss has served as the main objective function for classification-based tasks. Widely deployed for learning neural network classifiers, it shows both effectiveness and a probabilistic interpretation. Recently, after the success of self supervised contrastive representation learning methods, supervised contrastive methods have been proposed to learn representations and have shown superior and more robust performance, compared to solely training with cross entropy loss. However, cross entropy loss is still needed to train the final classification layer. In this work, we investigate the possibility of learning both the representation and the classifier using one objective function that combines the robustness of contrastive lear...
Contrastive Learning has recently received interest due to its success in self-supervised representa...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
We study the impact of different pruning techniques on the representation learned by deep neural net...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
Learning discriminative image representations plays a vital role in long-tailed image classification...
Is it possible to train several classifiers to perform meaningful crowd-sourcing to produce a better...
Deep neural networks are able to memorize noisy labels easily with a softmax cross-entropy (CE) loss...
Computational resources were provided by the e-INFRA CZ project (ID:90254), sup ported by the Minist...
The increasing size of modern datasets combined with the difficulty of obtaining real label informat...
Recent researches reveal that deep neural networks are sensitive to label noises hence leading to po...
In the last decade, Deep neural networks (DNNs) have been proven to outperform conventional machine ...
As a seminal tool in self-supervised representation learning, contrastive learning has gained unprec...
International audienceIn the last decade, Deep neural networks (DNNs) have been proven to outperform...
We study the impact of different pruning techniques on the representation learned by deep neural net...
Learning invariant representations is a critical first step in a number of machine learning tasks. A...
Contrastive Learning has recently received interest due to its success in self-supervised representa...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
We study the impact of different pruning techniques on the representation learned by deep neural net...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
Learning discriminative image representations plays a vital role in long-tailed image classification...
Is it possible to train several classifiers to perform meaningful crowd-sourcing to produce a better...
Deep neural networks are able to memorize noisy labels easily with a softmax cross-entropy (CE) loss...
Computational resources were provided by the e-INFRA CZ project (ID:90254), sup ported by the Minist...
The increasing size of modern datasets combined with the difficulty of obtaining real label informat...
Recent researches reveal that deep neural networks are sensitive to label noises hence leading to po...
In the last decade, Deep neural networks (DNNs) have been proven to outperform conventional machine ...
As a seminal tool in self-supervised representation learning, contrastive learning has gained unprec...
International audienceIn the last decade, Deep neural networks (DNNs) have been proven to outperform...
We study the impact of different pruning techniques on the representation learned by deep neural net...
Learning invariant representations is a critical first step in a number of machine learning tasks. A...
Contrastive Learning has recently received interest due to its success in self-supervised representa...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
We study the impact of different pruning techniques on the representation learned by deep neural net...