International audienceDuring the last few years, deep supervised learning models have been shown to achieve state-of-the-art results for Natural Language Processing tasks. Most of these models are trained by minimizing the commonly used cross-entropy loss. However, the latter may suffer from several shortcomings such as sub-optimal generalization and unstable fine-tuning. Inspired by the recent works on self-supervised contrastive representation learning, we present SimSCL, a framework for binary text classification task that relies on two simple concepts: (i) Sampling positive and negative examples given an anchor by considering that sentences belonging to the same class as the anchor as positive examples and samples belonging to a differe...
Few-shot text classification has recently been promoted by the meta-learning paradigm which aims to ...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
The demand for Natural Language Processing has been thriving rapidly due to the various emerging Int...
International audienceIn the last decade, Deep neural networks (DNNs) have been proven to outperform...
In the last decade, Deep neural networks (DNNs) have been proven to outperform conventional machine ...
This paper presents SimCTC, a simple contrastive learning (CL) framework that greatly advances the s...
Unsupervised sentence representation learning is a fundamental problem in natural language processin...
Contrastive self-supervised learning has become a prominent technique in representation learning. Th...
Natural Language Inference (NLI) is a growingly essential task in natural language understanding, wh...
Despite their promising performance across various natural language processing (NLP) tasks, current ...
We propose a simple and general method to regularize the fine-tuning of Transformer-based encoders f...
Traditional text classification requires thousands of annotated data or an additional Neural Machine...
In this paper, we explore how to utilize pre-trained language model to perform few-shot text classif...
Contrastive Learning has recently received interest due to its success in self-supervised representa...
Semantic representation learning for sentences is an important and well-studied problem in NLP. The ...
Few-shot text classification has recently been promoted by the meta-learning paradigm which aims to ...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
The demand for Natural Language Processing has been thriving rapidly due to the various emerging Int...
International audienceIn the last decade, Deep neural networks (DNNs) have been proven to outperform...
In the last decade, Deep neural networks (DNNs) have been proven to outperform conventional machine ...
This paper presents SimCTC, a simple contrastive learning (CL) framework that greatly advances the s...
Unsupervised sentence representation learning is a fundamental problem in natural language processin...
Contrastive self-supervised learning has become a prominent technique in representation learning. Th...
Natural Language Inference (NLI) is a growingly essential task in natural language understanding, wh...
Despite their promising performance across various natural language processing (NLP) tasks, current ...
We propose a simple and general method to regularize the fine-tuning of Transformer-based encoders f...
Traditional text classification requires thousands of annotated data or an additional Neural Machine...
In this paper, we explore how to utilize pre-trained language model to perform few-shot text classif...
Contrastive Learning has recently received interest due to its success in self-supervised representa...
Semantic representation learning for sentences is an important and well-studied problem in NLP. The ...
Few-shot text classification has recently been promoted by the meta-learning paradigm which aims to ...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
The demand for Natural Language Processing has been thriving rapidly due to the various emerging Int...