Contrastive learning has become a new paradigm for unsupervised sentence embeddings. Previous studies focus on instance-wise contrastive learning, attempting to construct positive pairs with textual data augmentation. In this paper, we propose a novel Contrastive learning method with Prompt-derived Virtual semantic Prototypes (ConPVP). Specifically, with the help of prompts, we construct virtual semantic prototypes to each instance, and derive negative prototypes by using the negative form of the prompts. Using a prototypical contrastive loss, we enforce the anchor sentence embedding to be close to its corresponding semantic prototypes, and far apart from the negative prototypes as well as the prototypes of other sentences. Extensive experi...
Learning high-quality dialogue representations is essential for solving a variety of dialogue-orient...
To obtain high-quality sentence embeddings from pretrained language models (PLMs), they must either ...
Learning sentence embeddings in an unsupervised manner is fundamental in natural language processing...
Contrastive learning has been demonstrated to be effective in enhancing pre-trained language models ...
Several prior studies have suggested that word frequency biases can cause the Bert model to learn in...
Semantic representation learning for sentences is an important and well-studied problem in NLP. The ...
Unsupervised sentence representation learning is a fundamental problem in natural language processin...
Though offering amazing contextualized token-level representations, current pre-trained language mod...
Pretrained language models can be effectively stimulated by textual prompts or demonstrations, espec...
Contrastive learning-based methods, such as unsup-SimCSE, have achieved state-of-the-art (SOTA) perf...
In the context of continual learning, prototypes-as representative class embeddings-offer advantages...
Unsupervised sentence embeddings learning has been recently dominated by contrastive learning method...
The impressive performance of GPT-3 using natural language prompts and in-context learning has inspi...
Prompt learning recently become an effective linguistic tool to motivate the PLMs' knowledge on few-...
Contrastive learning methods achieve state-of-the-art results in unsupervised sentence representatio...
Learning high-quality dialogue representations is essential for solving a variety of dialogue-orient...
To obtain high-quality sentence embeddings from pretrained language models (PLMs), they must either ...
Learning sentence embeddings in an unsupervised manner is fundamental in natural language processing...
Contrastive learning has been demonstrated to be effective in enhancing pre-trained language models ...
Several prior studies have suggested that word frequency biases can cause the Bert model to learn in...
Semantic representation learning for sentences is an important and well-studied problem in NLP. The ...
Unsupervised sentence representation learning is a fundamental problem in natural language processin...
Though offering amazing contextualized token-level representations, current pre-trained language mod...
Pretrained language models can be effectively stimulated by textual prompts or demonstrations, espec...
Contrastive learning-based methods, such as unsup-SimCSE, have achieved state-of-the-art (SOTA) perf...
In the context of continual learning, prototypes-as representative class embeddings-offer advantages...
Unsupervised sentence embeddings learning has been recently dominated by contrastive learning method...
The impressive performance of GPT-3 using natural language prompts and in-context learning has inspi...
Prompt learning recently become an effective linguistic tool to motivate the PLMs' knowledge on few-...
Contrastive learning methods achieve state-of-the-art results in unsupervised sentence representatio...
Learning high-quality dialogue representations is essential for solving a variety of dialogue-orient...
To obtain high-quality sentence embeddings from pretrained language models (PLMs), they must either ...
Learning sentence embeddings in an unsupervised manner is fundamental in natural language processing...