The recent tremendous success of unsupervised word embeddings in a multitude of applications raises the obvious question if similar methods could be derived to improve embeddings (i.e. semantic representations) of word sequences as well. We present a simple but efficient unsupervised objective to train distributed representations of sentences. Our method outperforms the state-of-the-art unsupervised models on most benchmark tasks, highlighting the robustness of the produced general-purpose sentence embeddings
For a long time, natural language processing (NLP) has relied on generative models with task specifi...
Word embedding has been widely used in many natural language processing tasks. In this paper, we foc...
Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have p...
We propose a new neural model for word embeddings, which uses Unitary Matrices as the primary device...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on lar...
Most unsupervised NLP models represent each word with a single point or single region in semantic sp...
The field of Natural Language Processing (NLP) has progressed rapidly in recent years due to the evo...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
abstract: In recent years, several methods have been proposed to encode sentences into fixed length ...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
A variety of contextualised language models have been proposed in the NLP community, which are train...
To obtain high-quality sentence embeddings from pretrained language models (PLMs), they must either ...
Distributed word representations have been widely used and proven to be useful in quite a few natura...
We address the task of unsupervised Seman- tic Textual Similarity (STS) by ensembling di- verse pre-...
For a long time, natural language processing (NLP) has relied on generative models with task specifi...
Word embedding has been widely used in many natural language processing tasks. In this paper, we foc...
Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have p...
We propose a new neural model for word embeddings, which uses Unitary Matrices as the primary device...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on lar...
Most unsupervised NLP models represent each word with a single point or single region in semantic sp...
The field of Natural Language Processing (NLP) has progressed rapidly in recent years due to the evo...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
abstract: In recent years, several methods have been proposed to encode sentences into fixed length ...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
A variety of contextualised language models have been proposed in the NLP community, which are train...
To obtain high-quality sentence embeddings from pretrained language models (PLMs), they must either ...
Distributed word representations have been widely used and proven to be useful in quite a few natura...
We address the task of unsupervised Seman- tic Textual Similarity (STS) by ensembling di- verse pre-...
For a long time, natural language processing (NLP) has relied on generative models with task specifi...
Word embedding has been widely used in many natural language processing tasks. In this paper, we foc...
Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have p...