International audienceWord vector representations play a fundamental role in many NLP applications. Exploiting human-curated knowledge was proven to improve the quality of word embeddings and their performance on many downstream tasks. Retrofitting is a simple and popular technique for refining distributional word embeddings based on relations coming from a semantic lexicon. Inspired by this technique, we present two methods for incorporating knowledge into contextualized embeddings. We evaluate these methods with BERT embeddings on three biomedical datasets for relation extraction and one movie review dataset for sentiment analysis. We demonstrate that the retrofitted vectors do not substantially impact the performance for these tasks, and...
This paper presents the first unsupervised approach to lexical semantic change that makes use of con...
Vector space word representations are typically learned using only co-occurrence statistics from tex...
Semantic specialization of distributional word vectors, referred to as retrofitting, is a process of...
International audienceWord vector representations play a fundamental role in many NLP applications. ...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextual embeddings build multidimensional representations of word tokens based on their context o...
Verbs play a fundamental role in many biomedical tasks and applications such as relation and event e...
International audienceDomain adaptation of word embeddings has mainly been explored in the context o...
Despite the success of contextualized language models on various NLP tasks, it is still unclear what...
The often observed unavailability of large amounts of training data typically required by deep learn...
Large pre-trained language models such as BERT have been the driving force behind recent improvement...
The way the words are used evolves through time, mirroring cultural or technological evolution of so...
Word embeddings learned on unlabeled data are a popular tool in semantics, but may not capture the d...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
This paper presents the first unsupervised approach to lexical semantic change that makes use of con...
Vector space word representations are typically learned using only co-occurrence statistics from tex...
Semantic specialization of distributional word vectors, referred to as retrofitting, is a process of...
International audienceWord vector representations play a fundamental role in many NLP applications. ...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextual embeddings build multidimensional representations of word tokens based on their context o...
Verbs play a fundamental role in many biomedical tasks and applications such as relation and event e...
International audienceDomain adaptation of word embeddings has mainly been explored in the context o...
Despite the success of contextualized language models on various NLP tasks, it is still unclear what...
The often observed unavailability of large amounts of training data typically required by deep learn...
Large pre-trained language models such as BERT have been the driving force behind recent improvement...
The way the words are used evolves through time, mirroring cultural or technological evolution of so...
Word embeddings learned on unlabeled data are a popular tool in semantics, but may not capture the d...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
This paper presents the first unsupervised approach to lexical semantic change that makes use of con...
Vector space word representations are typically learned using only co-occurrence statistics from tex...
Semantic specialization of distributional word vectors, referred to as retrofitting, is a process of...