Contextualized word embeddings have been employed effectively across several tasks in Natural Language Processing, as they have proved to carry useful semantic information. However, it is still hard to link them to structured sources of knowledge. In this paper we present ARES (context-AwaRe Embeddings of Senses), a semi-supervised approach to producing sense embeddings for the lexical meanings within a lexical knowledge base that lie in a space that is comparable to that of contextualized word vectors. ARES representations enable a simple 1 Nearest-Neighbour algorithm to outperform state-of-the-art models, not only in the English Word Sense Disambiguation task, but also in the multilingual one, whilst training on sense-annotated data in En...
Word Sense Disambiguation (WSD) is the task of identifying the meaning of a word in a given context....
Over the past few years, Word Sense Disambiguation (WSD) has received renewed interest: recently pro...
Vector representations of text are an essential tool for modern Natural Language Processing (NLP), a...
Contextual representations of words derived by neural language models have proven to effectively enc...
Distributional semantics based on neural approaches is a cornerstone of Natural Language Processing,...
Supervised models for Word Sense Disambiguation (WSD) currently yield to state-of-the-art results in...
International audienceIn this paper, we develop a new way of creating sense vectors for any dictiona...
Word sense induction (WSI) is the problem ofautomatically building an inventory of sensesfor a set o...
Natural Language Understanding has seen an increasing number of publications in the last years, espe...
Supervised word sense disambiguation (WSD) for truly polysemous words (in contrast to homonyms) is d...
Contextualised word embeddings generated from Neural Language Models (NLMs), such as BERT, represent...
Most word representation methods assume that each word owns a single semantic vec-tor. This is usual...
Word embeddings are widely used in Natural Language Processing, mainly due to their success in captu...
We present a multilingual approach to Word Sense Disambiguation (WSD), which automatically assigns t...
Words have different meanings (i.e., senses) depending on the context. Disambiguating the correct se...
Word Sense Disambiguation (WSD) is the task of identifying the meaning of a word in a given context....
Over the past few years, Word Sense Disambiguation (WSD) has received renewed interest: recently pro...
Vector representations of text are an essential tool for modern Natural Language Processing (NLP), a...
Contextual representations of words derived by neural language models have proven to effectively enc...
Distributional semantics based on neural approaches is a cornerstone of Natural Language Processing,...
Supervised models for Word Sense Disambiguation (WSD) currently yield to state-of-the-art results in...
International audienceIn this paper, we develop a new way of creating sense vectors for any dictiona...
Word sense induction (WSI) is the problem ofautomatically building an inventory of sensesfor a set o...
Natural Language Understanding has seen an increasing number of publications in the last years, espe...
Supervised word sense disambiguation (WSD) for truly polysemous words (in contrast to homonyms) is d...
Contextualised word embeddings generated from Neural Language Models (NLMs), such as BERT, represent...
Most word representation methods assume that each word owns a single semantic vec-tor. This is usual...
Word embeddings are widely used in Natural Language Processing, mainly due to their success in captu...
We present a multilingual approach to Word Sense Disambiguation (WSD), which automatically assigns t...
Words have different meanings (i.e., senses) depending on the context. Disambiguating the correct se...
Word Sense Disambiguation (WSD) is the task of identifying the meaning of a word in a given context....
Over the past few years, Word Sense Disambiguation (WSD) has received renewed interest: recently pro...
Vector representations of text are an essential tool for modern Natural Language Processing (NLP), a...