Over the past years, distributed semantic representations have proved to be effective and flexible keepers of prior knowledge to be integrated into downstream applications. This survey focuses on the representation of meaning. We start from the theoretical background behind word vector space models and highlight one of their major limitations: the meaning conflation deficiency, which arises from representing a word with all its possible meanings as a single vector. Then, we explain how this deficiency can be addressed through a transition from the word level to the more fine-grained level of word senses (in its broader acceptation) as a method for modelling unambiguous lexical meaning. We present a comprehensive overview of the wide range o...
Natural Language Understanding has seen an increasing number of publications in the last years, espe...
While word embeddings are now a de facto standard representation of words in most NLP tasks, recentl...
Contextualized word embeddings have been employed effectively across several tasks in Natural Langua...
Representation learning lies at the core of Artificial Intelligence (AI) and Natural Language Proces...
Vector representations of text are an essential tool for modern Natural Language Processing (NLP), a...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in captu...
Word embeddings typically represent differ- ent meanings of a word in a single conflated vector. Emp...
One major deficiency of most semantic representation techniques is that they usually model a word ty...
Word embeddings are widely used in Natural Language Processing, mainly due to their success in captu...
International audienceIn this paper, we develop a new way of creating sense vectors for any dictiona...
The representation of written language semantics is a central problem of language technology and a c...
Language, in both the written and the oral forms, is the ground basis of living in society. The same...
Human languages are naturally ambiguous, which makes it difficult to automatically understand the se...
Natural Language Understanding has seen an increasing number of publications in the last years, espe...
While word embeddings are now a de facto standard representation of words in most NLP tasks, recentl...
Contextualized word embeddings have been employed effectively across several tasks in Natural Langua...
Representation learning lies at the core of Artificial Intelligence (AI) and Natural Language Proces...
Vector representations of text are an essential tool for modern Natural Language Processing (NLP), a...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in captu...
Word embeddings typically represent differ- ent meanings of a word in a single conflated vector. Emp...
One major deficiency of most semantic representation techniques is that they usually model a word ty...
Word embeddings are widely used in Natural Language Processing, mainly due to their success in captu...
International audienceIn this paper, we develop a new way of creating sense vectors for any dictiona...
The representation of written language semantics is a central problem of language technology and a c...
Language, in both the written and the oral forms, is the ground basis of living in society. The same...
Human languages are naturally ambiguous, which makes it difficult to automatically understand the se...
Natural Language Understanding has seen an increasing number of publications in the last years, espe...
While word embeddings are now a de facto standard representation of words in most NLP tasks, recentl...
Contextualized word embeddings have been employed effectively across several tasks in Natural Langua...