Word embeddings have advanced the state of the art in NLP across numerous tasks. Understanding the contents of dense neural representations is of utmost interest to the computational semantics community. We propose to focus on relating these opaque word vectors with human-readable definitions, as found in dictionaries. This problem naturally divides into two subtasks: converting definitions into embeddings, and converting embeddings into definitions. This task was conducted in a multilingual setting, using comparable sets of embeddings trained homogeneously
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Word embeddings serve as an useful resource for many downstream natural language processing tasks. T...
Word embeddings have advanced the state of the art in NLP across numerous tasks. Understanding the c...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
What is a word embedding? Suppose you have a dictionary of words. The i th word in the dictionary is...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Language Models have long been a prolific area of study in the field of Natural Language Processing ...
Research on word representation has always been an important area of interest in the antiquity of Na...
Recent advances in neural language models have contributed new methods for learning distributed vect...
We describe a novel approach to generate high-quality lexical word embeddings from an Enhanced Neura...
Distributed representations of words have been shown to capture lexical semantics, based on their ef...
Pre-trained word embeddings encode general word semantics and lexical regularities of natural langua...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Word embeddings serve as an useful resource for many downstream natural language processing tasks. T...
Word embeddings have advanced the state of the art in NLP across numerous tasks. Understanding the c...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
What is a word embedding? Suppose you have a dictionary of words. The i th word in the dictionary is...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Language Models have long been a prolific area of study in the field of Natural Language Processing ...
Research on word representation has always been an important area of interest in the antiquity of Na...
Recent advances in neural language models have contributed new methods for learning distributed vect...
We describe a novel approach to generate high-quality lexical word embeddings from an Enhanced Neura...
Distributed representations of words have been shown to capture lexical semantics, based on their ef...
Pre-trained word embeddings encode general word semantics and lexical regularities of natural langua...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Word embeddings serve as an useful resource for many downstream natural language processing tasks. T...