We put forward an approach that exploits the knowledge encoded in lexical resources in order to induce representations for words that were not encountered frequently during training. Our approach provides an advantage over the past work in that it enables vocabulary expansion not only for morphological variations, but also for infrequent domain specific terms. We performed evaluations in different settings, showing that the technique can provide consistent improvements on multiple benchmarks across domains.The authors gratefully acknowledge the support of the MRC grant No. MR/M025160/1 for PheneBank
Lexical embedding, the embedding of words within other words (e.g. bar in barn ), complicates the...
Using low dimensional vector space to represent words has been very effective in many NLP tasks.Howe...
Pretraining deep neural network architectures with a language modeling objective has brought large i...
Word embedding techniques heavily rely on the abundance of training data for individual words. Given...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embedding techniques heavily rely on the abundance of training data for individual words. Given...
Most existing corpus-based approaches to semantic representation suffer from inaccurate modeling of ...
The techniques of using neural networks to learn distributed word representations (i.e., word embedd...
Language gives humans an ability to construct a new, previously never used word insuch a way that ot...
International audienceOne of the bottlenecks in Natural Language Processing for a given language is ...
Learning high-quality embeddings for rare words is a hard problem because of sparse context informat...
In this paper, we propose an approach for enhancing word representations twice based on large-scale ...
Motivations like domain adaptation, transfer learning, and feature learning have fueled interest in ...
Vector-space word representations have been very successful in recent years at im-proving performanc...
Lexical embedding, the embedding of words within other words (e.g. bar in barn ), complicates the...
Using low dimensional vector space to represent words has been very effective in many NLP tasks.Howe...
Pretraining deep neural network architectures with a language modeling objective has brought large i...
Word embedding techniques heavily rely on the abundance of training data for individual words. Given...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embedding techniques heavily rely on the abundance of training data for individual words. Given...
Most existing corpus-based approaches to semantic representation suffer from inaccurate modeling of ...
The techniques of using neural networks to learn distributed word representations (i.e., word embedd...
Language gives humans an ability to construct a new, previously never used word insuch a way that ot...
International audienceOne of the bottlenecks in Natural Language Processing for a given language is ...
Learning high-quality embeddings for rare words is a hard problem because of sparse context informat...
In this paper, we propose an approach for enhancing word representations twice based on large-scale ...
Motivations like domain adaptation, transfer learning, and feature learning have fueled interest in ...
Vector-space word representations have been very successful in recent years at im-proving performanc...
Lexical embedding, the embedding of words within other words (e.g. bar in barn ), complicates the...
Using low dimensional vector space to represent words has been very effective in many NLP tasks.Howe...
Pretraining deep neural network architectures with a language modeling objective has brought large i...