Word embedding has been widely used in many natural language processing tasks. In this paper, we focus on learning word embeddings through selective higher-order relationships in sentences to improve the embeddings to be less sensitive to local context and more accurate in capturing semantic compositionality. We present a novel multi-order dependency-based strategy to composite and represent the context under several essential constraints. In order to realize selective learning from the word contexts, we automatically assign the strengths of different dependencies between co-occurred words in the stochastic gradient descent process. We evaluate and analyze our proposed approach using several direct and indirect tasks for word embeddings. Ex...
The semantic representation of words is a fundamental task in natural language processing and text m...
We present a novel learning method for word embeddings designed for relation classification. Our wor...
A variety of contextualised language models have been proposed in the NLP community, which are train...
The existing word embedding techniques are mostly based on Bag of Words models where words that co-...
Word embeddings are an effective tool to analyze language. They have been recently extended to model...
Word embedding techniques in literature are mostly based on Bag of Words models where words that co-...
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on the...
Unsupervised word representations have demonstrated improvements in predictive generalization on var...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
Deep compositional models of meaning acting on distributional representations of words in order to p...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
While continuous word embeddings are gaining popularity, current models are based solely on linear c...
The recent tremendous success of unsupervised word embeddings in a multitude of applications raises ...
The relationship between words in a sentence often tells us more about the underlying semantic conte...
Introduction Word embeddings, which are distributed word representations learned by neural language ...
The semantic representation of words is a fundamental task in natural language processing and text m...
We present a novel learning method for word embeddings designed for relation classification. Our wor...
A variety of contextualised language models have been proposed in the NLP community, which are train...
The existing word embedding techniques are mostly based on Bag of Words models where words that co-...
Word embeddings are an effective tool to analyze language. They have been recently extended to model...
Word embedding techniques in literature are mostly based on Bag of Words models where words that co-...
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on the...
Unsupervised word representations have demonstrated improvements in predictive generalization on var...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
Deep compositional models of meaning acting on distributional representations of words in order to p...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
While continuous word embeddings are gaining popularity, current models are based solely on linear c...
The recent tremendous success of unsupervised word embeddings in a multitude of applications raises ...
The relationship between words in a sentence often tells us more about the underlying semantic conte...
Introduction Word embeddings, which are distributed word representations learned by neural language ...
The semantic representation of words is a fundamental task in natural language processing and text m...
We present a novel learning method for word embeddings designed for relation classification. Our wor...
A variety of contextualised language models have been proposed in the NLP community, which are train...