The relationship between words in a sentence often tells us more about the underlying semantic content of a document than its actual words, individually. In this work, we propose two novel algorithms, called Flexible Lexical Chain II and Fixed Lexical Chain II. These algorithms combine the semantic relations derived from lexical chains, prior knowledge from lexical databases, and the robustness of the distributional hypothesis in word embeddings as building blocks forming a single system. In short, our approach has three main contributions: (i) a set of techniques that fully integrate word embeddings and lexical chains; (ii) a more robust semantic representation that considers the latent relation between words in a document; and (ii...
Recent advances in neural language models have contributed new methods for learning distributed vect...
Distributed word representations (word embeddings) have recently contributed to competitive performa...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
The relationship between words in a sentence often tell us more about the underlying semantic conten...
Pre-trained word embeddings encode general word semantics and lexical regularities of natural langua...
Def2Vec introduces a novel paradigm for word embeddings, leveraging dictionary definitions to learn ...
In this paper, we first analyze the semantic composition of word embeddings by cross-referencing the...
Research on word representation has always been an important area of interest in the antiquity of Na...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Representing the semantics of words is a fundamental task in text processing. Several research studi...
I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, includ...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
The semantic representation of words is a fundamental task in natural language processing and text m...
In recent years word embedding/distributional semantic models evolved to become a fundamental compon...
Recent advances in neural language models have contributed new methods for learning distributed vect...
Distributed word representations (word embeddings) have recently contributed to competitive performa...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
The relationship between words in a sentence often tell us more about the underlying semantic conten...
Pre-trained word embeddings encode general word semantics and lexical regularities of natural langua...
Def2Vec introduces a novel paradigm for word embeddings, leveraging dictionary definitions to learn ...
In this paper, we first analyze the semantic composition of word embeddings by cross-referencing the...
Research on word representation has always been an important area of interest in the antiquity of Na...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Representing the semantics of words is a fundamental task in text processing. Several research studi...
I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, includ...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
The semantic representation of words is a fundamental task in natural language processing and text m...
In recent years word embedding/distributional semantic models evolved to become a fundamental compon...
Recent advances in neural language models have contributed new methods for learning distributed vect...
Distributed word representations (word embeddings) have recently contributed to competitive performa...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...