Distributed word representations capture relational similarities by means of vec-tor arithmetics, giving high accuracies on analogy detection. We empirically inves-tigate the use of syntactic dependencies on improving Chinese analogy detection based on distributed word representation-s, showing that a dependency-based em-beddings does not perform better than an ngram-based embeddings, but dependen-cy structures can be used to improve anal-ogy detection by filtering candidates. In addition, we show that distributed repre-sentations of dependency structure can be used for measuring relational similarities, thereby help analogy mining.
In the Chinese language, words consist of characters each of which is composed of one or more compon...
Attributes of words and relations between two words are central to numerous tasks in Artificial Inte...
Recent trends suggest that neural-network-inspired word embedding models outperform traditional coun...
A lexical analogy is a pair of word-pairs that share a similar semantic relation. Lexical analogies ...
Recent work has shown that neural-embedded word representations capture many relational similarities...
Recent work has shown that neural-embedded word representations capture many relational similarities...
Distributional Similarity has attracted considerable attention in the field of natural language proc...
Distributional Similarity has attracted considerable attention in the field of natural language proc...
There have been several efforts to extend distributional semantics beyond individual words, to measu...
Collocation extraction systems based on pure statistical methods suffer from two major problems. The...
Previously, researchers paid no attention to the creation of unambiguous morpheme embeddings indepen...
Lexical semantic information plays an important role in supervised dependency parsing. In this paper...
We propose a novel approach to learn representations of relations expressed by their textual mention...
Analogy is a fundamental component of the way we think and process thought. Solving a word analogy p...
Word embeddings have attracted much attention in recent years and have been heavily applied to many ...
In the Chinese language, words consist of characters each of which is composed of one or more compon...
Attributes of words and relations between two words are central to numerous tasks in Artificial Inte...
Recent trends suggest that neural-network-inspired word embedding models outperform traditional coun...
A lexical analogy is a pair of word-pairs that share a similar semantic relation. Lexical analogies ...
Recent work has shown that neural-embedded word representations capture many relational similarities...
Recent work has shown that neural-embedded word representations capture many relational similarities...
Distributional Similarity has attracted considerable attention in the field of natural language proc...
Distributional Similarity has attracted considerable attention in the field of natural language proc...
There have been several efforts to extend distributional semantics beyond individual words, to measu...
Collocation extraction systems based on pure statistical methods suffer from two major problems. The...
Previously, researchers paid no attention to the creation of unambiguous morpheme embeddings indepen...
Lexical semantic information plays an important role in supervised dependency parsing. In this paper...
We propose a novel approach to learn representations of relations expressed by their textual mention...
Analogy is a fundamental component of the way we think and process thought. Solving a word analogy p...
Word embeddings have attracted much attention in recent years and have been heavily applied to many ...
In the Chinese language, words consist of characters each of which is composed of one or more compon...
Attributes of words and relations between two words are central to numerous tasks in Artificial Inte...
Recent trends suggest that neural-network-inspired word embedding models outperform traditional coun...