International audienceSeveral studies on sentence processing suggest that the mental lexicon keeps track of the mutual expectations between words. Current DSMs, however, represent context words as separate features, which causes the loss of important information for word expectations, such as word order and interrelations. In this paper, we present a DSM which addresses the issue by defining verb contexts as joint dependencies. We test our representation in a verb similarity task on two datasets, showing that joint contexts are more efficient than single dependencies, even with a relatively small amount of training data
Similarity, which plays a key role in fields like cognitive science, psycho- linguistics and natura...
We explore human judgments on how well individual patterns of 29 target verbs from the Pattern Dicti...
Distributional semantic models learn vector representations of words through the contexts they occur...
International audienceSeveral studies on sentence processing suggest that the mental lexicon keeps t...
Contextual information extracted from corpora is frequently used to model semantic similarity. We di...
Recent comparative studies have demonstrated the usefulness of dependency-based contexts (DEPS) for ...
The utility of syntactic dependencies in computing distributional similarity has not yet been fully ...
Word Space Models provide a convenient way of modelling word meaning in terms of a word’s contexts i...
This paper is concerned with identifying contexts useful for training word representation models for...
Abstract. Word Space Models provide a convenient way of modelling word mean-ing in terms of a word’s...
Abstract Research into representation learning models of lexical semantics usually ut...
Coordination is an essential aspect of linguistic communication. First, meaning itself is said to em...
Methods for learning word representations using large text corpora have received much attention late...
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on the...
Lexical co-occurrence counts from large corpora have been used to construct highdimensional vector-s...
Similarity, which plays a key role in fields like cognitive science, psycho- linguistics and natura...
We explore human judgments on how well individual patterns of 29 target verbs from the Pattern Dicti...
Distributional semantic models learn vector representations of words through the contexts they occur...
International audienceSeveral studies on sentence processing suggest that the mental lexicon keeps t...
Contextual information extracted from corpora is frequently used to model semantic similarity. We di...
Recent comparative studies have demonstrated the usefulness of dependency-based contexts (DEPS) for ...
The utility of syntactic dependencies in computing distributional similarity has not yet been fully ...
Word Space Models provide a convenient way of modelling word meaning in terms of a word’s contexts i...
This paper is concerned with identifying contexts useful for training word representation models for...
Abstract. Word Space Models provide a convenient way of modelling word mean-ing in terms of a word’s...
Abstract Research into representation learning models of lexical semantics usually ut...
Coordination is an essential aspect of linguistic communication. First, meaning itself is said to em...
Methods for learning word representations using large text corpora have received much attention late...
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on the...
Lexical co-occurrence counts from large corpora have been used to construct highdimensional vector-s...
Similarity, which plays a key role in fields like cognitive science, psycho- linguistics and natura...
We explore human judgments on how well individual patterns of 29 target verbs from the Pattern Dicti...
Distributional semantic models learn vector representations of words through the contexts they occur...