Distributional models of semantics are a popular way of cap-turing the similarity between words or concepts. More re-cently, such models have also been used to generate prop-erties associated with a concept; model-generated properties are typically compared against collections of semantic feature norms. In the present paper, we propose a novel way of testing the plausibility of the properties generated by a distributional model using data from a visual world experiment. We show that model-generated properties, when embedded in a senten-tial context, bias participants ’ expectations towards a semanti-cally associated target word in real time. This effect is absent in a neutral context that contains no relevant properties
In this work, we address the evaluation of distributional semantic models trained on smaller, domain...
Distributional models of semantics learn word meanings from contextual co‐occurrence patterns across...
When we communicate with each other, a large chunk of what we express is conveyed by the words we us...
Distributional models of semantics are a popular way of cap-turing the similarity between words or c...
In recent years, distributional models (DMs) have shown great success in repre-senting lexical seman...
This paper proposes a framework for investigating which types of semantic properties are represented...
Distributional semantic models represent words in a vector space and are competent in various semant...
Distributional Semantic Models have emerged as a strong theoretical and practical approach to model ...
Opportunities for associationist learning of word meaning, where a word is heard or read contemperan...
What do powerful models of word mean- ing created from distributional data (e.g. Word2vec (Mikolov e...
Opportunities for associationist learning of word meaning, where a word is heard or read contemperan...
International audienceDo distributional word representations encode the linguistic regularities that...
The aim of distributional semantics is to design computational techniques that can automatically lea...
Motivated by the widespread use of distributional models of semantics within the cognitive science c...
We present a distributional semantic model combining text- and image-based features. We evaluate thi...
In this work, we address the evaluation of distributional semantic models trained on smaller, domain...
Distributional models of semantics learn word meanings from contextual co‐occurrence patterns across...
When we communicate with each other, a large chunk of what we express is conveyed by the words we us...
Distributional models of semantics are a popular way of cap-turing the similarity between words or c...
In recent years, distributional models (DMs) have shown great success in repre-senting lexical seman...
This paper proposes a framework for investigating which types of semantic properties are represented...
Distributional semantic models represent words in a vector space and are competent in various semant...
Distributional Semantic Models have emerged as a strong theoretical and practical approach to model ...
Opportunities for associationist learning of word meaning, where a word is heard or read contemperan...
What do powerful models of word mean- ing created from distributional data (e.g. Word2vec (Mikolov e...
Opportunities for associationist learning of word meaning, where a word is heard or read contemperan...
International audienceDo distributional word representations encode the linguistic regularities that...
The aim of distributional semantics is to design computational techniques that can automatically lea...
Motivated by the widespread use of distributional models of semantics within the cognitive science c...
We present a distributional semantic model combining text- and image-based features. We evaluate thi...
In this work, we address the evaluation of distributional semantic models trained on smaller, domain...
Distributional models of semantics learn word meanings from contextual co‐occurrence patterns across...
When we communicate with each other, a large chunk of what we express is conveyed by the words we us...