Published as Coyote Papers: Working Papers in Linguistics, Language in Cognitive ScienceThe paper presents a preliminary evaluation of a corpus-based representation of individual words and a method to generalize over these representations. The vector space is represented in a way that gives weight to the fact that words co-occur rather than to the frequency of their co-occurrence. This format is hypothesized to allow for reducing the vector space, minimizing negative effects of data sparseness and enhancing ability of the model to generalize words to novel contexts. The model is assessed by comparing computer-calculated probabilities of different verb-argument combinations with human subjects' judgements about appropriateness of these combi...
Methods for representing the meaning of words in vector spaces purely using the information distribu...
This corpus-based study investigates the inter-relation between discourse markers (DMs) and other co...
A system for building distributed representations for words using a sliding window is described here...
The hypothesis that word co-occurrence statistics extracted from text corpora can provide a basis fo...
Semantic associations have served as a tool in cognitive science research for decades, and in recent...
Lexical co-occurrence counts from large corpora have been used to construct highdimensional vector-s...
Distributional Semantic Models (DSM) are growing in popularity in Computational Linguistics. DSM use...
Advances in machine learning and increasing computing power are providing new possibilities for data...
Topics in semantic representation 1 Topics in semantic representation 2 Accounts of language process...
In this paper, we modify the traditional trigram model by using utterance-level semantic coherence f...
Lexical co-occurrence is an important cue for detecting word associations. We present a theoretical ...
We present computational models capable of under-standing and conveying concepts based on word asso-...
This tutorial presents a corpus-driven, pattern-based empirical approach to meaning representation a...
In this paper, we describe Semantic Aggregate Model (SAM), a generative probability model for word c...
Comunicació presentada a: Fourth Joint Conference on Lexical and Computational Semantics celebrat de...
Methods for representing the meaning of words in vector spaces purely using the information distribu...
This corpus-based study investigates the inter-relation between discourse markers (DMs) and other co...
A system for building distributed representations for words using a sliding window is described here...
The hypothesis that word co-occurrence statistics extracted from text corpora can provide a basis fo...
Semantic associations have served as a tool in cognitive science research for decades, and in recent...
Lexical co-occurrence counts from large corpora have been used to construct highdimensional vector-s...
Distributional Semantic Models (DSM) are growing in popularity in Computational Linguistics. DSM use...
Advances in machine learning and increasing computing power are providing new possibilities for data...
Topics in semantic representation 1 Topics in semantic representation 2 Accounts of language process...
In this paper, we modify the traditional trigram model by using utterance-level semantic coherence f...
Lexical co-occurrence is an important cue for detecting word associations. We present a theoretical ...
We present computational models capable of under-standing and conveying concepts based on word asso-...
This tutorial presents a corpus-driven, pattern-based empirical approach to meaning representation a...
In this paper, we describe Semantic Aggregate Model (SAM), a generative probability model for word c...
Comunicació presentada a: Fourth Joint Conference on Lexical and Computational Semantics celebrat de...
Methods for representing the meaning of words in vector spaces purely using the information distribu...
This corpus-based study investigates the inter-relation between discourse markers (DMs) and other co...
A system for building distributed representations for words using a sliding window is described here...