Several psycholinguistic models represent words as vectors in a high-dimensional state space, such that distances between vectors encode the strengths of paradigmatic relations between the represented words. This chap-ter argues that such an organization develops because it facilitates fast sentence processing. A model is presented in which sentences, in the form of word-vector sequences, serve as input to a recurrent neural network that provides random dynamics. The word vectors are adjusted by a process of self-organization, aimed at reducing fluctuations in the dynamics. As it turns out, the resulting word vectors are organized paradigmatically
International audiencePrevious words in the sentence can influence the processing of the current wor...
Words are time-bound signals and are amenable to temporal processing. The human brain has an innate ...
This paper demonstrates how associative neural networks as standard models for Hebbian cell assembli...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
In this paper we present a self-organizing connectionist model of the acquisition of word meaning. O...
Recent experimental evidence on morphological learning and processing has prompted a less determinis...
We present a self-organizing neural network model that can acquire an incremental lexicon. The model...
Abstract. We present a novel approach to unsupervised temporal sequence processing in the form of an...
Hypernetworks are a generalized graph structure representing higher-order interactions between varia...
Sentence processing requires the ability to establish thematic relations between constituents. Here ...
Human lexical knowledge does not appear to be organised to minimise storage, but rather to maximise ...
To understand the brain mechanisms underlying language phenomena, and sentence construction in parti...
The classical connectionist models are not well suited to working with data varying over time. Accor...
The ability to accurately model a sentence at vary-ing stages (e.g., word-phrase-sentence) plays a c...
International audiencePrevious words in the sentence can influence the processing of the current wor...
Words are time-bound signals and are amenable to temporal processing. The human brain has an innate ...
This paper demonstrates how associative neural networks as standard models for Hebbian cell assembli...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
In this paper we present a self-organizing connectionist model of the acquisition of word meaning. O...
Recent experimental evidence on morphological learning and processing has prompted a less determinis...
We present a self-organizing neural network model that can acquire an incremental lexicon. The model...
Abstract. We present a novel approach to unsupervised temporal sequence processing in the form of an...
Hypernetworks are a generalized graph structure representing higher-order interactions between varia...
Sentence processing requires the ability to establish thematic relations between constituents. Here ...
Human lexical knowledge does not appear to be organised to minimise storage, but rather to maximise ...
To understand the brain mechanisms underlying language phenomena, and sentence construction in parti...
The classical connectionist models are not well suited to working with data varying over time. Accor...
The ability to accurately model a sentence at vary-ing stages (e.g., word-phrase-sentence) plays a c...
International audiencePrevious words in the sentence can influence the processing of the current wor...
Words are time-bound signals and are amenable to temporal processing. The human brain has an innate ...
This paper demonstrates how associative neural networks as standard models for Hebbian cell assembli...