Words are time-bound signals and are amenable to temporal processing. The human brain has an innate ability to encode serial events into spatial patterns of neural activity (David Beiser & James Houk, 1998). Temporal Hebbian SOMs (THSOMs) allow us to take the two assumptions seriously. They provide a novel computational framework accounting for many paradigm-based generalizations in a natural and insightful way. This claim is validated on inflectional data from German, English and Italian
Abstract. We present a novel approach to unsupervised temporal sequence processing in the form of an...
A symbolic time-series is a typical word of a language (wordlikeness) and has effects in speech perc...
© 2014 New York Academy of Sciences. We propose an event-based account of the cognitive and linguist...
The lexicon is the store of words in long-term memory. Any attempt at modelling lexical competence m...
Recent experimental evidence on morphological learning and processing has prompted a less determinis...
Human lexical knowledge does not appear to be organised to minimise storage, but rather to maximise ...
The Hebbian view of word representation is challenged by findings of task (level of processing)-depe...
The Hebbian view of word representation is challenged by findings of task (level of processing)-depe...
Most of the work investigating the representation of concrete nouns in the brain has focused on the ...
Abstract—It is largely unknown how the brain deals with time. Hidden Markov Model (HMM) has a probab...
The new time-organized map (TOM) is presented for a better understand-ing of the self-organization a...
Kohonen's Self-Organizing Map (SOM) is one of the most popular artificial neural network algori...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
In the linguistic literature, paradigms have enjoyed a hybrid status, half‐way between entrenched pa...
Abstract. We present a novel approach to unsupervised temporal sequence processing in the form of an...
A symbolic time-series is a typical word of a language (wordlikeness) and has effects in speech perc...
© 2014 New York Academy of Sciences. We propose an event-based account of the cognitive and linguist...
The lexicon is the store of words in long-term memory. Any attempt at modelling lexical competence m...
Recent experimental evidence on morphological learning and processing has prompted a less determinis...
Human lexical knowledge does not appear to be organised to minimise storage, but rather to maximise ...
The Hebbian view of word representation is challenged by findings of task (level of processing)-depe...
The Hebbian view of word representation is challenged by findings of task (level of processing)-depe...
Most of the work investigating the representation of concrete nouns in the brain has focused on the ...
Abstract—It is largely unknown how the brain deals with time. Hidden Markov Model (HMM) has a probab...
The new time-organized map (TOM) is presented for a better understand-ing of the self-organization a...
Kohonen's Self-Organizing Map (SOM) is one of the most popular artificial neural network algori...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
Several psycholinguistic models represent words as vectors in a high-dimensional state space, such t...
In the linguistic literature, paradigms have enjoyed a hybrid status, half‐way between entrenched pa...
Abstract. We present a novel approach to unsupervised temporal sequence processing in the form of an...
A symbolic time-series is a typical word of a language (wordlikeness) and has effects in speech perc...
© 2014 New York Academy of Sciences. We propose an event-based account of the cognitive and linguist...