Connectionist models of sentence processing must learn to behave systematically by generalizing from a small training set. To what extent recurrent neural networks manage this generalization task is investigated. In contrast to Van der Velde et al. (Connection Sci., 16, pp. 21-46, 2004), it is found that simple recurrent networks do show so-called weak combinatorial systematicity, although their performance remains limited. It is argued that these limitations arise from overfitting in large networks. Generalization can be improved by increasing the size of the recurrent layer without training its connections, thereby combining a large short-term memory with a small long-term memory capacity. Performance can be improved further by increasing...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...
The emphasis in the connectionist sentence-processing literature on distributed representation and e...
Human cognition is said to be systematic: cognitive ability generalizes to structurally related beha...
Providing explanations of language comprehension requires models that describe language processing a...
Providing explanations of language comprehension requires models that describe language processing a...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently because the...
There is considerable debate about the amount and kind of systematicity displayed by neural networks...
Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently because the...
Abstract Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently be...
We present a neural-symbolic learning model of sentence production which displays strong semantic sy...
Recurrent connectionist models, such as the simple recurrent network (SRN, Elman, 1991), have been s...
The emphasis in the connectionist sentence-processing literature on distributed representation and e...
Inducing sparseness while training neural networks has been shown to yield models with a lower memor...
It is often difficult to predict the optimal neural network size for a particular application. Const...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...
The emphasis in the connectionist sentence-processing literature on distributed representation and e...
Human cognition is said to be systematic: cognitive ability generalizes to structurally related beha...
Providing explanations of language comprehension requires models that describe language processing a...
Providing explanations of language comprehension requires models that describe language processing a...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently because the...
There is considerable debate about the amount and kind of systematicity displayed by neural networks...
Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently because the...
Abstract Echo state networks (ESNs) are recurrent neural networks that can be trained efficiently be...
We present a neural-symbolic learning model of sentence production which displays strong semantic sy...
Recurrent connectionist models, such as the simple recurrent network (SRN, Elman, 1991), have been s...
The emphasis in the connectionist sentence-processing literature on distributed representation and e...
Inducing sparseness while training neural networks has been shown to yield models with a lower memor...
It is often difficult to predict the optimal neural network size for a particular application. Const...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...
The emphasis in the connectionist sentence-processing literature on distributed representation and e...
Human cognition is said to be systematic: cognitive ability generalizes to structurally related beha...