In recent years it has been shown that first order recurrent neural networks trained by gradient-descent can learn not only regular but also simple context-free and context-sensitive languages. However, the success rate was generally low and severe instability issues were encountered. The present study examines the hypothesis that a combination of evolutionary hill climbing with incremental learning and a well-balanced training set enables first order recurrent networks to reliably learn context-free and mildly context-sensitive languages. In particular, we trained the networks to predict symbols in string sequences of the context-sensitive language {aⁿbⁿcⁿ; n≥1}. Comparative experiments with and without incremental learning indicated that ...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
Recent theories suggest that language acquisition is assisted by the evolution of languages towards ...
Simple second-order recurrent networks are shown to readily learn small known regular grammars when...
Recurrent neural network processing of regular languages is reasonably well understood. Recent work ...
Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languag...
We address the problem of processing a context-sensitive language with a recurrent neural network (R...
Recurrent neural networks are capable of learning context-free tasks, however learning performance i...
The long short-term memory (LSTM) is not the only neural network which learns a context sensitive la...
Recurrent neural networks (RNNs) are capable of learning languages. However, the learning performanc...
This work describes an approach for inferring Deterministic Context-free (DCF) Grammars in a Connect...
A number of experiments have demonstrated what seems to be a bias in human phonological learning for...
A number of experiments have demonstrated what seems to be a bias in human phonological learning for...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
Recent theories suggest that language acquisition is assisted by the evolution of languages towards ...
Simple second-order recurrent networks are shown to readily learn small known regular grammars when...
Recurrent neural network processing of regular languages is reasonably well understood. Recent work ...
Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languag...
We address the problem of processing a context-sensitive language with a recurrent neural network (R...
Recurrent neural networks are capable of learning context-free tasks, however learning performance i...
The long short-term memory (LSTM) is not the only neural network which learns a context sensitive la...
Recurrent neural networks (RNNs) are capable of learning languages. However, the learning performanc...
This work describes an approach for inferring Deterministic Context-free (DCF) Grammars in a Connect...
A number of experiments have demonstrated what seems to be a bias in human phonological learning for...
A number of experiments have demonstrated what seems to be a bias in human phonological learning for...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
Two classes of recurrent neural network models are compared in this report, simple recurrent network...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
Recent theories suggest that language acquisition is assisted by the evolution of languages towards ...
Simple second-order recurrent networks are shown to readily learn small known regular grammars when...