This work describes an approach for inferring Deterministic Context-free (DCF) Grammars in a Connectionist paradigm using a Recurrent Neural Network Pushdown Automaton (NNPDA). The NNPDA consists of a recurrent neural network connected to an external stack memory through a common error function. We show that the NNPDA is able to learn the dynamics of an underlying pushdown automaton from examples of grammatical and non-grammatical strings. Not only does the network learn the state transitions in the automaton, it also learns the actions required to control the stack. In order to use continuous optimization methods, we develop an analog stack which reverts to a discrete stack by quantization of all activations, after the network has learned ...
Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languag...
Neural network learning of context free languages has been applied only to very simple languages and...
The objective of this thesis is twofold. Firstly, we want to study the potential of recurrent neural...
In order for neural networks to learn complex languages or grammars, they must have sufficient compu...
In order for neural networks to learn complex languages or grammars, they must have sufficient comp...
this paper develops a new model, a neural network pushdown automaton (NNPDA), which is a hybrid syst...
To appearExtended version with appendixInternational audienceThis paper presents (i) an active learn...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
We describe a novel neural architecture for learning deterministic context-free grammars, or equival...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
Although considerable interest has been shown in language inference and automata induction using rec...
This paper examines the inductive inference of a complex grammar with neural networks¿specifically, ...
Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languag...
Neural network learning of context free languages has been applied only to very simple languages and...
The objective of this thesis is twofold. Firstly, we want to study the potential of recurrent neural...
In order for neural networks to learn complex languages or grammars, they must have sufficient compu...
In order for neural networks to learn complex languages or grammars, they must have sufficient comp...
this paper develops a new model, a neural network pushdown automaton (NNPDA), which is a hybrid syst...
To appearExtended version with appendixInternational audienceThis paper presents (i) an active learn...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
We describe a novel neural architecture for learning deterministic context-free grammars, or equival...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
International audienceThis paper presents (i) an active learning algorithm for visibly pushdown gram...
Although considerable interest has been shown in language inference and automata induction using rec...
This paper examines the inductive inference of a complex grammar with neural networks¿specifically, ...
Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languag...
Neural network learning of context free languages has been applied only to very simple languages and...
The objective of this thesis is twofold. Firstly, we want to study the potential of recurrent neural...