Seq2seq models have been shown to struggle with compositional generalisation, i.e. generalising to new and potentially more complex structures than seen during training. Taking inspiration from grammar-based models that excel at compositional generalisation, we present a flexible end-to-end differentiable neural model that composes two structural operations: a fertility step, which we introduce in this work, and a reordering step based on previous work (Wang et al., 2021). Our model outperforms seq2seq models by a wide margin on challenging compositional splits of realistic semantic parsing tasks that require generalisation to longer examples. It also compares favourably to other models targeting compositional generalisation
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Despite the success of neural models in solving reasoning tasks, their compositional generalization ...
Generic unstructured neural networks have been shown to struggle on out-of-distribution compositiona...
Compositional generalization is a basic mechanism in human language learning, which current neural n...
Compositionality---the principle that the meaning of a complex expression is built from the meanings...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
While recent work has convincingly showed that sequence-to-sequence models struggle to generalize to...
In tasks like semantic parsing, instruction following, and question answering, standard deep network...
Prior work in semantic parsing has shown that conventional seq2seq models fail at compositional gene...
Humans can systematically generalize to novel compositions of existing concepts. Recent studies argu...
To appear in Findings of NAACL 2022In text-to-SQL tasks -- as in much of NLP -- compositional genera...
We present a neural-symbolic learning model of sentence production which displays strong semantic sy...
Human intelligence exhibits compositional generalization (i.e., the capacity to understand and produ...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Despite the success of neural models in solving reasoning tasks, their compositional generalization ...
Generic unstructured neural networks have been shown to struggle on out-of-distribution compositiona...
Compositional generalization is a basic mechanism in human language learning, which current neural n...
Compositionality---the principle that the meaning of a complex expression is built from the meanings...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
While recent work has convincingly showed that sequence-to-sequence models struggle to generalize to...
In tasks like semantic parsing, instruction following, and question answering, standard deep network...
Prior work in semantic parsing has shown that conventional seq2seq models fail at compositional gene...
Humans can systematically generalize to novel compositions of existing concepts. Recent studies argu...
To appear in Findings of NAACL 2022In text-to-SQL tasks -- as in much of NLP -- compositional genera...
We present a neural-symbolic learning model of sentence production which displays strong semantic sy...
Human intelligence exhibits compositional generalization (i.e., the capacity to understand and produ...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Despite the success of neural models in solving reasoning tasks, their compositional generalization ...