Generic unstructured neural networks have been shown to struggle on out-of-distribution compositional generalization. Compositional data augmentation via example recombination has transferred some prior knowledge about compositionality to such black-box neural models for several semantic parsing tasks, but this often required task-specific engineering or provided limited gains. We present a more powerful data recombination method using a model called Compositional Structure Learner (CSL). CSL is a generative model with a quasi-synchronous context-free grammar backbone, which we induce from the training data. We sample recombined examples from CSL and add them to the fine-tuning data of a pre-trained sequence-to-sequence model (T5). This p...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Neural networks drive the success of natural language processing. A fundamental property of language...
In tasks like semantic parsing, instruction following, and question answering, standard deep network...
Compositionality---the principle that the meaning of a complex expression is built from the meanings...
Compositional generalization is a basic mechanism in human language learning, which current neural n...
Humans can systematically generalize to novel compositions of existing concepts. Recent studies argu...
Seq2seq models have been shown to struggle with compositional generalisation, i.e. generalising to n...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
Compositional generalisation (CG), in NLP and in machine learning more generally, has been assessed ...
When writing programs, people have the ability to tackle a new complex task by decomposing it into s...
While recent work has convincingly showed that sequence-to-sequence models struggle to generalize to...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
To appear in Findings of NAACL 2022In text-to-SQL tasks -- as in much of NLP -- compositional genera...
Systematic generalization is the ability to combine known parts into novel meaning; an important asp...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Neural networks drive the success of natural language processing. A fundamental property of language...
In tasks like semantic parsing, instruction following, and question answering, standard deep network...
Compositionality---the principle that the meaning of a complex expression is built from the meanings...
Compositional generalization is a basic mechanism in human language learning, which current neural n...
Humans can systematically generalize to novel compositions of existing concepts. Recent studies argu...
Seq2seq models have been shown to struggle with compositional generalisation, i.e. generalising to n...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
Compositional generalisation (CG), in NLP and in machine learning more generally, has been assessed ...
When writing programs, people have the ability to tackle a new complex task by decomposing it into s...
While recent work has convincingly showed that sequence-to-sequence models struggle to generalize to...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
To appear in Findings of NAACL 2022In text-to-SQL tasks -- as in much of NLP -- compositional genera...
Systematic generalization is the ability to combine known parts into novel meaning; an important asp...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Neural networks drive the success of natural language processing. A fundamental property of language...