Funding Information: We thank Yonatan Bisk for his valuable feedback and suggestions on this work. We also acknowledge the computational resources provided by the Aalto Science-IT project and the support within the Academy of Finland Flagship programme: Finnish Center for Artificial Intelligence (FCAI). Publisher Copyright: © 2022 Association for Computational Linguistics.We provide a study of how induced model sparsity can help achieve compositional generalization and better sample efficiency in grounded language learning problems. We consider simple language-conditioned navigation problems in a grid world environment with disentangled observations. We show that standard neural architectures do not always yield compositional generalization...
By capturing statistical patterns in large corpora, machine learning has enabled significant advance...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Various perceptual domains have underlying compositional semantics that are rarely captured in curre...
Humans are remarkably flexible when under- standing new sentences that include combinations of conce...
Systematic generalization is the ability to combine known parts into novel meaning; an important asp...
The ability to combine learned knowledge and skills to solve novel tasks is a key aspect of generali...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
The power of human language and thought arises from systematic compositionality—the algebraic abilit...
This paper presents a new computational model for studying the origins and evolution of compositiona...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
textCommunicating with natural language interfaces is a long-standing, ultimate goal for artificial ...
Over-paramaterized neural models have become dominant in Natural Language Processing. Increasing the...
Compositionality has traditionally been understood as a major factor in productivity of language and...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
By capturing statistical patterns in large corpora, machine learning has enabled significant advance...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Various perceptual domains have underlying compositional semantics that are rarely captured in curre...
Humans are remarkably flexible when under- standing new sentences that include combinations of conce...
Systematic generalization is the ability to combine known parts into novel meaning; an important asp...
The ability to combine learned knowledge and skills to solve novel tasks is a key aspect of generali...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
The power of human language and thought arises from systematic compositionality—the algebraic abilit...
This paper presents a new computational model for studying the origins and evolution of compositiona...
In the last decade, deep artificial neural networks have achieved astounding performance in many nat...
textCommunicating with natural language interfaces is a long-standing, ultimate goal for artificial ...
Over-paramaterized neural models have become dominant in Natural Language Processing. Increasing the...
Compositionality has traditionally been understood as a major factor in productivity of language and...
Neural networks have revolutionized language modeling and excelled in various downstream tasks. Howe...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
By capturing statistical patterns in large corpora, machine learning has enabled significant advance...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Various perceptual domains have underlying compositional semantics that are rarely captured in curre...