Computational language models (LMs), most notably exemplified by the widespread success of OpenAI's ChatGPT chatbot, show impressive performance on a wide range of linguistic tasks, thus providing cognitive science and linguistics with a computational working model to empirically study different aspects of human language. Here, we use LMs to test the hypothesis that languages with more speakers tend to be easier to learn. In two experiments, we train several LMs—ranging from very simple n-gram models to state-of-the-art deep neural networks—on written cross-linguistic corpus data covering 1293 different languages and statistically estimate learning difficulty. Using a variety of quantitative methods and machine learning techniques to accoun...
Scaling existing applications and solutions to multiple human languages has traditionally proven to ...
The nature and amount of information needed for learning a natural language, and the underlying mech...
Languages with many speakers tend to be structurally simple while small communities sometimes develo...
Computational language models (LMs), most notably exemplified by the widespread success of OpenAI's ...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Cross-linguistic differences in morphological complexity could have important consequences for langu...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
According to the Language Familiarity Effect (LFE), people are better at discriminating between spea...
In recent years neural language models (LMs) have set state-of-the-art performance for several bench...
How cross-linguistically applicable are NLP models, specifically language models? A fair comparison ...
Large-scale empirical evidence indicates a fascinating statistical relationship between the estimate...
One of the fundamental questions about human language is whether all languages are equally complex. ...
© 2020 The Author(s) 2020. Published by Oxford University Press. All rights reserved. For permission...
Large-scale empirical evidence indicates a fascinating statistical relationship between the estimate...
The purpose of this study is to examine how a computer model learns a second language with a differe...
Scaling existing applications and solutions to multiple human languages has traditionally proven to ...
The nature and amount of information needed for learning a natural language, and the underlying mech...
Languages with many speakers tend to be structurally simple while small communities sometimes develo...
Computational language models (LMs), most notably exemplified by the widespread success of OpenAI's ...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Cross-linguistic differences in morphological complexity could have important consequences for langu...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
According to the Language Familiarity Effect (LFE), people are better at discriminating between spea...
In recent years neural language models (LMs) have set state-of-the-art performance for several bench...
How cross-linguistically applicable are NLP models, specifically language models? A fair comparison ...
Large-scale empirical evidence indicates a fascinating statistical relationship between the estimate...
One of the fundamental questions about human language is whether all languages are equally complex. ...
© 2020 The Author(s) 2020. Published by Oxford University Press. All rights reserved. For permission...
Large-scale empirical evidence indicates a fascinating statistical relationship between the estimate...
The purpose of this study is to examine how a computer model learns a second language with a differe...
Scaling existing applications and solutions to multiple human languages has traditionally proven to ...
The nature and amount of information needed for learning a natural language, and the underlying mech...
Languages with many speakers tend to be structurally simple while small communities sometimes develo...