International audienceSeveral popular Transformer based language models have been found to be successful for text-driven brain encoding. However, existing literature leverages only pretrained text Transformer models and has not explored the efficacy of task-specific learned Transformer representations. In this work, we explore transfer learning from representations learned for ten popular natural language processing tasks (two syntactic and eight semantic) for predicting brain responses from two diverse datasets: Pereira (subjects reading sentences from paragraphs) and Narratives (subjects listening to the spoken stories). Encoding models based on task features are used to predict activity in different regions across the whole brain. Featur...
Several major innovations in artificial intelligence (AI) (e.g. convolutional neural networks, exper...
Item does not contain fulltextIn contextually rich language comprehension settings listeners can rel...
How is information organized in the brain during natural reading? Where and when do the required pro...
Several popular Transformer based language models have been found to be successful for text-driven b...
International audienceConsiderable progress has recently been made in natural language processing: d...
International audienceDeep learning algorithms trained to predict masked words from large amount of ...
International audienceNeural Language Models (NLMs) have made tremendous advances during the last ye...
The neuroscience of perception has recently been revolutionized with an integrative modeling approac...
Neural Language Models (NLMs) have made tremendous advances during the last years, achieving impress...
Much research in cognitive neuroscience supports prediction as a canonical computation of cognition ...
The aim of the study was to test the cross-language generative capability of a model that predicts n...
Deep learning (DL) approaches may also inform the analysis of human brain activity. Here, a state-of...
International audienceDeep language algorithms, like GPT-2, have demonstrated remarkable abilities t...
Decoding human brain activities based on linguistic representations has been actively studied in rec...
Linking computational natural language processing (NLP) models and neural responses to language in t...
Several major innovations in artificial intelligence (AI) (e.g. convolutional neural networks, exper...
Item does not contain fulltextIn contextually rich language comprehension settings listeners can rel...
How is information organized in the brain during natural reading? Where and when do the required pro...
Several popular Transformer based language models have been found to be successful for text-driven b...
International audienceConsiderable progress has recently been made in natural language processing: d...
International audienceDeep learning algorithms trained to predict masked words from large amount of ...
International audienceNeural Language Models (NLMs) have made tremendous advances during the last ye...
The neuroscience of perception has recently been revolutionized with an integrative modeling approac...
Neural Language Models (NLMs) have made tremendous advances during the last years, achieving impress...
Much research in cognitive neuroscience supports prediction as a canonical computation of cognition ...
The aim of the study was to test the cross-language generative capability of a model that predicts n...
Deep learning (DL) approaches may also inform the analysis of human brain activity. Here, a state-of...
International audienceDeep language algorithms, like GPT-2, have demonstrated remarkable abilities t...
Decoding human brain activities based on linguistic representations has been actively studied in rec...
Linking computational natural language processing (NLP) models and neural responses to language in t...
Several major innovations in artificial intelligence (AI) (e.g. convolutional neural networks, exper...
Item does not contain fulltextIn contextually rich language comprehension settings listeners can rel...
How is information organized in the brain during natural reading? Where and when do the required pro...