Several studies investigated the linguistic information implicitly encoded in Neural Language Models. Most of these works focused on quantifying the amount and type of information available within their internal representations and across their layers. In line with this scenario, we proposed a different study, based on Lasso regression, aimed at understanding how the information encoded by BERT sentence-level representations is arranged within its hidden units. Using a suite of several probing tasks, we showed the existence of a relationship between the implicit knowledge learned by the model and the number of individual units involved in the encodings of this competence. Moreover, we found that it is possible to identify groups of hidden u...
Since the first bidirectional deep learn- ing model for natural language understanding, BERT, emerge...
Pretrained language models (PLMs) form the basis of most state-of-the-art NLP technologies. Neverthe...
Over the past decades natural language processing has evolved from a niche research area into a fast...
Several studies investigated the linguistic information implicitly encoded in Neural Language Models...
In this paper we present a comparison between the linguistic knowledge encoded in the internal repre...
Classifiers trained on auxiliary probing tasks are a popular tool to analyze the representations lea...
International audienceBERT is a recent language representation model that has surprisingly performed...
The outstanding performance recently reached by Neural Language Models (NLMs) across many Natural La...
The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP,...
Language models have become nearly ubiquitous in natural language processing applications achieving ...
While most theories regarding the various aspects of human language are couched in the language of d...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
Peeking into the inner workings of BERT has shown that its layers resemble the classical NLP pipelin...
Recent studies (Blevins et al. 2018, Tenney et al. 2019, etc) have presented evidence that linguisti...
Recent progress in pretraining language models on large textual corpora led to a surge of improvemen...
Since the first bidirectional deep learn- ing model for natural language understanding, BERT, emerge...
Pretrained language models (PLMs) form the basis of most state-of-the-art NLP technologies. Neverthe...
Over the past decades natural language processing has evolved from a niche research area into a fast...
Several studies investigated the linguistic information implicitly encoded in Neural Language Models...
In this paper we present a comparison between the linguistic knowledge encoded in the internal repre...
Classifiers trained on auxiliary probing tasks are a popular tool to analyze the representations lea...
International audienceBERT is a recent language representation model that has surprisingly performed...
The outstanding performance recently reached by Neural Language Models (NLMs) across many Natural La...
The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP,...
Language models have become nearly ubiquitous in natural language processing applications achieving ...
While most theories regarding the various aspects of human language are couched in the language of d...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
Peeking into the inner workings of BERT has shown that its layers resemble the classical NLP pipelin...
Recent studies (Blevins et al. 2018, Tenney et al. 2019, etc) have presented evidence that linguisti...
Recent progress in pretraining language models on large textual corpora led to a surge of improvemen...
Since the first bidirectional deep learn- ing model for natural language understanding, BERT, emerge...
Pretrained language models (PLMs) form the basis of most state-of-the-art NLP technologies. Neverthe...
Over the past decades natural language processing has evolved from a niche research area into a fast...