Contextual embeddings build multidimensional representations of word tokens based on their context of occurrence. Such models have been shown to achieve a state-of-the-art performance on a wide variety of tasks. Yet, the community struggles in understanding what kind of semantic knowledge these representations encode. We report a series of experiments aimed at investigating to what extent one of such models, BERT, is able to infer the semantic relations that, according to Dowty’s Proto-Roles theory, a verbal argument receives by virtue of its role in the event described by the verb. This hypothesis were put to test by learning a linear mapping from the BERT’s verb embeddings to an interpretable space of semantic properties built from the li...
One of the most remarkable properties of word embeddings is the fact that they capture certain types...
Several studies investigated the linguistic information implicitly encoded in Neural Language Models...
We present an event-related potentials (ERP) study that addresses the question of how pieces of info...
Contextual embeddings build multidimensional representations of word tokens based on their context o...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
We probe nouns in BERT contextual embedding space for grammatical role (subject vs. object of a clau...
La tesi si propone di presentare i risultati di tre esperimenti volti ad indagare la codifica di inf...
Distributional semantics represents words as multidimensional vectors recording their statistical di...
Nowadays, contextual language models can solve a wide range of language tasks such as text classific...
International audienceWord vector representations play a fundamental role in many NLP applications. ...
The latest work on language representations carefully integrates contextualized features into langua...
In this work, we carry out two experiments in order to assess the ability of BERT to capture themean...
When performing Polarity Detection for different words in a sentence, we need to look at the words a...
Despite the success of contextualized language models on various NLP tasks, it is still unclear what...
One of the most remarkable properties of word embeddings is the fact that they capture certain types...
Several studies investigated the linguistic information implicitly encoded in Neural Language Models...
We present an event-related potentials (ERP) study that addresses the question of how pieces of info...
Contextual embeddings build multidimensional representations of word tokens based on their context o...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
Contextualized word embeddings, i.e. vector representations for words in context, are naturally seen...
We probe nouns in BERT contextual embedding space for grammatical role (subject vs. object of a clau...
La tesi si propone di presentare i risultati di tre esperimenti volti ad indagare la codifica di inf...
Distributional semantics represents words as multidimensional vectors recording their statistical di...
Nowadays, contextual language models can solve a wide range of language tasks such as text classific...
International audienceWord vector representations play a fundamental role in many NLP applications. ...
The latest work on language representations carefully integrates contextualized features into langua...
In this work, we carry out two experiments in order to assess the ability of BERT to capture themean...
When performing Polarity Detection for different words in a sentence, we need to look at the words a...
Despite the success of contextualized language models on various NLP tasks, it is still unclear what...
One of the most remarkable properties of word embeddings is the fact that they capture certain types...
Several studies investigated the linguistic information implicitly encoded in Neural Language Models...
We present an event-related potentials (ERP) study that addresses the question of how pieces of info...