In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the transformer models currently available for the Italian language. In particular, we investigate how the complexity of two different architectures of probing models affects the performance of the Transformers in encoding a wide spectrum of linguistic features. Moreover, we explore how this implicit knowledge varies according to different textual genres and language varieties
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
We investigated the performance of two connectionist neural networks with different architecture to ...
We investigated the performance of two connectionist neural networks with different architecture to ...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper we present an in-depth investigation of the linguistic knowledge encoded by the transf...
Modern language models based on deep artificial neural networks have achieved impressive progress in...
The outstanding performance recently reached by Neural Language Models (NLMs) across many Natural La...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
We explore ChatGPT’s handling of left-peripheral phenomena in Italian and Italian varieties through ...
The volume reports the author’s research experiences and experiments in developing solutions in the ...
The volume reports the author’s research experiences and experiments in developing solutions in the ...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
We investigated the performance of two connectionist neural networks with different architecture to ...
We investigated the performance of two connectionist neural networks with different architecture to ...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
In this paper we present an in-depth investigation of the linguistic knowledge encoded by the transf...
Modern language models based on deep artificial neural networks have achieved impressive progress in...
The outstanding performance recently reached by Neural Language Models (NLMs) across many Natural La...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
We explore ChatGPT’s handling of left-peripheral phenomena in Italian and Italian varieties through ...
The volume reports the author’s research experiences and experiments in developing solutions in the ...
The volume reports the author’s research experiences and experiments in developing solutions in the ...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
We investigated the performance of two connectionist neural networks with different architecture to ...
We investigated the performance of two connectionist neural networks with different architecture to ...