We analyze the Knowledge Neurons framework for the attribution of factual and relational knowledge to particular neurons in the transformer network. We use a 12-layer multi-lingual BERT model for our experiments. Our study reveals various interesting phenomena. We observe that mostly factual knowledge can be attributed to middle and higher layers of the network($\ge 6$). Further analysis reveals that the middle layers($6-9$) are mostly responsible for relational information, which is further refined into actual factual knowledge or the "correct answer" in the last few layers($10-12$). Our experiments also show that the model handles prompts in different languages, but representing the same fact, similarly, providing further evidence for eff...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
The transformers that drive chatbots and other AI systems constitute large language models (LLMs). T...
Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks,...
Despite being designed for performance rather than cognitive plausibility, transformer language mode...
Language Generation Models produce words based on the previous context. Although existing methods of...
In Neural Machine Translation (and, more generally, conditional language modeling), the generation o...
Combining structured information with language models is a standing problem in NLP. Building on prev...
Transformer based language models exhibit intelligent behaviors such as understanding natural langua...
Recent results achieved by statistical approaches involving Deep Neural Learning architectures sugge...
Since language models are used to model a wide variety of languages, it is natural to ask whether th...
Pre-trained transformer is a class of neural networks behind many recent natural language processing...
In the last decade, the size of deep neural architectures implied in Natural Language Processing (NL...
We propose a synthetic task, LEGO (Learning Equality and Group Operations), that encapsulates the pr...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
The transformers that drive chatbots and other AI systems constitute large language models (LLMs). T...
Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks,...
Despite being designed for performance rather than cognitive plausibility, transformer language mode...
Language Generation Models produce words based on the previous context. Although existing methods of...
In Neural Machine Translation (and, more generally, conditional language modeling), the generation o...
Combining structured information with language models is a standing problem in NLP. Building on prev...
Transformer based language models exhibit intelligent behaviors such as understanding natural langua...
Recent results achieved by statistical approaches involving Deep Neural Learning architectures sugge...
Since language models are used to model a wide variety of languages, it is natural to ask whether th...
Pre-trained transformer is a class of neural networks behind many recent natural language processing...
In the last decade, the size of deep neural architectures implied in Natural Language Processing (NL...
We propose a synthetic task, LEGO (Learning Equality and Group Operations), that encapsulates the pr...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
In this paper, we present an in-depth investigation of the linguistic knowledge encoded by the trans...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
The transformers that drive chatbots and other AI systems constitute large language models (LLMs). T...