When classifying texts using a linear classifier, the texts are commonly represented as feature vectors. Previous methods to represent features as vectors have been unable to capture the context of individual words in the texts, in theory leading to a poor representation of natural language. Bidirectional Encoder Representations from Transformers (BERT), uses a multi-headed self-attention mechanism to create deep bidirectional feature representations, able to model the whole context of all words in a sequence. A BERT model uses a transfer learning approach, where it is pre-trained on a large amount of data and can be further fine-tuned for several down-stream tasks. This thesis uses one multilingual, and two dedicated Swedish BERT models, f...
This thesis considers sentiment polarity analysis in Swedish. De-spite being the most widely spoken ...
Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language repre...
Formality Style Transfer (FST) is the task of automatically transforming a piece of text from one le...
When classifying texts using a linear classifier, the texts are commonly represented as feature vect...
This thesis investigates how pre-trained contextualized language models can be adapted for multi-lab...
Assigning categories to text communications is a common task of Natural Language Processing (NLP). I...
Nowadays, contextual language models can solve a wide range of language tasks such as text classific...
This thesis is a proof-of-concept for embedding Swedish documents using continuous vectors. These ve...
Language model pre-training architectures have demonstrated to be useful to learn language represent...
In today’s modern digital world more and more emails and messengers must be sent, processed and hand...
Relying on large pretrained language models such as Bidirectional Encoder Representations from Trans...
This degree project examines and evaluates the performance of various ways of improving contextualiz...
This thesis explores the possibility to extend monolingual and bilingual text classifiers to multipl...
Document classification or categorization with algorithms is a well-known problem in information sci...
Technology has dominated a huge part of human life. Furthermore, technology users use language conti...
This thesis considers sentiment polarity analysis in Swedish. De-spite being the most widely spoken ...
Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language repre...
Formality Style Transfer (FST) is the task of automatically transforming a piece of text from one le...
When classifying texts using a linear classifier, the texts are commonly represented as feature vect...
This thesis investigates how pre-trained contextualized language models can be adapted for multi-lab...
Assigning categories to text communications is a common task of Natural Language Processing (NLP). I...
Nowadays, contextual language models can solve a wide range of language tasks such as text classific...
This thesis is a proof-of-concept for embedding Swedish documents using continuous vectors. These ve...
Language model pre-training architectures have demonstrated to be useful to learn language represent...
In today’s modern digital world more and more emails and messengers must be sent, processed and hand...
Relying on large pretrained language models such as Bidirectional Encoder Representations from Trans...
This degree project examines and evaluates the performance of various ways of improving contextualiz...
This thesis explores the possibility to extend monolingual and bilingual text classifiers to multipl...
Document classification or categorization with algorithms is a well-known problem in information sci...
Technology has dominated a huge part of human life. Furthermore, technology users use language conti...
This thesis considers sentiment polarity analysis in Swedish. De-spite being the most widely spoken ...
Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language repre...
Formality Style Transfer (FST) is the task of automatically transforming a piece of text from one le...