Unsupervised word embedding methods are frequently used for natural language processing applications. However, the unsupervised methods overlook known lexical relations that can be of value to capture accurate semantic word relations. This thesis aims to explore if Swedish word embeddings can benefit from prior known linguistic information. Four knowledge graphs extracted from Svenska Akademiens ordlista (SAOL) are incorporated during the training process using the Probabilistic Word Embeddings with Laplacian Priors (PELP) model. The four implemented PELP models are compared with baseline results to evaluate the use of side information. The results suggest that various lexical relations in SAOL are of interest to generate more accurate Swed...
Semantiska vektormodeller är en kraftfull teknik där ords mening kan representeras av vektorervilka ...
Identifying word classes is an important part of describing a language. Research about sign language...
Distributed word representations (word embeddings) have recently contributed to competitive performa...
Unsupervised word embedding methods are frequently used for natural language processing applications...
Word vectors, embeddings of words into a low-dimensional space, have been shown to be useful for a l...
In the 21st century, textual data has become abundant, large and easily available, which makes quant...
International audienceWe apply real-valued word vectors combined with two different types of classif...
In this thesis, we aim to explore the combination of different lexical normalization methods and pro...
In this thesis, we investigate how natural language processing (NLP) tools and techniques can be app...
The concept of knowledge is proper only to the human being thanks to the faculty of understanding. T...
Educational content recommendation is a cornerstone of AI-enhanced learning. In particular, to facil...
Incorporating external information during a learning process is expected to improve its efficiency. ...
In recent years it has become clear that data is the new resource of power and richness. The compani...
Various studies have shown the usefulness of word embedding models for a wide variety of natural lan...
This thesis investigates Swedish lexical blends. A lexical blend is defined as the concatenation of ...
Semantiska vektormodeller är en kraftfull teknik där ords mening kan representeras av vektorervilka ...
Identifying word classes is an important part of describing a language. Research about sign language...
Distributed word representations (word embeddings) have recently contributed to competitive performa...
Unsupervised word embedding methods are frequently used for natural language processing applications...
Word vectors, embeddings of words into a low-dimensional space, have been shown to be useful for a l...
In the 21st century, textual data has become abundant, large and easily available, which makes quant...
International audienceWe apply real-valued word vectors combined with two different types of classif...
In this thesis, we aim to explore the combination of different lexical normalization methods and pro...
In this thesis, we investigate how natural language processing (NLP) tools and techniques can be app...
The concept of knowledge is proper only to the human being thanks to the faculty of understanding. T...
Educational content recommendation is a cornerstone of AI-enhanced learning. In particular, to facil...
Incorporating external information during a learning process is expected to improve its efficiency. ...
In recent years it has become clear that data is the new resource of power and richness. The compani...
Various studies have shown the usefulness of word embedding models for a wide variety of natural lan...
This thesis investigates Swedish lexical blends. A lexical blend is defined as the concatenation of ...
Semantiska vektormodeller är en kraftfull teknik där ords mening kan representeras av vektorervilka ...
Identifying word classes is an important part of describing a language. Research about sign language...
Distributed word representations (word embeddings) have recently contributed to competitive performa...