In this thesis, we explore the impact of M-BERT and different transfer sizes on the choice of different transfer languages in dependency parsing. In order to investigate our research questions, we conduct a series of experiments on the treebanks in Universal Dependencies with UUParser. The main conclusions and contributions of this study are as follows: First, we train a variety of languages in several different scripts with M-BERT being added into the parsing framework, which is one of the most state-of-the-art deep learning models based on the Transformer architecture. In general, we get advancing results with M-BERT compared with the randomly initialized embedding in UUParser. Second, since it is a common way to choose a source ...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
As the interest of the NLP community grows to develop several treebanks also for languages other tha...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
The field of natural language processing with machine learning has gone through a quantum leap in th...
Universal Dependency (UD) annotations, despite their usefulness for cross-lingual tasks and semantic...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
We investigate whether off-the-shelf deep bidirectional sentence representations (Devlin et al., 201...
We present a study that compares data-driven dependency parsers obtained by means of annotation proj...
The growing work in multi-lingual parsing faces the challenge of fair comparative evaluation and per...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Many downstream applications are using dependency trees, and are thus relying on dependencyparsers p...
This article presents a comparative analysis of dependency parsing results for a set of 16 languages...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Multilingual dependency parsing encapsulates any attempt to parse multiple languages. It can involve...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
As the interest of the NLP community grows to develop several treebanks also for languages other tha...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
The field of natural language processing with machine learning has gone through a quantum leap in th...
Universal Dependency (UD) annotations, despite their usefulness for cross-lingual tasks and semantic...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
We investigate whether off-the-shelf deep bidirectional sentence representations (Devlin et al., 201...
We present a study that compares data-driven dependency parsers obtained by means of annotation proj...
The growing work in multi-lingual parsing faces the challenge of fair comparative evaluation and per...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Many downstream applications are using dependency trees, and are thus relying on dependencyparsers p...
This article presents a comparative analysis of dependency parsing results for a set of 16 languages...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Multilingual dependency parsing encapsulates any attempt to parse multiple languages. It can involve...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
As the interest of the NLP community grows to develop several treebanks also for languages other tha...