We study multi-source transfer parsing for resource-poor target languages; specifically methods for target language adaptation of delexicalized discriminative graph-based dependency parsers. We first show how recent insights on selective parameter sharing, based on typological and language-family features, can be applied to a discriminative parser by carefully decomposing its model features. We then show how the parser can be relexicalized and adapted using unlabeled target language data and a learning method that can incorporate diverse knowledge sources through ambiguous labelings. In the latter scenario, we exploit two sources of knowledge: arc marginals derived from the base parser in a self-training algorithm, and arc predictions from ...
Although multilingual pretrained models (mPLMs) enabled support of various natural language processi...
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
Cross-lingual model transfer has been a promising approach for inducing dependency parsers for low-r...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
We present a novel method for the cross-lingual transfer of dependency parsers. Our goal is to induc...
International audienceThis paper studies cross-lingual transfer for dependency parsing, focusing on ...
Recent advances in multilingual language modeling have brought the idea of a truly universal parser ...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Cross-lingual transfer has been shown effective for dependency parsing of some low-resource language...
In this thesis, we explore the impact of M-BERT and different transfer sizes on the choice of differ...
We present a study that compares data-driven dependency parsers obtained by means of annotation proj...
Although multilingual pretrained models (mPLMs) enabled support of various natural language processi...
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
Cross-lingual model transfer has been a promising approach for inducing dependency parsers for low-r...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
We present a novel method for the cross-lingual transfer of dependency parsers. Our goal is to induc...
International audienceThis paper studies cross-lingual transfer for dependency parsing, focusing on ...
Recent advances in multilingual language modeling have brought the idea of a truly universal parser ...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Cross-lingual transfer has been shown effective for dependency parsing of some low-resource language...
In this thesis, we explore the impact of M-BERT and different transfer sizes on the choice of differ...
We present a study that compares data-driven dependency parsers obtained by means of annotation proj...
Although multilingual pretrained models (mPLMs) enabled support of various natural language processi...
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...