Recent advances in multilingual language modeling have brought the idea of a truly universal parser closer to reality. However, such models are still not immune to the “curse of multilinguality”: Cross-language interference and restrained model capacity remain major obstacles. To address this, we propose a novel language adaptation approach by introducing contextual language adapters to a multilingual parser. Contextual language adapters make it possible to learn adapters via language embeddings while sharing model parameters across languages based on contextual parameter generation. Moreover, our method allows for an easy but effective integration of existing linguistic typology features into the parsing model. Because not all typological ...
Today, the top performing parsing algorithms rely on the availability of annotated data for learning...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Recent advances in multilingual language modeling have brought the idea of a truly universal parser ...
Recent advances in multilingual dependency parsing have brought the idea of a truly universal parser...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
International audienceThe existence of universal models to describe the syntax of languages has been...
International audienceThis paper presents a new approach to the problem of cross-lingual dependency ...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
This thesis presents several studies in neural dependency parsing for typologically diverse language...
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for...
International audienceWe propose a novel approach to cross-lingual part-of-speech tagging and depend...
Les parsers en dépendances modernes ont des résultats comparables à ceux d'experts humains. Cependan...
International audienceLanguages evolve and diverge over time. Their evolutionary history is often de...
Today, the top performing parsing algorithms rely on the availability of annotated data for learning...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Recent advances in multilingual language modeling have brought the idea of a truly universal parser ...
Recent advances in multilingual dependency parsing have brought the idea of a truly universal parser...
This work presents two experiments with the goal of replicating the transferability of dependency pa...
International audienceThe existence of universal models to describe the syntax of languages has been...
International audienceThis paper presents a new approach to the problem of cross-lingual dependency ...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Current methods of cross-lingual parser transfer focus on predicting the best parser for a low-resou...
This thesis presents several studies in neural dependency parsing for typologically diverse language...
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for...
International audienceWe propose a novel approach to cross-lingual part-of-speech tagging and depend...
Les parsers en dépendances modernes ont des résultats comparables à ceux d'experts humains. Cependan...
International audienceLanguages evolve and diverge over time. Their evolutionary history is often de...
Today, the top performing parsing algorithms rely on the availability of annotated data for learning...
We study multi-source transfer parsing for resource-poor target languages; specifically methods for ...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...