NLP technologies are uneven for the world's languages as the state-of-the-art models are only available for a handful of them. This is because developing such a language-specific model needs rich monolingual resources and labelled datasets which are partly or completely missing for many languages that are low-resource.This inequality in multilingual resources and limited capabilities of existing NLP models, especially for low-resource languages, drive us to explore more sophisticated solutions. This thesis presents a unified approach that consists of a set of novel methods within the context of multilingual learning and adaptation to move current NLP technologies beyond a small-set of resource-rich languages. We evaluate these techniques by...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
The Helsinki-NLP team participated in the AmericasNLP 2023 Shared Task with 6 submissions for all 11...
Neural Machine Translation (NMT) models are typically trained by considering humans as end-users and...
NLP technologies are uneven for the world's languages as the state-of-the-art models are only availa...
Abstract: The subject area of multilingual natural language processing (NLP) is concerned with the p...
© Dr Long DuongNatural language processing (NLP) aims, broadly speaking, to teach computers to under...
Multilingual Neural Machine Translation (MNMT) for low- resource languages (LRL) can be enhanced by ...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Thesis (Ph.D.)--University of Washington, 2022Modern NLP systems have been highly successful at a wi...
The current generation of neural network-based natural language processing models excels at learning...
Substantial progress has been made in the field of natural language processing (NLP) due to the adve...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
The Helsinki-NLP team participated in the AmericasNLP 2023 Shared Task with 6 submissions for all 11...
Neural Machine Translation (NMT) models are typically trained by considering humans as end-users and...
NLP technologies are uneven for the world's languages as the state-of-the-art models are only availa...
Abstract: The subject area of multilingual natural language processing (NLP) is concerned with the p...
© Dr Long DuongNatural language processing (NLP) aims, broadly speaking, to teach computers to under...
Multilingual Neural Machine Translation (MNMT) for low- resource languages (LRL) can be enhanced by ...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Thesis (Ph.D.)--University of Washington, 2022Modern NLP systems have been highly successful at a wi...
The current generation of neural network-based natural language processing models excels at learning...
Substantial progress has been made in the field of natural language processing (NLP) due to the adve...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
The Helsinki-NLP team participated in the AmericasNLP 2023 Shared Task with 6 submissions for all 11...
Neural Machine Translation (NMT) models are typically trained by considering humans as end-users and...