Neural Machine Translation (NMT) is notorious for its need for large amounts ofbilingual data. An effective approach to compensate for this requirement is Multi-Task Learning (MTL) to leverage different linguistic resources as a source of inductive bias. Current MTL architectures are based on the SEQ2SEQ transduction, and (partially) share different components of the models among the tasks. However, this MTL approach often suffers from task interference, and is not able to fully capture commonalities among subsets of tasks. We address this issue by extending the recurrent units with multiple blocks along with a trainable routing network. The routing network enables adaptive collaboration by dynamic sharing of blocks conditioned on the task ...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
In Neural Machine Translation (NMT) the usage of sub-words and characters as source and target units...
The current generation of neural network-based natural language processing models excels at learning...
Neural Machine Translation (NMT), a data-hungry technology, suffers from the lack of bilingual data ...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Using a mix of shared and language-specific (LS) parameters has shown promise in multilingual neural...
Neural machine translation requires large amounts of parallel training text to learn a reasonable-qu...
Neural machine translation is known to require large numbers of parallel training sentences, which g...
Multilingual Neural Machine Translation (MNMT) trains a single NMT model that supports translation b...
Although machine translation (MT) traditionally pursues “human-oriented” objectives, humans are not ...
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Mach...
In Neural Machine Translation (NMT) the usage of subwords and characters as source and target units ...
Domain adaptation for SMT usually adapts models to an individual specific domain. However, it often ...
| openaire: EC/H2020/780069/EU//MeMADThere are several approaches for improving neural machine trans...
The Helsinki-NLP team participated in the AmericasNLP 2023 Shared Task with 6 submissions for all 11...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
In Neural Machine Translation (NMT) the usage of sub-words and characters as source and target units...
The current generation of neural network-based natural language processing models excels at learning...
Neural Machine Translation (NMT), a data-hungry technology, suffers from the lack of bilingual data ...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Using a mix of shared and language-specific (LS) parameters has shown promise in multilingual neural...
Neural machine translation requires large amounts of parallel training text to learn a reasonable-qu...
Neural machine translation is known to require large numbers of parallel training sentences, which g...
Multilingual Neural Machine Translation (MNMT) trains a single NMT model that supports translation b...
Although machine translation (MT) traditionally pursues “human-oriented” objectives, humans are not ...
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Mach...
In Neural Machine Translation (NMT) the usage of subwords and characters as source and target units ...
Domain adaptation for SMT usually adapts models to an individual specific domain. However, it often ...
| openaire: EC/H2020/780069/EU//MeMADThere are several approaches for improving neural machine trans...
The Helsinki-NLP team participated in the AmericasNLP 2023 Shared Task with 6 submissions for all 11...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
In Neural Machine Translation (NMT) the usage of sub-words and characters as source and target units...
The current generation of neural network-based natural language processing models excels at learning...