International audienceThe attention mechanism in Neural Machine Translation (NMT) models added flexibility to translation systems, and the possibility to visualize soft-alignments between source and target representations. While there is much debate about the relationship between attention and the yielded output for neural models [26, 35, 43, 38], in this paper we propose a different assessment, investigating soft-alignment interpretability in low-resource scenarios. We experimented with different architectures (RNN [5], 2D-CNN [15], and Transformer [39]), comparing them with regards to their ability to produce directly exploitable alignments. For evaluating exploitability, we replicated the Unsupervised Word Segmentation (UWS) task from Go...
For endangered languages, data collection campaigns have to accommodate the challenge that many of t...
Word alignment is an essential task in natural language processing because of its critical role in t...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
The attention mechanism in Neural Machine Translation (NMT) models added flexibility to translation ...
Machine translation, the task of automatically translating text from one natural language into anoth...
International audienceSince Bahdanau et al. [1] first introduced attention for neural machine transl...
After more than a decade of phrase-based systems dominating the scene of machine translation, neural...
Lexically constrained neural machine translation (NMT), which leverages pre-specified translation to...
Zero-shot translations is a fascinating feature of Multilingual Neural Machine Translation (MNMT) sy...
Word alignments identify translational correspondences between words in a parallel sentence pair and...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Since Bahdanau et al. [1] first introduced attention for neural machine translation, most sequence-t...
International audienceAttention-based sequence-to-sequence neural machine translation systems have b...
Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to th...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
For endangered languages, data collection campaigns have to accommodate the challenge that many of t...
Word alignment is an essential task in natural language processing because of its critical role in t...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
The attention mechanism in Neural Machine Translation (NMT) models added flexibility to translation ...
Machine translation, the task of automatically translating text from one natural language into anoth...
International audienceSince Bahdanau et al. [1] first introduced attention for neural machine transl...
After more than a decade of phrase-based systems dominating the scene of machine translation, neural...
Lexically constrained neural machine translation (NMT), which leverages pre-specified translation to...
Zero-shot translations is a fascinating feature of Multilingual Neural Machine Translation (MNMT) sy...
Word alignments identify translational correspondences between words in a parallel sentence pair and...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Since Bahdanau et al. [1] first introduced attention for neural machine translation, most sequence-t...
International audienceAttention-based sequence-to-sequence neural machine translation systems have b...
Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to th...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
For endangered languages, data collection campaigns have to accommodate the challenge that many of t...
Word alignment is an essential task in natural language processing because of its critical role in t...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...