Translation into morphologically-rich languages challenges neural machine translation (NMT) models with extremely sparse vocabularies where atomic treatment of surface forms is unrealistic. This problem is typically addressed by either pre-processing words into subword units or performing translation directly at the level of characters. The former is based on word segmentation algorithms optimized using corpus-level statistics with no regard to the translation task. The latter learns directly from translation data but requires rather deep architectures. In this paper, we propose to translate words by modeling word formation through a hierarchical latent variable model which mimics the process of morphological inflection. Our model generates...
Recently, neural machine translation (NMT) has emerged as a powerful alternative to conventional st...
Out-of-vocabulary words present a great challenge for Machine Translation. Recently various characte...
Neural architectures are prominent in the construction of language models (LMs). However, word-leve...
This thesis addresses some of the challenges of translating morphologically rich languages (MRLs). W...
Neural machine translation (NMT) models are typically trained with fixed-size input and output vocab...
Neural Machine Translation (NMT) models generally perform translation using a fixed-size lexical voc...
A morphologically complex word (MCW) is a hierarchical constituent with meaning-preserving subunits,...
We propose a novel pipeline for translation into morphologically rich languages which consists of tw...
Language models play an important role in many natural language processing tasks. In this thesis, we...
Treating morphologically complex words (MCWs) as atomic units in translation would not yield a desi...
The state of the art of handling rich morphology in neural machine translation (NMT) is to break wor...
The requirement for neural machine translation (NMT) models to use fixed-size input and output vocab...
Neural machine translation (NMT) models are conventionally trained with fixed-size vocabu- laries to...
Lexical sparsity is a major challenge for machine translation into morphologically rich languages. W...
Neural machine translation (NMT) suffers a performance deficiency when a limited vocabulary fails to...
Recently, neural machine translation (NMT) has emerged as a powerful alternative to conventional st...
Out-of-vocabulary words present a great challenge for Machine Translation. Recently various characte...
Neural architectures are prominent in the construction of language models (LMs). However, word-leve...
This thesis addresses some of the challenges of translating morphologically rich languages (MRLs). W...
Neural machine translation (NMT) models are typically trained with fixed-size input and output vocab...
Neural Machine Translation (NMT) models generally perform translation using a fixed-size lexical voc...
A morphologically complex word (MCW) is a hierarchical constituent with meaning-preserving subunits,...
We propose a novel pipeline for translation into morphologically rich languages which consists of tw...
Language models play an important role in many natural language processing tasks. In this thesis, we...
Treating morphologically complex words (MCWs) as atomic units in translation would not yield a desi...
The state of the art of handling rich morphology in neural machine translation (NMT) is to break wor...
The requirement for neural machine translation (NMT) models to use fixed-size input and output vocab...
Neural machine translation (NMT) models are conventionally trained with fixed-size vocabu- laries to...
Lexical sparsity is a major challenge for machine translation into morphologically rich languages. W...
Neural machine translation (NMT) suffers a performance deficiency when a limited vocabulary fails to...
Recently, neural machine translation (NMT) has emerged as a powerful alternative to conventional st...
Out-of-vocabulary words present a great challenge for Machine Translation. Recently various characte...
Neural architectures are prominent in the construction of language models (LMs). However, word-leve...