Non-autoregressive neural machine translation (NAT) models are proposed to accelerate the inference process while maintaining relatively high performance. However, existing NAT models are difficult to achieve the desired efficiency-quality trade-off. For one thing, fully NAT models with efficient inference perform inferior to their autoregressive counterparts. For another, iterative NAT models can, though, achieve comparable performance while diminishing the advantage of speed. In this paper, we propose RenewNAT, a flexible framework with high efficiency and effectiveness, to incorporate the merits of fully and iterative NAT models. RenewNAT first generates the potential translation results and then renews them in a single pass. It can achi...
Neural machine translation (NMT) has become the de facto standard in the machine translation communi...
Although neural machine translation models reached high translation quality, the autoregressive natu...
International audienceNon-autoregressive machine translation (NAT) has recently made great progress....
How do we perform efficient inference while retaining high translation quality? Existing neural mach...
Non-autoregressive neural machine translation (NAT) models suffer from the multi-modality problem th...
Non-autoregressive neural machine translation (NAT) generates each target word in parallel and has a...
Non-autoregressive translation (NAT) models remove the dependence on previous target tokens and gene...
Non-autoregressive approaches aim to improve the inference speed of translation models by only requi...
Non-autoregressive translation (NAT) models, which remove the dependence on previous target tokens f...
As a new neural machine translation approach, NonAutoregressive machine Translation (NAT) has attrac...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, ...
In recent years, a number of mehtods for improving the decoding speed of neural machine translation ...
Non-Autoregressive Neural Machine Translation (NAT) achieves significant decoding speedup through ge...
Benefiting from the sequence-level knowledge distillation, the Non-Autoregressive Transformer (NAT) ...
Neural machine translation (NMT) has become the de facto standard in the machine translation communi...
Although neural machine translation models reached high translation quality, the autoregressive natu...
International audienceNon-autoregressive machine translation (NAT) has recently made great progress....
How do we perform efficient inference while retaining high translation quality? Existing neural mach...
Non-autoregressive neural machine translation (NAT) models suffer from the multi-modality problem th...
Non-autoregressive neural machine translation (NAT) generates each target word in parallel and has a...
Non-autoregressive translation (NAT) models remove the dependence on previous target tokens and gene...
Non-autoregressive approaches aim to improve the inference speed of translation models by only requi...
Non-autoregressive translation (NAT) models, which remove the dependence on previous target tokens f...
As a new neural machine translation approach, NonAutoregressive machine Translation (NAT) has attrac...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, ...
In recent years, a number of mehtods for improving the decoding speed of neural machine translation ...
Non-Autoregressive Neural Machine Translation (NAT) achieves significant decoding speedup through ge...
Benefiting from the sequence-level knowledge distillation, the Non-Autoregressive Transformer (NAT) ...
Neural machine translation (NMT) has become the de facto standard in the machine translation communi...
Although neural machine translation models reached high translation quality, the autoregressive natu...
International audienceNon-autoregressive machine translation (NAT) has recently made great progress....