We propose the on-the-fly ensembling of a machine translation model with an LLM, prompted on the same task and input. We perform experiments on 4 language pairs (both directions) with varying data amounts. We find that a slightly weaker-at-translation LLM can improve translations of a NMT model, and ensembling with an LLM can produce better translations than ensembling two stronger MT models. We combine our method with various techniques from LLM prompting, such as in context learning and translation context
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
2014-07-28The goal of machine translation is to translate from one natural language into another usi...
Despite the known limitations, most machine translation systems today still operate on the sentence-...
Pre-training and fine-tuning have become the de facto paradigm in many natural language processing (...
Consistency is a key requirement of highquality translation. It is especially important to adhere t...
Neural language models (NLMs) have been able to improve machine translation (MT) thanks to their abi...
Neural Machine Translation (NMT) typically leverages monolingual data in training through backtransl...
Open-sourced large language models (LLMs) have demonstrated remarkable efficacy in various tasks wit...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Recent work has shown success in us-ing neural network language models (NNLMs) as features in MT sys...
Neural machine translation (NMT) conducts end-to-end translation with a source language encoder and ...
Recent work has shown success in us-ing neural network language models (NNLMs) as features in MT sys...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
2014-07-28The goal of machine translation is to translate from one natural language into another usi...
Despite the known limitations, most machine translation systems today still operate on the sentence-...
Pre-training and fine-tuning have become the de facto paradigm in many natural language processing (...
Consistency is a key requirement of highquality translation. It is especially important to adhere t...
Neural language models (NLMs) have been able to improve machine translation (MT) thanks to their abi...
Neural Machine Translation (NMT) typically leverages monolingual data in training through backtransl...
Open-sourced large language models (LLMs) have demonstrated remarkable efficacy in various tasks wit...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Recent work has shown success in us-ing neural network language models (NNLMs) as features in MT sys...
Neural machine translation (NMT) conducts end-to-end translation with a source language encoder and ...
Recent work has shown success in us-ing neural network language models (NNLMs) as features in MT sys...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...