Semantic representations have long been argued as potentially useful for enforcing meaning preservation and improving generalization performance of machine translation methods. In this work, we are the first to incorporate information about predicate-argument structure of source sentences (namely, semantic-role representations) into neural machine translation. We use Graph Convolutional Networks (GCNs) to inject a semantic bias into sentence encoders and achieve improvements in BLEU scores over the linguistic-agnostic and syntax-aware versions on the English–German language pair
We explored the syntactic information encoded implicitly by neural machine translation (NMT) models ...
We propose semantic role features for a Tree-to-String transducer to model the re-ordering/deletion ...
Neural language models learn word representations that capture rich linguistic and conceptual inform...
Machine translation presents its root in the domain of textual processing that focuses on the usage ...
Neural Machine Translation (NMT) systems require a massive amount of Maintaining semantic relations ...
Maintaining semantic relations between words during the translation process yields more accurate tar...
In Interlingua based machine translation source lan-guage sentences have to be converted to a semant...
Recently, sequence-to-sequence models have achieved impressive performance on a number of semantic p...
Graph-based semantic parsing aims to represent textual meaning through directed graphs. As one of th...
Typically it is seen that multimodal neural machine translation (MNMT) systems trained on a combinat...
Neural networks learn patterns from data to solve complex problems. To understand and infer meaning ...
IEEE End-to-end neural machine translation has overtaken statistical machine translation in terms of...
The construction of high-quality word embeddings is essential in natural language processing. In exi...
Thesis (Ph. D.)--University of Rochester. Department of Computer Science, 2018.In recent years, ther...
A translation memory (TM) is proved to be helpful to improve neural machine translation (NMT). Exist...
We explored the syntactic information encoded implicitly by neural machine translation (NMT) models ...
We propose semantic role features for a Tree-to-String transducer to model the re-ordering/deletion ...
Neural language models learn word representations that capture rich linguistic and conceptual inform...
Machine translation presents its root in the domain of textual processing that focuses on the usage ...
Neural Machine Translation (NMT) systems require a massive amount of Maintaining semantic relations ...
Maintaining semantic relations between words during the translation process yields more accurate tar...
In Interlingua based machine translation source lan-guage sentences have to be converted to a semant...
Recently, sequence-to-sequence models have achieved impressive performance on a number of semantic p...
Graph-based semantic parsing aims to represent textual meaning through directed graphs. As one of th...
Typically it is seen that multimodal neural machine translation (MNMT) systems trained on a combinat...
Neural networks learn patterns from data to solve complex problems. To understand and infer meaning ...
IEEE End-to-end neural machine translation has overtaken statistical machine translation in terms of...
The construction of high-quality word embeddings is essential in natural language processing. In exi...
Thesis (Ph. D.)--University of Rochester. Department of Computer Science, 2018.In recent years, ther...
A translation memory (TM) is proved to be helpful to improve neural machine translation (NMT). Exist...
We explored the syntactic information encoded implicitly by neural machine translation (NMT) models ...
We propose semantic role features for a Tree-to-String transducer to model the re-ordering/deletion ...
Neural language models learn word representations that capture rich linguistic and conceptual inform...