In this paper, we investigate whether multilingual neural translation models learn stronger semantic abstractions of sentences than bilingual ones. We test this hypotheses by measuring the perplexity of such models when applied to paraphrases of the source language. The intuition is that an encoder produces better representations if a decoder is capable of recognizing synonymous sentences in the same language even though the model is never trained for that task. In our setup, we add 16 different auxiliary languages to a bidirectional bilingual baseline model (English-French) and test it with in-domain and out-of-domain paraphrases in English. The results show that the perplexity is significantly reduced in each of the cases, indicating that...
Models of lexical semantics are a key component of natural language understanding. The bulk of work ...
For machine translation to tackle discourse phenomena, models must have access to extra-sentential l...
In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multiling...
In this paper, we investigate whether multilingual neural translation models learn stronger semantic...
We present PARABANK, a large-scale English paraphrase dataset that surpasses prior work in both quan...
Neural machine translation (NMT) systems encode an input sentence into an intermediate representatio...
Natural language allows for the same meaning (semantics) to be expressed in multiple different ways,...
Paraphrase generation is the task of given a word sequence generating another word sequence that kee...
This paper presents FISKMÖ, a project that focuses on the development of resources and tools for cro...
We explored the syntactic information encoded implicitly by neural machine translation (NMT) models ...
Neural language models learn word representations that capture rich linguistic and conceptual inform...
Neural machine translation has considerably improved the quality of automatic translations by learni...
We present a methodology that explores how sentence structure is reflected in neural representations...
Paraphrasing and translation have previously been treated as unconnected natural lan¬ guage process...
IEEE End-to-end neural machine translation has overtaken statistical machine translation in terms of...
Models of lexical semantics are a key component of natural language understanding. The bulk of work ...
For machine translation to tackle discourse phenomena, models must have access to extra-sentential l...
In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multiling...
In this paper, we investigate whether multilingual neural translation models learn stronger semantic...
We present PARABANK, a large-scale English paraphrase dataset that surpasses prior work in both quan...
Neural machine translation (NMT) systems encode an input sentence into an intermediate representatio...
Natural language allows for the same meaning (semantics) to be expressed in multiple different ways,...
Paraphrase generation is the task of given a word sequence generating another word sequence that kee...
This paper presents FISKMÖ, a project that focuses on the development of resources and tools for cro...
We explored the syntactic information encoded implicitly by neural machine translation (NMT) models ...
Neural language models learn word representations that capture rich linguistic and conceptual inform...
Neural machine translation has considerably improved the quality of automatic translations by learni...
We present a methodology that explores how sentence structure is reflected in neural representations...
Paraphrasing and translation have previously been treated as unconnected natural lan¬ guage process...
IEEE End-to-end neural machine translation has overtaken statistical machine translation in terms of...
Models of lexical semantics are a key component of natural language understanding. The bulk of work ...
For machine translation to tackle discourse phenomena, models must have access to extra-sentential l...
In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multiling...