Recent research has discovered that a shared bilingual word embedding space can be induced by projecting monolingual word embedding spaces from two languages using a self-learning paradigm without any bilingual supervision. However, it has also been shown that for distant language pairs such fully unsupervised self-learning methods are unstable and often get stuck in poor local optima due to reduced isomorphism between starting monolingual spaces. In this work, we propose a new robust framework for learning unsupervised multilingual word embeddings that mitigates the instability issues. We learn a shared multilingual embedding space for a variable number of languages by incrementally adding new languages one by one to the current multilingu...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We present a method that consumes a large corpus of multilingual text and produces a single, unified...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Recent efforts in cross-lingual word embedding (CLWE) learning have predominantly focused on fully u...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
We propose a simple yet effective approach to learning bilingual word embeddings (BWEs) from non-par...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
Effective projection-based cross-lingual word embedding (CLWE) induction critically relies on the it...
Bilingual Word Embeddings (BWEs) are one of the cornerstones of cross-lingual transfer of NLP models...
Building bilingual lexica from non-parallel data is a long-standing natural language processing rese...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We present a method that consumes a large corpus of multilingual text and produces a single, unified...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Recent efforts in cross-lingual word embedding (CLWE) learning have predominantly focused on fully u...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
We propose a simple yet effective approach to learning bilingual word embeddings (BWEs) from non-par...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
Effective projection-based cross-lingual word embedding (CLWE) induction critically relies on the it...
Bilingual Word Embeddings (BWEs) are one of the cornerstones of cross-lingual transfer of NLP models...
Building bilingual lexica from non-parallel data is a long-standing natural language processing rese...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We present a method that consumes a large corpus of multilingual text and produces a single, unified...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...