Recent advances in generating monolingual word embeddings based on word co-occurrence for universal languages inspired new efforts to extend the model to support diversified languages. State-of-the-art methods for learning cross-lingual word embeddings rely on the alignment of monolingual word embedding spaces. Our goal is to implement a word co-occurrence across languages with the universal concepts’ method. Such concepts are notions that are fundamental to humankind and are thus persistent across languages, e.g., a man or woman, war or peace, etc. Given bilingual lexicons, we built universal concepts as undirected graphs of connected nodes and then replaced the words belonging to the same graph with a unique graph ID. This intuitive desig...
We study the role of the second language in bilingual word embeddings in monolingual semantic evalu...
The ability to accurately align concepts between languages can provide significant benefits in many ...
[EN] A novel method for finding linear mappings among word embeddings for several languages, taking ...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
Recent research has discovered that a shared bilingual word embedding space can be induced by projec...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
Recent work in learning bilingual repre-sentations tend to tailor towards achiev-ing good performanc...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
We develop a novel cross-lingual word representation model which injects syntactic information throu...
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (C...
We study the role of the second language in bilingual word embeddings in monolingual semantic evalu...
Distributed representations of words which map each word to a continuous vector have proven useful i...
We study the role of the second language in bilingual word embeddings in monolingual semantic evalu...
The ability to accurately align concepts between languages can provide significant benefits in many ...
[EN] A novel method for finding linear mappings among word embeddings for several languages, taking ...
Recent advances in generating monolingual word embeddings based on word co-occurrence for universal ...
Recent research has discovered that a shared bilingual word embedding space can be induced by projec...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
A joint-space model for cross-lingual distributed representations generalizes language-invariant sem...
Recent work in learning bilingual repre-sentations tend to tailor towards achiev-ing good performanc...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
We develop a novel cross-lingual word representation model which injects syntactic information throu...
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (C...
We study the role of the second language in bilingual word embeddings in monolingual semantic evalu...
Distributed representations of words which map each word to a continuous vector have proven useful i...
We study the role of the second language in bilingual word embeddings in monolingual semantic evalu...
The ability to accurately align concepts between languages can provide significant benefits in many ...
[EN] A novel method for finding linear mappings among word embeddings for several languages, taking ...