Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that words with similar meaningare represented by similar vectors regardlessof their language. Although the existing mod-els achieve high performance on pairs of mor-phologically simple languages, they performvery poorly on morphologically rich languagessuch as Turkish and Finnish. In this pa-per, we propose a morpheme-based model inorder to increase the performance of cross-lingual word embeddings on morphologicallyrich languages. Our model includes a sim-ple extension which enables us to exploit mor-phemes for cross-lingual mapping. We ap-plied our model for the Turkish-Finnish lan-guage pair on the bilingual word translationtask. Results show tha...
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (C...
Some languages have very few NLP resources, while many of them are closely related to better-resourc...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learn a shared vector space for two or more lan- guages so that ...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings represent words in a numeric space so that semantic relations between words are repr...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Cross-lingual representations of words enable us to reason about word meaning in multilingual contex...
Cross-lingual word embeddings are becoming increasingly important in multilingual NLP. Recently, i...
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (C...
Some languages have very few NLP resources, while many of them are closely related to better-resourc...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learna shared vector space for two or more lan-guages so that wo...
Cross-lingual word embedding models learn a shared vector space for two or more lan- guages so that ...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings - dense vector representations of a word’s distributional semantics - are an indespe...
Word embeddings represent words in a numeric space so that semantic relations between words are repr...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
Cross-lingual representations of words enable us to reason about word meaning in multilingual contex...
Cross-lingual word embeddings are becoming increasingly important in multilingual NLP. Recently, i...
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (C...
Some languages have very few NLP resources, while many of them are closely related to better-resourc...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...