One of the notable developments in current natural language processing is the practical efficacy of probabilistic word representations, where words are embedded in high-dimensional continuous vector spaces that are optimized to reflect their distributional relationships. For sequences of words, such as phrases and sentences, distributional representations can be estimated by combining word embeddings using arithmetic operations like vector averaging or by estimating composition parameters from data using various objective functions. The quality of these compositional representations is typically estimated by their performance as features in extrinsic supervised classification benchmarks. Word and compositional embeddings for a single langua...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Cross-domain alignment play a key roles in tasks ranging from machine translation to transfer learni...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
The ability to accurately align concepts between languages can provide significant benefits in many ...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
Cross-lingual word embeddings aim to bridge the gap between high-resource and low-resource languages...
Cross-domain alignment play a key roles in tasks ranging from machine translation to transfer learni...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small par...
Cross-lingual embeddings are vector space representations where word translations tend to be co-loca...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
The ability to accurately align concepts between languages can provide significant benefits in many ...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...