Word embeddings are useful for a wide vari- ety of tasks, but they lack interpretability. By rotating word spaces, interpretable dimensions can be identified while preserving the informa- tion contained in the embeddings without any loss. In this work, we investigate three meth- ods for making word spaces interpretable by rotation: Densifier (Rothe et al., 2016), linear SVMs and DensRay, a new method we pro- pose. In contrast to Densifier, DensRay can be computed in closed form, is hyperparameter- free and thus more robust than Densifier. We evaluate the three methods on lexicon induc- tion and set-based word analogy. In addition we provide qualitative insights as to how inter- pretable word spaces can be used for removing gender bias from ...
Words are often mapped to vectors in a vector-space (Euclidean-space). Such mappings, also called em...
In this paper we propose the application of feature hashing to create word embeddings for natural la...
Word embeddings are useful in many tasks in Natural Language Processing and Information Retrieval, s...
Word embeddings are useful for a wide vari- ety of tasks, but they lack interpretability. By rotatin...
Word embeddings seek to recover a Euclidean metric space by mapping words into vectors, starting fro...
Word embeddings, which represent words as dense feature vectors, are widely used in natural language...
Recent years have seen a dramatic growth in the popularity of word embeddings mainly owing to t...
International audienceIn this paper we discuss the well-known claim that language analogies yield al...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
International audienceWord embedding methods allow to represent words as vectors in a space that is ...
Word embeddings encode semantic mean-ings of words into low-dimension word vectors. In most word emb...
Vector space models of words in NLP---word embeddings---have been recently shown to reliably encode ...
Word embedding models have been an important contribution to natural language processing; following ...
Words are often mapped to vectors in a vector-space (Euclidean-space). Such mappings, also called em...
In this paper we propose the application of feature hashing to create word embeddings for natural la...
Word embeddings are useful in many tasks in Natural Language Processing and Information Retrieval, s...
Word embeddings are useful for a wide vari- ety of tasks, but they lack interpretability. By rotatin...
Word embeddings seek to recover a Euclidean metric space by mapping words into vectors, starting fro...
Word embeddings, which represent words as dense feature vectors, are widely used in natural language...
Recent years have seen a dramatic growth in the popularity of word embeddings mainly owing to t...
International audienceIn this paper we discuss the well-known claim that language analogies yield al...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
Words are not detached individuals but part of a beautiful interconnected web of related concepts, a...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
International audienceWord embedding methods allow to represent words as vectors in a space that is ...
Word embeddings encode semantic mean-ings of words into low-dimension word vectors. In most word emb...
Vector space models of words in NLP---word embeddings---have been recently shown to reliably encode ...
Word embedding models have been an important contribution to natural language processing; following ...
Words are often mapped to vectors in a vector-space (Euclidean-space). Such mappings, also called em...
In this paper we propose the application of feature hashing to create word embeddings for natural la...
Word embeddings are useful in many tasks in Natural Language Processing and Information Retrieval, s...