Distributed word representations are widely used for modeling words in NLP tasks. Most of the existing models generate one representation per word and do not consider different meanings of a word. We present two approaches to learn multiple topic-sensitive representations per word by using Hierarchical Dirichlet Process. We observe that by modeling topics and integrating topic distributions for each document we obtain representations that are able to distinguish between different meanings of a given word. Our models yield statistically significant improvements for the lexical substitution task indicating that commonly used single word representations, even when combined with contextual information, are insufficient for this task
Topics in semantic representation 1 Topics in semantic representation 2 Accounts of language process...
Topic modeling is an unsupervised learning task that discovers the hidden topics in a ...
We propose a new method for learning word representations using hierarchical regularization in spars...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
Probabilistic topic models are widely used to discover latent topics in document collec-tions, while...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
Vector-space distributed representations of words can capture syntactic and semantic regularities in...
We extend Latent Dirichlet Allocation (LDA) by explicitly allowing for the en-coding of side informa...
Latent Dirichlet Allocation models a document by a mixture of topics, where each topic itself is typ...
Distributed word embeddings have yielded state-of-the-art performance in many NLP tasks, mainly due ...
Abstract. Most nonparametric topic models such as Hierarchical Dirichlet Pro-cesses, when viewed as ...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Topic models like latent Dirichlet allocation (LDA) provide a framework for analyzing large datasets...
In this paper, we propose a novel topic model based on incorporating dictionary definitions. Traditio...
Topics in semantic representation 1 Topics in semantic representation 2 Accounts of language process...
Topic modeling is an unsupervised learning task that discovers the hidden topics in a ...
We propose a new method for learning word representations using hierarchical regularization in spars...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
Probabilistic topic models are widely used to discover latent topics in document collec-tions, while...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
Vector-space distributed representations of words can capture syntactic and semantic regularities in...
We extend Latent Dirichlet Allocation (LDA) by explicitly allowing for the en-coding of side informa...
Latent Dirichlet Allocation models a document by a mixture of topics, where each topic itself is typ...
Distributed word embeddings have yielded state-of-the-art performance in many NLP tasks, mainly due ...
Abstract. Most nonparametric topic models such as Hierarchical Dirichlet Pro-cesses, when viewed as ...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Topic models like latent Dirichlet allocation (LDA) provide a framework for analyzing large datasets...
In this paper, we propose a novel topic model based on incorporating dictionary definitions. Traditio...
Topics in semantic representation 1 Topics in semantic representation 2 Accounts of language process...
Topic modeling is an unsupervised learning task that discovers the hidden topics in a ...
We propose a new method for learning word representations using hierarchical regularization in spars...