Word embeddings encode semantic mean-ings of words into low-dimension word vectors. In most word embeddings, one cannot interpret the meanings of specific dimensions of those word vectors. Non-negative matrix factorization (NMF) has been proposed to learn interpretable word embeddings via non-negative constraints. However, NMF methods suffer from scale and memory issue because they have to maintain a global matrix for learning. To alleviate this challenge, we propose on-line learning of interpretable word embed-dings from streaming text data. Exper-iments show that our model consistently outperforms the state-of-the-art word em-bedding methods in both representation a-bility and interpretability. The source code of this paper can be obtaine...
This paper proposes a deep hierarchical Non-negative Matrix Factorization (NMF) method with Skip-Gra...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
Word embeddings encode semantic mean-ings of words into low-dimension word vectors. In most word emb...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
Recently significant advances have been witnessed in the area of distributed word representations ba...
Word embeddings, which represent words as dense feature vectors, are widely used in natural language...
Word Embeddings are low-dimensional distributed representations that encompass a set of language mod...
We propose some better word embedding models based on vLBL model and ivLBL model by sharing represen...
Deep learning models have become state of the art for natural language processing (NLP) tasks, howev...
Despite the growing interest in prediction-based word embedding learning methods, it remains unclear...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embedding, which encodes words into vectors, is an important starting point in natural language...
International audienceThe DEDICOM algorithm provides a uniquely interpretable matrix factorization m...
This paper proposes a deep hierarchical Non-negative Matrix Factorization (NMF) method with Skip-Gra...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...
Word embeddings encode semantic mean-ings of words into low-dimension word vectors. In most word emb...
Word embeddings are vectorial semantic representations built with either counting or predicting tech...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
Recently significant advances have been witnessed in the area of distributed word representations ba...
Word embeddings, which represent words as dense feature vectors, are widely used in natural language...
Word Embeddings are low-dimensional distributed representations that encompass a set of language mod...
We propose some better word embedding models based on vLBL model and ivLBL model by sharing represen...
Deep learning models have become state of the art for natural language processing (NLP) tasks, howev...
Despite the growing interest in prediction-based word embedding learning methods, it remains unclear...
Word embeddings are a key component of high-performing natural language processing (NLP) systems, bu...
Word embedding, which encodes words into vectors, is an important starting point in natural language...
International audienceThe DEDICOM algorithm provides a uniquely interpretable matrix factorization m...
This paper proposes a deep hierarchical Non-negative Matrix Factorization (NMF) method with Skip-Gra...
Prediction without justification has limited utility. Much of the success of neural models can be at...
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, ...