<p>We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks—word similarity ranking, syntactic and semantic analogies, sentence completion, and sentiment analysis—demonstrate that the method outperforms or is competitive with state-of-the-art methods.</p
With the increasing availability of large datasets machine learning techniques are becoming an incr...
This paper seeks to combine dictionary learning and hierarchical image representation in a principle...
Topic models with sparsity enhancement have been proven to be effective at learn- ing discriminative...
We propose a new method for learning word representations using hierarchical regularization in spars...
We propose a new method for learning word representations using hierarchical regularization in spars...
Abstract. Sparse coding plays a key role in high dimensional data anal-ysis. One critical challenge ...
This paper introduces an elemental building block which combines Dictionary Learning and Dimension R...
Dictionary learning algorithms, aiming to learn a sparsifying transform from train-ing data, are oft...
It has been long known that sparsity is an effective inductive bias for learning efficient represent...
Learning low dimensional representations from a large number of short corpora has a profound practic...
International audienceSparse coding consists in representing signals as sparse linear combinations o...
We present sparse topical coding (STC), a non-probabilistic formulation of topic models for discover...
The demand for Natural Language Processing has been thriving rapidly due to the various emerging Int...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
Combined with neural language models, distributed word representations achieve significant advantage...
With the increasing availability of large datasets machine learning techniques are becoming an incr...
This paper seeks to combine dictionary learning and hierarchical image representation in a principle...
Topic models with sparsity enhancement have been proven to be effective at learn- ing discriminative...
We propose a new method for learning word representations using hierarchical regularization in spars...
We propose a new method for learning word representations using hierarchical regularization in spars...
Abstract. Sparse coding plays a key role in high dimensional data anal-ysis. One critical challenge ...
This paper introduces an elemental building block which combines Dictionary Learning and Dimension R...
Dictionary learning algorithms, aiming to learn a sparsifying transform from train-ing data, are oft...
It has been long known that sparsity is an effective inductive bias for learning efficient represent...
Learning low dimensional representations from a large number of short corpora has a profound practic...
International audienceSparse coding consists in representing signals as sparse linear combinations o...
We present sparse topical coding (STC), a non-probabilistic formulation of topic models for discover...
The demand for Natural Language Processing has been thriving rapidly due to the various emerging Int...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
Combined with neural language models, distributed word representations achieve significant advantage...
With the increasing availability of large datasets machine learning techniques are becoming an incr...
This paper seeks to combine dictionary learning and hierarchical image representation in a principle...
Topic models with sparsity enhancement have been proven to be effective at learn- ing discriminative...