Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our met...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Distributed language representation has become the most widely used technique for language represent...
How to properly represent language is a crucial and fundamental problem in Natural Language Processi...
In this paper, we propose an approach for enhancing word representations twice based on large-scale ...
Vector-space distributed representations of words can capture syntactic and semantic regularities in...
Machine learning of distributed word representations with neural embeddings is a state-of-the-art ap...
Methods for representing the meaning of words in vector spaces purely using the information distribu...
Methods for learning word representations using large text corpora have received much attention late...
Most distributional lexico-semantic models derive their representations based on external language r...
We propose a new method for learning word representations using hierarchical regularization in spars...
The mathematical representation of semantics is a key issue for Natural Language Processing (NLP). A...
In this paper we propose a general framework for learning distributed represen-tations of attributes...
In recent years, there has been an increasing interest in learning a distributed representation of w...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
Representation learning is a research area within machine learning and natural language processing (...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Distributed language representation has become the most widely used technique for language represent...
How to properly represent language is a crucial and fundamental problem in Natural Language Processi...
In this paper, we propose an approach for enhancing word representations twice based on large-scale ...
Vector-space distributed representations of words can capture syntactic and semantic regularities in...
Machine learning of distributed word representations with neural embeddings is a state-of-the-art ap...
Methods for representing the meaning of words in vector spaces purely using the information distribu...
Methods for learning word representations using large text corpora have received much attention late...
Most distributional lexico-semantic models derive their representations based on external language r...
We propose a new method for learning word representations using hierarchical regularization in spars...
The mathematical representation of semantics is a key issue for Natural Language Processing (NLP). A...
In this paper we propose a general framework for learning distributed represen-tations of attributes...
In recent years, there has been an increasing interest in learning a distributed representation of w...
We propose a new model for learning bilingual word representations from non-parallel document-aligne...
Representation learning is a research area within machine learning and natural language processing (...
Distributed representations of meaning are a natural way to encode covariance relationships between ...
Distributed language representation has become the most widely used technique for language represent...
How to properly represent language is a crucial and fundamental problem in Natural Language Processi...