In the literature, tensors have been effectively used for capturing the context information in language models. However, the existing methods usually adopt relatively-low order tensors, which have limited expressive power in modeling language. Developing a higher-order tensor representation is challenging, in terms of deriving an effective solution and showing its generality. In this paper, we propose a language model named Tensor Space Language Model (TSLM), by utilizing tensor networks and tensor decomposition. In TSLM, we build a high-dimensional semantic space constructed by the tensor product of word vectors. Theoretically, we prove that such tensor representation is a generalization of the n-gram language model. We further show that t...
We provide a comparative study be-tween neural word representations and traditional vector spaces ba...
Recent work has shown that compositional-distributional models using element-wise op-erations on con...
Neural network language models (NNLMs) have achieved ever-improving accuracy due to more sophisticat...
In this paper, we propose a text representation model, Tensor Space Model (TSM), which models the te...
This paper investigates the learning of 3rd-order tensors representing the seman-tics of transitive ...
Models of word meaning, built from a corpus of text, have demonstrated success in emulating human pe...
This paper develops and evaluates an enhanced corpus based approach for semantic processing. Corpus ...
Abstract. While tensor factorizations have become increasingly popu-lar for learning on various form...
A tensor network is a type of decomposition used to express and approximate large arrays of data. A ...
Most machine learning models for structured data encode the structural knowledge of a node by levera...
The paper surveys the topic of tensor decompositions in modern machine learning applications. It foc...
We present a novel compositional, gener-ative model for vector space representa-tions of meaning. Th...
We consider the problem of text representation and categorization. Conventionally, a text document i...
Rich semantic representations of linguistic data are an essential component to the development of ma...
Categorical compositional distributional models unify compositional formal semantic models and distr...
We provide a comparative study be-tween neural word representations and traditional vector spaces ba...
Recent work has shown that compositional-distributional models using element-wise op-erations on con...
Neural network language models (NNLMs) have achieved ever-improving accuracy due to more sophisticat...
In this paper, we propose a text representation model, Tensor Space Model (TSM), which models the te...
This paper investigates the learning of 3rd-order tensors representing the seman-tics of transitive ...
Models of word meaning, built from a corpus of text, have demonstrated success in emulating human pe...
This paper develops and evaluates an enhanced corpus based approach for semantic processing. Corpus ...
Abstract. While tensor factorizations have become increasingly popu-lar for learning on various form...
A tensor network is a type of decomposition used to express and approximate large arrays of data. A ...
Most machine learning models for structured data encode the structural knowledge of a node by levera...
The paper surveys the topic of tensor decompositions in modern machine learning applications. It foc...
We present a novel compositional, gener-ative model for vector space representa-tions of meaning. Th...
We consider the problem of text representation and categorization. Conventionally, a text document i...
Rich semantic representations of linguistic data are an essential component to the development of ma...
Categorical compositional distributional models unify compositional formal semantic models and distr...
We provide a comparative study be-tween neural word representations and traditional vector spaces ba...
Recent work has shown that compositional-distributional models using element-wise op-erations on con...
Neural network language models (NNLMs) have achieved ever-improving accuracy due to more sophisticat...