We propose a new neural model for word embeddings, which uses Unitary Matrices as the primary device for encoding lexical information. It uses simple matrix multiplication to derive matrices for large units, yielding a sentence processing model that is strictly compositional, does not lose information over time steps, and is transparent, in the sense that word embed- dings can be analysed regardless of context. This model does not employ activation functions, and so the network is fully accessible to analysis by the methods of linear algebra at each point in its operation on an input sequence. We test it in two NLP agreement tasks and obtain rule like perfect accuracy, with greater stability than current state-of-the-art systems. Our propos...
Feature representation has been one of the most important factors for the success of machine learnin...
The present paper intends to draw the conception of language implied in the technique of word embedd...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
The emergence of powerful deep learning systems has largely displaced classical sym- bolic algebraic...
Introduction Word embeddings, which are distributed word representations learned by neural language ...
Word representation or word embedding is an important step in understanding languages. It maps simil...
For a long time, natural language processing (NLP) has relied on generative models with task specifi...
The recent tremendous success of unsupervised word embeddings in a multitude of applications raises ...
Natural language processing (NLP) is one of the most important technologies of the information age. ...
Recent advances in deep learning have provided fruitful applications for natural language processing...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on lar...
When the field of natural language processing (NLP) entered the era of deep neural networks, the tas...
In this project we make a study on universal language agnostic sentence embeddings: internal neural ...
Historically, models of human language assume that sentences have a symbolic structure and that this...
Feature representation has been one of the most important factors for the success of machine learnin...
The present paper intends to draw the conception of language implied in the technique of word embedd...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
The emergence of powerful deep learning systems has largely displaced classical sym- bolic algebraic...
Introduction Word embeddings, which are distributed word representations learned by neural language ...
Word representation or word embedding is an important step in understanding languages. It maps simil...
For a long time, natural language processing (NLP) has relied on generative models with task specifi...
The recent tremendous success of unsupervised word embeddings in a multitude of applications raises ...
Natural language processing (NLP) is one of the most important technologies of the information age. ...
Recent advances in deep learning have provided fruitful applications for natural language processing...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on lar...
When the field of natural language processing (NLP) entered the era of deep neural networks, the tas...
In this project we make a study on universal language agnostic sentence embeddings: internal neural ...
Historically, models of human language assume that sentences have a symbolic structure and that this...
Feature representation has been one of the most important factors for the success of machine learnin...
The present paper intends to draw the conception of language implied in the technique of word embedd...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...