Many natural language processing applications rely on word representations (also called word embeddings) to achieve state-of-the-art results. These numerical representations of the language should encode both syntactic and semantic information to perform well in downstream tasks. However, common models (word2vec, GloVe) use generic corpus like Wikipedia to learn them and they therefore lack specific semantic information. Moreover it requires a large memory space to store them because the number of representations to save can be in the order of a million. The topic of my thesis is to develop new learning algorithms to both improve the semantic information encoded within the representations while making them requiring less memory space for st...
Thesis (D. Phi)--Stellenbosch University, 2016.ENGLISH ABSTRACT: In contrast to only a decade ago, i...
International audienceLearning word embeddings on large unla-beled corpus has been shown to be succe...
Word embedding models have been an important contribution to natural language processing; following ...
Many natural language processing applications rely on word representations (also called word embeddi...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Word embeddings are commonly used as a starting point in many NLP models to achieve state-of-the-art...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
International audienceWord embeddings are commonly used as a starting point in many NLP models to ac...
Word embeddings are used as building blocks for a wide range of natural language processing and info...
AbstractDespite the large diffusion and use of embedding generated through Word2Vec, there are still...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
International audienceDistributed word representations are popularly used in many tasks in natural l...
Word embedding algorithms produce very reliable feature representations of words that are used by ne...
Thesis (D. Phi)--Stellenbosch University, 2016.ENGLISH ABSTRACT: In contrast to only a decade ago, i...
International audienceLearning word embeddings on large unla-beled corpus has been shown to be succe...
Word embedding models have been an important contribution to natural language processing; following ...
Many natural language processing applications rely on word representations (also called word embeddi...
Word embedding is a feature learning technique which aims at mapping words from a vocabulary into ve...
Word embeddings are commonly used as a starting point in many NLP models to achieve state-of-the-art...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new...
Distributional semantics has been revolutionized by neural-based word embeddings methods such as wor...
International audienceWord embeddings are commonly used as a starting point in many NLP models to ac...
Word embeddings are used as building blocks for a wide range of natural language processing and info...
AbstractDespite the large diffusion and use of embedding generated through Word2Vec, there are still...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
International audienceDistributed word representations are popularly used in many tasks in natural l...
Word embedding algorithms produce very reliable feature representations of words that are used by ne...
Thesis (D. Phi)--Stellenbosch University, 2016.ENGLISH ABSTRACT: In contrast to only a decade ago, i...
International audienceLearning word embeddings on large unla-beled corpus has been shown to be succe...
Word embedding models have been an important contribution to natural language processing; following ...