International audienceWord embedding methods allow to represent words as vectors in a space that is structured using word co-occurrences so that words with close meanings are close in this space. These vectors are then provided as input to automatic systems to solve natural language processing problems. Because interpretability is a necessary condition to trusting such systems, interpretability of embedding spaces, the first link in the chain is an important issue. In this paper, we thus evaluate the interpretability of vectors extracted with two approaches: SPINE, a k-sparse auto-encoder, and SINr, a graph-based method. This evaluation is based on a Word Intrusion Task with human annotators. It is operated using a large French corpus, and ...
International audienceWord Embeddings have proven to be effective for many Natural Language Processi...
Comunicació presentada a: 56th Annual Meeting of the Association for Computational Linguistics celeb...
Frontier Prize (best paper)International audienceWhile graph embedding aims at learning low-dimensio...
International audienceWord embedding methods allow to represent words as vectors in a space that is ...
Word embeddings serve as an useful resource for many downstream natural language processing tasks. T...
While word embeddings have proven to be highly useful in many NLP tasks, they are difficult to inter...
International audienceWord embeddings intervene in a wide range of natural language processing tasks...
Word embeddings are useful for a wide vari- ety of tasks, but they lack interpretability. By rotatin...
International audienceA lot of current semantic NLP tasks use semi-automatically collected data, tha...
International audienceContextualised embeddings such as BERT have become de facto state-of-the-art r...
Prediction without justification has limited utility. Much of the success of neural models can be at...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
In recent years it has become clear that data is the new resource of power and richness. The compani...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Word embeddings have developed into a major NLP tool with broad applicability. Understanding the sem...
International audienceWord Embeddings have proven to be effective for many Natural Language Processi...
Comunicació presentada a: 56th Annual Meeting of the Association for Computational Linguistics celeb...
Frontier Prize (best paper)International audienceWhile graph embedding aims at learning low-dimensio...
International audienceWord embedding methods allow to represent words as vectors in a space that is ...
Word embeddings serve as an useful resource for many downstream natural language processing tasks. T...
While word embeddings have proven to be highly useful in many NLP tasks, they are difficult to inter...
International audienceWord embeddings intervene in a wide range of natural language processing tasks...
Word embeddings are useful for a wide vari- ety of tasks, but they lack interpretability. By rotatin...
International audienceA lot of current semantic NLP tasks use semi-automatically collected data, tha...
International audienceContextualised embeddings such as BERT have become de facto state-of-the-art r...
Prediction without justification has limited utility. Much of the success of neural models can be at...
The digital era floods us with an excessive amount of text data. To make sense of such data automati...
In recent years it has become clear that data is the new resource of power and richness. The compani...
Real-valued word embeddings have transformed natural language processing (NLP) applications, recogni...
Word embeddings have developed into a major NLP tool with broad applicability. Understanding the sem...
International audienceWord Embeddings have proven to be effective for many Natural Language Processi...
Comunicació presentada a: 56th Annual Meeting of the Association for Computational Linguistics celeb...
Frontier Prize (best paper)International audienceWhile graph embedding aims at learning low-dimensio...