International audienceIn the last few years, the natural language processing community has witnessed advances in neural representations of free texts with transformer-based language models (LMs). Given the importance of knowledge available in tabular data, recent research efforts extend LMs by developing neural representations for structured data. In this work, we present a survey that analyzes these efforts. We first abstract the different systems according to a traditional machine learning pipeline in terms of training data, input representation, model training, and supported downstream tasks. For each aspect, we characterize and compare the proposed solutions. Finally, we discuss future work directions
Artificial neural networks have obtained astonishing results in a diverse number of tasks.One of the...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
This open access book provides an overview of the recent advances in representation learning theory,...
International audienceIn the last few years, the natural language processing community has witnessed...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
In the last few decades, text mining has been used to extract knowledge from free texts. Applying ne...
In the last few decades, text mining has been used to extract knowledge from free texts. Applying ne...
Current research state-of-the-art in automatic data-to-text generation, a major task in natural lang...
Today, there are many text data processing and classification models available, and the demand for a...
Unsupervised learning text representations aims at converting natural languages into vector represen...
The utilisation of automated classification tools from the field of Natural Language Processing (NLP...
Natural language processing (NLP) techniques had significantly improved by introducing pre-trained l...
The goal of my thesis is to investigate the most influential transformer architectures and to apply ...
In 2017, Vaswani et al. proposed a new neural network architecture named Transformer. That modern ar...
How to properly represent language is a crucial and fundamental problem in Natural Language Processi...
Artificial neural networks have obtained astonishing results in a diverse number of tasks.One of the...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
This open access book provides an overview of the recent advances in representation learning theory,...
International audienceIn the last few years, the natural language processing community has witnessed...
This chapter presents an overview of the state of the art in natural language processing, exploring ...
In the last few decades, text mining has been used to extract knowledge from free texts. Applying ne...
In the last few decades, text mining has been used to extract knowledge from free texts. Applying ne...
Current research state-of-the-art in automatic data-to-text generation, a major task in natural lang...
Today, there are many text data processing and classification models available, and the demand for a...
Unsupervised learning text representations aims at converting natural languages into vector represen...
The utilisation of automated classification tools from the field of Natural Language Processing (NLP...
Natural language processing (NLP) techniques had significantly improved by introducing pre-trained l...
The goal of my thesis is to investigate the most influential transformer architectures and to apply ...
In 2017, Vaswani et al. proposed a new neural network architecture named Transformer. That modern ar...
How to properly represent language is a crucial and fundamental problem in Natural Language Processi...
Artificial neural networks have obtained astonishing results in a diverse number of tasks.One of the...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
This open access book provides an overview of the recent advances in representation learning theory,...