This chapter presents an overview of the state of the art in natural language processing, exploring one specific computational architecture, the Transformer model, which plays a central role in a wide range of applications. This architecture condenses many advances in neural learning methods and can be exploited in many ways : to learn representations for linguistic entities ; to generate coherent utterances and answer questions ; to perform utterance transformations, an illustration being their automatic translation. These different facets of the architecture will be successively presented, also allowing us to discuss its limitations
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Pre-trained models used in the transfer-learning scenario are recently becoming very popular. Such m...
Comunicació presentada a: 14th Workshop on Building and Using Comparable Corpora (BUCC) celebrat el ...
This paper presents an overview of the state of the art in natural language processing, exploring on...
The goal of my thesis is to investigate the most influential transformer architectures and to apply ...
Deep learning has the potential to help solve numerous problems in cognitive science andeducation, b...
The article is an essay on the development of technologies for natural language processing, which fo...
With the recent developments in the field of Natural Language Processing, there has been a rise in t...
With the recent developments in the field of Natural Language Processing, there has been a rise in t...
In 2017, Vaswani et al. proposed a new neural network architecture named Transformer. That modern ar...
These improvements open many possibilities in solving Natural Language Processing downstream tasks. ...
International audienceIn the last few years, the natural language processing community has witnessed...
Speech Translation has been traditionally addressed with the concatenation of two tasks: Speech Reco...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
Transformers have been established as one of the most effective neural approach in performing variou...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Pre-trained models used in the transfer-learning scenario are recently becoming very popular. Such m...
Comunicació presentada a: 14th Workshop on Building and Using Comparable Corpora (BUCC) celebrat el ...
This paper presents an overview of the state of the art in natural language processing, exploring on...
The goal of my thesis is to investigate the most influential transformer architectures and to apply ...
Deep learning has the potential to help solve numerous problems in cognitive science andeducation, b...
The article is an essay on the development of technologies for natural language processing, which fo...
With the recent developments in the field of Natural Language Processing, there has been a rise in t...
With the recent developments in the field of Natural Language Processing, there has been a rise in t...
In 2017, Vaswani et al. proposed a new neural network architecture named Transformer. That modern ar...
These improvements open many possibilities in solving Natural Language Processing downstream tasks. ...
International audienceIn the last few years, the natural language processing community has witnessed...
Speech Translation has been traditionally addressed with the concatenation of two tasks: Speech Reco...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
Transformers have been established as one of the most effective neural approach in performing variou...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Pre-trained models used in the transfer-learning scenario are recently becoming very popular. Such m...
Comunicació presentada a: 14th Workshop on Building and Using Comparable Corpora (BUCC) celebrat el ...