NLP has yielded results that were unimaginable only a few years ago on a wide range of real-world tasks, thanks to deep neural networks and the availability of large-scale labeled training datasets. However, the critical assumption of existing supervised methods that labeled training data is available for all classes is unscalable: the acquisition of such data is prohibitively laborious and expensive. Therefore, zero-shot (or unsupervised) models that can seamlessly adapt to new unseen classes are indispensable for NLP methods to work in real-world applications effectively; such models mitigate (or eliminate) the need for collecting and annotating data for each domain. This dissertation addresses three critical NLP problems in contexts wher...
Computational linguistics explores how human language is interpreted automatically and then processe...
The rapid adoption of low-code/no-code software systems has reshaped the landscape of software devel...
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural langua...
NLP has yielded results that were unimaginable only a few years ago on a wide range of real-world ta...
Instruction tuning is an emergent paradigm in NLP wherein natural language instructions are leverage...
The natural language processing field has seen task-oriented dialog systems emerge as a strong area ...
Pretraining deep neural networks to perform language modeling - that is, to reconstruct missing word...
In recent years, deep learning has made substantial improvements in various fields like image unders...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
This paper addresses zero-shot slot filling, which tries to build a system that can generalize to un...
One of the most impressive results of recent NLP history is the ability of pre-trained language mode...
Semantic parsing is an important NLP problem, particularly for voice assistants such as Alexa and Go...
Intent classification is known to be a complex problem in Natural Language Processing (NLP) research...
This thesis presents resources capable of enhancing solutions of some Natural Language Processing (N...
Conversational AI has seen tremendous progress in recent years, achieving near-human or even surpass...
Computational linguistics explores how human language is interpreted automatically and then processe...
The rapid adoption of low-code/no-code software systems has reshaped the landscape of software devel...
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural langua...
NLP has yielded results that were unimaginable only a few years ago on a wide range of real-world ta...
Instruction tuning is an emergent paradigm in NLP wherein natural language instructions are leverage...
The natural language processing field has seen task-oriented dialog systems emerge as a strong area ...
Pretraining deep neural networks to perform language modeling - that is, to reconstruct missing word...
In recent years, deep learning has made substantial improvements in various fields like image unders...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
This paper addresses zero-shot slot filling, which tries to build a system that can generalize to un...
One of the most impressive results of recent NLP history is the ability of pre-trained language mode...
Semantic parsing is an important NLP problem, particularly for voice assistants such as Alexa and Go...
Intent classification is known to be a complex problem in Natural Language Processing (NLP) research...
This thesis presents resources capable of enhancing solutions of some Natural Language Processing (N...
Conversational AI has seen tremendous progress in recent years, achieving near-human or even surpass...
Computational linguistics explores how human language is interpreted automatically and then processe...
The rapid adoption of low-code/no-code software systems has reshaped the landscape of software devel...
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural langua...