Prompts have been shown to be an effective method to adapt a frozen Pretrained Language Model (PLM) to perform well on downstream tasks. Prompts can be represented by a human-engineered word sequence or by a learned continuous embedding. In this work, we investigate conditional and compositional differentiable prompting. We propose a new model, Prompt Production System (PRopS), which learns to transform task instructions or input metadata, into continuous prompts that elicit task-specific outputs from the PLM. Our model uses a modular network structure based on our neural formulation of Production Systems, which allows the model to learn discrete rules -- neural functions that learn to specialize in transforming particular prompt input patt...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
The meanings of words and phrases depend not only on where they are used (contexts) but also on who ...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Recent works have shown that attaching prompts to the input is effective at conditioning Language Mo...
Language models (LLMs) offer potential as a source of knowledge for agents that need to acquire new ...
Neural language models have drastically changed the landscape of natural language processing (NLP). ...
Large language models (LLMs) transfer well to new tasks out-of-the-box simply given a natural langua...
We explore the idea of compressing the prompts used to condition language models, and show that comp...
Pretrained language models (PLMs) have made remarkable progress in text generation tasks via fine-tu...
Prompting has shown impressive success in enabling large pretrained language models (LMs) to perform...
Comunicació presentada a la Conference on Empirical Methods in Natural Language Processing (EMNLP 20...
While Pre-trained Language Models (PLMs) internalize a great amount of world knowledge, they have be...
In recent years, there has been significant progress in developing pre-trained language models for N...
Fine-tuning continuous prompts for target tasks has recently emerged as a compact alternative to ful...
Probing Pre-trained Language Models (PLMs) using prompts has indirectly implied that language models...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
The meanings of words and phrases depend not only on where they are used (contexts) but also on who ...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...
Recent works have shown that attaching prompts to the input is effective at conditioning Language Mo...
Language models (LLMs) offer potential as a source of knowledge for agents that need to acquire new ...
Neural language models have drastically changed the landscape of natural language processing (NLP). ...
Large language models (LLMs) transfer well to new tasks out-of-the-box simply given a natural langua...
We explore the idea of compressing the prompts used to condition language models, and show that comp...
Pretrained language models (PLMs) have made remarkable progress in text generation tasks via fine-tu...
Prompting has shown impressive success in enabling large pretrained language models (LMs) to perform...
Comunicació presentada a la Conference on Empirical Methods in Natural Language Processing (EMNLP 20...
While Pre-trained Language Models (PLMs) internalize a great amount of world knowledge, they have be...
In recent years, there has been significant progress in developing pre-trained language models for N...
Fine-tuning continuous prompts for target tasks has recently emerged as a compact alternative to ful...
Probing Pre-trained Language Models (PLMs) using prompts has indirectly implied that language models...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
The meanings of words and phrases depend not only on where they are used (contexts) but also on who ...
The human ability to understand the world in terms of reusable ``building blocks\u27\u27 allows us t...