Prompt-based learning has been an effective paradigm for large pretrained language models (LLM), enabling few-shot or even zero-shot learning. Black-box prompt search has received growing interest recently for its distinctive properties of gradient-free optimization, proven particularly useful and powerful for model-as-a-service usage. However, the discrete nature and the complexity of combinatorial optimization hinder the efficiency of modern black-box approaches. Despite extensive research on search algorithms, the crucial aspect of search space design and optimization has been largely overlooked. In this paper, we first conduct a sensitivity analysis by prompting LLM, revealing that only a small number of tokens exert a disproportionate ...
The SEARCH (Search Envisioned As Relation & Class Hierarchizing) framework developed elsewhere (...
Dense retrieval (DR) converts queries and documents into dense embeddings and measures the similarit...
International audienceDirect Multisearch (DMS) and MultiGLODS are two derivative-free solvers for ap...
Black-Box Tuning (BBT) is a derivative-free approach to optimize continuous prompt tokens prepended ...
Since the emergence of large language models, prompt learning has become a popular method for optimi...
Enhancing the zero-shot performance of instruction-following models requires heavy computation, eith...
Prompt tuning learns soft prompts to condition frozen Pre-trained Language Models (PLMs) for perform...
Domain-specific text classification faces the challenge of scarce labeled data due to the high cost ...
Through in-context learning (ICL), large-scale language models are effective few-shot learners witho...
Large language models~(LLMs) are instruction followers, but it can be challenging to find the best i...
This paper presents AutoHint, a novel framework for automatic prompt engineering and optimization fo...
Why can pre-trained language models (PLMs) learn universal representations and effectively adapt to ...
Prompt learning is a new paradigm in the Natural Language Processing (NLP) field which has shown imp...
We propose a multitask pretraining approach ZeroPrompt for zero-shot generalization, focusing on tas...
In this work, we propose a simple method that applies a large language model (LLM) to large-scale re...
The SEARCH (Search Envisioned As Relation & Class Hierarchizing) framework developed elsewhere (...
Dense retrieval (DR) converts queries and documents into dense embeddings and measures the similarit...
International audienceDirect Multisearch (DMS) and MultiGLODS are two derivative-free solvers for ap...
Black-Box Tuning (BBT) is a derivative-free approach to optimize continuous prompt tokens prepended ...
Since the emergence of large language models, prompt learning has become a popular method for optimi...
Enhancing the zero-shot performance of instruction-following models requires heavy computation, eith...
Prompt tuning learns soft prompts to condition frozen Pre-trained Language Models (PLMs) for perform...
Domain-specific text classification faces the challenge of scarce labeled data due to the high cost ...
Through in-context learning (ICL), large-scale language models are effective few-shot learners witho...
Large language models~(LLMs) are instruction followers, but it can be challenging to find the best i...
This paper presents AutoHint, a novel framework for automatic prompt engineering and optimization fo...
Why can pre-trained language models (PLMs) learn universal representations and effectively adapt to ...
Prompt learning is a new paradigm in the Natural Language Processing (NLP) field which has shown imp...
We propose a multitask pretraining approach ZeroPrompt for zero-shot generalization, focusing on tas...
In this work, we propose a simple method that applies a large language model (LLM) to large-scale re...
The SEARCH (Search Envisioned As Relation & Class Hierarchizing) framework developed elsewhere (...
Dense retrieval (DR) converts queries and documents into dense embeddings and measures the similarit...
International audienceDirect Multisearch (DMS) and MultiGLODS are two derivative-free solvers for ap...