All You Need is One: Capsule Prompt Tuning with a Single Vector
By: Yiyang Liu , James C. Liang , Heng Fan and more
Potential Business Impact:
Makes AI understand tasks better with less effort.
Prompt-based learning has emerged as a parameter-efficient finetuning (PEFT) approach to facilitate Large Language Model (LLM) adaptation to downstream tasks by conditioning generation with task-aware guidance. Despite its successes, current prompt-based learning methods heavily rely on laborious grid searching for optimal prompt length and typically require considerable number of prompts, introducing additional computational burden. Worse yet, our pioneer findings indicate that the task-aware prompt design is inherently limited by its absence of instance-aware information, leading to a subtle attention interplay with the input sequence. In contrast, simply incorporating instance-aware information as a part of the guidance can enhance the prompt-tuned model performance without additional fine-tuning. Moreover, we find an interesting phenomenon, namely "attention anchor", that incorporating instance-aware tokens at the earliest position of the sequence can successfully preserve strong attention to critical structural information and exhibit more active attention interaction with all input tokens. In light of our observation, we introduce Capsule Prompt-Tuning (CaPT), an efficient and effective solution that leverages off-the-shelf, informative instance semantics into prompt-based learning. Our approach innovatively integrates both instance-aware and task-aware information in a nearly parameter-free manner (i.e., one single capsule prompt). Empirical results demonstrate that our method can exhibit superior performance across various language tasks (e.g., 84.03\% average accuracy on T5-Large), serving as an "attention anchor," while enjoying high parameter efficiency (e.g., 0.003\% of model parameters on Llama3.2-1B).
Similar Papers
IAP: Improving Continual Learning of Vision-Language Models via Instance-Aware Prompting
CV and Pattern Recognition
Helps AI learn new things without forgetting old ones.
CAPT: Class-Aware Prompt Tuning for Federated Long-Tailed Learning with Vision-Language Model
Machine Learning (CS)
Teaches computers to learn from messy, uneven data.
CBP-Tuning: Efficient Local Customization for Black-box Large Language Models
Computation and Language
Lets AI learn new things privately on your computer.