Fine-Tuned In-Context Learners for Efficient Adaptation
By: Jorg Bornschein , Clare Lyle , Yazhe Li and more
When adapting large language models (LLMs) to a specific downstream task, two primary approaches are commonly employed: (1) prompt engineering, often with in-context few-shot learning, leveraging the model's inherent generalization abilities, and (2) fine-tuning on task-specific data, directly optimizing the model's parameters. While prompt-based methods excel in few-shot scenarios, their effectiveness often plateaus as more data becomes available. Conversely, fine-tuning scales well with data but may underperform when training examples are scarce. We investigate a unified approach that bridges these two paradigms by incorporating in-context learning directly into the fine-tuning process. Specifically, we fine-tune the model on task-specific data augmented with in-context examples, mimicking the structure of k-shot prompts. This approach, while requiring per-task fine-tuning, combines the sample efficiency of in-context learning with the performance gains of fine-tuning, leading to a method that consistently matches and often significantly exceeds both these baselines. To perform hyperparameter selection in the low-data regime, we propose to use prequential evaluation, which eliminates the need for expensive cross-validation and leverages all available data for training while simultaneously providing a robust validation signal. We conduct an extensive empirical study to determine which adaptation paradigm - fine-tuning, in-context learning, or our proposed unified approach offers the best predictive performance on a concrete data downstream-tasks.
Similar Papers
Teaching LLMs How to Learn with Contextual Fine-Tuning
Machine Learning (CS)
Teaches computers to learn new things faster.
Context Tuning for In-Context Optimization
Computation and Language
Teaches computers to learn from examples faster.
You Only Fine-tune Once: Many-Shot In-Context Fine-Tuning for Large Language Model
Computation and Language
Teaches computers to do many jobs well at once.