IKnow: Instruction-Knowledge-Aware Continual Pretraining for Effective Domain Adaptation
By: Tianyi Zhang, Florian Mai, Lucie Flek
Potential Business Impact:
Teaches AI new things without forgetting old skills.
Continual pretraining promises to adapt large language models (LLMs) to new domains using only unlabeled test-time data, but naively applying standard self-supervised objectives to instruction-tuned models is known to degrade their instruction-following capability and semantic representations. Existing fixes assume access to the original base model or rely on knowledge from an external domain-specific database - both of which pose a realistic barrier in settings where the base model weights are withheld for safety reasons or reliable external corpora are unavailable. In this work, we propose Instruction-Knowledge-Aware Continual Adaptation (IKnow), a simple and general framework that formulates novel self-supervised objectives in the instruction-response dialogue format. Rather than depend- ing on external resources, IKnow leverages domain knowledge embedded within the text itself and learns to encode it at a deeper semantic level.
Similar Papers
Knowledge-Instruct: Effective Continual Pre-training from Limited Data using Instructions
Computation and Language
Teaches AI new facts without forgetting old ones.
Tackling Distribution Shift in LLM via KILO: Knowledge-Instructed Learning for Continual Adaptation
Computation and Language
Keeps AI smart when learning new things.
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language
Computation and Language
Teaches computers new languages faster, cheaper.