DeltaEdit: Enhancing Sequential Editing in Large Language Models by Controlling Superimposed Noise
By: Ding Cao , Yuchen Cai , Rongxi Guo and more
Potential Business Impact:
Keeps AI smart and accurate with many updates.
Sequential knowledge editing techniques aim to continuously update the knowledge in large language models at a low cost, preventing the models from generating outdated or incorrect information. However, existing sequential editing methods suffer from a significant decline in editing success rates after long-term editing. Through theoretical analysis and experiments, we identify that as the number of edits increases, the model's output increasingly deviates from the desired target, leading to a drop in editing success rates. We refer to this issue as the accumulation of superimposed noise problem. To address this, we identify the factors contributing to this deviation and propose DeltaEdit, a novel method that optimizes update parameters through a dynamic orthogonal constraints strategy, effectively reducing interference between edits to mitigate deviation. Experimental results demonstrate that DeltaEdit significantly outperforms existing methods in edit success rates and the retention of generalization capabilities, ensuring stable and reliable model performance even under extensive sequential editing.
Similar Papers
Constraining Sequential Model Editing with Editing Anchor Compression
Computation and Language
Fixes AI mistakes without breaking its other skills.
Resolving UnderEdit & OverEdit with Iterative & Neighbor-Assisted Model Editing
Computation and Language
Updates AI knowledge without breaking other facts.
Lifelong Knowledge Editing requires Better Regularization
Computation and Language
Fixes AI mistakes without breaking other knowledge.