Score: 1

CUFG: Curriculum Unlearning Guided by the Forgetting Gradient

Published: September 18, 2025 | arXiv ID: 2509.14633v1

By: Jiaxing Miao , Liang Hu , Qi Zhang and more

Potential Business Impact:

Teaches AI to forget specific things safely.

Business Areas:
E-Learning Education, Software

As privacy and security take center stage in AI, machine unlearning, the ability to erase specific knowledge from models, has garnered increasing attention. However, existing methods overly prioritize efficiency and aggressive forgetting, which introduces notable limitations. In particular, radical interventions like gradient ascent, influence functions, and random label noise can destabilize model weights, leading to collapse and reduced reliability. To address this, we propose CUFG (Curriculum Unlearning via Forgetting Gradients), a novel framework that enhances the stability of approximate unlearning through innovations in both forgetting mechanisms and data scheduling strategies. Specifically, CUFG integrates a new gradient corrector guided by forgetting gradients for fine-tuning-based unlearning and a curriculum unlearning paradigm that progressively forgets from easy to hard. These innovations narrow the gap with the gold-standard Retrain method by enabling more stable and progressive unlearning, thereby improving both effectiveness and reliability. Furthermore, we believe that the concept of curriculum unlearning has substantial research potential and offers forward-looking insights for the development of the MU field. Extensive experiments across various forgetting scenarios validate the rationale and effectiveness of our approach and CUFG. Codes are available at https://anonymous.4open.science/r/CUFG-6375.

Country of Origin
🇨🇳 🇦🇺 China, Australia

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)