Probing then Editing: A Push-Pull Framework for Retain-Free Machine Unlearning in Industrial IoT
By: Jiao Chen, Weihua Li, Jianhua Tang
Potential Business Impact:
Cleans computer brains without needing old data.
In dynamic Industrial Internet of Things (IIoT) environments, models need the ability to selectively forget outdated or erroneous knowledge. However, existing methods typically rely on retain data to constrain model behavior, which increases computational and energy burdens and conflicts with industrial data silos and privacy compliance requirements. To address this, we propose a novel retain-free unlearning framework, referred to as Probing then Editing (PTE). PTE frames unlearning as a probe-edit process: first, it probes the decision boundary neighborhood of the model on the to-be-forgotten class via gradient ascent and generates corresponding editing instructions using the model's own predictions. Subsequently, a push-pull collaborative optimization is performed: the push branch actively dismantles the decision region of the target class using the editing instructions, while the pull branch applies masked knowledge distillation to anchor the model's knowledge on retained classes to their original states. Benefiting from this mechanism, PTE achieves efficient and balanced knowledge editing using only the to-be-forgotten data and the original model. Experimental results demonstrate that PTE achieves an excellent balance between unlearning effectiveness and model utility across multiple general and industrial benchmarks such as CWRU and SCUT-FD.
Similar Papers
Distill, Forget, Repeat: A Framework for Continual Unlearning in Text-to-Image Diffusion Models
Machine Learning (CS)
Removes unwanted data from AI without retraining.
Machine Unlearning of Traffic State Estimation and Prediction
Machine Learning (CS)
Cleans traffic prediction models of old or private data.
Pre-Forgettable Models: Prompt Learning as a Native Mechanism for Unlearning
Machine Learning (CS)
Removes unwanted AI memories instantly and safely.