Score: 2

DeepTracer: Tracing Stolen Model via Deep Coupled Watermarks

Published: November 12, 2025 | arXiv ID: 2511.08985v1

By: Yunfei Yang , Xiaojun Chen , Yuexin Xuan and more

Potential Business Impact:

Protects AI art from being stolen and copied.

Business Areas:
Cloud Security Information Technology, Privacy and Security

Model watermarking techniques can embed watermark information into the protected model for ownership declaration by constructing specific input-output pairs. However, existing watermarks are easily removed when facing model stealing attacks, and make it difficult for model owners to effectively verify the copyright of stolen models. In this paper, we analyze the root cause of the failure of current watermarking methods under model stealing scenarios and then explore potential solutions. Specifically, we introduce a robust watermarking framework, DeepTracer, which leverages a novel watermark samples construction method and a same-class coupling loss constraint. DeepTracer can incur a high-coupling model between watermark task and primary task that makes adversaries inevitably learn the hidden watermark task when stealing the primary task functionality. Furthermore, we propose an effective watermark samples filtering mechanism that elaborately select watermark key samples used in model ownership verification to enhance the reliability of watermarks. Extensive experiments across multiple datasets and models demonstrate that our method surpasses existing approaches in defending against various model stealing attacks, as well as watermark attacks, and achieves new state-of-the-art effectiveness and robustness.

Repos / Data Links

Page Count
25 pages

Category
Computer Science:
Cryptography and Security