Human-Imperceptible Physical Adversarial Attack for NIR Face Recognition Models
By: Songyan Xie , Jinghang Wen , Encheng Su and more
Potential Business Impact:
Tricks face scanners with invisible ink patches.
Near-infrared (NIR) face recognition systems, which can operate effectively in low-light conditions or in the presence of makeup, exhibit vulnerabilities when subjected to physical adversarial attacks. To further demonstrate the potential risks in real-world applications, we design a novel, stealthy, and practical adversarial patch to attack NIR face recognition systems in a black-box setting. We achieved this by utilizing human-imperceptible infrared-absorbing ink to generate multiple patches with digitally optimized shapes and positions for infrared images. To address the optimization mismatch between digital and real-world NIR imaging, we develop a light reflection model for human skin to minimize pixel-level discrepancies by simulating NIR light reflection. Compared to state-of-the-art (SOTA) physical attacks on NIR face recognition systems, the experimental results show that our method improves the attack success rate in both digital and physical domains, particularly maintaining effectiveness across various face postures. Notably, the proposed approach outperforms SOTA methods, achieving an average attack success rate of 82.46% in the physical domain across different models, compared to 64.18% for existing methods. The artifact is available at https://anonymous.4open.science/r/Human-imperceptible-adversarial-patch-0703/.
Similar Papers
The Invisible Threat: Evaluating the Vulnerability of Cross-Spectral Face Recognition to Presentation Attacks
CV and Pattern Recognition
Makes face scanners work even in the dark.
Targeted Physical Evasion Attacks in the Near-Infrared Domain
Cryptography and Security
Tricks cameras with heat to make them see wrong.
Robustness Analysis against Adversarial Patch Attacks in Fully Unmanned Stores
Cryptography and Security
Stops sneaky patches from tricking store cameras.