A Single Set of Adversarial Clothes Breaks Multiple Defense Methods in the Physical World
By: Wei Zhang , Zhanhao Hu , Xiao Li and more
Potential Business Impact:
Clothes trick self-driving cars into seeing wrong things.
In recent years, adversarial attacks against deep learning-based object detectors in the physical world have attracted much attention. To defend against these attacks, researchers have proposed various defense methods against adversarial patches, a typical form of physically-realizable attack. However, our experiments showed that simply enlarging the patch size could make these defense methods fail. Motivated by this, we evaluated various defense methods against adversarial clothes which have large coverage over the human body. Adversarial clothes provide a good test case for adversarial defense against patch-based attacks because they not only have large sizes but also look more natural than a large patch on humans. Experiments show that all the defense methods had poor performance against adversarial clothes in both the digital world and the physical world. In addition, we crafted a single set of clothes that broke multiple defense methods on Faster R-CNN. The set achieved an Attack Success Rate (ASR) of 96.06% against the undefended detector and over 64.84% ASRs against nine defended models in the physical world, unveiling the common vulnerability of existing adversarial defense methods against adversarial clothes. Code is available at: https://github.com/weiz0823/adv-clothes-break-multiple-defenses.
Similar Papers
Physically Realistic Sequence-Level Adversarial Clothing for Robust Human-Detection Evasion
CV and Pattern Recognition
Makes you invisible to cameras in videos.
Adversarial Attacks on Event-Based Pedestrian Detectors: A Physical Approach
CV and Pattern Recognition
Makes cameras miss people wearing special clothes.
Revisiting Adversarial Patch Defenses on Object Detectors: Unified Evaluation, Large-Scale Dataset, and New Insights
CV and Pattern Recognition
Makes computer vision safer from sneaky tricks.