SUAD: Solid-Channel Ultrasound Injection Attack and Defense to Voice Assistants
By: Chao Liu , Zhezheng Zhu , Hao Chen and more
Potential Business Impact:
Hacks voice assistants through walls.
As a versatile AI application, voice assistants (VAs) have become increasingly popular, but are vulnerable to security threats. Attackers have proposed various inaudible attacks, but are limited by cost, distance, or LoS. Therefore, we propose \name~Attack, a long-range, cross-barrier, and interference-free inaudible voice attack via solid channels. We begin by thoroughly analyzing the dispersion effect in solid channels, revealing its unique impact on signal propagation. To avoid distortions in voice commands, we design a modular command generation model that parameterizes attack distance, victim audio, and medium dispersion features to adapt to variations in the solid-channel state. Additionally, we propose SUAD Defense, a universal defense that uses ultrasonic perturbation signals to block inaudible voice attacks (IVAs) without impacting normal speech. Since the attack can occur at arbitrary frequencies and times, we propose a training method that randomizes both time and frequency to generate perturbation signals that break ultrasonic commands. Notably, the perturbation signal is modulated to an inaudible frequency without affecting the functionality of voice commands for VAs. Experiments on six smartphones have shown that SUAD Attack achieves activation success rates above 89.8% and SUAD Defense blocks IVAs with success rates exceeding 98%.
Similar Papers
Continual Audio Deepfake Detection via Universal Adversarial Perturbation
Sound
Finds fake voices without needing old examples.
SmartAttack: Air-Gap Attack via Smartwatches
Cryptography and Security
Smartwatches steal secrets from locked computers.
A Portable and Stealthy Inaudible Voice Attack Based on Acoustic Metamaterials
Cryptography and Security
Makes voice assistants hear secret commands.