MoCom: Motion-based Inter-MAV Visual Communication Using Event Vision and Spiking Neural Networks
By: Zhang Nengbo, Hann Woei Ho, Ye Zhou
Potential Business Impact:
Drones talk by dancing in the air.
Reliable communication in Micro Air Vehicle (MAV) swarms is challenging in environments, where conventional radio-based methods suffer from spectrum congestion, jamming, and high power consumption. Inspired by the waggle dance of honeybees, which efficiently communicate the location of food sources without sound or contact, we propose a novel visual communication framework for MAV swarms using motion-based signaling. In this framework, MAVs convey information, such as heading and distance, through deliberate flight patterns, which are passively captured by event cameras and interpreted using a predefined visual codebook of four motion primitives: vertical (up/down), horizontal (left/right), left-to-up-to-right, and left-to-down-to-right, representing control symbols (``start'', ``end'', ``1'', ``0''). To decode these signals, we design an event frame-based segmentation model and a lightweight Spiking Neural Network (SNN) for action recognition. An integrated decoding algorithm then combines segmentation and classification to robustly interpret MAV motion sequences. Experimental results validate the framework's effectiveness, which demonstrates accurate decoding and low power consumption, and highlights its potential as an energy-efficient alternative for MAV communication in constrained environments.
Similar Papers
MAVR-Net: Robust Multi-View Learning for MAV Action Recognition with Cross-View Attention
CV and Pattern Recognition
Helps drones understand each other's movements.
Drone Detection with Event Cameras
CV and Pattern Recognition
Finds tiny drones in any light.
MobiAct: Efficient MAV Action Recognition Using MobileNetV4 with Contrastive Learning and Knowledge Distillation
CV and Pattern Recognition
Lets tiny flying robots understand what they're doing.