Real-Time Inference for Distributed Multimodal Systems under Communication Delay Uncertainty
By: Victor Croisfelt , João Henrique Inacio de Souza , Shashi Raj Pandey and more
Potential Business Impact:
Lets computers understand events with changing delays.
Connected cyber-physical systems perform inference based on real-time inputs from multiple data streams. Uncertain communication delays across data streams challenge the temporal flow of the inference process. State-of-the-art (SotA) non-blocking inference methods rely on a reference-modality paradigm, requiring one modality input to be fully received before processing, while depending on costly offline profiling. We propose a novel, neuro-inspired non-blocking inference paradigm that primarily employs adaptive temporal windows of integration (TWIs) to dynamically adjust to stochastic delay patterns across heterogeneous streams while relaxing the reference-modality requirement. Our communication-delay-aware framework achieves robust real-time inference with finer-grained control over the accuracy-latency tradeoff. Experiments on the audio-visual event localization (AVEL) task demonstrate superior adaptability to network dynamics compared to SotA approaches.
Similar Papers
Multimodal Remote Inference
Machine Learning (CS)
Smartly picks sensor data to improve AI guesses.
Multimodal Remote Inference
Machine Learning (CS)
Smartly picks sensor data to improve predictions.
Task-oriented Age of Information for Remote Inference with Hybrid Language Models
Information Theory
Smart AI chooses fast or smart models for speed.