Multimodal Remote Inference
By: Keyuan Zhang, Yin Sun, Bo Ji
Potential Business Impact:
Smartly picks sensor data to improve predictions.
We consider a remote inference system with multiple modalities, where a multimodal machine learning (ML) model performs real-time inference using features collected from remote sensors. When sensor observations evolve dynamically over time, fresh features are critical for inference tasks. However, timely delivery of features from all modalities is often infeasible because of limited network resources. Towards this end, in this paper, we study a two-modality scheduling problem that seeks to minimize the ML model's inference error, expressed as a penalty function of the Age of Information (AoI) vector of the two modalities. We develop an index-based threshold policy and prove its optimality. Specifically, the scheduler switches to the other modality once the current modality's index function exceeds a predetermined threshold. We show that both modalities share the same threshold and that the index functions and the threshold can be computed efficiently. Our optimality results hold for general AoI functions (which could be non-monotonic and non-separable) and heterogeneous transmission times across modalities. To demonstrate the importance of considering a task-oriented AoI function, we conduct numerical experiments based on robot state prediction and compare our policy with round-robin and uniform random policies (both are oblivious to the AoI and the inference error).n The results show that our policy reduces inference error by up to 55% compared with these baselines.
Similar Papers
Multimodal Remote Inference
Machine Learning (CS)
Smartly picks sensor data to improve AI guesses.
Task-oriented Age of Information for Remote Inference with Hybrid Language Models
Information Theory
Smart AI chooses fast or smart models for speed.
AoI-based Scheduling of Correlated Sources for Timely Inference
Networking and Internet Architecture
Makes computers guess better with old, mixed-up info.