Low-Altitude Satellite-AAV Collaborative Joint Mobile Edge Computing and Data Collection via Diffusion-based Deep Reinforcement Learning
By: Boxiong Wang , Hui Kang , Jiahui Li and more
The integration of satellite and autonomous aerial vehicle (AAV) communications has become essential for the scenarios requiring both wide coverage and rapid deployment, particularly in remote or disaster-stricken areas where the terrestrial infrastructure is unavailable. Furthermore, emerging applications increasingly demand simultaneous mobile edge computing (MEC) and data collection (DC) capabilities within the same aerial network. However, jointly optimizing these operations in heterogeneous satellite-AAV systems presents significant challenges due to limited on-board resources and competing demands under dynamic channel conditions. In this work, we investigate a satellite-AAV-enabled joint MEC-DC system where these platforms collaborate to serve ground devices (GDs). Specifically, we formulate a joint optimization problem to minimize the average MEC end-to-end delay and AAV energy consumption while maximizing the collected data. Since the formulated optimization problem is a non-convex mixed-integer nonlinear programming (MINLP) problem, we propose a Q-weighted variational policy optimization-based joint AAV movement control, GD association, offloading decision, and bandwidth allocation (QAGOB) approach. Specifically, we reformulate the optimization problem as an action space-transformed Markov decision process to adapt the variable action dimensions and hybrid action space. Subsequently, QAGOB leverages the multi-modal generation capacities of diffusion models to optimize policies and can achieve better sample efficiency while controlling the diffusion costs during training. Simulation results show that QAGOB outperforms five other benchmarks, including traditional DRL and diffusion-based DRL algorithms. Furthermore, the MEC-DC joint optimization achieves significant advantages when compared to the separate optimization of MEC and DC.
Similar Papers
UAV-assisted Joint Mobile Edge Computing and Data Collection via Matching-enabled Deep Reinforcement Learning
Neural and Evolutionary Computing
Drones help computers process info faster.
Task Delay and Energy Consumption Minimization for Low-altitude MEC via Evolutionary Multi-objective Deep Reinforcement Learning
Machine Learning (CS)
Drones help computers finish jobs faster, using less power.
Collaborative Intelligence for UAV-Satellite Network Slicing: Towards a Joint QoS-Energy-Fairness MADRL Optimization
Networking and Internet Architecture
Helps drones and satellites share internet better.