Score: 0

Deep Q-Learning-Driven Power Control for Enhanced Noma User Performance

Published: December 2, 2025 | arXiv ID: 2512.02582v1

By: Bach Hung Luu, Sinh Cong Lam, Nam Hoang Nguyen

Potential Business Impact:

Drones boost slow internet for people far away.

Business Areas:
Drone Management Hardware, Software

Cell-edge users (CEUs) in cellular networks typically suffer from poor channel conditions due to long distances from serving base stations and physical obstructions, resulting in much lower data rates compared to cell-center users (CCUs). This paper proposes an Unmanned Aerial Vehicles (UAV)-assisted cellular network with intelligent power control to address the performance gap between CEUs and CCUs. Unlike conventional approaches that either deploy UAVs for all users or use no UAV assistance, our model uses a distance-based criterion where only users beyond a reference distance receive UAV relay assistance. Each UAV operates as an amplify-and-forward relay, enabling assisted users to receive signals from both the base station and the UAV simultaneously, thereby achieving diversity gain. To optimize transmission power allocation across base stations, we employ a Deep Q-Network (DQN) learning framework that learns power control policies without requiring accurate channel models. Simulation results show that the proposed approach achieves a peak average rate of 2.28 bps/Hz at the optimal reference distance of 400m, which represents a 3.6% improvement compared to networks without UAV assistance and 0.9% improvement compared to networks where all users receive UAV support. The results also reveal that UAV altitude and reference distance are critical factors affecting system performance, with lower altitudes providing better performance.

Page Count
16 pages

Category
Computer Science:
Information Theory