Score: 1

Towards Edge General Intelligence: Knowledge Distillation for Mobile Agentic AI

Published: November 25, 2025 | arXiv ID: 2511.19947v1

By: Yuxuan Wu , Linghan Ma , Ruichen Zhang and more

Potential Business Impact:

Makes smart phone AI run faster and better.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

Edge General Intelligence (EGI) represents a paradigm shift in mobile edge computing, where intelligent agents operate autonomously in dynamic, resource-constrained environments. However, the deployment of advanced agentic AI models on mobile and edge devices faces significant challenges due to limited computation, energy, and storage resources. To address these constraints, this survey investigates the integration of Knowledge Distillation (KD) into EGI, positioning KD as a key enabler for efficient, communication-aware, and scalable intelligence at the wireless edge. In particular, we emphasize KD techniques specifically designed for wireless communication and mobile networking, such as channel-aware self-distillation, cross-model Channel State Information (CSI) feedback distillation, and robust modulation/classification distillation. Furthermore, we review novel architectures natively suited for KD and edge deployment, such as Mamba, RWKV (Receptance, Weight, Key, Value) and Cross-Architecture distillation, which enhance generalization capabilities. Subsequently, we examine diverse applications in which KD-driven architectures enable EGI across vision, speech, and multimodal tasks. Finally, we highlight the key challenges and future directions for KD in EGI. This survey aims to provide a comprehensive reference for researchers exploring KD-driven frameworks for mobile agentic AI in the era of EGI.

Country of Origin
πŸ‡¬πŸ‡§ πŸ‡ΈπŸ‡¬ πŸ‡¨πŸ‡³ πŸ‡­πŸ‡° Singapore, Hong Kong, China, United Kingdom

Page Count
21 pages

Category
Computer Science:
Information Theory