Score: 0

Contextualized Autonomous Drone Navigation using LLMs Deployed in Edge-Cloud Computing

Published: April 1, 2025 | arXiv ID: 2504.00607v1

By: Hongqian Chen , Yun Tang , Antonios Tsourdos and more

Potential Business Impact:

Lets self-driving cars learn from spoken words.

Business Areas:
Drone Management Hardware, Software

Autonomous navigation is usually trained offline in diverse scenarios and fine-tuned online subject to real-world experiences. However, the real world is dynamic and changeable, and many environmental encounters/effects are not accounted for in real-time due to difficulties in describing them within offline training data or hard to describe even in online scenarios. However, we know that the human operator can describe these dynamic environmental encounters through natural language, adding semantic context. The research is to deploy Large Language Models (LLMs) to perform real-time contextual code adjustment to autonomous navigation. The challenge not evaluated in literature is what LLMs are appropriate and where should these computationally heavy algorithms sit in the computation-communication edge-cloud computing architectures. In this paper, we evaluate how different LLMs can adjust both the navigation map parameters dynamically (e.g., contour map shaping) and also derive navigation task instruction sets. We then evaluate which LLMs are most suitable and where they should sit in future edge-cloud of 6G telecommunication architectures.

Country of Origin
🇬🇧 United Kingdom

Page Count
6 pages

Category
Computer Science:
Robotics