Taking Flight with Dialogue: Enabling Natural Language Control for PX4-based Drone Agent
By: Shoon Kit Lim , Melissa Jia Ying Chong , Jing Huey Khor and more
Potential Business Impact:
Lets drones understand and follow spoken commands.
Recent advances in agentic and physical artificial intelligence (AI) have largely focused on ground-based platforms such as humanoid and wheeled robots, leaving aerial robots relatively underexplored. Meanwhile, state-of-the-art unmanned aerial vehicle (UAV) multimodal vision-language systems typically rely on closed-source models accessible only to well-resourced organizations. To democratize natural language control of autonomous drones, we present an open-source agentic framework that integrates PX4-based flight control, Robot Operating System 2 (ROS 2) middleware, and locally hosted models using Ollama. We evaluate performance both in simulation and on a custom quadcopter platform, benchmarking four large language model (LLM) families for command generation and three vision-language model (VLM) families for scene understanding.
Similar Papers
General-Purpose Aerial Intelligent Agents Empowered by Large Language Models
Robotics
Drones can now figure out new jobs on their own.
Agentic UAVs: LLM-Driven Autonomy with Integrated Tool-Calling and Cognitive Reasoning
Artificial Intelligence
Drones learn to make better decisions in tough situations.
Air-Ground Collaboration for Language-Specified Missions in Unknown Environments
Robotics
Robots understand spoken commands to work together.