Quadrupped-Legged Robot Movement Plan Generation using Large Language Model
By: Muhtadin , Vincentius Gusti Putu A. B. M. , Ahmad Zaini and more
Traditional control interfaces for quadruped robots often impose a high barrier to entry, requiring specialized technical knowledge for effective operation. To address this, this paper presents a novel control framework that integrates Large Language Models (LLMs) to enable intuitive, natural language-based navigation. We propose a distributed architecture where high-level instruction processing is offloaded to an external server to overcome the onboard computational constraints of the DeepRobotics Jueying Lite 3 platform. The system grounds LLM-generated plans into executable ROS navigation commands using real-time sensor fusion (LiDAR, IMU, and Odometry). Experimental validation was conducted in a structured indoor environment across four distinct scenarios, ranging from single-room tasks to complex cross-zone navigation. The results demonstrate the system's robustness, achieving an aggregate success rate of over 90\% across all scenarios, validating the feasibility of offloaded LLM-based planning for autonomous quadruped deployment in real-world settings.
Similar Papers
AuDeRe: Automated Strategy Decision and Realization in Robot Planning and Control via LLMs
Robotics
Robots learn to do new jobs by reading instructions.
OpenNav: Open-World Navigation with Multimodal Large Language Models
Robotics
Robots follow any spoken directions to go places.
Neuro-Symbolic Control with Large Language Models for Language-Guided Spatial Tasks
Robotics
Robots follow spoken instructions more reliably.