LangCoop: Collaborative Driving with Language
By: Xiangbo Gao , Yuheng Wu , Rujia Wang and more
Potential Business Impact:
Cars talk to each other using simple words.
Multi-agent collaboration holds great promise for enhancing the safety, reliability, and mobility of autonomous driving systems by enabling information sharing among multiple connected agents. However, existing multi-agent communication approaches are hindered by limitations of existing communication media, including high bandwidth demands, agent heterogeneity, and information loss. To address these challenges, we introduce LangCoop, a new paradigm for collaborative autonomous driving that leverages natural language as a compact yet expressive medium for inter-agent communication. LangCoop features two key innovations: Mixture Model Modular Chain-of-thought (M$^3$CoT) for structured zero-shot vision-language reasoning and Natural Language Information Packaging (LangPack) for efficiently packaging information into concise, language-based messages. Through extensive experiments conducted in the CARLA simulations, we demonstrate that LangCoop achieves a remarkable 96\% reduction in communication bandwidth (< 2KB per message) compared to image-based communication, while maintaining competitive driving performance in the closed-loop evaluation. Our project page and code are at https://xiangbogaobarry.github.io/LangCoop/.
Similar Papers
SafeCoop: Unravelling Full Stack Safety in Agentic Collaborative Driving
CV and Pattern Recognition
Makes self-driving cars safer with smart talking.
Automated Vehicles Should be Connected with Natural Language
Multiagent Systems
Cars talk to each other to drive safer.
CoLMDriver: LLM-based Negotiation Benefits Cooperative Autonomous Driving
CV and Pattern Recognition
Cars talk to each other to drive safer.