Score: 1

LRM-1B: Towards Large Routing Model

Published: July 4, 2025 | arXiv ID: 2507.03300v1

By: Han Li , Fei Liu , Zhenkun Wang and more

Potential Business Impact:

Makes delivery routes faster and better.

Business Areas:
A/B Testing Data and Analytics

Vehicle routing problems (VRPs) are central to combinatorial optimization with significant practical implications. Recent advancements in neural combinatorial optimization (NCO) have demonstrated promising results by leveraging neural networks to solve VRPs, yet the exploration of model scaling within this domain remains underexplored. Inspired by the success of model scaling in large language models (LLMs), this study introduces a Large Routing Model with 1 billion parameters (LRM-1B), designed to address diverse VRP scenarios. We present a comprehensive evaluation of LRM-1B across multiple problem variants, distributions, and sizes, establishing state-of-the-art results. Our findings reveal that LRM-1B not only adapts to different VRP challenges but also showcases superior performance, outperforming existing models. Additionally, we explore the scaling behavior of neural routing models from 1M to 1B parameters. Our analysis confirms power-law between multiple model factors and performance, offering critical insights into the optimal configurations for foundation neural routing solvers.

Country of Origin
🇭🇰 Hong Kong

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)