Decompose, Plan in Parallel, and Merge: A Novel Paradigm for Large Language Models based Planning with Multiple Constraints
By: Zhengdong Lu , Weikai Lu , Yiling Tao and more
Potential Business Impact:
Helps computers plan trips better and avoid mistakes.
Despite significant advances in Large Language Models (LLMs), planning tasks still present challenges for LLM-based agents. Existing planning methods face two key limitations: heavy constraints and cascading errors. To address these limitations, we propose a novel parallel planning paradigm, which Decomposes, Plans for subtasks in Parallel, and Merges subplans into a final plan (DPPM). Specifically, DPPM decomposes the complex task based on constraints into subtasks, generates the subplan for each subtask in parallel, and merges them into a global plan. In addition, our approach incorporates a verification and refinement module, enabling error correction and conflict resolution. Experimental results demonstrate that DPPM significantly outperforms existing methods in travel planning tasks.
Similar Papers
Plan-over-Graph: Towards Parallelable LLM Agent Schedule
Artificial Intelligence
Helps computers do many tasks at once.
Inspire or Predict? Exploring New Paradigms in Assisting Classical Planners with Large Language Models
Artificial Intelligence
Helps computers solve big problems by breaking them down.
Large Language Models for Planning: A Comprehensive and Systematic Survey
Artificial Intelligence
Helps computers plan and solve problems better.