ConformalNL2LTL: Translating Natural Language Instructions into Temporal Logic Formulas with Conformal Correctness Guarantees
By: Jun Wang , David Smith Sundarsingh , Jyotirmoy V. Deshmukh and more
Potential Business Impact:
Teaches robots to follow spoken commands correctly.
Linear Temporal Logic (LTL) has become a prevalent specification language for robotic tasks. To mitigate the significant manual effort and expertise required to define LTL-encoded tasks, several methods have been proposed for translating Natural Language (NL) instructions into LTL formulas, which, however, lack correctness guarantees. To address this, we introduce a new NL-to-LTL translation method, called ConformalNL2LTL, that can achieve user-defined translation success rates over unseen NL commands. Our method constructs LTL formulas iteratively by addressing a sequence of open-vocabulary Question-Answering (QA) problems with LLMs. To enable uncertainty-aware translation, we leverage conformal prediction (CP), a distribution-free uncertainty quantification tool for black-box models. CP enables our method to assess the uncertainty in LLM-generated answers, allowing it to proceed with translation when sufficiently confident and request help otherwise. We provide both theoretical and empirical results demonstrating that ConformalNL2LTL achieves user-specified translation accuracy while minimizing help rates.
Similar Papers
LTLCodeGen: Code Generation of Syntactically Correct Temporal Logic for Robot Task Planning
Robotics
Robots follow spoken directions to move around.
Automatic Generation of Safety-compliant Linear Temporal Logic via Large Language Model: A Self-supervised Framework
Logic in Computer Science
Makes sure computer instructions are safe.
Zero-Shot Instruction Following in RL via Structured LTL Representations
Artificial Intelligence
Teaches robots to follow complex, multi-step instructions.