Score: 0

ConformalNL2LTL: Translating Natural Language Instructions into Temporal Logic Formulas with Conformal Correctness Guarantees

Published: April 22, 2025 | arXiv ID: 2504.21022v1

By: Jun Wang , David Smith Sundarsingh , Jyotirmoy V. Deshmukh and more

Potential Business Impact:

Teaches robots to follow spoken commands correctly.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Linear Temporal Logic (LTL) has become a prevalent specification language for robotic tasks. To mitigate the significant manual effort and expertise required to define LTL-encoded tasks, several methods have been proposed for translating Natural Language (NL) instructions into LTL formulas, which, however, lack correctness guarantees. To address this, we introduce a new NL-to-LTL translation method, called ConformalNL2LTL, that can achieve user-defined translation success rates over unseen NL commands. Our method constructs LTL formulas iteratively by addressing a sequence of open-vocabulary Question-Answering (QA) problems with LLMs. To enable uncertainty-aware translation, we leverage conformal prediction (CP), a distribution-free uncertainty quantification tool for black-box models. CP enables our method to assess the uncertainty in LLM-generated answers, allowing it to proceed with translation when sufficiently confident and request help otherwise. We provide both theoretical and empirical results demonstrating that ConformalNL2LTL achieves user-specified translation accuracy while minimizing help rates.

Country of Origin
🇺🇸 United States

Page Count
9 pages

Category
Computer Science:
Computation and Language