Score: 1

Textual Gradients are a Flawed Metaphor for Automatic Prompt Optimization

Published: December 15, 2025 | arXiv ID: 2512.13598v1

By: Daniel Melcer , Qi Chen , Wen-Hao Chiang and more

BigTech Affiliations: Amazon

Potential Business Impact:

Makes AI smarter without human help.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

A well-engineered prompt can increase the performance of large language models; automatic prompt optimization techniques aim to increase performance without requiring human effort to tune the prompts. One leading class of prompt optimization techniques introduces the analogy of textual gradients. We investigate the behavior of these textual gradient methods through a series of experiments and case studies. While such methods often result in a performance improvement, our experiments suggest that the gradient analogy does not accurately explain their behavior. Our insights may inform the selection of prompt optimization strategies, and development of new approaches.

Country of Origin
🇺🇸 United States

Page Count
19 pages

Category
Computer Science:
Computation and Language