Score: 1

Progress Ratio Embeddings: An Impatience Signal for Robust Length Control in Neural Text Generation

Published: December 7, 2025 | arXiv ID: 2512.06938v1

By: Ivanhoé Botcazou , Tassadit Amghar , Sylvain Lamprier and more

Potential Business Impact:

Makes AI write stories exactly as long as you want.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Modern neural language models achieve high accuracy in text generation, yet precise control over generation length remains underdeveloped. In this paper, we first investigate a recent length control method based on Reverse Positional Embeddings (RPE) and show its limits when control is requested beyond the training distribution. In particular, using a discrete countdown signal tied to the absolute remaining token count leads to instability. To provide robust length control, we introduce Progress Ratio Embeddings (PRE), as continuous embeddings tied to a trigonometric impatience signal. PRE integrates seamlessly into standard Transformer architectures, providing stable length fidelity without degrading text accuracy under standard evaluation metrics. We further show that PRE generalizes well to unseen target lengths. Experiments on two widely used news-summarization benchmarks validate these findings.

Country of Origin
🇫🇷 France

Page Count
13 pages

Category
Computer Science:
Computation and Language