Score: 0

DIP: Dynamic In-Context Planner For Diffusion Language Models

Published: January 6, 2026 | arXiv ID: 2601.03199v1

By: Yang Li , Han Meng , Chenan Wang and more

Potential Business Impact:

Makes AI understand better, faster, with less work.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Diffusion language models (DLMs) have shown strong potential for general natural language tasks with in-context examples. However, due to the bidirectional attention mechanism, DLMs incur substantial computational cost as context length increases. This work addresses this issue with a key discovery: unlike the sequential generation in autoregressive language models (ARLMs), the diffusion generation paradigm in DLMs allows \textit{efficient dynamic adjustment of the context} during generation. Building on this insight, we propose \textbf{D}ynamic \textbf{I}n-Context \textbf{P}lanner (DIP), a context-optimization method that dynamically selects and inserts in-context examples during generation, rather than providing all examples in the prompt upfront. Results show DIP maintains generation quality while achieving up to 12.9$\times$ inference speedup over standard inference and 1.17$\times$ over KV cache-enhanced inference.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
8 pages

Category
Computer Science:
Computation and Language