Score: 0

Masked Diffusion Models are Secretly Learned-Order Autoregressive Models

Published: November 24, 2025 | arXiv ID: 2511.19152v1

By: Prateek Garg, Bhavya Kohli, Sunita Sarawagi

Potential Business Impact:

Teaches computers to create ordered text better.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Masked Diffusion Models (MDMs) have emerged as one of the most promising paradigms for generative modeling over discrete domains. It is known that MDMs effectively train to decode tokens in a random order, and that this ordering has significant performance implications in practice. This observation raises a fundamental question: can we design a training framework that optimizes for a favorable decoding order? We answer this in the affirmative, showing that the continuous-time variational objective of MDMs, when equipped with multivariate noise schedules, can identify and optimize for a decoding order during training. We establish a direct correspondence between decoding order and the multivariate noise schedule and show that this setting breaks invariance of the MDM objective to the noise schedule. Furthermore, we prove that the MDM objective decomposes precisely into a weighted auto-regressive losses over these orders, which establishes them as auto-regressive models with learnable orders.

Country of Origin
🇮🇳 India

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)