Soft-Masked Diffusion Language Models
By: Michael Hersche , Samuel Moor-Smith , Thomas Hofmann and more
Potential Business Impact:
Helps computers write better code, faster.
Diffusion models have demonstrated strong potential in language modeling, offering various advantages over traditional autoregressive approaches. Their ability to generate and revise entire responses in parallel enables faster generation and built-in self-correction mechanisms. Most modern diffusion-based language models employ masked diffusion, where decoding involves iteratively processing masked tokens based on a binary decision: either retaining the mask or replacing it with the predicted token. However, this binary choice discards valuable predictive information when the mask is retained. To address this limitation, we introduce soft-masking (SM), a novel method that dynamically blends the embedding of the mask token with the embeddings of the top-$k$ predicted tokens from the previous decoding step, for each retained mask. This provides the model with a more informative prior, preserving context from earlier computations and allowing partial information about masked tokens to propagate beyond a single step. We propose a training methodology that adapts a pretrained masked diffusion language model to incorporate SM. We demonstrate that continuing pretraining a 169M parameter model with SM leads to improved perplexity and MAUVE scores. Furthermore, we finetune two state-of-the-art diffusion models, Dream-7B and Dream-Coder-7B, with SM. SM consistently improves performance across multiple coding benchmarks, particularly in high-throughput settings.
Similar Papers
Masks Can Be Distracting: On Context Comprehension in Diffusion Language Models
Machine Learning (CS)
Makes AI understand long sentences better.
Masked Diffusion Language Models with Frequency-Informed Training
Computation and Language
Teaches computers language with less text.
Simple Denoising Diffusion Language Models
Machine Learning (CS)
Makes computers write better stories and sentences.