Softly Constrained Denoisers for Diffusion Models
By: Victor M. Yeom Song , Severi Rissanen , Arno Solin and more
Diffusion models struggle to produce samples that respect constraints, a common requirement in scientific applications. Recent approaches have introduced regularization terms in the loss or guidance methods during sampling to enforce such constraints, but they bias the generative model away from the true data distribution. This is a problem, especially when the constraint is misspecified, a common issue when formulating constraints on scientific data. In this paper, instead of changing the loss or the sampling loop, we integrate a guidance-inspired adjustment into the denoiser itself, giving it a soft inductive bias towards constraint-compliant samples. We show that these softly constrained denoisers exploit constraint knowledge to improve compliance over standard denoisers, and maintain enough flexibility to deviate from it when there is misspecification with observed data.
Similar Papers
Constraint-Guided Prediction Refinement via Deterministic Diffusion Trajectories
Artificial Intelligence
Fixes computer guesses to follow real-world rules.
Enhancing Diffusion Model Guidance through Calibration and Regularization
CV and Pattern Recognition
Makes AI create better, more diverse pictures.
Conditional Diffusion as Latent Constraints for Controllable Symbolic Music Generation
Machine Learning (CS)
Lets musicians precisely control music creation.