Score: 2

Noise-Level Diffusion Guidance: Well Begun is Half Done

Published: September 17, 2025 | arXiv ID: 2509.13936v1

By: Harvey Mannering, Zhiwu Huang, Adam Prugel-Bennett

Potential Business Impact:

Makes AI pictures better and more like you ask.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Diffusion models have achieved state-of-the-art image generation. However, the random Gaussian noise used to start the diffusion process influences the final output, causing variations in image quality and prompt adherence. Existing noise-level optimization approaches generally rely on extra dataset construction, additional networks, or backpropagation-based optimization, limiting their practicality. In this paper, we propose Noise Level Guidance (NLG), a simple, efficient, and general noise-level optimization approach that refines initial noise by increasing the likelihood of its alignment with general guidance - requiring no additional training data, auxiliary networks, or backpropagation. The proposed NLG approach provides a unified framework generalizable to both conditional and unconditional diffusion models, accommodating various forms of diffusion-level guidance. Extensive experiments on five standard benchmarks demonstrate that our approach enhances output generation quality and input condition adherence. By seamlessly integrating with existing guidance methods while maintaining computational efficiency, our method establishes NLG as a practical and scalable enhancement to diffusion models. Code can be found at https://github.com/harveymannering/NoiseLevelGuidance.

Country of Origin
🇬🇧 United Kingdom

Repos / Data Links

Page Count
15 pages

Category
Computer Science:
CV and Pattern Recognition