CFG-EC: Error Correction Classifier-Free Guidance
By: Nakkyu Yang, Yechan Lee, SooJean Han
Potential Business Impact:
Makes AI pictures match your words better.
Classifier-Free Guidance (CFG) has become a mainstream approach for simultaneously improving prompt fidelity and generation quality in conditional generative models. During training, CFG stochastically alternates between conditional and null prompts to enable both conditional and unconditional generation. However, during sampling, CFG outputs both null and conditional prompts simultaneously, leading to inconsistent noise estimates between the training and sampling processes. To reduce this error, we propose CFG-EC, a versatile correction scheme augmentable to any CFG-based method by refining the unconditional noise predictions. CFG-EC actively realigns the unconditional noise error component to be orthogonal to the conditional error component. This corrective maneuver prevents interference between the two guidance components, thereby constraining the sampling error's upper bound and establishing more reliable guidance trajectories for high-fidelity image generation. Our numerical experiments show that CFG-EC handles the unconditional component more effectively than CFG and CFG++, delivering a marked performance increase in the low guidance sampling regime and consistently higher prompt alignment across the board.
Similar Papers
Guidance Free Image Editing via Explicit Conditioning
CV and Pattern Recognition
Makes AI image tools faster and better.
Learn to Guide Your Diffusion Model
Machine Learning (CS)
Makes AI pictures match words better.
Unconditional Priors Matter! Improving Conditional Generation of Fine-Tuned Diffusion Models
CV and Pattern Recognition
Makes AI pictures and videos look better.