Score: 1

ETC: training-free diffusion models acceleration with Error-aware Trend Consistency

Published: October 28, 2025 | arXiv ID: 2510.24129v1

By: Jiajian Xie , Hubery Yin , Chen Li and more

BigTech Affiliations: Tencent

Potential Business Impact:

Makes AI art faster without losing quality.

Business Areas:
EdTech Education, Software

Diffusion models have achieved remarkable generative quality but remain bottlenecked by costly iterative sampling. Recent training-free methods accelerate diffusion process by reusing model outputs. However, these methods ignore denoising trends and lack error control for model-specific tolerance, leading to trajectory deviations under multi-step reuse and exacerbating inconsistencies in the generated results. To address these issues, we introduce Error-aware Trend Consistency (ETC), a framework that (1) introduces a consistent trend predictor that leverages the smooth continuity of diffusion trajectories, projecting historical denoising patterns into stable future directions and progressively distributing them across multiple approximation steps to achieve acceleration without deviating; (2) proposes a model-specific error tolerance search mechanism that derives corrective thresholds by identifying transition points from volatile semantic planning to stable quality refinement. Experiments show that ETC achieves a 2.65x acceleration over FLUX with negligible (-0.074 SSIM score) degradation of consistency.

Country of Origin
πŸ‡¨πŸ‡³ China

Page Count
17 pages

Category
Computer Science:
CV and Pattern Recognition