MixCache: Mixture-of-Cache for Video Diffusion Transformer Acceleration
By: Yuanxin Wei , Lansong Diao , Bujiao Chen and more
Potential Business Impact:
Makes videos create faster without losing quality.
Leveraging the Transformer architecture and the diffusion process, video DiT models have emerged as a dominant approach for high-quality video generation. However, their multi-step iterative denoising process incurs high computational cost and inference latency. Caching, a widely adopted optimization method in DiT models, leverages the redundancy in the diffusion process to skip computations in different granularities (e.g., step, cfg, block). Nevertheless, existing caching methods are limited to single-granularity strategies, struggling to balance generation quality and inference speed in a flexible manner. In this work, we propose MixCache, a training-free caching-based framework for efficient video DiT inference. It first distinguishes the interference and boundary between different caching strategies, and then introduces a context-aware cache triggering strategy to determine when caching should be enabled, along with an adaptive hybrid cache decision strategy for dynamically selecting the optimal caching granularity. Extensive experiments on diverse models demonstrate that, MixCache can significantly accelerate video generation (e.g., 1.94$\times$ speedup on Wan 14B, 1.97$\times$ speedup on HunyuanVideo) while delivering both superior generation quality and inference efficiency compared to baseline methods.
Similar Papers
BWCache: Accelerating Video Diffusion Transformers through Block-Wise Caching
CV and Pattern Recognition
Makes video creation much faster without losing quality.
BWCache: Accelerating Video Diffusion Transformers through Block-Wise Caching
CV and Pattern Recognition
Makes AI videos faster without losing quality.
QuantCache: Adaptive Importance-Guided Quantization with Hierarchical Latent and Layer Caching for Video Generation
CV and Pattern Recognition
Makes video creation faster without losing quality.