Efficient Conditional Generation on Scale-based Visual Autoregressive Models
By: Jiaqi Liu, Tao Huang, Chang Xu
Potential Business Impact:
Makes AI draw pictures better, faster, and cheaper.
Recent advances in autoregressive (AR) models have demonstrated their potential to rival diffusion models in image synthesis. However, for complex spatially-conditioned generation, current AR approaches rely on fine-tuning the pre-trained model, leading to significant training costs. In this paper, we propose the Efficient Control Model (ECM), a plug-and-play framework featuring a lightweight control module that introduces control signals via a distributed architecture. This architecture consists of context-aware attention layers that refine conditional features using real-time generated tokens, and a shared gated feed-forward network (FFN) designed to maximize the utilization of its limited capacity and ensure coherent control feature learning. Furthermore, recognizing the critical role of early-stage generation in determining semantic structure, we introduce an early-centric sampling strategy that prioritizes learning early control sequences. This approach reduces computational cost by lowering the number of training tokens per iteration, while a complementary temperature scheduling during inference compensates for the resulting insufficient training of late-stage tokens. Extensive experiments on scale-based AR models validate that our method achieves high-fidelity and diverse control over image generation, surpassing existing baselines while significantly improving both training and inference efficiency.
Similar Papers
EasyControl: Adding Efficient and Flexible Control for Diffusion Transformer
CV and Pattern Recognition
Makes AI art creation faster and more flexible.
Context-Aware Autoregressive Models for Multi-Conditional Image Generation
CV and Pattern Recognition
Makes pictures from many different instructions.
Guidance Free Image Editing via Explicit Conditioning
CV and Pattern Recognition
Makes AI image tools faster and better.