Semantic Context Matters: Improving Conditioning for Autoregressive Models
By: Dongyang Jin , Ryan Xu , Jianhao Zeng and more
Potential Business Impact:
Makes AI better at changing pictures with words.
Recently, autoregressive (AR) models have shown strong potential in image generation, offering better scalability and easier integration with unified multi-modal systems compared to diffusion-based methods. However, extending AR models to general image editing remains challenging due to weak and inefficient conditioning, often leading to poor instruction adherence and visual artifacts. To address this, we propose SCAR, a Semantic-Context-driven method for Autoregressive models. SCAR introduces two key components: Compressed Semantic Prefilling, which encodes high-level semantics into a compact and efficient prefix, and Semantic Alignment Guidance, which aligns the last visual hidden states with target semantics during autoregressive decoding to enhance instruction fidelity. Unlike decoding-stage injection methods, SCAR builds upon the flexibility and generality of vector-quantized-based prefilling while overcoming its semantic limitations and high cost. It generalizes across both next-token and next-set AR paradigms with minimal architectural changes. SCAR achieves superior visual fidelity and semantic alignment on both instruction editing and controllable generation benchmarks, outperforming prior AR-based methods while maintaining controllability. All code will be released.
Similar Papers
Understand Before You Generate: Self-Guided Training for Autoregressive Image Generation
CV and Pattern Recognition
Makes AI better at understanding and creating pictures.
Context-Aware Autoregressive Models for Multi-Conditional Image Generation
CV and Pattern Recognition
Makes pictures from many different instructions.
CoAR: Concept Injection into Autoregressive Models for Personalized Text-to-Image Generation
CV and Pattern Recognition
Makes AI draw any picture you imagine.