I2E: From Image Pixels to Actionable Interactive Environments for Text-Guided Image Editing
By: Jinghan Yu , Junhao Xiao , Chenyu Zhu and more
Potential Business Impact:
Changes pictures based on your words.
Existing text-guided image editing methods primarily rely on end-to-end pixel-level inpainting paradigm. Despite its success in simple scenarios, this paradigm still significantly struggles with compositional editing tasks that require precise local control and complex multi-object spatial reasoning. This paradigm is severely limited by 1) the implicit coupling of planning and execution, 2) the lack of object-level control granularity, and 3) the reliance on unstructured, pixel-centric modeling. To address these limitations, we propose I2E, a novel "Decompose-then-Action" paradigm that revisits image editing as an actionable interaction process within a structured environment. I2E utilizes a Decomposer to transform unstructured images into discrete, manipulable object layers and then introduces a physics-aware Vision-Language-Action Agent to parse complex instructions into a series of atomic actions via Chain-of-Thought reasoning. Further, we also construct I2E-Bench, a benchmark designed for multi-instance spatial reasoning and high-precision editing. Experimental results on I2E-Bench and multiple public benchmarks demonstrate that I2E significantly outperforms state-of-the-art methods in handling complex compositional instructions, maintaining physical plausibility, and ensuring multi-turn editing stability.
Similar Papers
RegionE: Adaptive Region-Aware Generation for Efficient Image Editing
CV and Pattern Recognition
Makes editing pictures faster by changing only parts.
IE-Critic-R1: Advancing the Explanatory Measurement of Text-Driven Image Editing for Human Perception Alignment
CV and Pattern Recognition
Helps computers judge edited pictures like people do.
I2I-Bench: A Comprehensive Benchmark Suite for Image-to-Image Editing Models
CV and Pattern Recognition
Tests AI image editing better, faster, and more fairly.