Palette Aligned Image Diffusion
By: Elad Aharoni , Noy Porat , Dani Lischinski and more
Potential Business Impact:
Lets you pick colors for AI art.
We introduce the Palette-Adapter, a novel method for conditioning text-to-image diffusion models on a user-specified color palette. While palettes are a compact and intuitive tool widely used in creative workflows, they introduce significant ambiguity and instability when used for conditioning image generation. Our approach addresses this challenge by interpreting palettes as sparse histograms and introducing two scalar control parameters: histogram entropy and palette-to-histogram distance, which allow flexible control over the degree of palette adherence and color variation. We further introduce a negative histogram mechanism that allows users to suppress specific undesired hues, improving adherence to the intended palette under the standard classifier-free guidance mechanism. To ensure broad generalization across the color space, we train on a carefully curated dataset with balanced coverage of rare and common colors. Our method enables stable, semantically coherent generation across a wide range of palettes and prompts. We evaluate our method qualitatively, quantitatively, and through a user study, and show that it consistently outperforms existing approaches in achieving both strong palette adherence and high image quality.
Similar Papers
Color Alignment in Diffusion
CV and Pattern Recognition
Makes AI create pictures with exact colors you want.
Exploring Palette based Color Guidance in Diffusion Models
Graphics
Lets you pick exact colors for any picture.
Leveraging Semantic Attribute Binding for Free-Lunch Color Control in Diffusion Models
Graphics
Makes AI pictures have the exact colors you want.