Exploring Palette based Color Guidance in Diffusion Models
By: Qianru Qiu, Jiafeng Mao, Xueting Wang
Potential Business Impact:
Lets you pick exact colors for any picture.
With the advent of diffusion models, Text-to-Image (T2I) generation has seen substantial advancements. Current T2I models allow users to specify object colors using linguistic color names, and some methods aim to personalize color-object association through prompt learning. However, existing models struggle to provide comprehensive control over the color schemes of an entire image, especially for background elements and less prominent objects not explicitly mentioned in prompts. This paper proposes a novel approach to enhance color scheme control by integrating color palettes as a separate guidance mechanism alongside prompt instructions. We investigate the effectiveness of palette guidance by exploring various palette representation methods within a diffusion-based image colorization framework. To facilitate this exploration, we construct specialized palette-text-image datasets and conduct extensive quantitative and qualitative analyses. Our results demonstrate that incorporating palette guidance significantly improves the model's ability to generate images with desired color schemes, enabling a more controlled and refined colorization process.
Similar Papers
Color Me Correctly: Bridging Perceptual Color Spaces and Text Embeddings for Improved Diffusion Generation
CV and Pattern Recognition
Makes AI draw exact colors from descriptions.
Leveraging Semantic Attribute Binding for Free-Lunch Color Control in Diffusion Models
Graphics
Makes AI pictures have the exact colors you want.
Palette Aligned Image Diffusion
CV and Pattern Recognition
Lets you pick colors for AI art.