Chord: Chain of Rendering Decomposition for PBR Material Estimation from Generated Texture Images
By: Zhi Ying , Boxiang Rong , Jingyu Wang and more
Potential Business Impact:
Creates realistic 3D object textures from simple ideas.
Material creation and reconstruction are crucial for appearance modeling but traditionally require significant time and expertise from artists. While recent methods leverage visual foundation models to synthesize PBR materials from user-provided inputs, they often fall short in quality, flexibility, and user control. We propose a novel two-stage generate-and-estimate framework for PBR material generation. In the generation stage, a fine-tuned diffusion model synthesizes shaded, tileable texture images aligned with user input. In the estimation stage, we introduce a chained decomposition scheme that sequentially predicts SVBRDF channels by passing previously extracted representation as input into a single-step image-conditional diffusion model. Our method is efficient, high quality, and enables flexible user control. We evaluate our approach against existing material generation and estimation methods, demonstrating superior performance. Our material estimation method shows strong robustness on both generated textures and in-the-wild photographs. Furthermore, we highlight the flexibility of our framework across diverse applications, including text-to-material, image-to-material, structure-guided generation, and material editing.
Similar Papers
MatSpray: Fusing 2D Material World Knowledge on 3D Geometry
CV and Pattern Recognition
Makes 3D game worlds look real when lights change.
An evaluation of SVBRDF Prediction from Generative Image Models for Appearance Modeling of 3D Scenes
CV and Pattern Recognition
Makes 3D objects look real with computer-made pictures.
VideoMat: Extracting PBR Materials from Video Diffusion Models
Graphics
Makes 3D objects look real from text or pictures.