MatPedia: A Universal Generative Foundation for High-Fidelity Material Synthesis
By: Di Luo , Shuhui Yang , Mingxin Yang and more
Potential Business Impact:
Creates realistic computer images from simple descriptions.
Physically-based rendering (PBR) materials are fundamental to photorealistic graphics, yet their creation remains labor-intensive and requires specialized expertise. While generative models have advanced material synthesis, existing methods lack a unified representation bridging natural image appearance and PBR properties, leading to fragmented task-specific pipelines and inability to leverage large-scale RGB image data. We present MatPedia, a foundation model built upon a novel joint RGB-PBR representation that compactly encodes materials into two interdependent latents: one for RGB appearance and one for the four PBR maps encoding complementary physical properties. By formulating them as a 5-frame sequence and employing video diffusion architectures, MatPedia naturally captures their correlations while transferring visual priors from RGB generation models. This joint representation enables a unified framework handling multiple material tasks--text-to-material generation, image-to-material generation, and intrinsic decomposition--within a single architecture. Trained on MatHybrid-410K, a mixed corpus combining PBR datasets with large-scale RGB images, MatPedia achieves native $1024\times1024$ synthesis that substantially surpasses existing approaches in both quality and diversity.
Similar Papers
MatLat: Material Latent Space for PBR Texture Generation
CV and Pattern Recognition
Makes 3D objects look real with better textures.
MaterialMVP: Illumination-Invariant Material Generation via Multi-view PBR Diffusion
CV and Pattern Recognition
Makes 3D objects look real in any light.
LumiTex: Towards High-Fidelity PBR Texture Generation with Illumination Context
CV and Pattern Recognition
Creates realistic textures for computer graphics.