MaterialMVP: Illumination-Invariant Material Generation via Multi-view PBR Diffusion
By: Zebin He , Mingxin Yang , Shuhui Yang and more
Potential Business Impact:
Makes 3D objects look real in any light.
Physically-based rendering (PBR) has become a cornerstone in modern computer graphics, enabling realistic material representation and lighting interactions in 3D scenes. In this paper, we present MaterialMVP, a novel end-to-end model for generating PBR textures from 3D meshes and image prompts, addressing key challenges in multi-view material synthesis. Our approach leverages Reference Attention to extract and encode informative latent from the input reference images, enabling intuitive and controllable texture generation. We also introduce a Consistency-Regularized Training strategy to enforce stability across varying viewpoints and illumination conditions, ensuring illumination-invariant and geometrically consistent results. Additionally, we propose Dual-Channel Material Generation, which separately optimizes albedo and metallic-roughness (MR) textures while maintaining precise spatial alignment with the input images through Multi-Channel Aligned Attention. Learnable material embeddings are further integrated to capture the distinct properties of albedo and MR. Experimental results demonstrate that our model generates PBR textures with realistic behavior across diverse lighting scenarios, outperforming existing methods in both consistency and quality for scalable 3D asset creation.
Similar Papers
MVPainter: Accurate and Detailed 3D Texture Generation via Multi-View Diffusion with Geometric Control
CV and Pattern Recognition
Makes 3D objects look real with detailed colors.
PBR3DGen: A VLM-guided Mesh Generation with High-quality PBR Texture
CV and Pattern Recognition
Makes computer pictures look real with shiny, bumpy stuff.
VideoMat: Extracting PBR Materials from Video Diffusion Models
Graphics
Makes 3D objects look real from text or pictures.