Score: 0

Intrinsic Image Fusion for Multi-View 3D Material Reconstruction

Published: December 15, 2025 | arXiv ID: 2512.13157v1

By: Peter Kocsis, Lukas Höllein, Matthias Nießner

Potential Business Impact:

Makes computer images look like real things.

Business Areas:
Image Recognition Data and Analytics, Software

We introduce Intrinsic Image Fusion, a method that reconstructs high-quality physically based materials from multi-view images. Material reconstruction is highly underconstrained and typically relies on analysis-by-synthesis, which requires expensive and noisy path tracing. To better constrain the optimization, we incorporate single-view priors into the reconstruction process. We leverage a diffusion-based material estimator that produces multiple, but often inconsistent, candidate decompositions per view. To reduce the inconsistency, we fit an explicit low-dimensional parametric function to the predictions. We then propose a robust optimization framework using soft per-view prediction selection together with confidence-based soft multi-view inlier set to fuse the most consistent predictions of the most confident views into a consistent parametric material space. Finally, we use inverse path tracing to optimize for the low-dimensional parameters. Our results outperform state-of-the-art methods in material disentanglement on both synthetic and real scenes, producing sharp and clean reconstructions suitable for high-quality relighting.

Page Count
14 pages

Category
Computer Science:
CV and Pattern Recognition