Neural Inverse Rendering from Propagating Light
By: Anagh Malik , Benjamin Attal , Andrew Xie and more
Potential Business Impact:
Makes 3D pictures look real with light.
We present the first system for physically based, neural inverse rendering from multi-viewpoint videos of propagating light. Our approach relies on a time-resolved extension of neural radiance caching -- a technique that accelerates inverse rendering by storing infinite-bounce radiance arriving at any point from any direction. The resulting model accurately accounts for direct and indirect light transport effects and, when applied to captured measurements from a flash lidar system, enables state-of-the-art 3D reconstruction in the presence of strong indirect light. Further, we demonstrate view synthesis of propagating light, automatic decomposition of captured measurements into direct and indirect components, as well as novel capabilities such as multi-view time-resolved relighting of captured scenes.
Similar Papers
Neural Visibility Cache for Real-Time Light Sampling
Graphics
Makes computer pictures look real with many lights.
Inverse Image-Based Rendering for Light Field Generation from Single Images
CV and Pattern Recognition
Makes one picture look like many from different angles.
Physically-based Lighting Generation for Robotic Manipulation
Robotics
Makes robots learn new tasks in different lights.