Physically Controllable Relighting of Photographs
By: Chris Careaga, Yağız Aksoy
Potential Business Impact:
Changes how lights look in real pictures.
We present a self-supervised approach to in-the-wild image relighting that enables fully controllable, physically based illumination editing. We achieve this by combining the physical accuracy of traditional rendering with the photorealistic appearance made possible by neural rendering. Our pipeline works by inferring a colored mesh representation of a given scene using monocular estimates of geometry and intrinsic components. This representation allows users to define their desired illumination configuration in 3D. The scene under the new lighting can then be rendered using a path-tracing engine. We send this approximate rendering of the scene through a feed-forward neural renderer to predict the final photorealistic relighting result. We develop a differentiable rendering process to reconstruct in-the-wild scene illumination, enabling self-supervised training of our neural renderer on raw image collections. Our method represents a significant step in bringing the explicit physical control over lights available in typical 3D computer graphics tools, such as Blender, to in-the-wild relighting.
Similar Papers
ReLumix: Extending Image Relighting to Video via Video Diffusion Models
Graphics
Changes video lighting easily after filming.
Light-X: Generative 4D Video Rendering with Camera and Illumination Control
CV and Pattern Recognition
Creates new videos with changing camera and light.
Comprehensive Relighting: Generalizable and Consistent Monocular Human Relighting and Harmonization
CV and Pattern Recognition
Changes how people look in pictures with new light.