A Neural Field-Based Approach for View Computation & Data Exploration in 3D Urban Environments
By: Stefan Cobeli , Kazi Shahrukh Omar , Rodrigo Valença and more
Potential Business Impact:
Finds best city views for planning and analysis.
Despite the growing availability of 3D urban datasets, extracting insights remains challenging due to computational bottlenecks and the complexity of interacting with data. In fact, the intricate geometry of 3D urban environments results in high degrees of occlusion and requires extensive manual viewpoint adjustments that make large-scale exploration inefficient. To address this, we propose a view-based approach for 3D data exploration, where a vector field encodes views from the environment. To support this approach, we introduce a neural field-based method that constructs an efficient implicit representation of 3D environments. This representation enables both faster direct queries, which consist of the computation of view assessment indices, and inverse queries, which help avoid occlusion and facilitate the search for views that match desired data patterns. Our approach supports key urban analysis tasks such as visibility assessments, solar exposure evaluation, and assessing the visual impact of new developments. We validate our method through quantitative experiments, case studies informed by real-world urban challenges, and feedback from domain experts. Results show its effectiveness in finding desirable viewpoints, analyzing building facade visibility, and evaluating views from outdoor spaces. Code and data are publicly available at https://urbantk.org/neural-3d.
Similar Papers
Learning Neural Exposure Fields for View Synthesis
CV and Pattern Recognition
Makes 3D pictures look good in any light.
Emergent Extreme-View Geometry in 3D Foundation Models
CV and Pattern Recognition
Makes 3D pictures work even with weird camera angles.
Neural Field Representations of Mobile Computational Photography
CV and Pattern Recognition
Makes phones create amazing 3D pictures from photos.