Topology Aware Neural Interpolation of Scalar Fields
By: Mohamed Kissi , Keanu Sisouk , Joshua A. Levine and more
Potential Business Impact:
Makes computer models guess missing data accurately.
This paper presents a neural scheme for the topology-aware interpolation of time-varying scalar fields. Given a time-varying sequence of persistence diagrams, along with a sparse temporal sampling of the corresponding scalar fields, denoted as keyframes, our interpolation approach aims at "inverting" the non-keyframe diagrams to produce plausible estimations of the corresponding, missing data. For this, we rely on a neural architecture which learns the relation from a time value to the corresponding scalar field, based on the keyframe examples, and reliably extends this relation to the non-keyframe time steps. We show how augmenting this architecture with specific topological losses exploiting the input diagrams both improves the geometrical and topological reconstruction of the non-keyframe time steps. At query time, given an input time value for which an interpolation is desired, our approach instantaneously produces an output, via a single propagation of the time input through the network. Experiments interpolating 2D and 3D time-varying datasets show our approach superiority, both in terms of data and topological fitting, with regard to reference interpolation schemes.
Similar Papers
Deep Parameter Interpolation for Scalar Conditioning
Image and Video Processing
Lets AI learn better by mixing its knowledge.
Multitask Learning with Stochastic Interpolants
Machine Learning (CS)
Creates one AI that does many different jobs.
Multitask Learning with Stochastic Interpolants
Machine Learning (CS)
Creates AI that learns many tasks without retraining.