Geometric Neural Distance Fields for Learning Human Motion Priors
By: Zhengdi Yu , Simone Foti , Linguang Zhang and more
Potential Business Impact:
Makes computer-animated people move more realistically.
We introduce Neural Riemannian Motion Fields (NRMF), a novel 3D generative human motion prior that enables robust, temporally consistent, and physically plausible 3D motion recovery. Unlike existing VAE or diffusion-based methods, our higher-order motion prior explicitly models the human motion in the zero level set of a collection of neural distance fields (NDFs) corresponding to pose, transition (velocity), and acceleration dynamics. Our framework is rigorous in the sense that our NDFs are constructed on the product space of joint rotations, their angular velocities, and angular accelerations, respecting the geometry of the underlying articulations. We further introduce: (i) a novel adaptive-step hybrid algorithm for projecting onto the set of plausible motions, and (ii) a novel geometric integrator to "roll out" realistic motion trajectories during test-time-optimization and generation. Our experiments show significant and consistent gains: trained on the AMASS dataset, NRMF remarkably generalizes across multiple input modalities and to diverse tasks ranging from denoising to motion in-betweening and fitting to partial 2D / 3D observations.
Similar Papers
Polar Coordinate-Based 2D Pose Prior with Neural Distance Field
CV and Pattern Recognition
Improves sports movement tracking, even with blur.
Joint Optimization of Neural Radiance Fields and Continuous Camera Motion from a Monocular Video
CV and Pattern Recognition
Makes 3D pictures from videos without knowing camera angles.
3D Gaussian Representations with Motion Trajectory Field for Dynamic Scene Reconstruction
Robotics
Makes videos show moving things from new angles.