InterSyn: Interleaved Learning for Dynamic Motion Synthesis in the Wild
By: Yiyi Ma , Yuanzhi Liang , Xiu Li and more
Potential Business Impact:
Makes computer characters move together realistically.
We present Interleaved Learning for Motion Synthesis (InterSyn), a novel framework that targets the generation of realistic interaction motions by learning from integrated motions that consider both solo and multi-person dynamics. Unlike previous methods that treat these components separately, InterSyn employs an interleaved learning strategy to capture the natural, dynamic interactions and nuanced coordination inherent in real-world scenarios. Our framework comprises two key modules: the Interleaved Interaction Synthesis (INS) module, which jointly models solo and interactive behaviors in a unified paradigm from a first-person perspective to support multiple character interactions, and the Relative Coordination Refinement (REC) module, which refines mutual dynamics and ensures synchronized motions among characters. Experimental results show that the motion sequences generated by InterSyn exhibit higher text-to-motion alignment and improved diversity compared with recent methods, setting a new benchmark for robust and natural motion synthesis. Additionally, our code will be open-sourced in the future to promote further research and development in this area.
Similar Papers
Uni-Inter: Unifying 3D Human Motion Synthesis Across Diverse Interaction Contexts
CV and Pattern Recognition
Makes computer characters move realistically together.
Towards Immersive Human-X Interaction: A Real-Time Framework for Physically Plausible Motion Synthesis
CV and Pattern Recognition
Makes robots move and react like real people.
InterPose: Learning to Generate Human-Object Interactions from Large-Scale Web Videos
CV and Pattern Recognition
Makes computer characters interact with objects realistically.