Controllable Long-term Motion Generation with Extended Joint Targets
By: Eunjong Lee , Eunhee Kim , Sanghoon Hong and more
Potential Business Impact:
Makes cartoon characters move realistically and easily.
Generating stable and controllable character motion in real-time is a key challenge in computer animation. Existing methods often fail to provide fine-grained control or suffer from motion degradation over long sequences, limiting their use in interactive applications. We propose COMET, an autoregressive framework that runs in real time, enabling versatile character control and robust long-horizon synthesis. Our efficient Transformer-based conditional VAE allows for precise, interactive control over arbitrary user-specified joints for tasks like goal-reaching and in-betweening from a single model. To ensure long-term temporal stability, we introduce a novel reference-guided feedback mechanism that prevents error accumulation. This mechanism also serves as a plug-and-play stylization module, enabling real-time style transfer. Extensive evaluations demonstrate that COMET robustly generates high-quality motion at real-time speeds, significantly outperforming state-of-the-art approaches in complex motion control tasks and confirming its readiness for demanding interactive applications.
Similar Papers
RealisMotion: Decomposed Human Motion Control and Video Generation in the World Space
CV and Pattern Recognition
Lets you make videos of anyone doing anything.
CoMo: Compositional Motion Customization for Text-to-Video Generation
CV and Pattern Recognition
Makes videos show many actions at once.
MotionStream: Real-Time Video Generation with Interactive Motion Controls
CV and Pattern Recognition
Makes videos play instantly as you create them.