Animate Any Character in Any World
By: Yitong Wang , Fangyun Wei , Hongyang Zhang and more
Recent advances in world models have greatly enhanced interactive environment simulation. Existing methods mainly fall into two categories: (1) static world generation models, which construct 3D environments without active agents, and (2) controllable-entity models, which allow a single entity to perform limited actions in an otherwise uncontrollable environment. In this work, we introduce AniX, leveraging the realism and structural grounding of static world generation while extending controllable-entity models to support user-specified characters capable of performing open-ended actions. Users can provide a 3DGS scene and a character, then direct the character through natural language to perform diverse behaviors from basic locomotion to object-centric interactions while freely exploring the environment. AniX synthesizes temporally coherent video clips that preserve visual fidelity with the provided scene and character, formulated as a conditional autoregressive video generation problem. Built upon a pre-trained video generator, our training strategy significantly enhances motion dynamics while maintaining generalization across actions and characters. Our evaluation covers a broad range of aspects, including visual quality, character consistency, action controllability, and long-horizon coherence.
Similar Papers
Animate-X++: Universal Character Image Animation with Dynamic Backgrounds
CV and Pattern Recognition
Makes cartoon characters move realistically in videos.
Hunyuan-GameCraft-2: Instruction-following Interactive Game World Model
CV and Pattern Recognition
Makes game worlds react to your spoken commands.
AnimateScene: Camera-controllable Animation in Any Scene
CV and Pattern Recognition
Makes animated people fit perfectly into real scenes.