PanFlow: Decoupled Motion Control for Panoramic Video Generation
By: Cheng Zhang , Hanwen Liang , Donny Y. Chen and more
Potential Business Impact:
Creates realistic panoramic videos with smooth, complex motion.
Panoramic video generation has attracted growing attention due to its applications in virtual reality and immersive media. However, existing methods lack explicit motion control and struggle to generate scenes with large and complex motions. We propose PanFlow, a novel approach that exploits the spherical nature of panoramas to decouple the highly dynamic camera rotation from the input optical flow condition, enabling more precise control over large and dynamic motions. We further introduce a spherical noise warping strategy to promote loop consistency in motion across panorama boundaries. To support effective training, we curate a large-scale, motion-rich panoramic video dataset with frame-level pose and flow annotations. We also showcase the effectiveness of our method in various applications, including motion transfer and video editing. Extensive experiments demonstrate that PanFlow significantly outperforms prior methods in motion fidelity, visual quality, and temporal coherence. Our code, dataset, and models are available at https://github.com/chengzhag/PanFlow.
Similar Papers
CamPVG: Camera-Controlled Panoramic Video Generation with Epipolar-Aware Diffusion
CV and Pattern Recognition
Makes 360-degree videos follow camera movement.
Beyond the Frame: Generating 360° Panoramic Videos from Perspective Videos
CV and Pattern Recognition
Makes normal videos into 360-degree movies.
Dual-Projection Fusion for Accurate Upright Panorama Generation in Robotic Vision
CV and Pattern Recognition
Makes robot pictures straight for better seeing.