Generate the browsing process for short-video recommendation
By: Chao Feng, Yanze Zhang, Chenghao Zhang
Potential Business Impact:
Makes video apps keep you watching longer.
This paper proposes a generative method to dynamically simulate users' short video watching journey for watch time prediction in short video recommendation. Unlike existing methods that rely on multimodal features for video content understanding, our method simulates users' sustained interest in watching short videos by learning collaborative information, using interest changes from existing positive and negative feedback videos and user interaction behaviors to implicitly model users' video watching journey. By segmenting videos based on duration and adopting a Transformer-like architecture, our method can capture sequential dependencies between segments while mitigating duration bias. Extensive experiments on industrial-scale and public datasets demonstrate that our method achieves state-of-the-art performance on watch time prediction tasks. The method has been deployed on Kuaishou Lite, achieving a significant improvement of +0.13\% in APP duration, and reaching an XAUC of 83\% for single video watch time prediction on industrial-scale streaming training sets, far exceeding other methods. The proposed method provides a scalable and effective solution for video recommendation through segment-level modeling and user engagement feedback.
Similar Papers
Short Video Segment-level User Dynamic Interests Modeling in Personalized Recommendation
Information Retrieval
Shows videos you'll like, even within clips.
Explicit Uncertainty Modeling for Video Watch Time Prediction
Information Retrieval
Makes videos more interesting, people watch longer.
Research on the Design of a Short Video Recommendation System Based on Multimodal Information and Differential Privacy
Information Retrieval
Keeps your video likes private while showing you good videos.