PlugTrack: Multi-Perceptive Motion Analysis for Adaptive Fusion in Multi-Object Tracking
By: Seungjae Kim, SeungJoon Lee, MyeongAh Cho
Potential Business Impact:
Tracks moving things better by mixing old and new methods.
Multi-object tracking (MOT) predominantly follows the tracking-by-detection paradigm, where Kalman filters serve as the standard motion predictor due to computational efficiency but inherently fail on non-linear motion patterns. Conversely, recent data-driven motion predictors capture complex non-linear dynamics but suffer from limited domain generalization and computational overhead. Through extensive analysis, we reveal that even in datasets dominated by non-linear motion, Kalman filter outperforms data-driven predictors in up to 34\% of cases, demonstrating that real-world tracking scenarios inherently involve both linear and non-linear patterns. To leverage this complementarity, we propose PlugTrack, a novel framework that adaptively fuses Kalman filter and data-driven motion predictors through multi-perceptive motion understanding. Our approach employs multi-perceptive motion analysis to generate adaptive blending factors. PlugTrack achieves significant performance gains on MOT17/MOT20 and state-of-the-art on DanceTrack without modifying existing motion predictors. To the best of our knowledge, PlugTrack is the first framework to bridge classical and modern motion prediction paradigms through adaptive fusion in MOT.
Similar Papers
Towards Accurate State Estimation: Kalman Filter Incorporating Motion Dynamics for 3D Multi-Object Tracking
CV and Pattern Recognition
Tracks moving things better, even when hidden.
StableTrack: Stabilizing Multi-Object Tracking on Low-Frequency Detections
CV and Pattern Recognition
Tracks moving things better with less computer power.
Multi-tracklet Tracking for Generic Targets with Adaptive Detection Clustering
CV and Pattern Recognition
Helps cameras follow moving things better, even when hidden.