Benchmarking SAM2-based Trackers on FMOX
By: Senem Aktas , Charles Markham , John McDonald and more
Several object tracking pipelines extending Segment Anything Model 2 (SAM2) have been proposed in the past year, where the approach is to follow and segment the object from a single exemplar template provided by the user on a initialization frame. We propose to benchmark these high performing trackers (SAM2, EfficientTAM, DAM4SAM and SAMURAI) on datasets containing fast moving objects (FMO) specifically designed to be challenging for tracking approaches. The goal is to understand better current limitations in state-of-the-art trackers by providing more detailed insights on the behavior of these trackers. We show that overall the trackers DAM4SAM and SAMURAI perform well on more challenging sequences.
Similar Papers
Evaluating SAM2 for Video Semantic Segmentation
CV and Pattern Recognition
Lets computers perfectly cut out any object in videos.
Seg2Track-SAM2: SAM2-based Multi-object Tracking and Segmentation for Zero-shot Generalization
CV and Pattern Recognition
Tracks moving things better, even in videos.
SAM2S: Segment Anything in Surgical Videos via Semantic Long-term Tracking
CV and Pattern Recognition
Helps surgeons see and track tools during operations.