Command Palette
Search for a command to run...
UCMCTrack: Multi-Object Tracking with Uniform Camera Motion Compensation
UCMCTrack: Multi-Object Tracking with Uniform Camera Motion Compensation
Kefu Yi extsuperscript1,* Kai Luo extsuperscript2 Xiaolei Luo extsuperscript2 Jiangui Huang extsuperscript2 Hao Wu extsuperscript2 Rongdong Hu extsuperscript3 Wei Hao extsuperscript1,*
Abstract
Multi-object tracking (MOT) in video sequences remains a challenging task,especially in scenarios with significant camera movements. This is becausetargets can drift considerably on the image plane, leading to erroneoustracking outcomes. Addressing such challenges typically requires supplementaryappearance cues or Camera Motion Compensation (CMC). While these strategies areeffective, they also introduce a considerable computational burden, posingchallenges for real-time MOT. In response to this, we introduce UCMCTrack, anovel motion model-based tracker robust to camera movements. Unlikeconventional CMC that computes compensation parameters frame-by-frame,UCMCTrack consistently applies the same compensation parameters throughout avideo sequence. It employs a Kalman filter on the ground plane and introducesthe Mapped Mahalanobis Distance (MMD) as an alternative to the traditionalIntersection over Union (IoU) distance measure. By leveraging projectedprobability distributions on the ground plane, our approach efficientlycaptures motion patterns and adeptly manages uncertainties introduced byhomography projections. Remarkably, UCMCTrack, relying solely on motion cues,achieves state-of-the-art performance across a variety of challenging datasets,including MOT17, MOT20, DanceTrack and KITTI. More details and code areavailable at https://github.com/corfyi/UCMCTrack