Command Palette
Search for a command to run...
Yan Bin ; Jiang Yi ; Sun Peize ; Wang Dong ; Yuan Zehuan ; Luo Ping ; Lu Huchuan

Abstract
We present a unified method, termed Unicorn, that can simultaneously solvefour tracking problems (SOT, MOT, VOS, MOTS) with a single network using thesame model parameters. Due to the fragmented definitions of the object trackingproblem itself, most existing trackers are developed to address a single orpart of tasks and overspecialize on the characteristics of specific tasks. Bycontrast, Unicorn provides a unified solution, adopting the same input,backbone, embedding, and head across all tracking tasks. For the first time, weaccomplish the great unification of the tracking network architecture andlearning paradigm. Unicorn performs on-par or better than its task-specificcounterparts in 8 tracking datasets, including LaSOT, TrackingNet, MOT17,BDD100K, DAVIS16-17, MOTS20, and BDD100K MOTS. We believe that Unicorn willserve as a solid step towards the general vision model. Code is available athttps://github.com/MasterBin-IIAU/Unicorn.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| multi-object-tracking-and-segmentation-on-3 | Unicorn | mMOTSA: 29.6 |
| multi-object-tracking-on-mot17 | Unicorn | HOTA: 61.7 IDF1: 75.5 MOTA: 77.2 |
| multi-object-tracking-on-mots20 | Unicorn | IDF1: 65.9 sMOTSA: 65.3 |
| multiple-object-tracking-on-bdd100k-val | Unicorn | TETA: - mIDF1: 54.0 mMOTA: 41.2 |
| video-object-tracking-on-nv-vot211 | Unicorn | AUC: 34.52 Precision: 47.77 |
| visual-object-tracking-on-lasot | Unicorn | AUC: 68.5 Normalized Precision: 76.6 Precision: 74.1 |
| visual-object-tracking-on-trackingnet | Unicorn | Accuracy: 83 Normalized Precision: 86.4 Precision: 82.2 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.