open-mmlab / mmtracking

OpenMMLab Video Perception Toolbox. It supports Video Object Detection (VID), Multiple Object Tracking (MOT), Single Object Tracking (SOT), Video Instance Segmentation (VIS) with a unified framework.
https://mmtracking.readthedocs.io/en/latest/
Apache License 2.0
3.53k stars 591 forks source link

Custom MOT data #530

Open whiteplatin opened 2 years ago

whiteplatin commented 2 years ago

Ive got a video labelled as MOT which looks something like gt.txt: 1,1,1126.93,422.16,85.66999999999985,163.32,1,1,1.0 2,1,1127.48,421.92,85.67000000000007,163.32,1,1,1.0 3,1,1128.03,421.68,85.67000000000007,163.32,1,1,1.0 4,1,1128.58,421.44,85.67000000000007,163.32,1,1,1.0 5,1,1129.13,421.2,85.66999999999985,163.32,1,1,1.0 6,1,1129.68,420.96,85.66999999999985,163.32,1,1,1.0 . . . Its using custom classes

In https://github.com/open-mmlab/mmtracking/blob/master/tools/convert_datasets/mot/mot2coco.py it mentions DET and Results, is that required ? I was hoping to get DET and Results after running mmtracking

dyhBUPT commented 2 years ago

Hi, only if your tracker need detections as input (e.g., DeepSORT), DET is required. "Results" is generated by running mmtracking.

Best wishes.

noahcao commented 2 years ago

To be precise, DET is designed for public-detection, which means different tracking algorithms share the detection results to be associated. If you are not targeting this goal, DET files are not necessary.

noahcao commented 2 years ago

If no public detections are provided, you can define your own MOT datasets by imitating the support of DanceTrack dataset as in PR #543 .