Closed briannlongzhao closed 2 years ago
Also in the paper, you are reporting mIDF1 but in this repo the table shows IDF1 but the numbers are the same. Just wanted to confirm are you referring to "AVERAGE" or "OVERALL" IDF1 if you are using the official bdd100k evaluation code?
We are not using the bdd100k image detection set, which can improve the detection performance. The reported results are mMOTA and mIDF1.
Thanks for your response, but I don't quite understand what you mean, could you please elaborate? My question in short: Have you tried other existing methods on bdd100k other than the two listed in your paper? Do you have any ideas on why transformer-based models in general do not work very well on bdd100k compared to some traditional trackers such as QDTrack (if it's actually the case)? Thank you!
Thanks for your response, but I don't quite understand what you mean, could you please elaborate? My question in short: Have you tried other existing methods on bdd100k other than the two listed in your paper? Do you have any ideas on why transformer-based models in general do not work very well on bdd100k compared to some traditional trackers such as QDTrack (if it's actually the case)? Thank you!
Hello, what are the CUDA version and the versions of torch and torchvision you used when training the BDD100K branch? I have been troubled by some problems for many days and hope to get your help. Thanks!!
Hi, I found that the performance of the model on multiclass dataset BDD100k is lower than using traditional detector-based trackers such as QDTrack, and I think this also happens with other transformer-based trackers such as GTR. From my experiments the mIDF1 could reach about 50 and IDF1 about 70 using QDTrack on BDD100k, but transformer-based trackers are not working so well. But I see on single-class tracking datasets like MOT17 transformer-based model works pretty well. Have you done any experiments about this and do you have any insights about the possible reasons for this? Please correct me if I am wrong, thank you!