ifzhang / ByteTrack

[ECCV 2022] ByteTrack: Multi-Object Tracking by Associating Every Detection Box
MIT License
4.8k stars 906 forks source link

track_id bug with FP16 #182

Closed HanGuangXin closed 2 years ago

HanGuangXin commented 2 years ago

Hi, I found an underlying bug when the model is trained with FP16. In yolox/core/trainer.py, when we get targets with shape [batchsize, 1000, class_id + tlwh + track_id], the track_id is correct. But when targets is converted to FP16, the track_id will lose the precision, resulting in wrong labels for reid. And seriously I think it is not easy to find the bug.

Although this bug will not effect the ByteTrack performance, which just uses detection annotations, but it will severely harm the ReID performance when trying to combine ByteTrack with ReID module in JDE paradigm.

I can make a PR if you think it is needed :)

ifzhang commented 2 years ago

Thank you very much for finding the bug! I tried to add Re-ID features in ByteTrack and the results drop a lot. Very pleasure if you can make a PR!

HanGuangXin commented 2 years ago

Hi, I'm thinking about separate track_id annotations from variable targets. And set targets to torch.float16as current code, but keep track_id to be torch.float32.

If you think it is ok, I should manage to make a PR tonight, and more information will be provided there.

ifzhang commented 2 years ago

Sure, thank you very much!

HanGuangXin commented 2 years ago

@ifzhang PR is made, you can check it or merge it?

ifzhang commented 2 years ago

I have merged the PR, thanks very much!

HanGuangXin commented 2 years ago

Thank you very much for finding the bug! I tried to add Re-ID features in ByteTrack and the results drop a lot. Very pleasure if you can make a PR!

@ifzhang Hi! Maybe you can re-run the experiments with ReID now, if the old results are related to this bug. I'm looking forward to see another awesome work of you!