megvii-research / MOTR

[ECCV2022] MOTR: End-to-End Multiple-Object Tracking with TRansformer
Other
633 stars 93 forks source link

About 'memory-optimized version' mentioned in paper #48

Closed HELLORPG closed 2 years ago

HELLORPG commented 2 years ago

Hi, I've read your paper in ECCV 2022, believe that your research is truly meaningful.

I'm trying to reproduce your experiment. At section 4.2 in your paper, you have mentioned that "provide a memory-optimized version that can be trained on NVIDIA 2080 Ti GPUs". But I didn't find any details in this repository.

Did you release the memory-optimized code? If not, will you release this part?

Thanks a lot for your contribution.

zyayoung commented 2 years ago

Yes, checkpointing (memory optimization) is released. Just add the --use_checkpoint argument (I think it is already added in the script).

HELLORPG commented 2 years ago

It's my fault that misunderstood the meaning of the --use_checkpoint argument. Thanks a lot!

Bian-666 commented 2 years ago

It's my fault that misunderstood the meaning of the argument. Thanks a lot!--use_checkpoint

Hi,is the memory-optimized version trained on 8 2080ti or only single 2080ti?

HELLORPG commented 2 years ago

It's my fault that misunderstood the meaning of the argument. Thanks a lot!--use_checkpoint

Hi,is the memory-optimized version trained on 8 2080ti or only single 2080ti?

I used this argument and trained on 8 TiTAN XP.

Using fewer cards for training will drop the performance.

HELLORPG commented 2 years ago

Hi,is the memory-optimized version trained on 8 2080ti or only single 2080ti?

And if you use 2080Ti for training, 11GB CUDA Memory may not be enough. But you can change some code, then I think it will be fine on 2080Ti.

Bian-666 commented 2 years ago

Hi,is the memory-optimized version trained on 8 2080ti or only single 2080ti?

And if you use 2080Ti for training, 11GB CUDA Memory may not be enough. But you can change some code, then I think it will be fine on 2080Ti.

thanks for your prompt reply,thanks a lot!

HELLORPG commented 2 years ago

thanks for your prompt reply,thanks a lot!

😄 You're welcome.