Open Bonum opened 3 years ago
You can follow the command listed here: https://github.com/amazon-research/siam-mot#dataset-evaluation-and-training. You only need to use your own train_dir and model_suffix.
Training the whole network on 8 GPUs (V100) with the default configuration takes around 10 hours.
Is it possible to publish the training command settings that reproduce SiamMOT-DLA34-EMM from Model Zoos. And approximately the training time.
Thank you for your great work.