MIV-XJTU / ARTrack

Apache License 2.0
228 stars 33 forks source link

Performance on GOK-10k #30

Closed qkdkralsgh closed 11 months ago

qkdkralsgh commented 1 year ago

Hello, first of all thank you for the good work.

I have one question: Is the performance of the ARTrack_L384 model on the got10k dataset the performance learned on the full dataset? (AO : 78.5%)

AlexDotHam commented 1 year ago

No, the GOT-10k performance we provide that only trained on GOT-10k full train sets. You can see the leaderboard of GOT-10k, we trained full datasets including LaSOT, TrackingNet, COCO2017, and GOT-10k_vottrain provide higher performance (AO: 79.6%). I can provide other full datasets performance, ARTrack_256 about 75.5% on AO, ARTrack_384 about 77.2% on AO.

qkdkralsgh commented 11 months ago

Thank you for answer. So, where is the yaml file when training a large model with only got10k dataset?

AlexDotHam commented 11 months ago

You can follow the large model yaml and set the training epoch same as base_256_got, and batch_size same as large mode's yaml. Others hyper-parameter is same as large model.

qkdkralsgh commented 11 months ago

Thank you very much!