Cysu / open-reid

Open source person re-identification library in python
https://cysu.github.io/open-reid/
MIT License
1.34k stars 349 forks source link

Triplet-loss on Market1501 Batch_size is ? #1

Closed miraclebiu closed 7 years ago

miraclebiu commented 7 years ago

from the script when training with triplet loss , it doesn't give the explicit value of Batch_size , so the defalut batch_size of 256 is used ( I'm not quite sure) ? But in the paper the number is 18 ? Maybe it's the reason?

Cysu commented 7 years ago

Thank you very much for the suggestion! The default setting is 256 batch size on 4 GPUs. So that each GPU has a mini-batch of 64 images, which belongs to 16 identities each having 4 instances (--num-instances 4). It is similar to the paper's setting.

I actually have tried using exactly the same setting with the paper (one GPU, 18 identities x 4 instances/id). The result is similar to the above default setting. Maybe I've missed something critical and need to wait for the authors' code.

miraclebiu commented 7 years ago

Thanks! I will do some experiments too.

liangbh6 commented 6 years ago

@Cysu I modify the market1501 split to maintain the standard protocol and train the model using exactly the same setting with the paper (one GPU, 18 identities x 4 instances/id), but I get rank1 78.5%. Could you give me some suggestions on reproducing the results of TriNet?