PatrickHua / SimSiam

A pytorch implementation for paper 'Exploring Simple Siamese Representation Learning'
MIT License
814 stars 135 forks source link

Is "DistributedSampler" necessary? #16

Closed Kungbohan closed 3 years ago

Kungbohan commented 3 years ago

Hello, I find that there is no "DistributedSampler" in the code. Is it a normal setting? I think this setting would make the model run the same data twice (or the number of GPU) in a single epoch (because the shuffle is True.). I'm not sure if this is normal. Thank you very much.

PatrickHua commented 3 years ago

I've changed the code. It's someone else's pull request.