AidenDurrant / MoCo-Pytorch

An unofficial Pytorch implementation of "Improved Baselines with Momentum Contrastive Learning" (MoCoV2) - X. Chen, et al.
68 stars 10 forks source link

Hello, Do we need to pretrain the contrastive model? #2

Closed trinhvg closed 2 years ago

trinhvg commented 2 years ago

Hello, Do we need to pretrain the contrastive model? I saw other codes they dont do so?

trinhvg commented 2 years ago

Sorry, I misunderstand your pretrain module. I think it is a contrastive learning module, then it makes sense.