Closed wengdunfang closed 2 years ago
Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.
Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.
ok, Thanks for your reply. In MoCoV3, the batch size equal to 1024 ?
Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.
ok, Thanks for your reply. In MoCoV3, the batch size equal to 1024 ?
Yes. Here are training settings (image size=256*128).
python main_moco.py \
-a vit_small -b 1024 \
--optimizer=adamw --lr=1.5e-4 --weight-decay=.1 \
--epochs=300 --warmup-epochs=40 \
--stop-grad-conv1 --moco-m-cos --moco-t=.2 \
--data ./datasets/LUP \
thank you
Hi, can you provide pre-trained model weights of MoCo and MoBy on LUPerson dataset?