damo-cv / TransReID-SSL

Self-Supervised Pre-Training for Transformer-Based Person Re-Identification
MIT License
172 stars 20 forks source link

Pre-trained Models #4

Closed wengdunfang closed 2 years ago

wengdunfang commented 2 years ago

Hi, can you provide pre-trained model weights of MoCo and MoBy on LUPerson dataset?

michuanhaohao commented 2 years ago

Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.

wengdunfang commented 2 years ago

Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.

ok, Thanks for your reply. In MoCoV3, the batch size equal to 1024 ?

michuanhaohao commented 2 years ago

Sorry, the open-source model requires strict approval from the company, so we can not upload Moco and Moby pre-trained models. You can use the original repos to pre-train the models. We used the default training settings.

ok, Thanks for your reply. In MoCoV3, the batch size equal to 1024 ?

Yes. Here are training settings (image size=256*128).

python main_moco.py \
  -a vit_small -b 1024 \
  --optimizer=adamw --lr=1.5e-4 --weight-decay=.1 \
  --epochs=300 --warmup-epochs=40 \
  --stop-grad-conv1 --moco-m-cos --moco-t=.2 \
  --data ./datasets/LUP  \
wengdunfang commented 2 years ago

thank you