facebookresearch / moco

PyTorch implementation of MoCo: https://arxiv.org/abs/1911.05722
MIT License
4.8k stars 792 forks source link

Question about transfering to COCO with Mocov1 and Mocov2 checkpoint #127

Open mZhenz opened 2 years ago

mZhenz commented 2 years ago

Hi, I get some question when I reproduce the results on COCO.

When I use the Mocov1 pretraining checkpoint released in the repo, I can get the same results reported in the paper as below. Evaluation results for bbox: AP AP50 AP75 APs APm APl
38.658 58.372 41.765 21.351 43.372 51.629
Evaluation results for segm: AP AP50 AP75 APs APm APl
34.014 55.207 36.171 14.915 37.526 50.783
However, when I change the pretraining checkpoint to the Mocov2 one. I get a worse results as below. Evaluation results for bbox: AP AP50 AP75 APs APm APl
33.921 52.401 36.497 19.247 37.598 45.513
Evaluation results for segm: AP AP50 AP75 APs APm APl
30.113 49.341 31.897 13.713 32.557 45.618

There is a large gap about 4% between the results of Mocov1 and Mocov2. The code I used is in this repo with official setting. Is this a normal phenomenon? Or just I make some mistakes? I am wondering if anyone could kindly give me some advices or explanations. Thanks a lot!