clovaai / wsolevaluation

Evaluating Weakly Supervised Object Localization Methods Right (CVPR 2020)
MIT License
332 stars 55 forks source link

openimages pxap perf changed between maxboxacc and maxboxaccv2. why? #49

Closed sbelharbi closed 3 years ago

sbelharbi commented 3 years ago

hi, in https://arxiv.org/pdf/2001.07437.pdf , the pxap performance changed between using maxboxacc (tab.2) and using maxboxaccv2 (tab.8) over openimages. i didnt expect that. why is that? because the validation (for model selection) on openimages is done using pxap (as for test as well), using maxboxacc or maxboxaccv2 in the code configuration should not impact the results on pxap, right?

or did you run a new 30 random trials for hyper-parameters search that led to different best hyper-parameters for pxap tab.8? or simply due to randomness if the code is not reproducible (running the same experiment twice with the same settings does not lead to the exact same results...)?

tab 2

tab 8

thanks

junsukchoe commented 3 years ago

Thanks for your interest in our work :)

Yes, the performances on OpenImages dataset have been changed. This is because we ran new 30 trials for OpenImages. Due to the randomness, the new results are not the same as the initial experiments. But we would like to note that our main conclusion does not change: WSOL methods after CAM have not improved significantly.

sbelharbi commented 3 years ago

ok, thanks for the clarification.