hellomuffin / exif-as-language

official repo for the paper "EXIF as Language: Learning Cross-Modal Associations Between Images and Camera Metadata"
MIT License
39 stars 4 forks source link

Splicing Localization Issue #7

Closed HighwayWu closed 1 year ago

HighwayWu commented 1 year ago

Thanks for your awesome work and open-sourcing. However, I cannot reproduce the localization cIoU or mAP as the paper. Specifically, on Columbia dataset, I can only obtain 0.76 cIoU, where the number in paper is 0.98. Could you please provide some generated "pred['ms']" demo for checking? E.g., release the "pred['ms']" for the "canong3_canonxt_sub_03.tif" file, where my generated one is:

canong3_canonxt_sub_03

HighwayWu commented 1 year ago

In addition, I also tried to reproduce the localization over DSO dataset, e.g., the pred['ms'] of the file "splicing-40" is as follow, which is much worse than that on the Fig. 9 first row of the appendix.

splicing-40
hellomuffin commented 1 year ago

Hi, thank you for trying out our model! It seems that the model I uploaded is a bit undertrained ( trained for 48k steps, while as said in the paper the full model is trained for ~73k steps). Sadly, the original full model and data are accidentally auto-cleaned by cluster due to long-time no access. To make up, I uploaded another full model that is trained for 75k steps in another 1.5M random sample of yfcc100m dataset. Qualitatively its performance is extremely similar to original full model, quantitatively there is a little difference, perhaps due to variance of training data.

Specifically, the performance for this version of full model in Columbia and DSO is as follows: Columbia: mAP: 0.93 cIoU: 0.88 DSO: mAP: 0.65 cIoU: 0.80

Thanks again for raising this issue. We are looking into regenerating the deleted data and training the model for full length to replicate the results specified in the paper. We will get back to you soon. Sorry for the inconvenience.

HighwayWu commented 1 year ago

Thank you for your reply and answering this issue. Great work again!