mjkwon2021 / CAT-Net

Official code for CAT-Net: Compression Artifact Tracing Network. Image manipulation detection and localization.
222 stars 26 forks source link

About class weight #18

Open huytuong010101 opened 2 years ago

huytuong010101 commented 2 years ago

Hi @CauchyComplete again, have nice day! In your paper and in your training code, I see that it's fivefold more weight on the tampered class (0.5/2.5 in your code) I wonder that if it depend on the dataset? (number of tampered image and authentic image) If it depend on number of tampered image and authentic image, how can I calculate this ratio? Thank you for your reply <3

CauchyComplete commented 2 years ago

Hi, I chose the class weights by the number of authentic and tampered pixels throughout the datasets. To be specific, # auth pixels : # tamp pixels = tamp class weight : auth class weight. But you don't have to follow this protocol. It's up to you.

huytuong010101 commented 2 years ago

Hi, I chose the class weights by the number of authentic and tampered pixels throughout the datasets. To be specific, # auth pixels : # tamp pixels = tamp class weight : auth class weight. But you don't have to follow this protocol. It's up to you.

Thank you, do you have plan to public the code using to generate the custom dataset?

CauchyComplete commented 2 years ago

Okay, I would like to support your work of implementing CAT-Net with Tensorflow. I will upload the tampCOCO and JPEG RAISE datasets tomorrow to Baidu cloud. I cannot read Chinese but Baidu cloud gave me 105GB of storage so I may use it. Other English-supporting drives do not offer that much storage as far as I know. If I fail to upload them, I'll personally send you via e-mail or something :)

huytuong010101 commented 2 years ago

Okay, I would like to support your work of implementing CAT-Net with Tensorflow. I will upload the tampCOCO and JPEG RAISE datasets tomorrow to Baidu cloud. I cannot read Chinese but Baidu cloud gave me 105GB of storage so I may use it. Other English-supporting drives do not offer that much storage as far as I know. If I fail to upload them, I'll personally send you via e-mail or something :)

I really happy to hear that, Thank you so much <3

CauchyComplete commented 2 years ago

@huytuong010101 I've just uploaded all custom datasets used in the paper to Google Drive. The datasets exceeded 105GB so I couldn't upload them to Baiduyun. Fortunately, I found that my university account gave me unlimited Google Drive storage, so I uploaded them there. Refer to README.md (Front page of this repo). Plus, if you finish the implementation, I kindly recommend disclosing the code publicly.

huytuong010101 commented 2 years ago

Yes, thank you so much, it really helpfull