junyanz / pytorch-CycleGAN-and-pix2pix

Image-to-Image Translation in PyTorch
Other
22.9k stars 6.31k forks source link

The test result of single direction is so weird #873

Open A-Little-Nut opened 4 years ago

A-Little-Nut commented 4 years ago

Hi. I have trained the cyclegan model on cityscapes dataset. Now I want to test from testA (the real images) to testB(the label). I have changed latest_net_G_A.pth to latest_net_G.pth, then I run the code " python test.py --dataroot datasets/cityscapes/testA --name cityscapes_cyclegan --model test --no_dropout ". I have expected the result is "the real A" and "the rec B". But to my surprise, the result is "the real A" and "the fake B". If I want to get my diseired result, how should I do? 2019-12-11 22-03-49屏幕截图

junyanz commented 4 years ago

The label is correct. "Real A" is the input image, and "Fake B" is the generated output result. The results are worse than normal. Have you trained the model for 200 epochs?

A-Little-Nut commented 4 years ago

Thanks for your reply. I have trained the model for 200 epochs. 2019-12-13 14-45-08屏幕截图 It's so weird. And I have just downloaded pretrained_model to test the same dataset, the results are not good. 2019-12-13 15-05-05屏幕截图 2019-12-13 15-05-20屏幕截图 The above results are normal? Or am I doing something wrong?

junyanz commented 4 years ago

It looks worse than normal. Could you shared with us your training and test script? What is the size of your input image?

A-Little-Nut commented 4 years ago

Thanks for your reply again. My training script and test script are following: training script: python train.py - --dataroot ./datasets/cityscapes --name cityscapes_cyclegan --model cycle_gan test script:python test.py --dataroot datasets/cityscapes/testA --name cityscapes_cyclegan --model test --no_dropout Why did I get those results?

junyanz commented 4 years ago

I didn't remember the training details. But the pre-trained models might have been trained on 128x128. Please try to add --load_size 143 --crop_size 128 in your test script.

CR-Gjx commented 4 years ago

Hi. I have trained the cyclegan model on cityscapes dataset. Now I want to test from testA (the real images) to testB(the label). I have changed latest_net_G_A.pth to latest_net_G.pth, then I run the code " python test.py --dataroot datasets/cityscapes/testA --name cityscapes_cyclegan --model test --no_dropout ". I have expected the result is "the real A" and "the rec B". But to my surprise, the result is "the real A" and "the fake B". If I want to get my diseired result, how should I do? 2019-12-11 22-03-49屏幕截图

I get weird results like yours. Did you solve this problem? If so, please give me some tips, I use the same training script.

CR-Gjx commented 4 years ago

@A-Little-Nut Thanks!

A-Little-Nut commented 4 years ago

I don't really understand the reasons behind it. But I think the reason maybe that model for this dataset is not convergent. If you have trained the model, you will find the results in the middle are not that bad. So you can interrupt the process of training or only use the milldle model you have trained. I hope I helped you.

------------------ 原始邮件 ------------------ 发件人: "CR-Gjx"<notifications@github.com>; 发送时间: 2020年4月14日(星期二) 晚上7:34 收件人: "junyanz/pytorch-CycleGAN-and-pix2pix"<pytorch-CycleGAN-and-pix2pix@noreply.github.com>; 抄送: "2862237661"<2862237661@qq.com>;"Author"<author@noreply.github.com>; 主题: Re: [junyanz/pytorch-CycleGAN-and-pix2pix] The test result of single direction is so weird (#873)

Hi. I have trained the cyclegan model on cityscapes dataset. Now I want to test from testA (the real images) to testB(the label). I have changed latest_net_G_A.pth to latest_net_G.pth, then I run the code " python test.py --dataroot datasets/cityscapes/testA --name cityscapes_cyclegan --model test --no_dropout ". I have expected the result is "the real A" and "the rec B". But to my surprise, the result is "the real A" and "the fake B". If I want to get my diseired result, how should I do?

I get weird results like yours. Did you solve this problem? If so, please give me some tips, I use the same training script.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

CR-Gjx commented 4 years ago

I also observe this interesting phenomenon but do not know if I make some mistakes causing it or not. I will try it again. Thanks!

A-Little-Nut commented 4 years ago

Have you evaluated your model using the script "evaluate.py" that author offered?

------------------ 原始邮件 ------------------ 发件人: "CR-Gjx"<notifications@github.com>; 发送时间: 2020年4月15日(星期三) 下午2:57 收件人: "junyanz/pytorch-CycleGAN-and-pix2pix"<pytorch-CycleGAN-and-pix2pix@noreply.github.com>; 抄送: "2862237661"<2862237661@qq.com>; "Mention"<mention@noreply.github.com>; 主题: Re: [junyanz/pytorch-CycleGAN-and-pix2pix] The test result of single direction is so weird (#873)

I also observe this interesting phenomenon but do not know if I make some mistakes causing it or not. I will try it again. Thanks!

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.

CR-Gjx commented 4 years ago

Yes, I have reproduced the results. In the label -> photo setting, you need to set the batch_size=1, and in the photo -> label setting, batch_size=3. For all settings, load_size=143, crop_size=128.

A-Little-Nut commented 4 years ago

Yes, I have reproduced the results. In the label -> photo setting, you need to set the batch_size=1, and in the photo -> label setting, batch_size=3. For all settings, load_size=143, crop_size=128.

Thanks! The model you test is the pretrained model or yourself model?

CR-Gjx commented 4 years ago

Trained by myself! @A-Little-Nut