xiaowei-hu / CycleGAN-tensorflow

Tensorflow implementation for learning an image-to-image translation without input-output pairs. https://arxiv.org/pdf/1703.10593.pdf
716 stars 294 forks source link

I have trained cyclegan on couple of datasets, after testing, images displayed are not in rgb color format, but in a different format #28

Open charan223 opened 6 years ago

cschar commented 6 years ago

@charan223 Had the same issue but ended up realizing I hadn't trained enough Epochs, initially trained 10 on ukiyoe2photo. trained for 5 hours and weird color rgb format 'effect' resolved itself.

hyunSo commented 6 years ago

@cschar Does that mean I need to pre-train networks with some verified dataset, and do fine-tuning with my own dataset? In my case, I trained networks with iphone2dslr_flower about 10 epochs but still had the color 'effect'.

hyunSo commented 6 years ago

@cschar +) As you mentioned about ukiyoe2photo, I'm working with it, but still the same effects are remaining... It's epoch 16's sample result. b_15_0269

cschar commented 6 years ago

@hyunSo perhaps this link might be helpful: https://hardikbansal.github.io/CycleGANBlog/#Final-Comments

During training we noticed that the ouput results were sensitive to initialization. Thanks to vanhuyz for pointing this out and suggesting training multiple times to get best results. You might notice background color being reversed as in following image. This effect can be observed only after 10-20 epochs and you can try to run the code again.

Perhaps it is just bad initial weights being set.